var/home/core/zuul-output/0000755000175000017500000000000015145072337014534 5ustar corecorevar/home/core/zuul-output/logs/0000755000175000017500000000000015145077470015503 5ustar corecorevar/home/core/zuul-output/logs/kubelet.log.gz0000644000175000017500000315016515145077306020271 0ustar corecore~ikubelet.log_o[;r)Br'o-n(!9t%Cs7}g/غIs,r.k9GfB e>KYEڤ펯_ˎ6Ϸ7+%f?ᕷox[o8W5N!Kޒ/h3_.gSeq5v(×_~^ǿq]n>߮}+ԏbś E^"Y^-Vۋz7wH׋0g"ŒGǯguz|ny;#)a "b BLc?^^4[ftlR%KF^j 8DΆgS^Kz۞_W#|`zIlp_@oEy5 fs&2x*g+W4m ɭiE߳Kfn!#Šgv cXk?`;'`&R7߿YKS'owHF6":=3Ȑ 3xҝd){Ts}cZ%BdARO#-o"D"ޮrFg4" 0ʡPBU[fi;dYu' IAgfPF:c0Ys66q tH6#.`$vlLH}ޭA㑝V0>|J\Pg\W#NqɌDSd1d9nT#Abn q1J# !8,$RNI? j!bE"o j/o\E`r"hA ós yi\[.!=A(%Ud,QwC}F][UVYE NQGn0Ƞɻ>.ww}(o./WY<͉#5O H 'wo6C9yg|O~ €'} S[q?,!yq%a:y<\tunL h%$Ǥ].v y[W_` \r/Ɛ%aޗ' B.-^ mQYd'xP2ewEڊL|^ͣrZg7n͐AG%ʷr<>; 2W>h?y|(G>ClsXT(VIx$(J:&~CQpkۗgVKx*lJ3o|s`<՛=JPBUGߩnX#;4ٻO2{Fݫr~AreFj?wQC9yO|$UvވkZoIfzC|]|[>ӸUKҳt17ä$ ֈm maUNvS_$qrMY QOΨN!㞊;4U^Z/ QB?q3En.اeI"X#gZ+Xk?povR]8~깮$b@n3xh!|t{: CºC{ 8Ѿm[ ~z/9آs;DPsif39HoN λC?; H^-¸oZ( +"@@%'0MtW#:7erԮoQ#% H!PK)~U,jxQV^pΣ@Klb5)%L%7׷v] gv6دϾDD}c6  %T%St{kJ_O{*Z8Y CEO+'HqZY PTUJ2dic3w ?YQgpa` Z_0΁?kMPc_Ԝ*΄Bs`kmJ?t 53@հ1hr}=5t;nt 9:I_|AאM'NO;uD,z҄R K&Nh c{A`?2ZҘ[a-0V&2D[d#L6l\Jk}8gf) afs'oIf'mf\>UxR ks J)'u4iLaNIc2qdNA&aLQVD R0*06V۽棬mpھ*V I{a 0Ҟҝ>Ϗ ,ȓw`Ȅ/2Zjǽ}W4D)3N*[kPF =trSE *b9ē7$ M_8.Ç"q ChCMAgSdL0#W+CUu"k"圀̲F9,,&h'ZJz4U\d +( 7EqڏuC+]CEF 8'9@OVvnNbm: X„RDXfיa }fqG*YƩ{P0K=( $hC=h2@M+ `@P4Re]1he}k|]eO,v^ȹ [=zX[tꆯI7c<ۃ'B쿫dIc*Qqk&60XdGY!D ' @{!b4ִ s Exb 5dKߤKߒ'&YILұ4q6y{&G`%$8Tt ȥ#5vGVO2Қ;m#NS8}d0Q?zLV3\LuOx:,|$;rVauNjk-ؘPꐤ`FD'JɻXC&{>.}y7Z,).Y톯h7n%PAUË?/,z_jx܍>М>ӗom$rۇnu~Y݇̇TIwӜ'}׃nxuoỴRZ&Yzbm ]) %1(Y^9{q"4e?x+ [Vz;E|d1&ږ/0-Vb=SSO|k1A[|gbͧɇد;:X:@;afU=Sru CK >Y%LwM*t{zƝ$;ȾjHim @tBODɆj>0st\t@HTu( v e`H*1aK`3CmF1K>*Mk{_'֜dN${OT-n,'}6ȴ .#Sqη9]5zoX#ZVOy4%-Lq6dACYm*H@:FUф(vcD%F"i ' VVdmcOTKpwq.M?m12N[=tuw}opYG]2u<ΰ+a1tHayɒ aY(P*aaʨ@ΰ<pX X{k[%Egl1$9  ֲQ$'dJVE%mT{z`R$77.N|b>harNJ(Bň0ae3V#b,PY0TEu1L/]MTB4$`H6NI\nbǛ*AyA\(u|@ [h-,j7gDTÎ4oWJ$j!frH_HI\:U}UE$J @ٚeZE0(8ŋ ϓ{BpY]Q4`Iz_*2coT'ƟlQ.Ff!bpRw@\6"yr+i37Z_j*YLfnYJ~Z~okJX ?A?gU3U;,ד1t7lJ#wՆ;I|p"+I4ˬZcն a.1wXhxDI:;.^m9W_c.4z+ϟMn?!ԫ5H&=JkܓhkB\LQ"<LxeLo4l_m24^3.{oɼʪ~75/nQ?s d|pxu\uw?=QR -Mݞίk@Pc n1æ*m$=4Dbs+J \EƄզ}@۶(ߐ/ۼ𹫘qݎt7Ym݃|M$ 6.x5 TMXbXj-P\jА޴y$j`ROA"EkuS#q * CƂ lu" yo6"3껝I~flQ~NCBX`]ڦÞhkXO _-Qy2$?T3ͤEZ긊mۘ$XD.bͮW`AީClСw5/lbl[N*t*@56."D/< {Dۥ sLxZn$N(lYiV =?_e^0)?]{ @| 6+#gPX>Bk2_@L `CZ?z3~ }[ tŪ)۲-9ֆP}b&x Uhm._O 4m6^^osVЦ+*@5Fˢg'!>$]0 5_glg}릅h:@61Xv` 5DFnx ˭jCtu,R|ۯG8`&ו:ݓ3<:~iXN9`2ŦzhѤ^ MW`c?&d.'[\]}7A[?~R6*.9t,綨 3 6DFe^u; +֡X< paan}7ftJ^%0\?mg5k][ip4@]p6Uu|܀|Kx6خQU2KTǺ.ȕPQVzWuk{n#NWj8+\[ ?yiI~fs[:.۽ '5nWppH? 8>X+m7_Z`V j[ s3nϏT=1:T <= pDCm3-b _F(/f<8sl, 0۬Z"X.~b٦G3TE.֣eմi<~ik[m9뀥!cNIl8y$~\T B "2j*ҕ;ێIs ɛqQQKY`\ +\0(FęRQ hN œ@n|Vo|6 8~J[,o%l%!%tyNO}}=ʬ-'vlQ]m"ifӠ1˟ud9)˔~BѤ]һS8]uBi( Ql{]UcLxٻa,2r(#'CDd2݄kTxn@v7^58þ Ţ&V"J~7+0_t[%XU͍ &dtO:odtRWon%*44JٵK+Woc.F3 %N%FF"HH"\$ۤ_5UWd̡bh塘ZRI&{3TUFp/:4TƳ5[۲yzz+ 4D.Ճ`!TnPFp':.4dMFN=/5ܙz,4kA<:z7y0^} "NqK$2$ Ri ?2,ᙌEK@-V3ʱd:/4Kwm2dotXjpgjpF)*=}Coޗq !Ȥ4΍ +aM(VldX ][T !Ȱ|HN~6y,⒊)$e{)SR#kהyϛ7^i58f4PmB8 Y{qeφvk73:1@ƛ.{f8IGv*1藺yx27M=>+VnG;\<x7v21՚H :[Γd!E'a4n?k[A׈(sob 41Y9(^SE@7`KIK`kx& V`X0,%pe_ן >hd xе"Q4SUwy x<'o_~#6$g!D$c=5ۄX[ു RzG:柺[ӏ[3frl ô ހ^2TӘUAT!94[[m۾\T)W> lv+ H\FpG)ۏjk_c51̃^cn ba-X/#=Im41NLu\9ETp^poAOO&A+mj(^>c/"ɭex^k$# $V :]PGszy".zɪ) ӓT)D:fci[*`cc&VhfFp佬)/Wdځ+ uR<$}Kr'ݔTW$md1"#mC_@:m P>DEu&ݛȘPˬ-Ő\B`xr`"F'Iٺ*DnA)yzr^!3Ír!S$,.:+d̋BʺJ#SX*8ҁW7~>oOFe-<uJQ|FZEP__gi(`0/ƍcv7go2G$ N%v$^^&Q 4AMbvvɀ1J{ڔhэK'9*W )IYO;E4z⛢79"hK{BFEmBAΛ3>IO j u߿d{=t-n3Pnef9[}=%G*9sX,¬xS&9'E&"/"ncx}"mV5tŘ:wcZ К G)]$mbXE ^ǽ8%>,0FЕ 6vAVKVCjrD25#Lrv?33Iam:xy`|Q'eű^\ơ' .gygSAixپ im41;P^azl5|JE2z=.wcMԧ ax& =`|#HQ*lS<.U׻`>ajϿ '!9MHK:9#s,jV剤C:LIeHJ"M8P,$N;a-zݸJWc :.<sR6 լ$gu4M*B(A ݖΑِ %H;S*ڳJt>$M!^*n3qESfU, Iĭb#UFJPvBgZvn aE5}~2E|=D' ܇q>8[¿yp/9Om/5|k \6xH.Z'OeCD@cq:Y~<1LٖY9# xe8g IKTQ:+Xg:*}.<M{ZH[^>m0G{ ̷hiOO|9Y"mma[sSbb'Rv&{@6; KE.a\}:<]Oyve3h9}E[kMD,5 %sO{킒 8.K?]i/`׎tp NvԻV4|<{H@#*h{Yp/E%dlh\bU:E%h@&SEK [ Ƣ xg{z%ǻViX~鮦w35QE~qp[ʕ@}ZL! Z0!A⼏q)[f &E1K3i+`JG P/EG 4 9LڑKL|`PОnG#|}qOR{Q|2_tH߫%pD?1%(@nfxOrs25rMլf{sk7݇fjӞh2HkeL'Wʿ}Ƞ%>9cSH|cEyQp 'ˢd:,v-us"Iidw>%zM@9IqrGq:&_p3õB!>9'0LL]M[lwWVR9I5YpVgtuZfG{RoZr3ٮr;wW:͋nqCRu1y=㊻Ij z[|W%q0 CJV٨3,ib{eH7 mҝ(3ɏO/̗-=OR\dIoHZ6n`R֑&#.Mv0vԬ]I˟vrK}F9X|FI#g.Gi)%!iK|o}|ֵ7!ېATJKB2Z/"BfB(gdj۸=}'),-iX'|M2roK\e5Pt:*qSH PgƉU'VKξ ,!3`˞t1Rx}fvvPXdQSg6EDT:dׁz^DjXp͇G|X5Q9K$)U?o': .,wؓaՁ_ 3]Q16ZYafuvrq^ѷQT},!H]6{Jw>%wK{)rH+"B4H7-]r}7v8|׾~Us?yWfv3>xpRҧH-EeJ~4YIozi:nq Vq8swHOzf ̙eX-4`TDGq G.tݻgq74ŠqBFf8 9Fk Afq#ϛa$!qNCJ4bnvB @W,v&- 6wCBjxk9ᤉ ,Asy3YޜZ4ΓVYf'h?kNg?҆8oC!IMo:^G10EY↘H:L@D+dˠUHs[hiҕ|֏G/G`' m5p|:9U8PZ7Yݷ/7cs=v{lLHڕ/=("lX-Ǐ_whaE 98 (oѢ/Р΅ 7ցl6618ł_1/=fu).s¯?.S[{'g=Ҥ):d8h\y6]t1T7IUV:;.1& ,5΀j:<< +Y?58In'bXIǣO{&V\DŽ0,9f O_"[l:h¢8wݓ19\:f6:+ .3}=uvKc ٹeS<>ij(o'ciS<{1$E[nP b?8E'xv[K+E{,Qƙ1*dcs_Z'407|qBOgYU|U--sG8`u! qGYܷw;ȌCPc_|(RaIBKb+{P.T! =ĦiTob d<>SHr][KqWs7ѝBYǭ~RR"p9dFg|K- obY_vM 4>/]e/dy,8!xŋ5 R<^mYo 3c9(F?he:9[_v~\:P ؇'k01Q1jlX)/ΏL+NhBUx~Ga>Z"Q_wjTLRˀtL L+BT҂ll魳cf[L̎`;rK+S- (J[(6 b F? ZvƂcW+dˍ-m𢛲@ms~}3ɱ© R$ T5%:zZ甎܋)`ŰJ38!;NfHohVbK :S50exU}W`upHЍE_fNTU*q%bq@/5q0);F74~'*z[\M-~#aSmMÉB2Nnʇ)bAg`u2t"8U [tJYSk, "vu\h1Yhl~[mhm+F(g 6+YtHgd/}7m]Q!Mę5bR!JbV>&w6οH+NL$]p>8UU>Ѫg39Yg>OF9V?SAT~:gGt $*}aQ.Zi~%K\rfm$%ɪq(%W>*Hg>KStE)KS1z2"h%^NEN?  hxnd/)O{,:خcX1nIaJ/t4J\bƀWc-d4M^d/ ʂK0`v%"s#PCoT/*,:[4b=]N&, ,B82^WK9EHLPm))2.9ȱ  QAcBC-|$M\^B!`}M^t+C~Lb }D>{N{Vt)tpDN,FCz~$)*417l;V iэ(_,j]$9O+/Sh]ice wy\Mڗ$,DJ|lj*à␻,?XAe0bX@ h0[}BU0v']#Vo !ې: Z%ƶ(fl>'"Bg< 0^_d0Y@2!ӸfZ{Ibi/^cygwדzY'Ź$:fr;)ٔf ՠ3Kcxwg*EQU{$Sڸ3x~ 5clgSAW"X Pҿ.ظwyV}̒KX9U1>V..W%GX +Uvzg=npu{do#Vb4ra\sNC/T"*!k愨}plm@+@gSUX覽t01:)6kSL9Ug6rEr(3{ xRP8_S( $?uk| ]bP\vۗ晋cgLz2r~MMp!~~h?ljUc>rw}xxݸǻ*Wu{}M?\GSߋ2ꮺ5w"7U0)lۨB0ח*zW߬V}Z۫ܨJ<]B=\>V7¯8nq~q?A-?T_qOq?5-3 |q|w.dަ'/Y?> (<2y. ">8YAC| w&5fɹ(ȊVã50z)la.~LlQx[b&Pĥx BjIKn"@+z'}ũrDks^F\`%Di5~cZ*sXLqQ$q6v+jRcepO}[ s\VF5vROq%mX-RÈlб 6jf/AfN vRPػ.6<'"6dv .z{I>|&ׇ4Ăw4 [P{]"}r1殲)ߚA 2J1SGpw>ٕQѱ vb;pV ^WO+į1tq61W vzZ U'=҅}rZ:T#\_:ď);KX!LHuQ (6c94Ce|u$4a?"1] `Wa+m𢛲`Rs _I@U8jxɕͽf3[Pg%,IR Ř`QbmүcH&CLlvLҼé1ivGgJ+u7Τ!ljK1SpHR>:YF2cU(77eGG\ m#Tvmە8[,)4\\=V~?C~>_) cxF;;Ds'n [&8NJP5H2Զj{RC>he:ա+e/.I0\lWoӊĭYcxN^SPiMrFI_"*l§,̀+ å} .[c&SX( ( =X?D5ۙ@m cEpR?H0F>v6A*:W?*nzfw*B#d[se$U>tLNÔ+XX߇`cu0:U[tp^}{>H4z 4 (DtH-ʐ?sk7iIbΏ%T}v}e{aBs˞L=ilNeb]nltwfCEI"*S k`u ygz[~S [j3+sE.,uDΡ1R:Vݐ/CBc˾] shGՙf 2+);W{@dlG)%عF&4D&u.Im9c$A$Dfj-ء^6&#OȯTgرBӆI t[ 5)l>MR2ǂv JpU1cJpրj&*ߗEЍ0U#X) bpNVYSD1౱UR}UR,:lơ2<8"˓MlA2 KvP8 I7D Oj>;V|a|`U>D*KS;|:xI/ió21׭ȦS!e^t+28b$d:z4 .}gRcƈ^ʮC^0l[hl"য*6 ny!HQ=GOf"8vAq&*țTOWse~ (5TX%/8vS:w}[ą qf2Lυi lm/+QD4t.P*2V J`\g2%tJ4vX[7g"z{1|\*& >Vv:V^S7{{u%[^g=pn]Y#&ߓTί_z7e&ӃCx;xLh+NOEp";SB/eWٹ`64F 2AhF{Ɩ;>87DǍ-~e;\26Lة:*mUAN=VޮL> jwB}ѹ .MVfz0Ïd0l?7- }|>TT%9d-9UK=&l&~g&i"L{vrQۻou}q}hn+.{pWEqws]]|/ǫ\}/J.MLmc ԗWrU}/Ǜ+sYn[ﯾeywyY]]¨Kpx c./mo;ߟRy*4݀wm&8֨Or4 &+Bs=8'kP 3 |}44S8UXi;f;VE7e4AdX-fS烠1Uܦ$lznlq"җ^s RTn|RKm;ԻZ3)`S!9| ?}m*2@"G{yZ${˪A6yq>Elq*E< NX9@: Ih~|Y4sopp|v1f2춓t$]koȒ+PLq;v005_CoUev*NŝD.VWUwW\Z&)(AˤN|ɖ]{ jSl+׃,Zr Ǫ OH` Q #JjqTؔc iUQjrp(v)V1VmU7ͽq{\~*ϓB6Id x!_~jd"OhR<)*2nF bqUx?6m;gm#2eM\l&6Nsl', WWzD 1qW".xQz͘|Y3N!pcOr.n; Of/ vzH@|O ;=@KVܟaEAl1'xm2 #|m F侎qrF^ly{]rMÎأzy#dD~Fe9cx]4x:X#Nr5'vKUuOIO0&}}lι.x69hv2]c8NG>G'󌕳ZRkMc\X(dȑ& ll:5uӹKӾK= I_FU3jAa) `Z` &M:'tMs=)B-ґںC}3i`'t-1+T#x7z#-$ZF8lxIky.;Fe27Fٗv7kً%I،|+3]L wXM`<ثKyDD"N^Y'+|1j3{J Ŷ;ˤjt5ZY]TQgFl{b-GFdonZܰ ? 96GF[l!K(p.iz2Qߜo"eHwںJ0%ɼXY"{('֏R܈1ZuO۷\`kM[M@yLh"e"/@DLHI*bUzQ)OLP˄M֤.E$wtr!d4ev@wyճWL%oIqZMʛG$.Is1Ir,RA64J%{-'9,X'@f㭺 O'$iDV,ALlM %D NԴ uv'r vTlCϓ¹~Fa+j1ʟ*Ŀ#<̓%ûk󰸙t@;Kxh+d6=͍M Iw`:/ޚ9%S 'ȨUN i1Kb%g\i 5Wl??./ł6^aq5is\\oLo'g]ӳdv wr\M L ;iU#aZ;9X'룁ns%щ|dYEhBqڨjd3BTBNaOp`g.עicgRG&(4}*"g%XJxT`N"e ˓ w#MU쾒,>8K ,`aB?Dn;ְ 6Ͷ kwG`DK]?$K1h s }!;j"&,jK wrIlۜv|NUoL9IКOH /J4۲gbPȸN,K!9+yC4+eC1O l:ate) u @ Nߝ8s8Kn?9>J1]N|lm^$sӗJYTI v2IQZԇH|ԾKj+_d!AB=$6bf(A12ڲYG0.6%jic:Ԩ9`&݂̓h;pFtsLYiFy# r;0پm,9Uuȅ?9xK@N,"ԕs#11lgLIP^KΊ< V'ע@|}ilBF~$ JLs$Rdy,D$P2ؽK*%9!p ޙC1K@ 3(/^=U73@J"-FА1V%=Vِ"WvtGLID7#>Z SLJXśI:s؜4)P%lA C=+yJHG];y #ӻ:J^D9^XgK㛭c~1,scRx޵ۉ11txOHhUiq-5$9 ܼo5 xxZ@CUr-˴вn"I4Nub#E'i"\AҚ&[wv'-xBELkYQ ol5>GB!rvgXT:;p .bʪf`fWx;U,E6o]&FeST~;#Dm2}Fc,O*`m+xPRғ8<^DMܭ@+⧡j ~X@\Q)zTm$,Gw*fK+h<{ in V`8@Ł~X6 :CCm%ReAG/j7\ʏ-W!ܖv؍|NCu_[YBDE~x#_rYTf@#y[@@D|g["Z2Q, 9BcϬy0gpe.[-k X]{ .IeL ״Yw+sy{x!qQBP^,G`6N{D54lh\_ClE 4WC}n8fALzHS=^=lwtqz*Z`C=?dQT!XsU=@bSڗ"*fUZeֽ++!e1cE|a[o|@e?R&\jm0 Ad^` U'cy*%Z1Xx,!UORDCfX{KN C=5(nArz{#fno5;c[ " 6n3H3F`yb2PX1b&=PJL6sj0ݩAmS7k</"5l~oZXt}?)˄V4-S6dș$bkY%gt- ꘶b9(L#`PA4GT^ap:gvC}xeRVX~.gETjK{k  f:[rl<,Lq<ˏy`e;qF~1s |; -+<}ݍd[ wMPR]aiM8}S޽_*r`Y ~KuhsK"jVI@=Da)U:7T]Q=_zlbBV<:yz1 ҳT}\npC׷mw]kX\t4il}itKx]ʖVӲfm5XS,5\.R!Gh;4᪌/}R"1iU{6/hTJ,yV}uX5?mш(uODzዢm mWX(*rX "wU0 2MM!w* ]|tV98ܠQ`GA_g3~.?@du}])T44hU:lX6^ܟXgv"]Ҫţ 1,yWTܙ4{2_ѽ Q]u`+ \C(".!DygɛM/hE܆W޹jҺ=βK@V(gRKAHx"uT'•]HJ"UK8MQi1FBRO@qL{/Ӓ4xG[K3i3\k/:2+0@3vWJkzw?2þ4n7fhl>uЦ{6 ^*>`鐇QWIߋVHts>$A(:l}`I&؋Hޣrȇr:}H bo% ŀݿ խ%?"BA (lTP% {GUlݷȗPB~HofJM#pkE^IߋHez Bk˱#1{RSj?T/@#U\y}m68BAzJeE>#:a% r=kfD.槃.AC54}yL ڣƒ늄 ߶15wώ,>8Kiv/B]\n˝tEVh`}q !%0hg \a1@)'2waY >Kᒄ7m_![jCClX~V(}IpL9A-qos/y߻)0w Fע$5w彍YӔx1FndtàHfZ"5c^IdYUt muޫջhR|{I_ lsNQ4 ga:0nfVf6PL #Be>O>˿.xxӳ}FǬD@0epse0bpu-黃 ` x!' ۧ'>%OQXyXoiq(F iwcVX~oNCe=^%ao + grCHk.Df=y4*]3'8dٯGP %Kz *q_+Q}sPjn{Їnۑ zF 8+2PDJu3k M;܌ HyZqVaٵZl#a{+ dGM-_5l<[m 5`m 4.b#o1O7ds饝l! ž4pU^# }vfقJ[dz:\;pq/ipl[q1=B0Kӛ NI (p4[]w@ C~Ncb_]tK q>.o:*gl}ק& Isq8CètwPpւ꫖"ܣ~8c0ɞl 9kOvOd)t1i18 w-8ngS.kX`0lGujO0oٕZ7yVcpKo>Mls0Mz?r3D)KRt/z`3:%d?U7OLJ8odk96X>\MQiu8ƄvrK4}"\Dp˚/BR8pTuZ A],q`ҫu UU~HG۫`Y W0ˆvZm( cj[k0(R{~Vp[QhCvh` a*ns.mtB!Pɱ*n+c!e!bVUfqҞz+b'yplhCE)i1I e<V e;v'G({&lB׆2Uae;4}كZZ;kN3)v ^%ށP{wBI*:GLBuW uw ݝPw?Bg@(_%@(ߝPV v ۝Po?Bg@J?PBUB v'4؏]%siB_Zw9^QPn!<)n.*/&h6  \&n(t:˜YroŜNV!tҘ$ ˴hjS:/PCҿNqo?g.5t` Ip }~%2<-?UM3Fs Y|q/2ΉA:\GI[eAyxz>9$G=8kx3*hvga`hEB1pV U ȁ<}pO\JyZf:hq3"[+GS;F͐G2DU#fi,&DCDzjNg78m=4<MbK!)F"`(" h|3 8$U IgPIG-!5[*@5mO'7&Ԁdڲ!{nKI# JHOlh:fͣ"9͒lЀ@ۍ ?9a|D:6^jyVJ.yݹYl_="ጂ"Ad&IrV1%’Bf"C\Lǔ*/^ZpjdoJD:( SZOsL3w'Y@! ӁqFߏ)ٲKFb }E<:(f)AYbme)(P㶎PBLw'Zmk9 9z%! )W!?)lqy%`'-#ּ`MdctD~v~1j9;`p3)dJU*"::UBdp I (aB4YeS!t<@z߲/G]3a#Ƚp,k' X: FD `|Hd'wX|o>p$穄tWA,#%OfoҿuM}P Oٽ}}hޝ[V?ϯ1쳼>]f+z]s^Hk|‰uu*1\Fzuଡ଼=Y/|ǵ֑MiJR.xXɪ(Іw]T&J`6qahpcDRI'cQi.5F]OH@n*Uv%If,xhMTC l1AϸTjBhhúO}y{>~ZǍ8Tʼn5<"9Q̗2n=LC}jшa:.Xz( 2]1ԆL'BdQ>$< 216I3/H\t&! MxQ%J}D~uV:{mMX"G/Y)D %2Ŋ%DCB19CDXZd%ImLcȐ:Bh3 tLwNo"[/Zo} %f$7Q6]IyI^T%" 2+NɥxB̞M"Ek|&IGc.a"d T?MpԮ+DQGjVW:K [/9K %Ka /? l0i!9l=c;1 d`EvژɳRa\`G^: *uΉX+3aQxb%LÐinHn]3D$9[DxͣgQpC@qMli'a^x)҃Sk3\ĒKG gϏ+ǽM]pLR) SASDD∠t"~(mdɚ9n WhS&j2CL4}^E yCd `T>[b(77ړ/pD>qM j +nA$&VxH #%HV>,hzGrBґ#.Xヷ)qC ucD۔},i e#R:Ft 5 p5 .h6h-iu'n1A[l|(-p4<2) hPksheIVe#Gnrzkc (t0ЊiiӬ!6R(Hf?2RP*8R!Je~ 1M l=Z"e_C;etp;F@&1d1Z?aIWAsFS$'9H${ ${lLĊ5gR|ӫ6NZ{g{ IUum/ :g5ɪsELfkm܊*3p@9#I`' c.#kAP:.TLٌ`~UMlv.7O5kI%ʤ%]Spub)bJMΪ&8juRpkZcrr=m~j]綻cy;[-Jk╔,Qgo8*K؅1dǞ{O4$m!$M`xU"ZJ V#K0 G76K=FPj+p7:[o Qrs[{c{x׎]"%?وvVgݝA՝r,>?|:jX^]^և++-|`3k)?]]/Y;3a0X 13xg`2aߞ3T+c=NQ:d+K2/g޷|v/_7&_7~x7FxiWO_0S>/A\|}+8ŷ>??¯.m_\]K+?Tx/#/L#?~6[g.oxe r\!--`{ݽ=}c1ew Y+v"xÝ^Q)_}:w#TDk)?*/#PJ uVށ9Nv qR46LWFƤLbhvlg0ALgmc粬c;շ0mMc%;羝;eE3-H4qYSӞ da~\LI]ۍ557o3hdD] ':WNn'<pu.:vpfSBp)~[M";7^;e$J=% ʼfxmu ;l[9m(=̧d1C%}s6Y3XVeuF" vv0nlƖ#@GOkTv:pRǘM~N70VO\N.O& s3_~z{yC$Zd D |l<$i4"a`l*y\+!Ogq^؃\36Q/J-:櫑ȼ , ;󫕏7)OG©k>c yEg%6c,3|~>~|{WYG4//ek|7;;_?,o?~=>fq2#%"|aR%^^Bp.߭kMV?<2wWݔX$%f7ْǖ׿U%͡w ޒ1}Ab:XH55!ꬴcǧM !:82 A9&fîQ D2?w\gH"1s$,)ZK֮sLZ::o:."Vg\DIjd@8}-og~d tv5)2w?ҥz@:)W$*LAϯ>VQv։PQ^\]Ǘ[OƗşeQ'ӣE 嬸7L}.dz|쳟}Qef4;y#9,x]p=M Ym R*u_>~WXO0\$ϥ`4̂]"<1 Tbda=wt>]M''ѻw 5I>?>suQqQN9B_Mt|=W ʨ: Xc9X~ oxhL୯TοZq9r" 幉%\J*/t lۏffچ]@}&9]UՄ1MQ`Ql zcz2Ue0_?~k"&O;=PݗU s)Jl$87.3+4J]z5.w{uF}%thQL=ZONWW:gIչ ^Aq%"1濹z[Fd x$[Sv:9Kfa6d\Z>).zI  W'pśB s.~~},F#p3Pٗm!;.^]+c} h<({R _oc4,8*ftz6!Eds+   J+yVKݲ]*P!p!{ҼGz@`R_Ҡ4h# ~s#iXI}$eaQ ~Y :8iڝnPպBKy'a&opyR>h Mpy\-t6yME}~+ג)2I;4z uSw6`TTHwG׿\\[CrV"4Y'X3R:s r ˠk+ MJ(e|b]$)cMV{6ϡ 0 $((^ Tx.(`{>tI8i;990gu!vԉ\aBǂä/ 6~?>ܡ;Tn?,'RaeWQE;GY Ysi,]k{fÂs{]m}&̖C#XQYݢMvT;&GjMm Pcc_]p%?8P߈ݒxXktd8/<\:˴t͚ /.wʺvfKtP9dgto і}.SyGqRhJZsK?QQ!c蟵C9Aw̉g](+iFhG:2.,0X䵏DtWZRzɡvoLazԽ%, 0tt8-@-鴋-@Jk2#b>fQZ:ś.~{H!6o$Qbk'Tr@$qOqOqOg~&4,CXi=1\Yt_˷ɴ~/ЦKm8#BPqY P-cN8F #2BI4v慾-!kXM_Bqǝ6}0N$ rw νO_TۇpD2FMab*RIlʍq Jr^+e5c rÂS:ho,-MH2>2оlnA&DŽi=e+r)0e0(ʞô -NհR,4ͳ>N.JvEqzu:KgM!zL\;DnN:) q2Eo #qxb)a^>K bWbUpbDaCNf !5~7Q]Z|0͐RH8ŷ0M5P+`nL.g8|r: &~s)el!+[P{+bN/nŲ~5*.)GЫ5 HUx!\lbVn/"K v j.wϚ45^}ܖ#sK#0ÍGE>}[ݧo^5#a",%ʃw9AKLlOp JƹgQĎPf !K@:Ue6{ԁXj@,BaI`83ż`X e26`@Qc<&P)Dwu>U^Fs)J4;4BK"wkj/); Pqi-U+Kr%AmiRx&^j W%@a/拑NNII?0_ @u+-[ wmÿgiE58WyuN@)ͰkiEmciXkz]CoF0[?/ź)M|xFYWY$+.>k7,7h_3=!aN~;z3="=@ێZ[5Q$#EL}-jH+3:GsTxXq` .~9b_fiqo0_ͮFl--JWf[HN~\?_~څ/P_u`R|tLIvׁHgH"|L,\I\-ߕF}SE M(.A|uT滏ŗ ,@[%˵ X;#qJ;` 8Qt\g/zFM^_1ޫEz2ſa9S~N2W #'9iod w0{GN2P+|l7Z*( Ǩ27Zr@C(-Hx6 &vn=벝S1X\aCskVXʵiM)-En3;1[~061bYji&6MDlYol&Bѝ趍Y˺MRn<&:䜃aTDim]$E|Ok=ît 9qdY5#?މoNYB6P<ǁt.8xgO`BqH椧=&3A߁]zztbLc"=TA i^IC<4DEL|nj sk!YRTħdM*e'Jӽ2M9N>$>b _ryL2ވ{Y7s.Qrs+Ӈ;c>Nԅ̉ȉ 8\msM9(D_·z}EON#W$Wm$}K+F-Z.:ܝ BZ@;B >Xέ\ Z8;VFdGhUF_jh*1!(+ Iw Gb+ bF(wrAF@jz;c ?֫cރ_ +R3ѠL=kA*QC7|``ak 5$6DrLq y,]Aw!FlkmI Β{7d0d$. Mh"KQr_US![%S 8E쯫Ũ{K3 Id:ް9Kt0;KcÝ XF} 5 FPFI"1Yj9&EjHFbq-cOҧӟ7-Y*t#η0x\_Å*"pC'@g;u3gs!3_pï_5V@^;E$"w?s` 7yq{ >-Qc0J@N^-; A.ئ:O~$n u9\ B.Lq(_/&|/:0 L;`m2_$0^:88/3DkENXOOB9W 𤨏o;b'ot>ɏOCvVOƁPW˼1V߽|2ksV=\_ {:aƠ(<%cU"=7x0 Sb6bmm{ 9XDĖljVdpR7xy7}K.==1[{MReݧ=7ed.9OJ$Vi  DkP#J[jj6Igs٪ xeM02|e>uN/.tn!.~1L:ւ&hLZD;)rUim~CI5A5MQ !k` үھej S]{">btUĚSe9RjcX(iIg,IK#<]N%51M%XW)E}el2.v1ɤV&mRX];LTxI蚤>^R*x9I+ulfA,;  }dYb&L&S,}U.% @Bf6>a"%>vά)M-ڛz@rIV/ضhpMkV/#fUtAf.-#=A$-%e=t&X~&UgW+z5KeRQ7≯&LZ,f 7ζ;C;K{p33}BzhU7Y[)mHTioLjxqj>ݽ"\lV%k*n`q#oqoI+"DC`lk|?O5Z6W?M˻Ï9QQ%d`Z1.6ĖoVv]g~TwF|ҎmN39X/D+q}T9%MR\#6<_43GGprGwϛ RWjUS*0=/VЦ2ݬZ[(RG,iDeP e0 ţڲߪUJ=26 L0TV,L92>F* Z1i/җ.GG=hV3aJc_(MhfLF \i"|p۽׹cx +-i2§\6оPh#|پt4%{[9aS j\;>w >=!j'Y8Ғ&%T%ŷ9cbH?zuR%bNͥFRŭs)ެ[M܉O+r'z {G&d3,$be/dOPƔ=Lx+m1jτ '> )cKG'NNr-=, ͮvOc& $^ힳEԚ a[nǰD8˻ww,A_OPV~{v>Fi\uN:bzIʲlIx!|bP i}7.CGP%?~4~*0/`_:`9ouƔ\ے vGBFWK DmŷL-F+oz:oLǧ|ꍚQ)SW; ,/a/ n37N-8~3! 0 K`!o1.R㎄sa҆P[Rg]ݵowq%BƃVۆjlV}%JKu՞܋ҪJ0rTM%*">\Fh[1}蠺DJHG?1U +*Z3nÉUZ+=^wz-u1eH)_b=1_@sYWކw9$|D_a43`BV)HiZ*Płǫ= u FI%.^^Z/oPKQͦ1*O"@N4<(_duu=f_yx0)7bYށvP)foVU.j5rx7[ūj Yib\s>5No#(//l[.N`֪94&1N VF er)KEvsM3 "b磱%ݬE$ó&kQnUB k\c-V 1Xzx eK{ /e@90__e[}Wɠ .csMr'#J/cKSXtj ՟Ѓ_٧<g{wa|b?jAKP=7Oq-bϋus(ɼL q8X֞V6K_-[Slp%8pi: BftlPv'h7o7wl<槡§}vg\2 (/`ӄi7 ;:*]:`K>Y 4M,M jB꒡r"l:EFc( ?L>/g8Y:Ԧꀌm PL ͞/Y#?2~w>5kEMgjY9` @*;v/4\:`2CwWÑt37/tE3mQ ~40Rw2F]#c!DH=O?;_dJl:f5\ga`Fjn?Y0 }$Wh䢑GgPvh{Mxl9Yљ@^PBM N҇r,t\wm¢=x@&,!yv'Qy0岵\mouM 7Ztӫ1:0Q_|:xnY:FI< D_\狜# ]봛d4@b^ˣ+`T\݁ĚF7+mF+U] Xk]&L]kdei*>laFFYf^hSuEBl7fL8,buFY 4ʇjm%MFճ$Z-E719-N>'#Ap6s9Z e#0GpL > &U-ʐ+MP}`R|w膔 iT9V88Zӎ]v@)\k!׻ wF>2J.৬t@%wnb*m;tR{]dc¼q",cOoSt~+[r= o>L^Ro4䠝X/%},L<µZ< C_kv ɣNڠ-l/z,f+nѷr%~6;YUXKL}}ՙ/Z#J_T~= HoKɑ{ʈRll}KdŤrYi~/.ap-HXܸйW%lt^Çs~2I/DֵΘ.ٍqX )cx߷]y_v4Q:oqpjW n?_: 6^u ᢝwvhٻ3V_Vzfsˋ (Џ䧏 $[Fyҥl?wqЊ;osH|IZ㏹B+s=ݧ>wSy:N&K#]ܨ'T}Oc1&DĩzBx`W[*bR&"ɌJN=sW `^ KsyP\ .ux69Lgޕq vyazؕaK~Vȫ-t{fH.7/V555HAlEFT}y~?]>V@8.giy2[čj5Е`ԣ_ xNo'ѥq.M/oqޞm=?:.Jx \~[m?jK:!^CǞrܴ.b PLN߇9zզf9%ZR d=ьrJ^XxsS&74TEʁ sJp< +2.) x|俙gFeg`nQ_8ײۧi~_Aܽ {wh<*0YG^[7s:o9.noX8';8߬m{>͌l `L\ZSlW׵wKОi$N'JP.ڨW (⤣̑lڋ@NA n# |oȹxW Z M$@A *MCa]#x5| NAΓ|\IuPk"y&3h鐱ã9|RG|:g8rV4YZHR)OWjK=4r& UR<6S""hJ˜(CA#x:| \1-MQ "NHh$䰙9at̍aR}m 8h q=嗏֡ 6(/IG d6E"2'G{D\26/X`d02>JO2g18BG>]s=I~_Ё1e g-k2#Q_6#DʪT*uIEȲ^/L#Y2j^ϖ_-/[.=]I[ȃ!Mњ>wv2VG@\44ⲂSu8.l"ɔHi%3a0+BG>]FMs=j:~^'K|Č(X `DNxV<ڐȇB.*]r|03^EBG>]jBWN bhw2%#z @6!uȇ(J6=44ئCF7iG3!GP]1+A2m2i{*0%eJ??ȔƐEWj2!aKx4&gL43bБ! aՀħVzxZh7,P^]DKi4j}FE>MFCOKp?O>(zξfɏS=c `%Xh+*s7z'GhQC!W*yЈ ԡ͆HU2Ͳp˧C mc(w r1$hIGaNr 8sZF]n2ґP`uM3 3Hk`ȇ*٘Fpm/A C]Vi.mR^H0d$L|(J,rO:y>T@zJ͵G.{L z+Y}^-)+S~+]e+̊ϲ]k?[ln!ZZ&iq k_R5QTRWl\srrY3#؊ȇմGҿO=NLl= ؒ@ (K;|(f_؟ae)˔zbq!['ڐ_WZ#]ĸMZ)tM$,GΎ|G3*|(YFq ]qήgNQy2*N{;,6}:jVs̿5]11{ Ug%E`RyE,Fw]kw2I$@9W^ SqyJ[w38vt&&AS&LF*Mi9_ hguT@!Nj=,CR։N*϶7g܊9f|=rxEAr$,L.+=Omp_?d#Qz q`2F("Mh" »[qAhgȇB~v} !(0q>T_iũ}\|{9ޫHȒblHss'` xPȭb} X2j6QA +عdz4/*~TLV͆û\+zz対4]XTs j'jU[]>.6OiP$Jkj )IQG(E>rcl:C{L'jj ûr K+T0,4cKf\[| "tʭ]v<+nt&_nmQ+CFիJ=tcy\,މS^R QW9+#kJe^DfmLUQjk^$x'Rz?n;͞:RQ-ϗ.Rgnx1t6]O)UYw Q3$"F -b\{ݢȇB;b@1y Kr3t/nZM<\c6ޛOIܣ_KqD%-qp"3(}/bt/|(fj.?LŖ-tWM&y}m5l#׼}j~Q R2jVvtPT]Ś c2 r˕Cv^7sWNŌm#f5PP]Ux#:'#zO/N]h)TahKE8QNx}m`O/޽Fu$FN]斩Gԅ2*a;}ռǮ/}$Cg+$?\<3𴹉!r]Dt!H~^UqHAhՠȇBF8+|^#!^rMI hx2FR0b\qX5"슩G*/lQxƵV&Y0.Cc"qH,Eh'ȇB>U<=|uTe≏^a%#@};f:o~?4r$9'w4N4Fu;e^q8AGL|y:Byz}&nzhṫL')T;sχ$@X\k"f9Z_lf¸N bi&j{cFQaTUsK"%UR^J"Q<K9kE>[U~ >bq6]GHA=QqJ{-^;{ŕY5*XG /$B+mu"flFWx+]?漜>#T~ {ltcmwM ;dTܰP3EGٺj0qK5qF@9%W :}9'ޝ`bPOŜ?S'3[<^XҎ;rۭ7`d/?%vScvӊ}Vq״>0מ]'Lf};2"tǞ<.GQ=< =V&'w5'i9}ɼ|+Ͻ4, `U&d!OJ\{h&[5/AFaeeq_oG_}*M- JEH)%ܙ؈I"L  );IWГ OA7`mc+TZd/[. o?K88|S3ě *!>-Xx{\o|{7uIn M-F^4XK#ѐ@0YM1o9ϧ%rH-EbLh^*%N:bF$۱yϷnvmCnI*<5yk!`ͷ4X9#i]ڡ!u={?,hb11UEa8~5g"zvۘJTxJNinVűΩJKŤ*Doۆ,~=_`<˞=LaDLY(& hg3 #g3qWs /]׉/WsI&5|fW gnP$w[ p<:[#<$7 )x-1l9F.u)mޖ.6rv+}Ը쭋3f k<  ~_`f=oc+ O)r)b0:f!Y/DajGY 1B91mO\:ϲptce?=olVKt/78W`v&A˃)6;|6kX]qP?~S?_OMh]I h 2Q-KzYu% XaUШ󽙻-_W2 $I^\t݇I6S^_ ʶnRT0xvpa֕Hu .Jl'LJ[|"qAEVy]DF{a*>[,/:&'JnsexŃ1q`TJ+kǩX3Mˉ$Tb,n?Cv͛_6&c?u =nhWK+h*|]B%! aYՂ2L5U wzv$b4(Yg+ǀ;*sPfI"]EĐ4z[60@X9ʝjMX܆;T vGDTnڼm^!P1 G <:RlBa̘)~vcuQ.A1L;0g70ۏ.k;h"mL"Z9KKX~?tz )d 7RK9.y1o3Z9y79x&v4BN%~+*sDd_#bq2d5.=_v֚9^4}sZrD'**UN*zbF'n>_և'n1E9f\N٪܂t1Uh#E5ULF>ݼnJTU0mLQA bpt,`s&LD%ʟcb՟OWpt ݄;p6&"rpL*9F$b2ȕ<&ګG4KٿiQExŪnCiŦ43ZlÈZm䒣\ 1c΢COGm% fGCrG5S2&c: (E׹YpiAurn A7<_kC^nxaROnN}d@\؋|oW8 9|$ɻ8n('D"\9Vs+.mdWWYuB>P>w41նc KT#$rA6&[Z' uѮ9Y|K!u)xHSÌ|~13GOV gPILA~zx g ^ۿI NcӧR2 /jQReBcF+AoSF;v)6jhۘ^Z5SƁ·Bk/i6<qXB,rEϸ؜QEOޡBA =u^8\ ОeY4I% (Sa> X%_fu䁇jCYȇGR2 n5I.fx=}Y]$Wq|6Ro}1V"_tR,i/^5rA#I.*Du!eU]"6+IibWsȊKx\݋AhbH~Y=Wgi}8;IocW4S/L6^O>VSsDM}eMszV f=<ȸ~}DY+VWxIgYZ/jA Ե۵)QB7|m""(n1hW߀:4flG7hCpv\Mr $z1u @cI.H~EUI1MIxU9gfh.;~7\X2K{"ڝ_Epkܰeq̶lVw6nڄOdŴ#<W~9ĆGnj]ـ2>B)F!umuh";<2-4zYB6mꎝ@.@_kvbMR3O&$gX&qfB. &'dG] ITvGgxxO7Ы=#,;ۧLy{YxqxH곝[.yvy[-M Wk#5U: %B&]^ >wTfklٌKv[LmCXϑXbb$Ƙ֡\qbC@>>-R%WQf2wW!du _Δs@Op+%}vq徹m2XYE^>?6~rTet2嗓+n.z_8FWǒ\{LP~猤ʥ3&Cg:hw]>3h} YEu?“{HVSx)Zz+’sb)n6LX)aq# Hڂs])2F3k1ۄX͙p6@^zd/1:PrHmh[x]=풠o_F%i?_~\ ]3H1咱_L^OZ-M:*fcl#e)D(EFIʉ\'IwiK_Op0zlٶG8Xӵ8'9j^wZkЀ84o)u;aŻy ݒzK1GmҀ1_3#ɮUx"bT uԓ4#F&{~8]i:kn4kq8]J6m6x"LXe10!4q̰r)|H+Zdž :˩p/E.a/̥Ik&[/9N-=ج?PK*,x92up=h8jWgY{7+ _$0rCb^v4,R3$Dj#vsbWBqAi~ٶ1ja=k(ڛC'2pͦOw:a$ &I$8bnʕ =8\n2[ܬԗqR wNcPe+7vw IE-M3NĤ=*j +ېu$H&rYxlgYՠcWdڻ{DF+ӣIpLD4N {Y!{:%bJR Sb43~h\EP)IUJITfAҋeǴX0hpFu{Y(b7,f<{>J4ĦǗ!4}X X |18@. +1{Mwd}790/y6s=8~7[-gv۬ ~$|!q8M%Ti#S)^FP;TP۔6d(Kg91%^*ܕ%]~šR%'ҧdTyAhDZŒ(1, Ҹ8/G 'À[L`vr.mA2dnI#cvt7RSc͐-YFDIsIas"Jʐ2Z?[q],zc=Y0limtH屯E<1E>xPJġAݎ B2=Yա4c$LPڠr3kDGL{Ύ].0]ͪt^9ix<`nq%4W6Àfa~qir4cJh}Uy yhKUH{qH,Te` g< Đr-f9c"mSQr[!U"V:8-pLHLcQer="R 3NLGpʋU`.=/,\lvIH7ˑ<-cy5$So 2w29)k{]I 4{faW9MA_.$Iڳkʵѱ4*:<j>fGg'Ed2hLWqںOJ#[›u@cpBKsHk:'CUh1f7grRI8#Q-yFҌ5b%87L%P~Cp#9XJ )Ѐ< iS1qa9FB_ubs⼘ 2q2&PgGthIsoe{Q6  =ch^.Wv|tJxC0(2Hbo}&MS`[zqMNj17 (W_{XV,`6NC0_j4BCػ8׉"WW4 T :i1qweQŢzXZk` J qL!qt ÜZk;WIHb|PT`&N-cek}̋!] 1zp8&ގ]WFwLxE‘zp8&X,ikCи*fNI" >ϷPzp8&NLr]!ܷ_ 91ڞi(ɲ pM}b(8BN# >8K`ږH"lɷ@'`1OYT4(5dL^'QQ1M9Y eZ Yi*Tdmp8&”9)m}\,,7???B%4h!$9.vmnxV~[U*~78s4=^x ǐ%R-2Ysb i3਺:}ˎ}8DQvC̾_R:ѻ$g0`筶WaP؁42[ϞXI_av0ᘙ[PZj&'=F#\>r4j+}؝,^ۯp. 6΀~l{>Iå )LI?H wPmc[  u!2jg=0b:tF,$ A  sr`-'rati|^EWqn偬.,aB"@)*Ntn#2rAbT3AWYjU%Uo{|t!Co[j5#%%yj 'f!Q9e& z>Gm~v&6qHɹ)0咜"$1(q ĉG8 SdtIFHjZպKiPh{3D;wLl pLҖգ/SFۍ9JCf66j(Q(B9<t GHh;F`@oqIauȈ:i1jڡ*$ Yfj=8gקPB=8g~>l ᇻ~@]TSȈ@ Y eN cG!f_8I$ʉua0A3kxRtVx.\EKvtKS9>(jDۃ1qF@wQr6ʥ5Ir62nҠz3z3i^Vˑ1|ǭ-r+:A|v0N/N!a칇&q#s^aa޳ Ƕ6r^r89Təm 0xC1t Sc` G\l]`b~r}wnw0}926G5j\+2 e" H$Q@L-;8){Yxh؆w6pOnoKR2Tj -ayX(IutA|,&CK^ ݰoB0ViL"y툨+AN֫<;wp8C pO[9g*%eZs,̈{fIt&ͥ|@a!ߞ(`ά1LǓ;M}:/;@4Ì W?07v;*vނ~;$2?&LA`Eb Mri'2Bz"@ 0VӘBrɨS;q+-޵%X'n1tUad8ɥ`q$M95d{uk*i6^ݝzE^1EI7'\B۫vFQ~m;,ݚ/j=;Au=>)fJ]|u}U՞dcn0K>YDfi=:V߯wGժGegXHD fc夷˰^PZAvմ~x7 Sn榘mҶo /f]; w!P@R7׳~O;m-Z5LCfє?w(Σ@۫ȷޕCbrku^1`7rez*I#I{Vj/ntP$~,#K pD ~z<y1p۱H=~b-p~:S|8*YͧvQ~n7tY =gQ}$/GgI2:<_ô &Gci|vY1d#bsq͑w Â獎Y" K %J) r,HyBztKqbY*]w-UxRTˌ/vyYgqYՁ ׾73(ze>^*i{m}uuUrPϪyZ"\Rǽ D70]w3[Q`UMO#ߓ ut9U]z2,iod{q'#*]T;1|{>7mzQj+B e{cׄ$,g̗җ{36܄_C;XސL鈿}. 0q_y\7k؟Ռ֮Q_{T<<}< 0dH&IJB*mk8D2EhҰa C # JS+1A{b_.q1%>!<n˴#]!W Ǹ#SH xciO%Kz>7L-kz{wP88 4)8'xMtz:0#%$'TL Vsvi+G *D ޘ "w^.G8cuMS 7V''KZޫ&`rZq8)~DgGӘ\p+4dT¤ # H4;k=0 a?{F$ٝ!س01^,ӍF-(fQc0>[$XUd°Z#Xv:VCWeB~id4ʾ7.^x{b4aM/tvK;lzhtk n8ql'u<h5-.pd8aH-cίFS;0P[SC`x(kZfȻ>ͯRP?Ϯ&LG1@>.7֤ u<໋dlj1Ƈ|PWui.~rs+0b,'"&'8 QlMSUnYϫӟ̴NBdM"-uխq9/koȳ}l~Bvþ&u^Fx\rv0c j~bU}f~~|ǟ˗Cm_6}OzIbN; iLj4K( grD%Vw >]5ގ7Y]Y*e$Z5!s&bbIb.*N7 >ipDHnz2?EF`|\14_£z@ìa6 鿦`bo ) vZFer)[Bv1ӽͯȵoK鏢~^A;]G^?goh׈a[zF)Y5ckPWl}{M||iyzMv:skU=xYvRǖOns~L|WEamT'6`v;wǏ1؜ŀ ~eG>\,Uo=H7[|`{#5rq +Y_{| } 4s/..{젣t\]}Kz?rigohN\`Iԏ{M/3 A}ZNblsaX WJ(hZE#w\PɶJ6wuP]jݩ.2O R_b}&wX' ;Ǟ)⣑"E I"{ZJI+OWIB]Ѱ}j80C&qNȩ=ݽUмS/4)7IyzY4.0Ϧq30ZT>pgY}/uvOQ.蜝]Д ,[qAW4|<]n'T!$s8NƃZT(XMւf;q2Jpukr+n-.1p\=#^rt6Z{ÐأIF!XQm'oRzUϲeuh=c';]eBR Cf:87]5Q#NsE6Jy7V5Jz_%[ e]=8Q$!! E~G?9Uh?vՂ1Vѫ%ٽ 40Y9iBQ#cEJ-QSs,JXE4~wfwwO+0^*AjHwx:0*;61y N rqZYguT?}Ikqc{ [O2P is19owN;RF>G=Dx2[{|wQjsOD/7ƻn& 뙱1$SI8'ɹ{ZDS?DQGFu'ǽw9.z}Mlw}5~7Bj&'{rE?~}Yq0iLd=_|m=z+I_+[ad_*SVD2V8gENr6?"yHӴQjS<:QT6ܬ]j5Ta2J*z]Hz51ͨiܶm=OPG\Fn*/ I8Èk.$(@P)϶[ťc߃*{~Œ .j <y&{fP ֵ}H|Ew9Ԭf]]C/&"ׯD;ƾn{t>.WP DQLL r(%UIcBZ2*3 wV$|IVUb#Ǘ@֬W5kAO/?O1+9g:gZnd9h9OQAguYIH Qz Mvzo J4&zKKrV7CYFFIAsX(dD6q0r,K򇑍J[<&4}ݖ49T"ky< eo8.>p^=fm /ؔn|Prm/1[O=fOAg x <6:`6˓S5.Ryib1`ŜO:@̽xPu9gt7CuCE" 3=EZX `N`! ep(| Jh5uIm` #LB9a# שiu޼ioeΥvrf>N ) ° ]`(aoiǐ-}edoW&#5@`1yq`OT]a2:0+RD29bD`$-lU;oxcoCLѠQbc#ErwAEd2w28Ri["D:a4İ5{@rO8t:Gr3\{ He4팃hz|aԀ޶8+9ZMSvA!vZ{f!yxbVY k Hz-à#ENNxBᤜYd%. PI.r 4 ;|e)(rG-)n*yR R2 G9Sy zK +iPۥc M~Џü&e1m~[=xe`P=,u vrX:U`5'sf:5 լ Xn[b)@ic 9!iEޢN[i޷8`a1eB xN4]2xI;<2-?ŋƳ>"Vy}4sM眈]oAߠs* ˧ābn6;93<BJhۖЀYTQj7Pp=`;=\`OΝ㘂\>WH#A@(5LMAh͐u5Wf9\"FLȺ$оAZ=>)292NVF]V8WN^?'U.vyj<ҲbY\RVbUի1.v:︑~#%XFv0s {+}c\ҳQBn; k)yWD_Tg}m],E)݁ w;4VYc#֚ X63ͻ;߯LggiG`)'{V vN'ؖ)K@X&J~ZVYR އʩ4PsFtl!Ċ)a@_F}]Vv|M/qVݧ 0.IԒGc](AMhe: /- @X?e07мYWZU}FYr[ų/,6T7(~D2ug]UB^PG%o2KOO?Zaw0P>`k U cJ#68R$@8p1[,} *{Bf02 ŵiL˧0Ż3ᰖo6}6`]?FO`e>!X.|;ywv4Zԅ?H6?>GU򧢋/oTZf"럷*'Ow!׼óIh"}y7O.z\US~:}'Eן;3ގql:pѽ>[@ܒXMob7+@rE uP+U|5T[T["u&Qe){A 9s˭#:F l/ 2ֳn 98\(wp)BcF@1gz4‚S qCsq#k;v hOt?߸Z,]\UX[L[5X{$VǕJR]#1+YxRjJLA,O'.|gKH(򹖜NAfx^ bz𫃁t][|VޤOT G٥߿;{NXSVMOm;;8+ȯ kw"MxR3_X DKݽ{Rinv:74.XS&1`C%`ZIʶ⮿>'aR/1ꡲ)V P1h#@Ԙtsk9Mq>]MEC^'>DfYmYƦZϣ ~;y-N&Pf(+*GT KIw9C 1Ϝi'2s}2Wnㅕt6)ER%ȒٴJ7'tt?'2"B# ڬPANXekddeh$)}8-t1`֌FˏT̈́!ZֳmDl_&|y1&Ƌ$%ݓ}Kn"'HAln%9ӂEk]}/#no\#y(9&rhJt"6ӑaYqGk 1k%%RHD#ZFm33;gn ;.{΀~{Өu_}#K~Ja˟A^=X_+HQ!E_vOWXk@d$֢Ge-~k%`uvAa>Fx&JWo3yg{G; 'R(r#qBJ0&Naqx@ՓA җ+xj U 1(*u"՞k.y5$ ĄCڻ1ALkAZ"$tBI ߥҠmPz $0LIYum;aXҡ5ȫ ճ[kv6ykW b RG.%N%O-\y5@6M8?ntÁ-ʫEqe!b"$)Ŋ_21i,R(rNxƨNHYy${LrE']ci)1 P4F8x%pVX9uFbgʙAPI(a`HdD$*#+TT!H% M;κP Ĭ`C BO@8e5PA vk)4*Xb1ƨax"Хܽ_(z:zpF Ô! Q(xZxM"Ci:Zn>|Jz;q=`D k tJN8Qb5RDf/G DqvRmת {؈3CuQfG=;[p7"lOס﵂p#pWD^pLB q0"gc8@4 8Ym520@js^ں@n~@SXTgttHL 9$x*.%NzEJD `:=C;H/݉ nrĬUC@j~o՛bB|BQއykE>IQN)93ZUDc^Ve FZw7vTW]gqgVJ@B?n!$#ahamem奂**9HȶLo6(n0M3tBR.Ĕ-n0/N\.Dy qάQ`h5((dQE* 7ُG`.3ԒQ* .V3N"E_k7ul^Pg^5s H4K*+;6bEu ʥh8Ǭ @/?:;٪pDum nٓ!&Q)P;dmȡ8Zrtҍ3tny$XLjA/>Wܤ[ކV az.ZR,r3i"(Cg"h$4v&OgFkZa|.ZjǞURym*FEy'3NX,MTWW,Y[ m,9LSq\o^7#Bksǭ^'uzGLSniv-&-Z]IUBJڒ)"Z#.尥 kwFcHĠHTw%Gh|H'/-9~=iQ+&>J$]P i[,F[ WBogtL#AFB)RZ?Q^n~7U|O[ڝ7 TkA%(} *͈8lN%%Z@O"kY"M6ꩴ8VƘc|: y"4~+$_o&َvȕRtYV̥;lUfg6YhEt4Eͽ ]aR?p4zi Ie<ѥ4jUΩ:+in!YT$>d+IgxB|lM>х-Mv|l\a#[xֺ-ZiI#|9XWbXH^'jؙĢoS8}Fb& ٶYJOؙQ:Pė2^0X31JhWfað^Ǔ~J_=ogWI6]{kܢ j-8-]T$LꛃOnf|·q:Yp|O`r7iZ!|6\;JMo:soܻ~hc=ǯc!܏݇0u}N~V1Fhwly8>~Mڨ4LpR_f5)lPJߦߌݮw~/ؾ&DiF V]VXw\:I&ggn>p8|9ELq\*764|<*kf-{}6\_(z'`Hݘ jۭcK>0˓N\g§`̒CV<>o|MQxGH FRHVWf/qLPAKʄR2M(|W[ a&|8-a%Η~tؾseK]y"4<0smCvJ%v`AtĎ@ iT6ժU֛E+ebr(qdYT-QUm#1)ZW MjW ir~7^iYc0䑘 vy;a3k[HL!!9cs j"]:G$]‚D=dp.LepKsc xԸ9Һkuy)I('XhL،Si# ($EKDPepUhXf0qL[ )B %&i(QUN "H\8!xt a5x `Dٻ7/޼`!;gnU(U΋P l*gAMmNx$_Įyh%\S4 a-!kve Dډېfx;>fl&7+%q?l7Ϳ_?sz$J&0(oJ!*JR͸F,E/ײessP6F^#ok"x6 3\ݠOM9D/yՎNP!Ƈ>zcNU43?N'46+oz=޶Vjo*!מ<}r+ͷ;ro;a-;=}4RT;wM<(0pӪWSv馕aHgp `epy VZxF|1'KfTKqTKR=vɿ.T|O_WsA[ N'c(uc˃DC$R9%fPޡzw0Wb^xF'o&yX=S3/iJ2Ic#n$^ $s2N *NR^;0Hx Qð L P)xAov ~ZIj7GgP|CSpviN19OI0t!q #U0 BՎm Kph8ڋBdRSdF*w 8k0@e8.R7<]KK{P/oEUE7GDcF~g^됁O@1Nʛf?EE"sE7׹]fW!$D6W6x~vnUi%J5#\zP`uAdTQ+(o "~maize5ABBTc8AoFKi[CJ,:B n]z#1.҆w+uB5Eg +j680_Lz3yZ,4nQ>&,`{ XP ß-۰pm;)c Y)s)=5B<81\V+ ܮiM[v'HfOo-7/KhVl HKnW`%eE_]Y Đp `ݰso!g:pQE_~QY Fo :Q'Jѧ?Voch\Q;[͍k05?dÔn>iRu,6dDЏ_Abpo`#b) w~%kMJqcGn`<XD7gVFyf˿=Y/G0 jw9*gshw/K"#>6V7`; F.y&>/TlsӏR<-;~e>{{vGyiVzɀ5AKYk_|y1⽖4$_[QC&̦_=S*os$/úbbAV/L[coYD;U7MR_6Jq hSi~s.sI4cBS΄,!XY8Qh,#& Ig^pR8 :>SأA@xQ`;e Kg x9ˆIAvP8. H8DfT p ƴ@IϒNO>+ YyPj;/K}X O Y(& q6V *9 FJi_R˭! αVy-J2e'$ |Agip,q'FS+eVk:b~˟;MV?2"(RB*vH 86JAӐH=g(#X?40LH"8nFњzTz3W a;#GUjזSdw[hwy`Y \0OPL44sG,՞ l-]D$k\ Z/y-L׃=(1^X7їCp@X9ІyK84G:cyNkC$=AQ5m)Gs>؃Z.m-\+ XIHZp0ɔIZYz$+͗wT荦 v3Ӗ%J5̽P}|(F_(9arr/Ӳm Dh^Ѣ (TuEzd7/13y ԟ]u>Jdm`dpm|u-gjZeD!tAr6:_{ &UUʍl8>h"\"|!1w`vInʘ~xEGq) ]_`|@5q hnu#!ZzuA{ɂL\洈^f9,E̪wrCD?27{\`(,%El}֯X5 H \* h알G~ }0;8Rˠ1hv8, մPt\ehVum_ :^9,+CL fVsA=ƹClkC,[$8/|@?i+ =~ꖋIT_L<|q`8$[,.eޓ!v?JCX%Ļ8Դ|M['< tL;rRDfcNo{yp7M K7ړC oû=9/x}g.@:6ϋR:={',71͹Av T9VCwk21N'-6=w#^Ӡx3*:{tԹuj48jz?n3nvE{i]QDɌqw >pcJ(?_|V' $$UPY(AA Pb!X32b+T|Au!QAjl#X88)-b1dH B)d`CHj5]a^xFX? ;~z^=VKjRL B+ɜIr($5J(鵃 w r5 7ʘ-1w=+fs c,gn-fnm5GZǮpD ", nxz,~0Ea?`pӭ٠,D!(ǟl.@_͓z_ˠL5Tq \)Ni%Rulgwބ3UXTR "ѵnI6/AkR ܇sVYu{@HyQc`x/áA;ܵnI6/HyS$ 3QfLEg7BKo^{4*5h9)tLvlzA.ԑc%,'1>l^z ,޲7Z0vz} <4GXjo 4v IPǃVtlG7cR ƹ X!&ٻrW?1/,,6fl0 [ilLS%Y%cmC-"YUb:A:/uZ0mWsEҳ`:ĤDQظ>0xkDrAZegAy| -}1(&221^ZI]2w0߃Bm/1 hY~Iގ e T ȅL"$ zI1!SǟRWw Oѱ2݇ &S;w8٘M8=Z5p"anRytTgDg콃drg?]L_O>ؔ%2.4@ucdS%nP(Mj2SWo{i[ݤW/NqMhuOs": iKpCQ6D^5&X4wU4xгS9fQ\"I"ARd֚Bɒ 'lZ+,HR4ճKPV*i<n9^&l$((v]q9[gL̊YԳ^}{j>A.A$J҉bTc JeT(ƃp9(CG'-UP/,@\%Bx ބQ('&=%$oHgWf7@fP /_!NzYv d|sAx/NBe[J{KWŚW(`8y8lL̳/IX=|2ƀ S~zlBjZ0y;`; b]@Jga4DЂyo|. k!z Ђygyl_=sbșTquZ07küR'UAZWhp#ϻÙ7Zޚy X"t,JVy^ЂymC4[GfK1ЂyEҚ\3)KuxQSLzo*(UCy/Jq,PkoAY`P^!ԑ݇B 5̛jR )5E]oX>Z0OߜyjYn4tAV;%#^b=(4a g%M$HZ(& J*8)w>Z0o Ge^/[%̯bcTf/ZIXj".km"Jm 㘟nJnP7f<;5@Yg QV%FB^xrbQ'PEV6w=Ѐy͙w+N m:I_G&$}?[9[Gh|܆FxZ((!8T6lX{K ]mGڇB\%d!)$PU$/̉t4C Ίc"c9R$ 8ZKqd -'o0o]s*gc$Ii=R*$((ORw>Z0O<IHdd(ҦdD.CTZ|FE`ΓZ2Z(yЄy/['H=P=8t`{[ǡ9 >#~QeFDF]듽qh M9) g%Xxâ8q8j['!~9\Id7[ښ*zzؠMV.Tp3F8Vf~^B-^֡GⰭۍb,b/brEElAoOgׄR!8~{=gmo_Fͺu` G{ =tBe9 K% <dhGz]ӂrB**rEh0(9(?ϸ55o%>ԝmadyY]SS u:_(_ 4M ?ɿ~<w' 7COoj*蝏/@W! %K .풓1m-)y'2B7*ϣThT!8+B.P])pw(%%Xց3*{b9i"((RqPJdiNդރz<KbjCkחzAULgNQrdiipbK]xHFZhqc=wnL5 j4XY!_q/fz4;?ڛpcv`>m66ϧg y-)f${4-jJM*$jCteESX+\5B sF,P$]LǤ{Cj#0dZзG Cv6_~3g- ً\X6ƏǤ &?@y_IPw:(Ih(\(N)"w&@0^KQ_N_.V{{[mʉ4';&Ouv6DM"68XA!,2i %CA(.j;`h'A/=a<[ "p t$cCM4)Nԅ9ևLx wqB=1Re=7 $ $uDJ9dtsI)x1(Jd&WQ`z#S“ig?/.Ճ'w'%#,?蓑~eRFFKXYr+*e'exj'O6]FQ45ߜnqa edTL̗d24yء9y;4x{ owb9cK[Gqw$Uo[ޏˀ|zvVMBgdxl˯ !fBd:Fh^<6wI~^bC-B(J+7RNe'[{rbkzݵuyJ|q M`ǧӥЮKR[}+[__?ר0lЛAo^<{m+ \ݺ16Oݒ\3.vj բ'Ys1z ̠ϡ =hsP̗S{;kPgVۍA']ǴCznҮA=ʥ=@z#۶qU=$AoykrKMo{A8I$^G3ft~1KygDKwt7uzl2yqzb}#^Ƴ=|[:OykfFL|A/Z.H3IFK Da(PV&}4%T0*≓'UxbF+7E6Iy "[_ )'MYM;yRer,N8db笪!&0blbˆ@4e8{h.z&j]L hVW FQ` b'3 ]#A8Ř\Ń]l[BH2H3ImaB^R=q.vQ(IDmtt'i48RPnjN {>]eK ֊R0|51J-6s'#  lذb+`]V.B-dS$MGEgEIYE+T*DJy @E(-}E*>5IM‡Pta_;G:R0}F *9;/pO 빦gpM%&3Sn왣J'u- ܧ}w )ǀab4OhRK$Ov 9L:R$BYwkVe tb\Op Mlpcg8v}ۯI2gYW_;p`/onVR(c{jYۋD'{eD7BԁXzK*$h`LY` ' 頩Sdt sܝȈm⸞g~NMEؖy#쿵_^vM?˹5,(!(U?EGw!Q*F%Ed"ۉ[]|4֎LG^m 5-w?F{_!U(bHт"ul m0TJN *&Hߺ;ݍ' IC.]z ~ϖr&ŶBہ =䠓& ٗqÑH{!R5&vG@m b.Fr (G#&$]kbLİ0I$D}**F_Y'tQ=9l څ㚝;f|=g',h| u%RK~ö0o@q:٧cjk?v~)\:*u+d($ig i"-(D%d;S<N|Sd ݬYwrܹ=9tp/i8-b= } >zPtYsUpG:} T2&6H*9^1:,DE"*7#Tgϓ€И FV<^ 6 ̉ cM@MU~mJ&#rt@`_뜖κsDqoI|>6{R׼ jGR-vJ+ZZmW)CPoX99r{CuQ\6ṯM̬kf\W81͔l!'v$) kng0]w];őY|fIq"CR×CȘ]= J,.$m(#SZYJ)*PyrlMIA≜6uzYwԳVxj >A/KǨ,K6YeRFz2)J7ItAz*.W.U"sC}~e6:ޯm"I "4DIA FeɨɑVgP+MoI}'.~{ GwGE/C%AN'mЛBQ.[)hcHVN DŽ'g{X;u&9;լvuql_{'A(ZqWS]b HaRĈZ;Fj"X ׶""ܷ155A?[QDΤ|GaIIX67՗S\"E[tv"HG.b%2%RztK0hGg\k0MYΏ4#KoT|t穐Hc! YgA9Yqzt?vt>Drdnl l馔7^x/!CVO?<̊t4mˬsCQ^212 h1.( *gELZ$ȱWO9-x1iqN"O1A >![wlzCY2 ]箺s${RR$;>urG^|+1(V H3$c2M .}@Wd/4i8du{,ZI9K9mr^9H8V#ޱAҪYm8S DW[bQݬt|?-e}uBO#|RH$$Z$dȺxB[?_ы(RK1l@)YWNXtIPu#YVk s"uH M&*NFO "%jTƻgCc ,s==49>8V",_9H_r 0>0I6ݍ&54а4QE`ߕPEn6˸>5;0-[!? cr#moFj6<٫R-vQ|ܳ,nt;3i }C? (I`M[>ֶfXyRN}`-MU1?yP0>T!$ٟe 7k֨_&;dl&[BeF~h>6ZMp&E +B]=[g:]Ʉ׫ɔ/W4f~o[AXQK(Cb 6󴵻J=+O]8Jh _Cf% UBPO?/~Z؛}B<N~h.s G:0b)gigE*|vϵNAS߬=1~ ,fL<2h>14_nRhb)Ί^/7Ҳ"=QwE\'֎jEq}6|sg/Xތd(y]_s3_: ÙϬ-=o,W!eȶHFN 6 ̕E,6c33).]yoG* 31&-}0M L25Ӥd)|y٢xĦ.Vwu{u`ڈYY<0C@o~*^T=8iMDXȅb.8~*)K}l4|>+ 0ߜ0zZӒlmKn˺ff^LUށ0 (x4 =]92m\cN[%hs ׵vZhp=T5l4!2.uP*FxCB'_TV# >;7'z~{߾7>b>pO(0;.NhwF?|iWMC{uT6M.G]Eu95.nO x6Q!ȿL]pU;ґ͜AZsH\M_1R@l0XVq?u5~N`G֣"b5TU(D*/gKG C+%C mvkյxw*G~Fa>bxבgVˈ7Gug1 13dh# xT;:gr.AԥA#+ I8uT W?e+[]rOgIX̵Q*g er+g Ťbn}ܭ`ŎjiJ \u`,q1` 4U;)`!v{}o;a̍(#R F#7WφUAp}K|[c^ր-CWaKlx~:[qcbp0 -:U|<$7fHO!) IUp;oB!1qV{tHbaJHݤ~q z&]f?.uY7v: $؞ul1=Yx8ctx &WeGg߇t V3nxsŶ5}>#MFI$5`ZkK녳T֑h:=RNq,X1wt)>bK( 3{9A ;=;-vK 6 7H\hgWU!?vCn5= {wO]?N\+:YAL'WII$~m|1B2 ?Fg`J!r<8NsFu .W1/sDS;q mܶ5շQs4/N[Z[,/.efWp|ze!@d|yUM6Of|6fFd`Ǟȏc0bSFcD2Y+냉KMTN ` Xy$"K8KuJb%ݷw5 v5 Lc<6σt]>~~9vVc{v |zQ ,Rg2D'HkKI QDpC@JlYh6t!7, ͚bz)0][n1$ZPbs5IU5X3E.O>L{|`.yֽ^(Y(6Fpei K^]dJuV&k<;oɎm:.tUhuJiړZ1i%^ӭ06\^.>e(b_HY&?NܧL ? hoQeN7F-u^/ p6\T"fV%7\u5Jmgo(RQ턷1XlE<Նjs\5XTqk*5uDG_wE_jG OY_@yV(j8Ox;Z8`mˇ;\tµrOZ9tl=?nLLgo$v>ح$Cgr>G_ c3,2v4Ix5C0FX[c or %H"}(NxI$4Ln&W౜]e V,YQ?E*i,=ϐ:{|?4M(:o;cƌLKFMFSفӍ8{y:rһ^0@$eq5jxE"_> KKB9UaaR"fia䪈&qMihcݼg3;+`ʭKEM8dOurҧP2zE)JPG Vp޽Q[M*Ffd j0DNd^n;ߧkVʗŎ\]ӫB&~Y 7ppa F=LxN0+D{-"#a:0- 9Kaz+$,r!c0)fe  D964XGmt!a7XzeɘV!!: $H[XA`Ir,aK5K>%IKbPgzUObQ﷗>{ oj]x0)EfKH&˗eL&' B"J*4K)nAs;BVSo/Y ypA!!I"a3b0^oEL* Bis45ARp E޾>n]Evxsc:XK 8['`&2!{K$#6#KbS1pl2,ŋ%V  PiWޟd_^C&Gξ%%~@ Po22}qI~]cL^jm0mk]b}rlŰfv_dٍϩ&sJ5Tie'fn8u:k&e%mt?B>QӚa6Y/*/+4ƛUjjybk٦c_QuyZM&\YJU>Wk6g2%@k9y[ZzƄb.:Oh!;IJh&k7ڴTG# ٶ;s |BdSp5M=﨨0!}a(p ZWIQN)_hNydw1ic TUy`iKk6 mQ,6[Nq;b;;M. Ȗdf`(d[9H .am+x֦=W |f+m!^PdJt##М=\pBȞlO1޶]m{yi328K>+<úX Rn~Ň*2_ٱqv4 gJ|9!~1jd<+`p0 7<"rv䡢H`[|}qs!\sbsy;+SgF)x ʽQ (xv q|OJJxt⢁~cj.L )0LF.(}6<8;ߪU/Qy5$ Z.(K8뗕7(UY 'iy*N~y[ڀQHcxXyGOf]]-NxC_`v~?wf:$"wdQ?]Ll|[mH1bYO2Nam<>˾8 a4kI@+\c89T!4)1K?xut3@1y "\P S,0VA2Ql#D9(2tQ&qýO dplI&e])bӧ@/_#5_s, +m/pIM)8->fwbM}&k2:XHxa)L0h 5H(V" (0LBFIJ,`a);*5m@j\T^]>+֜LS3Yuk!u>]7rs(u<6zσ}-׶wbEi \)|0WY k+,zYAіQֈ`{*AqF /xIk/" (8B]h*>D;@}-o׃.ߙBH%M*mW`NQAS+#:JF"$K@N$#l~I.A2x'p(N9BPH J` "0c7v.qQ+mǝ) qžnHa&#rV:z@5 "VX@BДH@cN$T5q`UGO{pϟQE;Q 湁9֔IL0Uʵ!u,E@U6H=< & Q%D9l.鵍GzSW6 ں z+]rП 첸=㣣]6_E;j"@@Kq&ѵ8Gf-?xDE)AefT][Uwaɍ]+ gfFO>!tw1؜NoJe˷7jmmMQp %7C:Wdh߷VgU+`jNn>`#OϽ <a1,+sW!@R9a<&a;@)Йo~u]}6k%~YZ.."e;aR/1",D@/$(""&H0<)c"u>ƥ>W {sby};]E0ܜ.Zj=ۀ$aZ)65T&2R,D*8!OTu+NcF1ZleF͝QFGA`) NPB2Kpg.+q35ػF$W6K#0x/cn4MQ2RC$")))RRnuʫ"22qM11%βi>\5ή%MXrv.3ի.WY׋8&W!Fs,Uf۫ _0s5hq3=/W`k%ʣIN=ׂ{z޶E֥A'ސ<8=VWvG JX!q*HŸNz4MZ% J5(.e 빱1$SIIψO9:LJh8.g㢫ڏ6DHh{(P^ι7^w@@@@Va}ڼWkv,]&" < a+,v]yp~L$U1CikS"Mŕ$R %d2$QxF(Q=v=t$ 9Ϩl5ʺ _o}s\"V_ bzR|>:y AhRb"azUI/y!J++T UeXv#]T#]Pv}g4j19 UΘI qx !$heAcDQ$HS^jr1y̓:r cE0*'S?ťbَ ="GBD/[`hWSv#£`mף_?g iAWۻ}iRwK,ՎdQjt uh(s2pqn1έ.s)_=TETDo,M: (hJ, [LP9[Se'ĒdbIC(rr:EXf E޳$4tJK"'|Vl:;YW*:l T%xi@h "%3a4@ut1kOIcc|u,Pn Or(y9CIaxʁ84 F F*< *jZ~<8&םtXG||wE P6g12A\-RtIky椳^r!tb5j]HۋRY5Փ  kd߳ vL͸+ e-&X`)u^ D3Ӽ3p" R#A4;n+݌b/⋮SOv`h(&iDqm9f x'sAqE+'}d{c9?(F!O];{tYz;hOM<;4@"ǝ`&(&;}v QEx $1APBIRl~{ @Y x-us{'M^~@)B)A 8aJPNK!rASJCJ cQѲ\:υfB)jcAwxpV[8Q#!P My|Nj@,g A(qސdE2\r!t@@I?!X-53',sIpZ*?:7#y )R]4SUR6 Qhsʤn7nGwܳ$SSf 5/jʹJ7гpf |"Ҕ"\#I7'>?8º|}v&%Ra>/I^`8ojn0qn1>o fy"(Gz_r,~k6{L_m_8Ck"GSi E]U6vOMg9'0s۫u |Z%߬spϒ*;?B9a'\I)'ܝcTr'q`w$2jQ Fb2B=;Ⱦ<%YJ! *?_?l@& 73=_FQ4-o{vǃ:SFSfcŅrbj$ȉ?eip'5e'o0pѽIK{s71 }|D nhtRt>^}Բ@˳}&)y~æX(ny ܔh$&Bt~q7`2PAb9Iox5fH|_֑@m> 7,ʯȲ!;w1nxꨌluȶQ[jy=XD"FLS Rpc)t}Hqovh;kIK9f$;ıM~i﫥ftzzJ BP)~zr)rPG׃0@!NBe*_ "VtHm WvW]g5:5 _|* Ձ J5Q{HjA7 "UD:Uhw qATޣM5uBrGZ.4VD$%wk&0e.HcMo8.g=`52^ EӜ QǸ Xy+`$Sbu" Y[z_+\/qyOB)K[8G}jIwīP/Wn•aU}^ IWz2zd@닃gz0mpvh9OUZCnm?ro^ 1?;ym/j\gi0fu=vC@'}E ]p:IA^eǹ\1q]6 ,h5T"[yzsy!ȓ n=5$m6 sNYd|&U0 a* JdOZB!TF0V)PAbQ.[DBx>j,;4Iph4E&̱RoFb' W(c"1MP )CrF~YíN*d֍ryt{>JjvKs ɵDzMGm$hWy#8$iŸZNѾm ǣ& NK]\C!|. 4=56h=#_K4wQ>/[YYK`ٝfp,e,ecISl],jY[`jfWUXsT , l%,L.S97)j\τVVCG,'/'7e?Ciz^Ryķg7OAA /Vf`2!E R4D#\4". p Θ2Ccka:+6U1@Te2jEM6Ú1Ȑm2޷{!cƜiDP]ֆh@ZH:indDg ,aeiW= ih^Ez!ne@+ܶ>pzwLѧѰ )*\U>`W`ߕ6Jii VB \g0$ edxYhJ%x8.F-U,Vzl1fhW\C}$vmFβ Y˓7*doc$RSgrTTBx%YYltrkl+:@N=KX@ZH T4m1k+6%ηӣGQq{tG-yT3xL5D1d tQ9eP8b8ZpEd"1伸9spftj"2 jk y lMR& 4"' Fr?6Elm8;z="F=d_F@䴍Kkw^*MM}_]~N9 K". 4@m/7ߨōvq <DB)g|e:HQL֐`+GpS8&ym$I1 QќER>>hoˍJI! G[řgK#Ўq-}3 EV6%cI2d8km8;7y-Z*ٽvsTQ$u Gsnܨ}UZtIꍂSQ(gn^CăZΑA1\d@п085QFVG $|o+^.ͦPbo<3EZLJ+Iꨔ&Ԯ̯db'ik-H]Z6b8:l+]W HW hlx*,k)<}o%CBێiiq.1E%դHظt]`[X(HJX+bPN3h$umPu!z'y 4J*Ņʣt>E &2p!k!Yk1 F&h3iԺr8[zv>-9\pmHf=a8/$ h/5: URX q3OO ;bU7iuC2O&[e5{ijH% ̣ΎeN;'JI>bb }twk5R_Ofлw˫*YV9]Q0uQ\x ىr;UL[yY޴y 1үu+Lg|X&fcF/sy?;άtw3$ Gn&u'`'á'P{lF4w#]c7ˢ pT'2ŊD_'n靮*^k:g^޸,d9BJNJok,(8'7tgﵛ s妚d_ӛ=_8?Ϗ??sa?ˇyOO4%qC 3p]˶]K tm|لЯdm>rC/_OqapkIͷyaz,8Lpu=8 `~Y{egGQq)FV!V|* 1ifT&7ń.+q*.Rv%ȟ SF98+oiQU9iK`'s Aw^%.9*2kBğ~|htc;t"HMbh e=QȘ1] /x,*kuRD+ f.tKGF ϣ#G^A7d1s/,Z!8LM&q`^jc[je*&oS^}.V8`ku\+2Sf6? ]wX)u6mT0D[r張SUHWAKGrew%[ 3!R$+4zF")"dr!v([yatofmAU~mKG7;"#;|ǎ9(_Bx4HsrTccQu59+媳xZ`I*'"CWUd ** %c:fjW_FGz /|` 2M;dpb"Ld-8ƵY#sJ_SAĽaT&|g?̱0!m󤔞mf^\?ei[fx;X JJV.U9ՠxF%QdrFAU x:6^ ꎃ|g`W`uJ E:xe`LiGD8,^NwX&_+*gT.DU!Om⁇5uºzaGZ/,1585)d PI Ƌ5!&I"VX]En-Zm"W萗m!U`vJ-/6Xz .MaZzZqiyUڗ^_ [%0w$⫊C/jiëH6bJd2儕Q'DSGZDZ͑Vs̤ƽA5Xc^uu=v[އK\#>דzcлN^en#+@i!emg y 4qo3s6mvZ{_x"ι a3~p:Bl^/]M_ˡ{ڡ:t9JD}ȶwY}h_10<߭%ݮ!@Zlpj˜3%pY္Ywm遁ՁA.ZW SHj0;Ŕ>4GԶis?<<@jXlڶ g߂.T<1ix6|ibƷIrAmt IZ`ݩp=WIpu2f9k!_p^.afojXf{[ZR}3jw%+e<gښq_qivN x}ڪyتy-)$[Ŗebdtw܎uC> '[T8߷|/>pvgN[b$Z}}-VniCv3PΊEMÁQAl'iwlaR/: T,\ Jɀfk)4ͪ?9tKL)ym&/,y}Muq,5Of?NS폔jom<_y3r^peMӢ,Q6.TsQsȫϚsᲭJ{yH+|1f@K[/ >Yzpt}2ׄy֧{n`j/kr|ʛȚ@H2Ɣ ڧ@,cI1L.),xF{ {5.ns,eΧh):dye!h- Z r=3QI~,|iM@0}/ePd 6Ue %$ƍShAdrrNxl]%tvCƳ䎫]]'av]Ng6Ӣկo*t{osSmvYX` AqZ\Rڵ>-kZʑ/lQJ\LygcOԚ=] ''tq0}\} ؂T:B!"&o$0fc*I8jr L"Ѳ]Ĕz;#/-CN@ z KV:MK m1H#H{;(v(Ӯg*jW.'c1߼V'[)+l"hOTBWlw x5$aJQogUJ%CwBIj"0QY:ѸsmWomy~p;.hWwC]2x.n9p<;\>rj ?~2N]!Ho{dLzA~e }6ו=k.Cy\ѿ}ob<·fx씺Ԃ(TpKvN{yX^?/\59QᗽNq|iѫ霕i+(&+2{1:WxC /b,&JEM1*bWe=_n-iX4g Sy)8-3g(L/^, RRh"TuKhxKH)@Z;Sr&AQZȫfPBw:*J;H +p]Bʆx)2FV A.7YV F|,!Z0OI3tA7Р2.9B$Xk0hi]XwO _e:_MhъCy:t_dऒϓXxߊoEy(I9I"JV&Ҥ}"Qt֚LVU*Xy8tSZn޺d7;: = aEO}i0-m6%I%z3cn}<'Sid1/c[c$b$e$ S0} %L#:G:2 ͦO3q k_}mا}yhv_GjRH޳\~)w.Bh ; ͧdvl0Ny}'[) x*\ /8u[G:(bJ^j5&ej8eMTTw PFJu:ofvǣ3~Pe=K,)+`?'y,A$R:yJ6BMb djy$ vJ5/^O3s&6WUzeV4yf~7GoF̿jK|;7t3h+"^' .iS1.\dcɨ\ܜG}']ZGz;HR}];^YcؽJY1M|bhPƶgZaZA %_O7v\MK U[2Xw@yǎIoiCG%'"Ik<9&t$6;RqaUybvQ)qX6`4zT< C 6+l }aO]HvǞ) pq(6рhKV7 U-:BLH2UK&(J=K:cz6"Q,8<6qza I!ci!i;]rjz,qgL6ۤ9\!ֿ??- Ӭ׻O%XUޑ#l9.M͇n: ;̉]p:^ECAA4=;7%5kͻo꿦W}7_Z9_}v+GnfU1m@Z#9t](ԓVw՝2i}v©V`tuUZXߴ5b3~ÔŤXC|6/" TE@{Ob VEP&鳰̽c,"<;}=/)Eh|a]T{iȑ)Cٻ8#WbC=Ț-Ii8>$blE"hxesS ?pN,wߥ#:yV۩#v>w)Wg]!;X֒`Tp !xM$-I^Z ,)+fk̩ -/lBTڱnkr)ERhyP#rRu,ɇd=E® w5 }gtG`^b1ީak]?7U!zQH 3+ Dq%](aJL$Q%XRkxU|BOFrffفobIQZZI f',ΣYV3JWhփC%\uD,W>>{M?'"7f|v%E"W TqrF] +z]cOO5mH4hl!EV"8S*(d,oDdpJ`mH26f-3zsC̊c#U5*)_ /,N5=I=Ex@ 8 OU3F ,+^ k7npDSL,Z-s8`pu|v %֯o<P=Jȃsq@i8>]<='ܛvFdB}B@Yxºx͆yݎI}/grЗNw)_Ω#k{=|WnL.5 pFMߟ>0U\=E#>Ň5L}?5ӃmMuhrbvo0Vf6SxXN?WV3 gÿ_]2͖$_Kz {Be(yRy,Ou.~6<.U{w{buI.u\w7/oTCjǚ秳XYTq08-= e ~ax=N7{8TGA##<_r];gt>THBH*Q%@^ MgBWW}$Njb/^FSuL&.R6)RZ9'IA/"Gݭ$`ma(>f]XoEC+eb|c9t ;t6,?^Tk܋uSֽtE'Fَ#brl,1ӏբ׷:9 e"667FR7XƝJ'b1Y[B -\E2#颥M^v*%pܗq.EORM('W ɀQ %cgY/;R:-ˏ'/hIf}=k{Sp"?6Wel~`J9 Q)r c%0x{lʃ[d7Yg 5!yLd4b:  ii5 A樴񅼵ENv'BQXee0S2d>XӺ"/giIB`l7Ip~vr43y"oPVx67J@r3`,fzhԭaGIxXR dX A2`rȠ!AâS -LY?X^> nYBW(6A'm _!'ᤳԴTtK<(>s~򘋲c M !;$Ӫ-L$VX0yOfddCEgs?hsF)h[\BA9́F(Uw]uzWUw]uzWUw]uzWUw]u [!wP\5n޹>p'&7h@7 -HPn ;{^P~ޞ`wGv1["J| YeBE%DI#d1*'rLKA%cPҢ.R&e"My (39%apDE{g_~g wϓAܟv;fj6hn15tuy4~fNJ.U_Ql09du( /9Ds HlRI]얃]P (+$ȗ!L`:ET% wz7e{_-l Řb %Y>B 97-8u!Q121,iL4d4VdU%=vޱ !*#D}xbQhkr|^(=k8FB$'T @/>R==zqe@l4!mД`\V*]y):  c*bg =}r\&=o<מ ^p\ N3]V h)qѤ65$lϹqnZq}{١ͼ9f'(.1XURd֚/%,S8DZXBvHJ/ 9 d,(tٲHP&FYC:cBV:Ά|ή T/$,?y c9"U-akĞEHJq0",% |V"7L\%U*DPI:9GI L*zU3U+^'=UF {IB XSjzQ(3^pt"8F|_*vF[V dwQ1p\*z x#5]E$A:!5CBԬyb+e3&:1֝Evu4>De>#ۀ=/ɂ28aqSѦ b0աK%+E$ b̲gcU1(d[ps}FЫQX#,@GY?AHTCBl<& sg/pR2+*Ch0L JRcx^>a6[1"-:2|ZOl(fǃKh+nq|I[2Z.fC>t_{ո2oUHs2Лņ΅ΒK~ʹ%Wfa֧4k^f?+Bgɂ=*Z2ԂRJwSƶ%q7u6z-V--dTa[YRLZ]d\qG֪}im\D|5AL_Aż-ݲǛD=ߓo`,!"g>1X> '%Z{ 9Zr;Afxw4"*^]-o Q ±8qD+|_x<S]sۏ)5^^AULhiT2hbsyzu:9 [l;ԅta:GFI$5`ZkK녳Wi).#Ѵ!cŀ0bS~FcD2Y+냉KMM ` Xy${_x<Κ}}9Xdn3291<+cyR~Z{oOZ`tY'7-m>`;Y4c>Xsw(u5:Oݭ-hYN͆|6ä!ߝdN0;<`; m V', P'7xaj`]Wf'mIm6 BD[L%kۆK2=k,f;&-4:M4R٭؏#)⪘ ? R/Şt9fc{Ìfdޡٰ6[iILY=O c]᱁@V&jo K*;6~7RMTQv\m6dz.n;B%c&:]rmM͆gQ7s<Ut7ޖ'-Fl؟1^ < KťD)u2h"2Gb(Vcxd"`+50KxEC(,-¦UWN )fujP{%#<#LNZ<#a:g\SA)R9~ߞ=bW!W)fң=P62+DjnKCeLKH90 Rh|Q еAV!8QJlB O1jv)UQR0%ykY<#z.yL.Z!O)5'~=ݧGۻ rGs we $s:QSAec3I08ExQkiAn9$0\df+hf^h.7)TD2'7,*͸Jb,A"A$ࣘmdrmrn`=C .1vyAYF3FXtHS &1Q)@_ԱAؿ4"HSÙ0h焌T$+w[G^Hҿ $%VQRvԮ ^sswwtnȬX9q_#qk16.A$FH]{0 ./l;GYkQe&fV4kU͓~-]uT>xq9}?lݘDXH1̅A}EY]reLx7' g5$1k#]4 CqeVy "pT'`b&tџnZҞJQE6ڸVbtj Is`&.u}YT$)٨y+#cSRL@zK%bdF/cL%նflm:5c(N?;yU:Ef,bBOgF nՅcfe&<$W̺i-kdn^[boBϷ`7x*(s1J>oERӵ֕]E`Lg3~~(q&,a1-R8Wq%b[YQ$ ݌{fKf:Lߌkгd;֛˱1jd4',!~fUy>3Ĉ܋*A2}ާAhn{v%=TSlO]m)?#˿iC8$@2;g1_8H-I9VU$EIJ,܁-Gbիw;=̼V+8+6+D,=KlPT>Rl/wKʋ2b a^ oxV` */9\ao?y\Hs R׀ijFގ̙3 Xj [2ԛQșd>t˴fLRkJ' c-77[8Ԁ:*R$1 vҠUo+uQ;/'܆MTh@v2n XxMo Zqb>?+~29_>R_Y3a*+owJ!<>E7>͑w'#l䥂y]Xz~{`hd4JhV5 OݐSx}"ş?~sQv+y~o ߛsFp28o-a T1/sD7߾&Dt{I\ P|d`J*IgH uGNDg#L61L9 r `x |A{ G O4Zd:BTF,\8;iiIɆ#~r  s@_D)8Hl '% ,ӻs1\YR."NyR$:ba UDq 텋^ׂQ7u+Cc\6O:ˊXJxf߬xM[X>\zT9et2.}yO .AbMk*r=z?HA0X)˻d[a 0MnD >]&*ؒ/a֗0K%f} Y_¬/azbNE Z{kz;$ޫVz\.F^VwVVW“fS-{*QT*9eUu4z7Δ? `WnW2nxs`SŶךTVxX^͞KQhDR(6 o5Ni+ ^8K`zKKuh&kE1 T(A; ZDl aqoR0#4'(xAcb١GT͗_AkMB-gq~-?vByv%NW_N5HNeSJLI_e]_%g_uȟ^iJSVZ L䴈(ү lʛ]Efk/5LfWYoib;yz #z!Uxd!R&R/5e4o1h#1l V׋Ԣp`;J5`q5OsPiFݗ +ЗRLW ftI ,Rg2DOy!LDX,-,xGI4 GrM=|zj&)aj;&J#6O"xZ6 ls~F˶ӡJ) 'eSh]zxfX8zXw^9NۀzHͭ6va`:GIqASrg=;%.9CtۤwGJ#TWͿ*Nt 6 4c{D8Y}v6nEsCbfAj:#:Ocn̘R5OCsaV7 VS<wJK:C;鵯j_ɵ24:W;P^5oG'|_]>fv0=1^ <@KťD)u2h"2Gb(Vc xd"Qcښp_Rpcd0]2t eWH)j@t|ٓO#]Y@1/7+̼" HitL*"Qfi%1@!Ayv#${,=VkS}몌 ֣<6c _/:(ˈqiJA!ư4*cC`<HO7zl.Ѵ !L gh2j$RPDQoQ&z!M*&XE KR^Bz3;$PYvRY>%Fqᛋ0+U78|tp=pϿ[tũ?YiHp2o/ûƣFoZ!X1%k3H)a :(33k;;΀^.gJ`Utb !r\6pMo˄CaTG|)qO᮸1FTm[3_Ň..ehJ R(eU}_7τ76 { r G\.ԨPQ3'#ZQ?zݲbDXH1|p>Y'$*Fѯ;@-Cc%Yw}p%^kҽ ՝,{ YF?.IGû5׃٦-U ڼjɮU;ZiX-9_ǓC4X$qp1o5ʤoTNO=`T?. O.xO?Ͽt/1QO߁nQ A{~}4Ұ^4Ule]7úrǺ?ȧ4Qw#`R ]{Y \t-@" d~=^)* tx+[4[;.ֆ]>᪁cnl^/6G{kIS+1SMa{*o7o[?.ϑY2)N ウA@; ;u -qbA4t!WXsq嫇k+}:_ȭhܨS8q#ѣz_׊(>~ |-,` MLR?>Un݂W![K9Wpgrw :o;cƌL">hDk45 iVzI#L*kD0y*3AmF)8]&vA`xr2PpU0F 7F sVjf%P`yG]<\a#txcUSRIøaF3o7t?p ^υFKzP0T+6^ԖiR3Rcib W JeD X BQ" F#X{%4 K"OUx0S27bһ>1_9uUDL G cZ¼B̡ipX'Ah?xkt4ռpőpxSaK XEdj"|lH%11U5{,Ϳ|rRYOv|OelsB ]~Eyg?}3Jsӥ`^Ě2 6m'wuk̉lwquA*Kc ,uEDl@zJQtXH$%Ymwˆ^FcD2Y+($0""&``$EV 1̶{6pv'ָf2rw@ArL ַ]mv{MV}ΗvvUY+m#IolIy==@.0<BfQe`FQ! JVDN_:@I^Yc(8-}rZ"" . \nLL$cNL&`L`(*ku3`epdj͘ ;d+Wj8sZ8mvq#zbjJey RJ;'WԲy!9cLdTIpRq A k_Lzzz˜<%c,pV&G2x$Y" xbJD$%F:Y<- ZŠV Mg'w1,# u9$2G$3&xP8sJQHCֱG9XXB0wb n~TMϯ?U$r< 69P?)Z݌ W(S pJQ4V.yMsb;(Ϭ|8 ittQ8đ` u&R"qbiHN,QkΤ3%sk&YrP:pGv(ǣ7;iI>ݼߖ'F7&w!WҴOk9P$BK< Ƹ79(:A#$ScWd9#B^t[ygxW>zͦ1&ZMU:aP@Nmgoξ8U;obߝq^X7× e9LoѲ+m"WYUYSK10!tgܼĸi? s=Dٴ}O.zFCSnhX†*Q3J49n %uM<Ђo[1ǐh%y\ ͛|K^]X}_@oR 62C/;' &.+̯"4^^q)]uZpuVc5hq3=0,/Wv1xcҰQTc0Y}-9hAʶCZ CZL+%vWLau.0`(/ @3#"γ$U@v%l?{ Mo'cur`" 3&\ $ B R6LXBzfw6N2`!v](OM=e[;s n*#e_Y 2*0a+,P65!]biOvάRWcW06a:gUǻf *ItVE)c aLKib)'A*@\ 4d1д]λpjt~Zǰ\}ҩ %dℋ()ҙfT)#%sE3*8aV/d|'PWKU hZ6LX4ެѹb0Pp9BŇzܔ<=%p0RT-NN uHe9{Lʭm &: __#Sk}Co0{oqjdW4)zT}TAΨxV|!n>~S #/Лyr3Z{8q">VLQ 5h d>PN99׊sÉkۓz"ݛ4HW)XO] b:m`pf1xgKtx~i<|J:rb }mt_w1Ҵh^fkЇ˫KKڜma??!?(8'7=C)P4^_] 9 OO9OgxQ W`!GAxxyt=k^ko57buY|~9)9~_>1 0{]qگCT#nv~ӝDW؋఑/K9#Q:*ݨelDaFL#ـJc6Oy'IgncⵌԪQJ=Ji|qD䘑6~3e 3߯? e[Fx[hYBE{l5۹9"j2٨%Jp+'2d+p8Ovi{Q;[UuVRR/^{Hj8=mMp%Tr>*5HPnT#OR(IEC+D[Q{ gz혏*xƘB[{&XrE7XmrFjL0OE* 4M3.C#bNYDS bbu*TD-׊${e.CU8 nQ0S b:z ʀxut3+;4<}^J >V6A "*)@@>rЪ>Aq/ ϳ@lxkh6lb,xr{C} zɫL-qxz`>:I@.- 79D{;}zǓx[H;3t6#/tؾ9Ik{Z,iYet|s|OqndA3-s$~~75p]2K|\K0Tf-l1Z-#»լEr|3pfǘc1W,C[1Şbc^w -WL-$*v*}lPORas?>@l ,mKȸӳF7:;O:)wLRnIu|?4MV[PR}[O2~ĂLT׸tfcy{Èfꤺ`ӱٰ~nsof,^gxHaد|8Oղ@V76̫),y19*Gd8p:Y#L;eҒ΃(N:\5[M͊gڨs)Uw;cBwǝPPhd_:xf5 6JB"JFQ OJ-U3slZǔm<3+/% ¦UWW,e0 ?P=B}:Q8vƆ&\6BH7 {J*+pf.{Bu8/rq했7FQso (=^0]]T <1x 4~4x^EkJ&fZ>罥BecMS.yg+*Xq_3Ԥ؇"٢m*/"_}7m)۵Eӷ{'erXGHԇf+ĠVy=5@FXw,DPZ :4܋ dFEKϋ$"2gKJ%QhRQ XjT|rr ٜ E2֜q7Jy[8df *[nQf ]|+90˪w↎wy=1UNV׌*dhh0LVd {b"g\DnVvxNh +Xi@vUL)hd.F |B*:TDCh'˦cMǞyB F *Y*RH!:&*K,`Ewб-MӚ5M{HM5b|AM^N8jbght'h :^-xz=/Q)rc%΋yH9kdYI25#J/J/s1u?nuPDF`TTX 4 T gr񅼵EAG(>*+!="giI#5n_NsO-}Z[r'aQveK{~}-4^>`l(ZQD*@5}a@rƳԢKN L0m0Im-P H -|@\PJg2eP+5@ԧihdĒZ ǒ\EUL a)^#1oH(_V&tB]UH_I,xys!+l28iW9 '-DJ ,*qz F70ǥc1e M !;$B+,мV;G2@찋cGcW9=cE~[׾3K'tkpڪa|pa%t\&^i]-!BȣKХ 1v H(%&m3BA" h 22>P,ߧE}D ; FD7N=37>o{3fr*|ۤȝM[$ .޸ 7cowξU*zw{D:;5?ח-߂7x_@;"~AuQ,z/UYd-Y.kpBDָNhR=65$l4cTsugŐc9zK"|(dkHkK(Վ" {D!e `Ar.[6 (v(\q9[gLJW`9Pz٪rAuu$ %$<,1@ƜxMĞDAfFh*0 {.4'dsrD"o( E`I:Ь9GI5֙U55Ԋ1k~#/lg^ªw*}F錩A q4J&U.hx\x?>VUD2yQ V4@!g^)H:Tܴm*w.OѤLD|ȾD$ DM夏6."d=!P$1˦éU$1pg`룯gua4~o['^}z Q'+BB#< D N6sRNڽ0!%30PBqfR]P:BZ[:C wPnuw _ܳG DulbBb-d 0X` v+԰X,ĔfFڐ9UR[JMuLcNQBnGvljgqݖЭH<ݵqu)On~$|~<> AbdBJ\Фc'8K$`=fUkYy)G) ʻDoǤ˲73%y_Nl]D36 <." Cc:ϻ'/`۫`":T萼o6mk1')Hov[=(m#7LsQ`H -J[ '#\R)d&y)eIϮT-|# ]=zoG58/?ϓќ {^R'o V{;hmsd~%\;n]j]}[kx2Z6ϋy6PŶ!'BZՃ!u&yYGc|xXjJ:b Ld7%?V?v?Z8&KvqpVD2$N昣t0QG4()A*)*O9uLbu"BE`WZc*Ckm+tL{vigǻK%w$ExDMW/t#m݋`>O}?_3MfTʘ{7׳@bTS:#`[odI -iPk4ߟ-N?T2 uٌR EQMC@ L2UT#+<6z|GK XAOGO1+bn4zg*m^nj2k}v#q$VgBU_nt5/nM[D/ QBarf% _:(nJ9 sU%G7d/tq|='nLҚ,Պ2he', tβ3f3JW h8}>ł,3'8q);ND3ȿf|v%E"W TqrF7u~E'هIlXK!DH!/}Rd+R3RQIRMdLJ6EVlo>G?w%G[75QuvI1lcYG^>c@f[*;"+XNkF&M-#Zf8H?0 㴬 +Y_ZO+PCS<JȓKq@0p2 |{Ǿ[~3"\P"›m廛ܺEmY*Cy 0cӔRN#zNKHG?m/WZ>Cgg󷗧:V B8q:N!7 WOcxzݱ5|>¨EjtG˻Gkهg|yM`Iccͺ1o2x}Y?Fh4Nz;o=Qt2w^Ɯ~ߺ䶷nټҐbUѴ8Mj6뭇 xa]{o#7*7=G5L.A.`g60H${_hI[lfc"Y_Ūequ{ȎAet?~??w?>}>LCy9`+ $Q0n}??rO>k>5lie⻜tttE|< [ ~qa89UZ DxWV0#H6E9"_?nRE9_R忄Rh{ވ-H;:(7PmMSމ6 u,q9oⵌԊIJJ/ʰul(nq~&caiVF'^5^mBNsţ:D6J8j !dTH &x$e+q $S45#쒴JҐ"/Ģ*3*Q y;bJ:DX.iN!) N6*V3x!نzqU7:#}^b AoB>m ЮBSEJVTu@Dᢶ @i% H8Q\:V1z b_$Eʅ:$aНU39qҭ/#uҷ]Vs^n=M]@#/iO7 Q fwL&؀R6KuB ƃ0I8'$"!KH*'p™"1=Jz1n3ֳMC7%(at\`R3BDN H<"T nV<[7ns mҵgv3~͕[kf\=lo C=A=θT{[\|b<0@!NBa"A"VHe ,@u yuxg,|=:-T{T8t+UU_?;YKZES~/@XQD=* QS_/Nw"jE!I]qgf" 1NY1&Vyl&Yfkax& "8O` yx3r` W]q?7Vn_ оKy׷{v̲]bz'*M.ʑrNҟ@TQ`Up /d؃VuU2haiAQ1dX^q>qU٭:t~c`4ȳLa2`.i=0"ALjM<(Á!fF2 ,Z#ՁH&bv|oœt≪o]ٰhFMzzd~~6y5Q㙵K}gӧ|&[u~(Bz݇)^ -O;bU o;*}*ʬ+ t1Z+](u9枒 0H,j֧x1*)~l`mޜGuf=Oίn^.c3p}P}"*HL451sD\@mrpk1Y ֋ԓiRR#)Vy=),'|(b\/X$ыn+D [rmK*Q(Q$f, +D<ʍb\pJ'Shu&,dZ&&w^ٶLؽ~h|zUkj)U[s<9/wpʍgJְvBCZys4 i% >)U*τsAb/E" f3k-tV)i!qPDhcƠSbTrKPIu;#gf\R ;㌇Bg /rfKj*w~ՠh\Xc OR +%/ĠIv.QO,Da ZN56<Cl`;8jK%8 A1TBwhF&Jw;#gvqXq.Ekw@:ڦ=BU%!N "hQx"^"C Iߍ.֨n-6?%^zPSBtN Z"ʃ 4=*61P* >DBZozI%]JڳņB6bc26i>XiOJF`%]w4Zy  #h0Hvt(@W>ə@ hD%!^&ebtIa;AI`<(On\Kӕԝ왜{jw6^X_*_*C8m>ugo קnNEպsn['7k 7/wyJ0Lּw[Vij~L+nwaeu ti}gۃ;DMr{]tKߜ%(AyZHn; [hnuT@a)(ghwl)gkȳ5yUjWZY`$It-L%9eMC *z.I($`OWIolVC}$&zNG4H6F;v;#g@,ߟOFC>#ھ\oC+2%C= 8 UD588Zz?'+($Xyl.ג- 2K.W9)oJ*t0ECAL(EmYpր;m<8v(CDU]P`e3r\ AzI9Ia((0Ac 9A$Rd]iU'#"ӍlW8/ mCzjiñZ#A󦝧:YO9TN^^".VdX>d$UQUjK+1Y>HY`O [I,(m2%@z; FOlY{}ں׆ߚŠ~9pr r c\ TSCVBqpgg椃@mƛD T/ԠkA:އ Gx(8䢾{_|DLŔ;*I6Ev$8'$aI2C}ՒJj{| <r͵Yʥ!Ca/uImiwm<;)II5V HLA\; Ƭ@WT'}  Km`u4[mMrNR^rB<]tEf g9< \{IĆ@}*ݯ==y ϲ.%Ч|559[t"G#1Q}8F;^wE 7YYF<rXɧT!C |4X,VJ[EtbdPBH^q`T-\:8\ǙsBO5`v6(Rқnqَ+!O6s9Kox@4?qA1XQ|GjAZ4dBevj( h~ᨊM> EBp7K2zZ_'? gg(rFݽ"?n>_KE6ƓaUEI1 V@N )̡|E%r>lt(1[ȉ|Q529ɬ#H.<Ej- xzɳ3x5ޑ7"좷4@9 [n6;x^Fď\{w1MF0Űi pWQA [,Qp4=FMMHNa4kJ[P˪i |MMn*3[9x~Ӡex~Q1OJr Ckh  TyCl/k ZΩS_ԼX/oןoSx|)ݯ56l~ 9L${.蚬*KqZ~dq+2iIc7k?ڿ\=v}XfJ7WzhOۅa6;sWR$QvWig`1Y0e+H0S{^jQ唍 MpLEkN(O9WUV7Uiեrg'ØU7i85tެ Mig :Fm>57s@XRw+Tm $e76iRLjDhi{X:eBuɌؕr mqѹk^}C) pzG} '"Z"M4ӽ^ZnuehB*=ݡlNjyдK5UB%%sxT0-0rix*S=^bx{v<~%B!}tBb&@QDsϢ:ݼjWoYRX;Ib34Jt)*k_cF>{(.Qc 0g](tGhOMw_KͲoGTfɣ_fK'J_36XAX3zVab {'C:mqL/46dr(j]v\ (2 g-K75![]{D\AI'4VX4mNUŸ(1)l t[ Ҡbf2b@eChj\F)EB= AA\Z}2ɮ3ŸJt"T] eΦɨRȠE $e5̀j@o]\?A1n@Ơ) uڀb(4VH(,@6{؍ttg-JC(]e֜G59:&M0 #Bl( `L1R"580)Й5aD1nPrљtGdUmj(pgBQš\ єFA(x3g'j}OJ (%<3:Fr^5 lZ52 -J)MZomXE񮀵Ck>84iȀgm<wGv[/fĥ"ҌY7IcL(Q G:I "&TcY;6~nGĮ]-zdB%>H`"2Ni%PQFiYiV2MkҕjIJ22+SAyCs ~XA̤ƂBgąs՜Q$rDOWd&deZCPg4m*A}Y%@ Vҙo 7tX:?)j@J4DygE3ʣ ld-ŀb$Sƪ`| v_=Ajx1) 7DD.cʺ^H@7SA脴qcJI%h v% Aر",x"+P(vnj5BbF,[v#hXpg"B25;9vǍEQINJ ¤0,,H1#dlDh! c+:QbզX5{-6't;bI= w64IxF YIe6(Vӥ*^`^Qc-A6@*tE6 |+>M-K - ڪmryX砝;{ykye9kZ]󶭹I @znNNi`$ti#ll% f;2MIlfq׶(ZkM!JK9O-'on0&]h\ȃg8w"~TX Iy[_07WgCHW갎p냱wiϧ1swO^_ ủ?ۄ?ۥKFs| u԰Q?yZK~hŒ[B*Vr~Zg΅hNQV`P l"l Np3i|] jBĞ ׵d$!- ͳu9?|nAen՝|.M(Kf!@p 8)>2*^,r*ro>Y#Ǵ)?߼-XE6~þ`VH{X|}[v Yug<_{O5jP{ t2 O&Z$y(T*yP]Eݲ ꬭ=EtJv뭝ٻۯl=dy]he86+׋YyW`us|~ؼ\;.;r_}37^ҟozc5՛?c5"ݾ܎!P%KfwZ@e;n>f,FCXd'S.itͪxts N"R)8@dN_ˤ2`\̜ӹ2A8S.'S'% > bjV3l6y+aUTi+=)h]ΫU` K1s,E[}dp HK'-zԸ9e>9vA9vmTmښA\,o|{)__߳#eZ#X?xuѧ~ٞEB 'sz"?1?1蕛|:]bF5qY5CeLȝ1Cvt93'u؄ M.nB_ЄhBfuZdiFvc-IRZYE7Te )O;#26ΈLfΙr]/5΂O^F@oِgBCk+X=ãnMZ= bWcśzj.|Ћ՘ڨK7mQӖ pF} W:aÓ pa}ĠkƇzn]qoĕ6zȻOY]?c+bqdw=y?ì5׋k^Gu~9{{a~]!мK%ć5Je+7v76DLJK5gj%}99wvDs-i橼>Ԯ^TUmM28$`C xS:.ۋK!+_Suѫ2!t!7uZt: ՝J ɸZ dor*$;ݷ߁@ѹ]nW7;!Kқۗkp'ǩ"EdK枂&:7VCMdyȶ+|J% ݆2mOq{2tY@}c l~=nhO0ÜbM%y3˯,=}ݷR ;`v0!ٺ^4)J gT2\2Ĩ ;CD!.^ kvEE kmH_H~0Ymv\ )L "_ I0%4{*!/ڻ1`F6 O_Մ:\GpXN8zI:Gu!)!.:Uc,QAcX ' })??qN=Mr7Z Y0 GRBa4Q iAuP+r1;{|au/&q$n 3i)pJ#VXNpdU޼RlIVɍq]fn41S67i r}>ɛ|\yLM+"/8& q0R!44g4\Iu=Q*X-Gv; .W|Hd[;b #XL*tT!&rD-Y(%,Uw"z_9{kWWbVӕW˻ѧmv ؖ U`hi)1H-h3W+"9U#,jFBĩT&ʘg7(7fy]1][E\hVPCid =! J)݈KI S"xl,0llf(Q)Edk9>+;@nxaZ`&/&yzg*ߙ" B,a2CHcna[ 3JY ,zᜤ04c,;16wΖԾ}&ϸV:aTe%}/km9;F#4ݙ}{rSs̺> 3.økpP8-uD%ԖhQjcdgܼƸiP;.|&G01kTW_r8+g%!;}`ܩgb,{mPr9`Z1F: ʄ ߜ <.^sH>cmimmka_ Sjgjvkyq s@)}-m(^ tT_A1vIaރF޴:)o$RҶ68Zv\ד E f-qa/{xРMg9G%!A0WhR[Kedੲ]3;#řQI@`r\[X/%ƵD֑hL dS(sVA-0 ,XoR0#4'(xAcfm8̉\f g)a}6jjD U՝tA=@EZ ë«\H\n0WmZe4Z y#8-1RǼ:] Z_B}4_I .ՖI X>/ۢJS}da~{@$.rg/IU˂Q B}3m|=hG?m|e&D;;^s {Ke9YM7G7;;[As^;]8]mluxi5kAlQM>`?'6;_1fǘc%˺:=0cuLgxt u<S IorJԋTjnܳ 3i2.UwwRܒ tvtYq$%T=Tp'?[bA6-h1!^3:kӡ٠z晋mvS&,ZgNxHnK9[&hyp}aoƼ'!UՏHot4UǓGvB9etk[5v^<(_ 1o&IWG^^*.%JiA{!9`G!#+AwidM"XGU!/mlYfca˪FًQKBz݅(^(C4#N,"4 <#-adRSA)R=~OOn5O*m#L]uV07h2f!;Va(DjnKCeLKH90 Rh|Q H8o=Pa &LI8FҎYrp-A[/5^#r5 cgt~LX=s`hY :|_TZ[Q4Gpt*[dzgHFǤ"9%aQiƝVc L"(O x>kJk 4oY^v+ ,#C@,:)Ө L / _= \GRM0p&&9!F" JDeҤ B8IU#emܜpI8ѹI㤲}L~DՍqg?øwoq]30),>}:mugT+<+RY+3C1hVywE'XST<o@Œ)_UDNS ySDNnɥbNLUk8 OB_{ӟJ`Utb !rl.j"P1fs 3t_N?bnQ/_r1 R 1B w'nFO_ ^l~=q#.aTz3b&yr1ŸλƷɅy`d0a!l0 {WR8sNb0x;ͦoB-=IZ{b|uOmݐnnfT0 f0bAa:cxͭrumq񪴱:@>\zma,8'6;k@yyXo.\ / .;o/.&o} /0R !WW W I<]\]CӮbt-sd7藢OsS=7S5Qoʥng6ufn* ȅ~ l~ݔvU<?G)}QCڈfrP@.Ľd/ԏXk-bW-㵄aFiu&a8uѷ_{:m?"2<,y0GJQ&muԏLupx7utRd$XG @HP!gK!ϕ$JfҶ3fO90iIGY-#H֝$ĨI!CNxebĖ_8s׶hن:XNV+0`kuX+Ы+[uJ/1GIq XjT ʔV0I5.ž]}׻{w HY쨖*2ցNj4BWX<ڂ⤝>p6Hz\(txd 6O \o~a覙@*~rѡAH$E̋F/%$ KpÑO+`'. ԟo47?wza*zpp==n|? ? r W=u;4UhV_#R/ ag[ ' 9k՛o&` .p!)wT *4]cmcW3ɳif*{3U*eD=YSsڱQ4?{L;.nqS߆oGgAqEsh %,R .;gNO ax; f4- 07nwfV~r*o.,[㺼T\1fp0Xv!]Znu '1bePQlٚ3ҋ6Cن^P$*8ʝ7\ s1qV{tHbaJ_/2TrN,Wq~ZP8訸 b ĿCO>)o$ײm[fm~| -4? ۧl/xTku'}?h>O,hf_.|zFsTH'6on;4eJ6z!*=s9)jobK& &cM?^{ɱ] %ȗZrVSM2^ǁ~y?ng ս*2(\^}E}0՜]pUZ T]i4_.mPuUb@FX,9V~-g5Ѽ :4E{.Z{"qTB58ГliPJAos2` Tؙ87#,Y3͸/ O%6;5O7dL"OΟ_hh8}ምѱC][Y^ bO+Š @ecDSVNnOR6`0d {QHɔu؝s3b88ڝi=QEǨM=j Z t"KBBATtifd1[-e`p:QQ-=BNhc}Cff^tT2֤,Yh8EM"Sag܌SATqc|㾈"y="hDȁ"A֨7JeR(*ɕhI"jfmCFqCy%nUa6l|M%E>QHLdN[}]6OUwF=ϪIifwX5.UjY-+ڭ*-M+1Wf Lnk5Rl \UitR^"\)Rp(Zn \UiW֘m(\U)p Op[] y?8]rsof%VN,?ā{Z 5/׃l~iYܣ{thO@g>୍pciր4K#At{~ &=.pWm*-nCR.K++$j"n[WUJٻ/Fl8U\MWUZk7+#VW/7%_?=ߝc;?|~eޭO\V,]g3_#HLAF@SBO]qkb)HxH_*e~ 8?6s9+wx<uQ9@ϝҠx>) A&%^p={]TdyJٺ mDCEĒ$:!\Ȣ(1`,Fz|(JƠrE]0.Y ˠ"MI,0Q8Wpv&-7P; >6Lfe0g־ `h뗂]S׆+5aϳ&!:0VGٽhB~e.M ^5AI1|ޯˬ?ou@LKN|6 1g-d IbuU^NX,Ĕ"$)IJ*فQh/7 `hu|SĹz1&oij'?d\:;tž.>xj^7O7c@/ .ѤAp"`c^ko'ƻ`od۴!lz a :rn,tŞ+tgoU矬%cƇjp4;XjtN1fFLdzEO?ӓŏc2$ ZA蒈NZ$ JGTU&+"DѠHݖRGE>'QXn,QG)" \,25압CPss" mR,%־ҵ/LY3g7zfq.>ts3W=y&c@t wRyw@uIPPFȊm6ziiLOu>{CS.mq*H:XKAXeuR B#'Y`',9_HveemeOͷ>6p>A'6ɤ$6Qgfh}Qv\q5y4FO=:!&a{rf5Gujvm1S䡸ĚbȬ5>Q5%ف^kK(G6VEJ% `Ei%I-C21ʢ W\Tv֙8̬֮UɇO?-?LYj1@ƜxL(Qj\2H*|GMKoϬp$'㓈PvvsDN]#[Ȱ$RfSΨV6I]'=/{@p35e5΋/jWN4d.6t#0H{2Ik%DՠȒ`dNb&nٹ Qi's)ERhPGwt6uJ,!YH$욝w%-w>M 5;?泜ߓ;(DB(0i`=h@WOw&zI2 6ǔ_u>񴮸yKLҚ,Պ2heX,;l,ElpwVXq?qiw8FexFsf(G)g% R >|d?OOƱncHh 2A(Wq^Ի7p ~paPӾaSN={O^x^C{{C !%mvm.ObzS\}bGc'>hԼĞ}?8FkmCuϼmO|s<8f] Xzo9 e4q޳zv%H4O?ްv(_kZk[B}uK뚑QUUy&x`:OƣeGO g׽7ت׷|u}W"wul^ZHmX}r룏y0) |YPVy55?l8GVǿ0۷՟w q=0˱*V`~yR<@KM͏whZu4lZ9MӶ˷^t-UW^څia<p[R?oXw>YN3?uldxX Wq-,6p:GRTԙT :3=ő젝*LSu1Lܝq%_wۜ9EVZ3)e~ K"h.w:ׂA2wv^g5YxsJGK򫵁#?X%R^C'2TlL!S&QB-`Ds&kR Cԑ%>oetdsC1H\u~لlJBa]d.evWdjr;Ӊ/$W>P[Wukϵz\+>͵Uܟ2To%%Q_ .Hk ڟ_S%_G]1C?vi~ӧ,k%zo$Gֆe"ä+yvJfm+#jkg xNDir^͓^}mBC(o]eK v>wi|tY f-6,?wn{3?? jyvp؞{fG|kn'u߽_{эa?vwNE5`Ӄ#vuC(6Z.j*YKC޹Q&d`Pc@?{Ƒ\H~0wvlrӤd+W=IiY'iTwW˂=lX0\ŎjiJUcu[N&oȅ LucQ b޻{+ft`r֛uҰL{Ȋа2"9EIbꚴM%Ϟs2或XM;mbO447/Z`fNnRnd =!"=AZ3E4^JJ8%RbSuT΂wDðq$ܤ y,)"*ߎAs7˽ - J-Vwa9El>bP%ܖ)ͨøJ$sdv sβ$t\]X:xfXFD7%,cI[:V{v=0̴$MŅ[yҡ1ܜ CEr$Yq}.%Ɠ7^ R)o`F{ Y21c{&>fWPl'<{f>H[n[g6 LR6ݒ jZgiڡ5A6=q&j\uǾ[v;n>Xudɫ*l։I}m}53wuzuE o&I+FpI_YV:B#1F|R+1< t֨ρ,0E)+u62+5͜zlզǪFًQsBz݅(^aDeP7}x}Q46ʆˉTH71b1*5'Lj" t5^k{ }[^s7c'FNM W ?.^ x$ )[X1c}*B&Z i6rOU_)E9z4pTֈ`T*gnrRI7h"[)0JJ*E.]`FʩR ˽);w.r֋hUa ]֣)O5efczbJδ`.sRIue{]=&z qKܶߟ#f ]Z#eD?#KE="c E h1eEFd>ZRlR 飶ěU`̙R19qr,bpJQ&Y⍭_Vg܎ƮIx_Ö. m*" 2D;X #+OTI4$xTH'K`!#a:0- 92#v6r#qy,;Dm2P`Wl{M8Qu#*e,H`P!Ɣ9Qfȅ4>XȠbVA ('@q904&3f#g=VF} hl\-"⼈!bK8x4pvDF(bdk&083|$l8-aXM#ʛSad@Ob ^2wb,iP΁%MFbRMI8s#z 3+'#tXrbqbLTk}2'qyrAuAT*wP9M){sR8&pT%d R`6`Db]Lܱ䡕GTxBGTd5CJb #52(B;NxMg6Asf.\=w:ߡLoCJlzX\}n+jF!VkA1Gyج&)%KW) ”buL!aARb&$E:JX5FҎY | Z/5^e%9: 0@!c 8yC(Rqk/ PxPLC#3ZU lo:q`UD9*#dG(Kocik踨Z*(ww\2&_)GT1S-$I$*LeqR1(4|˖WkХeVpuU@?w F ߇4.1$$:#V &bL/U=}{۳ofMn7;{O? 7\<ݨ{6SnT*z2(PR+96L=oeg'MXU x6ELn6M:󖷭_Փ<7m%x7 ؆+lh~\ŇQUmQw~ټl?.TC&7`\%|x>FTDopcqpyT4ÿkǘhN0蓁D.UWZNwpJQKFAb50Amc} _?SkmP%֕KmZjCX3((dQˠϛ첨5 7Us%}<-`6\0?mc(& %/5ӨXIڊ_nlP\^ab4dPW^ud9e,!]Znu O}Av[Ƣc\WYM(Va>ÚO]l',wT `ư,@QVPK,;^jA(iolWu֘ŇaQ/t{/!lVlfͥA◟sNBL+\|sjy] gQ Zw7F9 '5"Yo LdKp`{'-0TtwFݧl1)[O1W丄oN;C3T#RC)wÛۤKFD]C{pb9|]ԳFh`M!&)_ε{p|63&gHxo&}{Bhk%(k~ _:y\&'[ |Cc'3SpY9\_`ǝsȔ=(u"!d")[ )R4LOF;v^\)19D$'L䪓qMcwM&*i|I8&jOfm+~%Z%n2ը3;˜,|c*>@$GW1~6J_,N3Ҙ/WI{ɱ6@JX|%gS4aKx 1glon`K> kf$l:q՚=rRg&4]j-9w})3h4PJ_@9=E'xYY.+b5(oõ% .Uڦ-.Di],7ɾZe *a%_s7Lb٤4,%}D1I5Y8Ι3:Y.MWh^hO J&]$cb' ƻl?%䁙׊&yAb1gR)'M+y2P )ORtDL@}=PFMpp+&N\MЩUVc+F$W`j{-$^_Vվenbu9ao<;_rzV 'uSهi~K ֗\R6ABEl[S>!3ܬӉ\&O*_#Ls!S}7^eT-{16o_ggu9LT6^Q K1B+M@o _c0vɧ0I͛bw} k>s\o8䅲QuK]EϜ]}L>Kuܛ{lS`yẕc˱-nߚMRp""psY4@.AT,DOMvh3!:X4j- dӁD.;?yV}D}^\)"(^_UX!#/J )%( ?+fr=۵8]C0IZ2'0>^O'o*ڊg1MkQx߃ExsLL]t^)o"G*`y\͟A9odO3cp2y#",j)C `8D10 )kw-mI{2Rɮ<gB?%)Rǡ(W=8$ED#bN*Co +3ro Ro^ ot6e) ;"%;%5b`J裙Η { )β2Wls։Y]t)5Wƅo^f+H&ΌR`SX|:b֓% HF³:N0Œ=0h*TVEwE'X˭77D8X2%S5tzNS ParrNΥSwb2\I9xyFh@0! H ˥CRr՜\ 6tg͕Ca\|61>&X/iOvpe*5RM:G)t:UF`/O/6 { lU*FGJ5uT2 Y~]4[fZBE̥}8E=T삓h0NJ||o>0~IYW.!P9"|a<ӤNvxtNty7r8[r k<*AGm_kwZ-^4>LH2DIiّCݏkm3̈́o1.E9_cħ oQqoaYY}7c* yg.Tayۯ K颢/TaJruv+E>rG].p(oh_Eb}tKF~`$5^>잕R+&10JSua[Eoֿ@upϺۿGf `gexXJ=`sIMtOMyb  oS6Q[ X3%NxJVd%itisQy`exe0R`rP_G5QZdjO?(վ],f`+cF1 )`)?(\2J}UVX\b_w?_< =`6n=::: SOsCHE12lK柽1V^{C6N%\<T{g}"7$*8ʝ7\ Ƙ8P=d:FY^1Tk%i;yv.WT(ڣlXTUE+3h>Ta+]vt-}6O>;Uu܇-Vkm˝\j@r}e'ƽ Wc'UdU@υhGZ1G;8ݕ7*FـJ|DJ.`lG+w|I>O/aR/1",DDꥦ&ZH0<H?uUkiD]n%XWzd~i81݇㉵{e?g U ̂.妗,Yydc 1FRRA(Ŝ(aHI|/ 9][GV.K uC%тIM<-ּ}7E{1Pry1MmHĔOLB},qNY>y7mj:iDl4|bRU;ɮ:&i6tmvRʚMTYuǑ]3P7V]1fTِu•Ixd!#/)BsB ɤ&,H^?"z8n_-([]lr?2Bx#+V_^pnG eZ30S}=6MVHKD൳E$iӗa(IeF:>Or#(E1xju( ,&I@U\CFRM,@(vG@\$rCvYI(J1,,VJ\,|Z%-4L釙nru{4]rɮFl4Zz 'H cTAYf#g>6#)ƣsǮՈMF5bWvΎ( 4X,rbs"IzKDӈ&T!!8X8KFb30S ЈX҄Ij$f.վʝl395yacuf]"ͬI{x*kwE]>PlS_FqzQ}{ͷEWw>.Zz&1I1~W7&ۄT{U7y')?\G!|cLo#'' Hܧ_4I~V𧥶 eڏJBR 䢢L Q68ƕ5ňș<*X:tշM`|PTiW~@`*VIh~LYwzGYO|ZmC]%؋*hz=jKԝnR)vvڧZwfqY+0O? xShA Rd8\O^M'7zǫ\+&G#//O;>/j7,ƨPԨa ͬEaOq`ka|>^_N |Fi|{)l\(>)W(wZ*aP4 Q'Ml2u !II4w`TsqB]]1c& ĒDl !arK1J,.l[D ), ɂ *P$hK |9;omZԍtg3lIEPTL$TNQfȉkv{AZBσ@K~/0ԏӯ,\']:І$Hj'a=ș$a3 uvw jxw?u۞JһeŨ0i)9J@$m dɐhb8OL<]l'^!Q0"5Ry!!v e$ QĈZ D28 r׶vE?kp15*ۑ=B=2शX aT_N!E[tr"-߹dK$QV]=ک kc3 yv4f*{m 6^jZV H\~]MfBFVVusmy%[Oi/y\ e A{/% ɣ,d%܍فQDLVf6a 'm& JօzW_> ]/41jphc.y\]0?, |04?ZnzwߓɆr?נb4AݎdpuoϥlxgD~+x֭vH땐6hwVpN/M0/O+9X)5C,D;!@,|1t_,~LR7}d <P(I^eʤ!?})LzUQ̟.|ׄQO3V_XHk Mr?v}7ϔ֋̬<-LfkpO!Gs\ f~|lb3ib]y|׆!{d|fomXX:`x[z Q-8>6qzaNe:]4PtBZ%x4'p3qAz?AYwK ~nw^^|PoI|xٯ.4Ժc2.e쨩Ӏt=Bp(6bS1W&J`tAd)*,.j6]DR%4&=y|4$}F[dTڤIx X4Jduil:},;(tP^/[ǵymz6zt_; 007"KU% #xz2ڼ^q:9oZ˞RpC&1 X2 >$=nA7cg񮋢^1CV Δ*AZӐ$mR۫q!ɌM~A1lEkl_;os|!1 S|FN8-RɇXay"]LB @`9wPey(aaq C1Y >0( )vچQ8 XҭEҤSdl m+~ܦBx>m2 l-M\OGpmH|ƃGl̢9цa/'Zp%Hm$`J63D `"Iqf!G-K3h B@c˴ñ ~^Oijrr?bSeF臯&ywt)' U[ w<%Tv%_Նx: +/##6ΠrFjYTIY /!;5Q Zl ЅgdY*%tH>jm9fe"CYz<c%{jom^ԙK+?y ]XzK/^]+^|W|L$#(V H3$cL R R.1$/%|`մ2c3K!&Rfхy\qG0IX#rV.^L8!Om~f IB7ȿ.bT0gjK!CLm^S߯*?2./4=F Ⱥtp)օnZb9z-|\W hwx'y|tK'y*:q*@_F/G;8vp΁>px-Wn fLDV_pT_>IwyGV5@G_rNi~s%kkZh@zvjh'\;t)L/~:cz+rBe !j 6G~_M7}vBbߪ eճYH9?ϳ8i2:3321{m?-ݴ=[XkV_nrNnI8jyW]ٱwxcZQґuXVp!7.X|viD.z:1y,A\8qFmiFqof4o=5/GQݯ'/ϦG^+hJ%1ݟ|rhž!`8t6mS"[.I@֌׎>Fصì٘V? U,hxeГ1?O7Mn^ͣ.&nԵs֜Iq;%FP>(x03N6t'~uv<=#bǿ2zë_P~y?޼n޽wo^/JF$[U?ۓ+Cn054Zeh]sa\jNy͸?^Vħ5frmH|9OBp:3g Sqю<$gގ'0o!Ω cB [1YZT&ם Pɸ+hHs }F[p/Yτa@b罁 oo6uH.MS&C^K (r*pї|uir=ṵ;?WPykU%ۭ[9KBpz@;ͻ+rj:&2 t 2~`0׋rp2*.c[-I+,ּi ,a܋kʳy+m7}P׉v\kҳ yӋpi})="/<=;sK5G_'&Y'5,^`VR3=ڼ4?ff1Lŀ^>^&x~0 n|G}o۳-};Umg's6l}hGe +%甏wT]XyQ['O J^-ɾM-]gox5Z|Q^^j:*/JfnlJ#bY! r‘qj~S_տm+ʓ[Jrwɿg=~-˴N%q_T xD )2RvRJ6ߺZSP$ن$3!.%br<@HIn4R^kfBCZl~ܞe%iKg\d$r}/] /mtњXY6׷9A#䴵d9rk~6dk)]}X 8U;3@Ő5iL0*#T4,Nf-m9U52:ǘgR,IG1[ϣKN@  eRR5c5rk(Ata5W^>*]mzVJk$(sNLr4dx25vJ%,`,:e 9x ˌF92![~F.r&TB+ۨ3{&*YЦƮFvGøb\vEkW=6ZZGm6rMd2G\,SFIK&5)r.-Ϻ>Rih`@ DҤkbbAMPS ɜ'R# dT'\e}X5IZbM_?ՈPW#׈OE#CM3"y`E|&>'|r+2ƔF'I` y.z  ("Ȓ樅ӀPF5rkK"\]:%EQY/^/z*k]w [*Zw쁝wv:ކ _{K`\HI}w|Rp;DkpAX$0)/8fRQp{184[6C$ROmCYNIeC}v+)i ?u B*: Uը ,X`ZhU.*֣PƔa-gLNKCe9;896NG'KG۪'+dOk 2YU2Y Q,!u]tOݧ-$5Ӡ@^Z&N ɓz됌,ic_d/9< Pנe*:s~Ƿ}Vq9AXV6&sO_ʁЁty(֡ɛxTjϚIYF* H%U;췐Pz-WYmvK׋]x9ωHd>b7]^?AC.o}1jPvu3k{ZdK6[jͫvH-j۽M{;&ڡ畖a<motv>]x~EsKg;4 5_4 ^_b`>=oZnm@?f%trQ?āZͻ8PiY8 &DÄH]sZȵ&)Q ߽z:*XIq WB6̜yjՌI.k-^R21F.|Nl!CpV9F gHM}4jkcQӅ&*%@V 4X+D(CЍ J4jr,l8}7QTo9nC[o\Kf"蒏̢(ԁ[OIX"Kl 9Zu5(AKPJI*{edv.ڨvNJϜ{Wӱ+ˀ9-7IJ*AfII|YQYΪuԸ]y Jrge@b!apQ4rPYSppPcE|VwH\D]FYtVd'n@8F$HgHfI'CjP+LOuTւހQ)L 52'hZet%.(!^qTQ4=;JjW5!7$+vN3\qO`[)CLG@{sȉ 2 bX}]3XvjD]4fB!mtN܂Z .5JteLb*aCpWFvَH'K>o"5Xɠ[&ӰTΗ^Lp);+0CLhȠ\xrGCFH691٫RS{"s%ٕ[/ٴĚ&[l^T6vŶ|7dhf3&CZ0YZ7Çt4m׈^5Bߠ.(Ycdmhooel_sv8n~`PZYk!W% cCe61(6}୍(b(yURLyo\Tki^݈/48EIluOr8I >`JʆvSi[-H߽|hcB?AfDc˹}KGQᏓ;+䯕{V6 Pf#BZF-B=$u˺kM2 2-B%Semwg.*( qҸRiPUفfB!w7,Oh{6/wow7wW*c<5Xe<Ȃa7fP.d ՍRB Ke)ΚFGPB1&d\+D}b^'旵PvJ'L3OwHܻ!ֻRz{<bl7R{ KZ)tu )u'Hͧ|uQF Hnv \Q'pnPtTNTA2Cގs\N%TTseH9k9zXn.ho!iZ>) >1|H/7[|WG1bjL@u"+k#7o^?#1<8b&6_PHP"\6) owuM49*Yė<ب4d߁+%'t Xװ} ŮV{pMOi[ WU _zdGfc0(<H-*H$d"*B6ƤSQ"tt>A .l4䑷˒:9xTľ:qG{zVoCcbk3BC6 d)JȐsp6H)ou{_*ԛM}6'6H!]^C8dPg;IRIN *=C1ӔgcG>O9oe$\RH@Ik[!B$~a4R:+t@bM9&BC 6Vi$(Hm1%5kʘ{X^?wRŐVFqHw;ӝ\x/`ܻޠS%cvge9Wy_?| +,/}7o?ٵ!x3'A'/aT}2i-ޙ^]c/g_| nBZ'r02_Y/Whgz4촏w4  % Qh?8UV"iz'^u?/zv9prᮼӛ*zC:޻˗7:aIChT'w?yh`P|{q+pJ `L?DY# 5?uy}ƖOX{̫EGWħ7f'bHO;~:}՟=-I\U]ON^ b~xѠ/-M֗Tܫ>ՃمFTa|?דU$>%%uU#E};U.O@^C DC Kk l6MiЫ?:O˯GGamЩ-C/CL:f W r!ðRq4~gd;%cEd/ c>y=EGl&IA[YxjJ`7WR1P/\獇Vw`1j3Vڠvϟxθ>u쟖ena+@|pB-pTcXs5< (bP1bxU,1ҹLpQ2$dM-2*mr`LI8 Xolui|ڌ{+8WHJtynD]گW{fyGo(9v\V[n::5!a2/ I%KiBRbۛT*&$aԼ>6<β5k? llRQ2s ΅RJhc?[nc6W9,J‹ 1@Z)9Xi mmedb 0+P|)B4.(PQgkA Bo)ꕷ*4IB(7.4#'1i\ q|LNN4V{Nj}etpO[_J#o֬f}wT&J%]P˜ "c)!4QR7I}zkEd֊zG7x@l:w֞e ̘<3 jL fwl͘y']2BF4tq2]X]"m`|O.Fq9Y?D|DK ^#kėŗ[nez4:D\c i/ E/M4$,Z9 Ka'667e|}皍{ (ހNYV13r;F4$U\,zgdfkɏJҫ~U}#}R!.z]Ivu C&N3Q7grxvߺ9F7oB1w1Ě%ERzEScDtu2bY΢x$ ^g鷗l5\aASbMQN'9 ڀK,CM}paTGc%w!Slo0`)`b] l=%ZP9P9%)Y ;Զ 0R,'2*FT ܠF;F#puVQ?點J''w|kZǚؤ~dF@o׷8y?vkOzsd/%]VG40"¶3E,T{ A0(ϛ !@ShK墥bC֎\|T9Pv@@VZ4 +֚95j]،3 ݨ _.>YJdg糴hvoɏa5v֤bι$CA ϖVHł$#E)`|vYadt^jQd!ZȨ|p(ޖDE$t5v3rkx4Wm͸ZDcG="ؕ64+\dSb3RY18*}= `S}XŐxb l!32Vt҄k2d),GQALjjfׇQ qklVX(jF<hDsE' Z)JEQzƆ̜o(b.@zumk;)E`zȀAgxшɥĖ%u-V׈7 pm:,#q%aR2I_¼)z7A .!#ƌ^ƲҌ{Fކ l(K0( `XQ1@v Eڝ(2i+f2ɯp(-UBOɒc6$] IJSX]D,rXB(^ A%Z.*-W=cti.7.c.+D%Dr*PWLJ+ UR!XU$E6(c[$c'!"-ko\ ݁r`䙳}g dyM[0ƴ1wKEMl|0@=ELLł|Q ?{G_F|?iΞ[73gn- }ax]d^o&tj evy(@huf7إi`d,Q󺮗jtj(rmg]ziFnI3-ْ&wW4Vl-wY&m熓;<EuJǟ4]W!^'`ys^AfV+":Ѕ{,\Q˛r˛:˛"˛ [. JqX}ry(Q,$%ːD4X 2+ёZބQ&ctU:+̒0,, a2e$r.|˖֦"i5XH$ܽL8}j]t V!rxu!h^م^.du͒vTRpP4C!I[ML05le$ rӨiT ZP+ e' A31dL̕'0W޷ߘh8k$Q&GPJ]y!@EF J چMg$C׷KK/`O'8nfQ]XYE\ʝ&,ƙR6D5c&*PsHvքj.z0x}ٞE#KD)HP4gZOs܆ _,i3NܩgJxВH@d]CΘ/[֦&5|٤BF3t%V*"u&S,ˡ+QM"(svԘ>;h^ r_8IkL,oBbb'&>@äI5%Ztj jtuz-/¬;؋4Ng\y/5YCyFjT*M<2{*oc3Bx-ZpWfh.cPiRV 'G/EEH%uv֞Evy㪄|lMiHg7|k *X.ώ3$n.'<DSd,E09)8=c)۪"Qc]:®/ecp8|Miǯ?H$!s6yg/4gnaY9w8$pvEKg7/|`~3/ikv~+eѥ_&'RɖӦks}{i|>,s7R8$hffX +ThgN}k:S5;#ot)N Jx$"Fѽy:nzۡdqўIge~R݄]QdX<~R_Tw~k&L^n5 //ݧ7F~{4Փ(ϣW3T*@ԛ!J$u"&R,ӆ༕AkeVei %e1+$>Y+JKat!X\R.֦Z&xQYlDW hIa5c)&[T8I7r;!ѽƟ;bc N狈"", Q!dtMP&E E㲳!"'>V:h-]BQu Um]xSi͂S&+ C0h)|kh.m,GKmgLz!c^7=LpR8aW&>ZJk}H.B6B))7OQnh.|ƍ01mT} ~a߹4g~'v (J3ϵz"LCCqL碣[j\yj@/cEijd|Ī@cWCjW6ϴ#$lvi)H{BՏ'gxe N >i( <=X)}͜ 5ނZ7iҲLi15M;; /ju-@duP@J8' bXRe R;~~~ `.F.N`<ǜ@@MH Q]f4(X>Rk.JrIJfK+nF) ; -} k,mTkYm+ȠQzoM%37ʑ3 A'qC5/"WC=&$T+PD: -N\KUmv2 cxȳ.%lU9dg 9GӡVP I~%U *[RcP['_ϐzdd>:&kBL˘Ҵ1W{>lR5.,b}F] Fkhm?)V&);٣g@_B1I$@S(e ^J84p16 uVaMHi\&qڋ}F&IQ':8epn~nӯ k iof̻(yt][<#=..u5Yٰz|O_z,F<?>.`MRj/gJqbZn5BAjGȅ`I*K dtT .(&䈥=m1Ys,550tMͷXob~XZnWAr}rK#TIs3+ Z'β.J)8A#m+x >CtAouGNaMHޓF@XZ-3(U`%*;4xg9J#ujy+&P~l8*C/_hs-B7H_ɢ]T"@ KI[S$ Qiޔ:\ښae/<9`Brg SJ %A1`N2[1R#hkThtI)*8)vB~L-NM)rưOʁܣw-4g XV&엂{eꏟ?Tǡ]`5hVYsdKW*^N$XQ%kP`]H()q9=FJ.a7 )^$?⩯&?aUyJ Mu8Zq7nuac~Ssq? B; ^W#ML~ȡK59VbwaF4w'v^>qL^Ч[O4n]#ژ~yrx&5Vbד܌.{wzwT:yE.|VcpHX/X=˫ѹ?)2K:5DunW;?&r ߼}c?~87@;h)'O'M@'L?C˶&CKڶ_c\|㾿7.aFtmHʏ/Oo{(ד8왤 *⺟gj{ȑ_),"@6 dłEleɫ)ݚxęi X*>U,S>"b]K/%FT`̍Tm]HпŵY(^ k[\}m|tK|]9< >^kHm8c>x VOo`\ #i!*U$E`lQDyZn,'i2N0xKٗ$Tp*J,譇ר U+R+HNE}gF0)fG&^!FwPBm})9=C,g]C0 P%yԤ*úĖaй2lE,CMr{eսnV`@V!}]kuW0_245vj>;/ MK:^,Kڐ^"GQˡo?=E \_$nz8˅f׿Ӥuj^Z^3u|zW/ѻ|b|/8-w_.5#{?#ƣO]qK^NۿzcwA, 4!1־ ^lDb ah=c! bd@4Ottt{הEd=QlC,tr$؅EBPKFvʡkX3A3σWf[Ι6po>,U;h εY>qm>:9܁9kW;CG2M,N8mcyhXb!X@Uy^u@ICJURX[M5fr͞MדRm[)?=oi?= ZfcsI;"TC`b`M 8kb pY ѾXN(i|{fUzRF/O}|]O.d\3?ʜPO!O>{?o/9=x"Adj}jyjYN\*ѐ|ߖhIbÝ}b_/\tRUx6`ށ~gPW%΅-kK. -u1pqG>,o%`&vUhT8S֨1=a472,6Pݓ*'|<5ɜJgHu H^n F7,+?MN_O>wɏqUj$dH9KLQӎ'Sg VJ%(xNJ J9h[m$"LZk''# luB@ 1C۵@ؒ`B[(&V8ÝX [藟.\oϽ"[JXf:{4|ϯޫSh0K{x@x5;^}DN\7DOddXfÅh{՚2P"35փGȺ$8W?m8e]ĵVMv^_β*PXk!C%'4R>)S&EKΓv_P]r*iFq3ז<$sG2(|+Ƈ+Qu%={A򡮪-Ŗk}wQpղ`_&!W$/1 ޫ{qjOATOC>{i2;<{ɶ};h\YB`I2vK>J[6^i%X\ /zb^yw*} )yJgemO#trlɲQ&yR®ϥ0j5.[K%0c8p&XL,Ng&q=J9/0o^K}K Y qȗ[R~C:99xr=v)fvRdTH01d%X5Wmm,hTzl <!F7ia:Yc*cwqMgVp9M;ڨ:{8{EY.r" Z)XAdXU֋ca6QEbu>eb@JMkJUCvkkEe O?&p ꯸+cLLK?#nK:{#RrfwTE*κTD-PXWᨐF:V}kp(<<բ(LPЈojY"i$ۖ58y#~|]Upi\SuvӒmg7EUV&~SX9D)_}j4LΚ`cŊ>_/Nvl?\uC8GnD7eIE/K$ rÔC65f{H'DlIa@8D$/X,΁x'^5ӫ;yyv&LKr6)s,v4qqec@(ȯ[9EٸP7+%!\0BnԲ,TTKjvgeKwg'DYwWI2^*2YeOb{(K(AJN/}Ts/!l)"' Bm9( 0 ]B]T`ZK5-}kzt'XrU28ƮC1-բS[I-EہhRnmW3{4m,dʮʎ jE'}F"J+U`ddf$mlB6v^̃E~־m=N ( Q؇oبBl$҂S'6jR.9gbjfٻcuF8b"!+ˬif.N6lUk/~B/~=NJuFDŽ+#Z+:HX+qVb%F@uM ρϩ{Yfff,VEk6fwm/ݮ&,~1's IWxAFF//tEǬ/:9OsdU]x/~lp[#ٚ]]lMt~w\$ O%ǭ^WVͫۿi^݌g_szx1rSJ%Z؟kmʯ_{/=y{ &9ɰ0 e9v'/rψSێEc&PUl1sfSS mn]&SC2j@&‹SюAGuMTEBJbؽ8oC.TfΏ?\ m78} ݅ [8~6 UA'C%NG%Y>lNV؆ׂsZf.LFSHq5ZLz_& zvw^UOr}WbJ+@цxJ= 9K 2!ƒV⿹dYUO}Lk&=7k_v3=(89Y|ޞlgu BҺNh҉ɩ}bofV:#KQL҃bM.+8C@[[Jr:Wd NP!JPT*Y\*T(VmFml-hclʭ;QQl-yZe !̚ˏ+úw^<ںڲ=[Z{B41k8_︴?P:ϟGНLÓ }}kA9],yHÇ ؃`4j03bJvn:s\0m`HGpSʐ[h-l|<2CqxVP݌@L*[Sxn}7\,>k{X!! >fyAkCL1,tꪩрKn{`NjNvA .j>ɾ1*gI OJ %]BU,Y!F;#ؗFXyێܲ ՘^W,& ޥnـFX|(.md~9 jIjNq|BmRjh(V ^"'Ż!]iݪKZErY --?; 쥯29XsLx$PYWkpJ%/Z|J֚Z.e+IW \!`ifQ%fPUR48hWzj4Y-5D6( *"s6!<t?;7Ŵ]?KEe0H'Ԛ5Ustb3%kV7/Nav?>=y[#ORuRVyr% {ǘ+ƬQ؊ev]Ǣ^7R^:%)LvRm)$nCF0.VYЭX!Z blEi"Ec|EAjXbL:[[ZTlk|~GƷy"wb9/2֘HQur犲オ@U<0%nu$EXs:YHQ }#hu'W sɁM l+םÇP*T!( dt(m+kS.M](  Su`]mo9+B-m 8d3 &{/|):ȒOq߯-ɯ-9r˖N0DM5U"U AeI kEH44:#1%0HLcp%cPZy3 eL(E}:3q63;D񎿴d%<㔠i&_u, -F㓇%#%t,P!Š=|[KNH{ pcW%#aTzxmlWm,v&Z/7`P4+*n᷽X"F*ב'S+S\W는Z^`:8:? X)u<]14^!@1 *UР*4G^ Q73!ز(C4<u2<:3 -ƒY\rim>8]OME8JDv+Qwq'}=ɏqg;si)-;ib6ն|T5*\u>.JD<U\TpUJ",b{l=T'蹛Gz 9=j0YR2M;2 ɦs LZI笑99|SAzv*>']ju?eѤErr}_J/2û SGzZx|zdѷNDz5H~uU \H|jzWijd2{kq,k1d >ZP Ƌt BM EnKhK\EE3B `J!2=x`ov&6X\~%<ג\~]+MglVf}7@O)lK᫒a˸ڷ[q6>퀴j6&o*JV C\XY!iQhIHsj~~Jd4uu {?; HR\"QNy1@Fza]X!D)I BIFT̒ېg  J61z똤w&f^kє|t:㼞oBoiaXx)0l~9\)omTjokx!0U#DH3:0E^h0Iے_3MH3)9="._^0d`S Y&S K褑Ft]Q@0S3B#} }9jQ^1n'v-n;,ҷHDWEs"p@&4Cf`!)KtIDefƜ @WJz{x&CO+ܐ'9 Nz}dHّCJmY%2hYad=t<8*z|-! ]GF'oQ+P1HK ^M"JY% &KEoYzo-zZ]GġMB})ڳR2!dG| /ht{Y}?snjDGTA(,km t:{AskDxW0~mbw 2-Rr~gs85-fE^u"S01xF7LhGWlsF'>pztAʓ 0Whz4/Kz۳.Ƶ+aNq ?5#+^Z$'_uw| 0ғ{:iFw#]k7Ul'2eGˁ]9<=ܥwzW|ȶ^[9{feR:V>Lc <(q`Kkk7%EnTNWsḹyXS6hML.y3z \`M<{T^Eu[F#öD6Z-v[c=tVkeuVVGe0P⍀WcLSy K/_xU| *M޼>yg2 Ő>O& luB~,g '8ýcmXOJ./qNvQKu/Pz"*A9^7ASߡ"LP0fWc6]fW3H.savzkQ7=hnzq巺b鲃Z^4M;xրk`ןͩf| :yx0 m7t:2, _^IOr6ˇͯʧF9>]ٌϬ Q=*PrsV9%Del D<2s<5[kU/cN5X}ke6XSMʏF×i}`NIQ4KJ``C<2rP`D2(lL 4 I0CD AC2%ce1hlLjݙ8;Ls/ݙvl:Fmףv`l\%'l>B V*hSbΞg QDžl:ì:@21CEG&!ZQq| QTc+ђ %LTeRwJu#.y,2B5tt>io4.㱤K2# g`PAIv9ݼo  v idtkHd=>ţ#YuLْ}}EɪWEVUWX _ ˑfoZnm290ɔo{D] {; Y:CxYӃ; .,Gov?Be/ً(l J|GÐd6$ee"h& Wv Z€ʐ6 CJE̅#@$oe62zlW#gG}<6_^NvMGamt  16ӬNdUve'oTrq4.h`ImPYkP ᕈfe*[.C%ǹvs$xBo!-ҷ'ύB*#\g9.{^w@xw-Gr\mSLvEoYe SJ_6p2) ֜J rq) )` p(X% J: :Vn-V5/LJ~drw &YS"Gjh)E)s^ʂU{ Xt!؝N"W185]մjouj iqjx1+.`G1\Gifк>slD0~9plVʴ M\^wO5Ek:_tQ?X2rY!uE9L!׊SQWZwr^]Aue(QWDdU!N2wVcWWJzu&ՕqZgвTYro?/;R#SymZʟg̵\+L_yz e0QJAH H!GhR$<8DlF(SrGS Ѩic֍^7 I9&ym$N1 AL_)z}tqˍJI! ;nJkϲFHȉ[(] 6R R#,0mJjdrV嬕@ZFmbTz},AIFZ@'XiOǒY;d,.Ћ@My-) ?:QGYNȐK`'a"9H D"dIf$CjP .mo',~;|p4nr( sܚ`/L1#Q*gUNs2P-+7d/̳l|^t<8)sTLG{3$8H*KbX+B0Ye]ÇRc`!mK XḋH ()gʩQ{]5r#;҉RvUMY 2 oFm浅rn݄pSqU=_EbBǂ$Bl ڃo-!YI^UJ{1 o1ɮzFkM*2#rŶ|7/c8&Rͮl쒱7nbܬO\Fotq=`bhyxip=j>o%#->zۼyH Y:qPr5`Z{i- DK{N&^2 o(#fSb>ذLw\UElSZJBElu|H✤/|F+jW6ϴ#zw?MjA}庥/h[hq3ۭ>(RIa亱}G/-:IL۷-9gl-mwͶGZ/ i-j / MA& \ԧ + X k%A,JhXA#XOXu :cmdh  \GIlfVT )G|,+L6u s.?d|qqSgG?u >Iї,y^!GdH/Yk _}yCGo7c{0lI7smO<׻V+{m+| mpHI Ju Զ$ɘ q&0|JF8F(||zvCVh6fP$ \%brH}p̠HC0E2.>UF!^5Q*n~gսZl+XCm/xU<`rDR1@"u"9CbE$ښs6Hq*:O3\}Yzz4RGs:!845d fG*gjJd Ya$byh=^?wU98>8yG@%țA*JuXdf#$4Jz'tb,,MW[z/k.X#mD= նa;԰#j("]+cYUPitDN׹&wYɹW (#Ó̐h2β&BnYg5$,urҫ b>kL-3ݾ\ )aqcқJ%a|OB\ ,K~+,uo$H7.$]woIWLɷ~'ӿ_g!=xi+}pD$zZ?O֊R;yJڟI.Z4DYݤ[9k;,BݾG,7]=hC}A MC&w$kz7i}I{6ɋifDnsNݎ٣߶@c0c構h󅚿\3Wדt|? Sq=eDn=K,u_ד%=ɠ/O6f&qSOJ&,5NLX*ˬeOɏ<^48++_`}j3GQ dcdkM~IZ,GNyR QN%"$同v嚳"^(%CpE &d!xmRJxIB4XEjc؂Ϗ"~%5rJ5m+:Nj$Dʼnό@km#GJ#-n2f,pF%Y~a-'QSdM~XJ: [Ƚ)׬9$\J߿|uo6M2xddJ <(x. IhEm/#+@\okpO]Մ‰{ftp~ʍ **gvH yb8-PxBG#% |7w5m,VJ[Etbj<8tlN W9͉@R i33&6+ҥ6uglȳu|e}I8)Qȫ H~/3=k|Ԩ+ceeIfN_P\Aw)S|.5X4qrD EXS%~XpD5ْ ;"GJrN?qt!(M9}6[K-J8j=!9Z1 qqP=7Yu48?s(&Y|J0!}?śV0R^U8trr<:^ĥ(! G~BG ⿮q#niTӣi.OD𿕩Eb@ETr.̥7HE=܎삑 ?^O (8Cۏ@Ȓ-xeͰpũlaO8y.>of=mwٛ;*#[urYKj}y|Jg+$7,pu8q,ȠqLh1Q1Jq"SRsⷺޠ~\N?!}w}2}vӳ'I9REY0 44oilo4װIӪ^O|v9iKAtaTl7jqwmoCu:{&jĽL2\Bf_Aq_BqUTRT[e D f<Ǒ:JHCU2w)'N].{2aJ%Tf09Ï2 CHKy x\xe؟Gv|pxRz s\4J8j )dTH &fc<2 8N:nז>R֑;$4$K& Jh¹e1%QM@,4'T#PrdmlNlJvd9Pw ~/u \C$'ҼA)uK!' i+@*պ1cJ-OrW95sjʝD&@(4B[qt#h r_$Eʩ%ĤI//|lL~ï#K3./=<.,4n;O\=%\9vhOPbgZGe_jx<y,Id2RJ)pLL09uǒÒ\᤼(=z9%Hh^:OF J=>XD"FLqMD$4BhSXJўrCz 3w^~P"{`Cŗ8(&X._pO2o W؇߽oYw`eerp=x?xhYNVFTlTG1fb-j)`?7EQ&ܐ#B/g ={>!Ffn k'BH࡜MLPf.8 Q|hX15pn7w^ֱXݳ\}Z嚵YoZFXVnZEQ*&)S eqts\oh>Wn_4.{ԩUQױ +_ 2?t_@Fm“Պ(C$2$zD1bcLGyVge&PV;/P፧lq, Xy+`$SI s)snjVȩ?jxU]ÉXbC]SE=U`u6Lx\Nф\JJg/SR"xR2  vثj7{h{,ͳW1 eܳAkb,G4fs*DHL451sD !{MnpR9&fݦT [b5<%5zqJ)Vyj+mD1|E3bZ."YKTwYLA%d| I Kh>EPZu:w50{QE},ߙ\|OH  ,{8܍^{99כoꥦ/+8Y+9X @cixodD\i*˞ViBrf,h@&:-$H+K Hi4|O]4| w|no:t]Sl޸ety.,]btY^]=`y.R[=UXĒ3ͫ]5j-򩲲A %WχpƦ>o27zgyN+_Sqvʚ BWSsYMwW;{w[fNE`q/֙dJTxY.@~>ŮT>"4\a<,Jj8((uNwƒw[ceh]kIs|`N$YV竽 AJS ĩv!ZB!0(3(h5T"=&bgC}/Fo(W]uv}~}lxJ{=,]2[c&Ν$)'4Z%\"pFreR :Wv"1YIh t݌Ri6r:᭶&H][*Ř[\Tɓ34&( qV2]_]hU:EJ.+:r?%Fs@N*Y`㤒ŕp(N*YZM\ST^ ; +y0p5W(`p++.08 +z0pW( p k+Zr@pWY ` WFiu{jXܵ OrYY'邭=8dqD~U7h8>wDIYiAى ' QT~{~ }$Q```:ɡ4J+YJ^#LK,0J\s(pT++dSJ}8J0KJ^k?PJn#W&GBWF2BUL,-oh;Jmv wĕzp4iiR]'jWOsHpE(9jv(pEj;\e)SH~@p% \ein;\e)W\; ks0p}G}*K WWn#S ` e`lW(-+:zp] oF+qv ػdvAO[,iDʞEULْ-۴%D2M6]Y]ehl[Xx,A`uL$9g)Skخ+R7:sv̕\=2W`\zUm{f̕~ҝz,1=2W`c\d'hF-ys3Wo\1L{d`CޘL.5b2rFe#iwgގX`O ̥sJ7 Ua 4CTzCJanif=4~:ƽ޽;G?~Z|k#vt9DAU<v&Np^x}'g~ܠ6?oQv=NbD_L`~p _X"/P>!T*t/\ÏU!ެ mNr7LǽOA(q6 ǵ]ìX[Wwdd43I04kGY)$*=㚯*aez"1nz6gOYְ Bix/S[Ϣg;Y;rkWgKlۀbOOn2d WOl=דk6ۃ`tza9}d x} ȟljOP!<@;7m*kTy{]t[_ AC `=w? k#~ĸFg?}~eUIG~(KbGU.~*QW96 `i< 3wܫ˷_G[]3:<$čN萨̢FF"L&oݿ8*=̽j0r=C3zgٍa3&y[%/qx C>SѲ"-7>ſz!ؤ7ZNX#޲ڶi*L!g˨Ν[ b1ݸlA_qf/:9pm2CaY0d"qEh..s??랯{"? E89ZloՍ~Z} ?BcLEBvú&Un^f@Ǭd=U#^9a2r1oUׇ卸nm1K~sf'Ӹk>Vk3vۤe0.+6mqEO~G#Ҏ*9p k]diqb1|Nb$[lD _?O׼2\'pO-G#?F#|Ὴ3K[L\4K$#څh FD8VHtcmmfhLN̗z~U-Vudbv5w:,;x.CyY_n[\zt:fP+D׀v/|٢j3"n,f Z׺ZdYf0LqtL.>05kƔ,q-&_k? Ȍ4Fk.YM  L6q ZHTiV@uڷvhXb#VG#\$.io(EYPmY[#~ ^^OF2%WzEbhN}H<$dUs;=>)F׬/=Llܨǻ}jk"oHf;F1a `?_Ew6A K_Ĝ,^]>` U#Fqvg=\r {_6K=`gR1}ep+UW}DS/Q3Qǔsfv`\3qq0 8񘻠 6KsQq ygLXh[gS/p]q?F.A@dPm.Y?Lfp+a G 0ɕgoBuyix>] o@L+bWfq8ïW ׍G1o`>I.[k+ b_l;)fQu&2t{t rw=:SI#mo1Ziv X2/0q`(Z{ kCzO#-{SےFhn aqT1$`e)N$0b"=VF,`@:]$YIdT$UQCdpAx'D#|=HH*iLhP$CNdV N)HWb- 'PL*Z .hP U AzMgz"B 3r66jNϦ_RŰq<@[/ǭLJxo}~;twQQJ8-s fa ?[_+TYH@H4hhE4(ף1J𹀁L(HC2EA>1[N1@C5NYMb_YQW댑|:ڢWSnE[xRC)O6XK5"0e23t2AY"]Qw&׎AWmhBf] 6 PԵe7 u@dGx %^x1kr'WTۈE4$UGJ(H?Z`ɨ j+5~a1X3-oJuwR?gs:ɋ?WD4iZؘ3ўIX7P80& s']2ׅݏ LRkWeʗlw1c)*85 -bvN`Dt|+ac^ps}cpʼn"(d{d2^ hu@’gX[Y5A5;rupA S/Ɠ>$cYuyRT/brB) Ib9낷Rth׉ҫʧfF}o+fHvS'C\{]T.>뎡TS)+B\KCg>6KrdnЀ QS +Go^gyN5id~LՑbcu v!"q'L`1UZU2iK gCa,"';Ƥg$IDx@GDֆ#)1 K"C3rLø'd'O+%;?۴acqm_mz~XZnljm.}x}A:{E >p7Dty[=3b%'aJ`_d2j2:\&|`4wb=ŧFϮ=܍wzwU%֯|U|˛kZJH]0cz3M˨*Cy GĿ'3)3k&5l<=J#fǿ2o?~퇏R/m$GG#buWKz˥mv]>b÷XW.yź?|!>1;NWn+@Z2',]h{S/yMCWA|"0,'n(-h]Tܪ~1+=PAxn_=>WZ1#k5RFRԷNZq D߭ @T=bXlě,&Z?яߙ(ut*jl蝉2!d&i= as@d%ðR Z`=gdVwbXtV"}2~)Eɻ 1"CF' zRQ)L P[Y03\@urիȰxb0qdخh}{e՝V`!Yb4ʺ؝ 7ؽKi^5+N _acw~^CCO. 9sq]9ߟYs<bD5"&b5hQq60@b85 ?vZ*QJ>2B"dSDʀ%%Yw[Q-mMQ^e 9aAYD%K,`uQَ-҃Ve)=n9v鬸ב,|DKZ'$h- M@T(R[J |Ҳ1Y/$Ad!qJ=HM :,Q[l.4=썛+>5uk@첽9Zq2r/쌜5bMeqrF|-Qjߙ""(Xc13A.؆9*A]D1x)u-UuF#V:M>%8NBR1u95oGį:^-K uzQ zqЋkbɕ_$>a蜜2. A/C/xeWܱ>!6˟hȭÜQ[ wx?ɝ}ի|_,GD~y\+s6N6{ew;ݽ楘cִuÆ] V,/_QNy{j-]L{p]mW|Uh}ѱZA+KBd:ۉW; @K ˡPaZ* s :IN$SѦ":.D2{adƒ5*{3r^`8]}՘lmk WJ.b`CfHޠY.wi;i| AIQ;6Q* `XbF$AR[V^:0*Lʍ+݃PFD7SUU/WR`1Y:tP)Y1T""yf'mMy$@;c̞`%gS>(c w6aJQQ q]םvl_M׋WXY}B!#X@qFDQ $` ZحGd(x`Fm𴉧 bnwcvSx멻޾&r-iiL-nNSY?R+ nxtut3~zläWc6N#~u׃7O(ڻ7F燳t|o[oϼ oU"x.&0qF:7}]1S1TXsˡ+uSs{̓$ 3%o`?m%^PIJ/~%I[<~˾ԯTk,mPN_iۺPE%z]7i / -mϓOGȧdjP(5dK))Kw3юPr@:n:%~7`z{hom}?|6Oftí A" HŪ#C@AT⏥X2jIc*uW!ͻ*.'vڄ2z`^{zv7?uȕ31|?z'!/Nq}ZożEkUWk؝j0C?<?{FŸr#H.. c)g,eF%)T%- %nN_SUzT?\WdÕSpmWkw1mTW?\pW?\_M7\U,'ZY7>$\<='==&n}p/K(%_u=uw}p9 ‚RR8,'ȲLR IDÊb(}YUw[WN9ߜNhI|vMWq4ixHMZ)vy/W 86`uq4Q>cr)d]YXgA He_u8Ca)f} yht Z!42+69x#yٛ[sԏhu*]+iBiPdjco&2L׳J5zdjcd*O$z7rRYS)M(J4pZ JhnWO2Ye\IiŤOk “BfXrL 1e9s声^ac-iLFC)Gtf䅰6jU0VrE?зFth =$cߋ1/L3ߢҮ?uhC<:KDŖTʆȲw XjʹqφpD.X"AќjM|Ԧ6%]ov`9H+0q"dNEDV D4"k:c"JzΎ|o͖~ Tz|(o$?I'Q"RQa3EH#`l⮄qQcF,' \}+ +ӿ"q.,{2 ERJ &foghUԂU=uGѭkshŽFCqo"0ͬ#WlMVd3QD41U13s&ahbpm̊;JYnl9Y<.ح=^Y#`j]%p)BIZ#hhpD!5tIPX?rUq F"Oe>'ہ=gPri|v !qSt9 *eE4rX(*{+0 <BgߣrVVԤi%?)-(rKJqr&zw6R?m=8 <6( R- N5jSBs;H^YEGfڑH D2 :K"X'b2mX[d*K))ytd*-Ab5 HDȮn7-]AQZۥX/i\0 y A:90#こV Da.6I^@J(Btq+LYzL؈/p*bןp2Sr`^Ӊ wMJA}Wx7=9_O->Lv@䭍 )w7_umm˖OE0[S ϋZ;h@/ۊ^ iY*qb^(lք1 N5O=@duP@J8' bE2 R+z/e?/m?/f?d0Ko '0m pcN }&$M.Q_755RL,km4&l Ƚsҏ26A6MP!٢uI.e^ 3ri s|9sӝ,3 }^ 8#lؾ2Mjzv&$k(PD& -P'I4=ЙNFy!yּե*gL,sx{8h* nHo$AiAe$ݟ5%IK5 7$^پbj;l%.l>YM?>ggs Mi|L0ZZԥ]řgVy8 g%iHc1h@BDDLds:ȑ5x< S HM3A`=ަ87.gK𷶞{kItQ#j[ %O*T}zܢ%xOwīa`7_Mjxal28f$xF:,jȲPIB/" cΔ &BNAb/1~f3v_r7=y~ѿ&Zևnl _/8'6 O'xg0rr&'Ca(#\#2´~)SN}o='hhv[Mls8~Mu%F@|i%Ub?7nfs DfdkMfyV$sϮឧQeV75,P|AI1ocҳYE1恁:m7`WN5L0jXۼH<|\T4$0e09yboΌ!cN3̌먑kJtyZS\g\gz?;<^Idsޱ}Ҍ !FpN͢IƠBW(IsXq'V =}Nœq˪C6G SJ!5daǐaKB;oe3[s/Dv7/K/_xdU)Q3}02V³&l'9bԼ Y]/L.87K 42ʀn-G鬢_'x%.>hI$#VҳM5>^lj[~Xhҳ t@v:tW߷@cxy)1<M,_}Lߙ_N'=Rq9 eEn@mW]YGt5Rtyzg8X??7hQgـPQKM΄v9=>'zUZ~Z}ji$]\ov?'| om򴼵ɚG7IkJyKEN((Z5gohyf"HHQ< r!X) .JIE%$&yN%ճ=2gk d m)eC6w_=Z-\|I]..|rK#TIs3+Ne]9$Rp72G#/!aڡZax7ЊE/ʡG 9V.he_.ٻ6n$eG*pdojVťZ"$e[{5fHz I4J3@׍Fx! B\1( 8:Qu0k#k^'"gkE`vETrN̙Ǔff'Qo0v1!ͧ4^!-=) =nX{7ܴv*pˆ8 ga88MYlp kzWF:d[c5F_-8 d?G>7}1aeqH_ް9Fc"#SQ䃘\+!uŮke;Gױb-*kUÔzթp$1>=m!vVDMhFYZЮWB|W\!lHZW9-Y墶 @^I5^#+_Sqs mG-G KAh b_$Ez 18)Rg-ךROmJE)wItHO4j+F/}zU|s&9nVpc_/guV0tz=:RPMq|T5aS"ECV"TOL3Ub9̩(i^~yI:tV$[Cז+lH PRj"XC#w&@S Rpc)BF\O]^yu$^Zo.r vr u !o$ײ{Ҷڼ| Z<>#w!P۰͍RP]'=B5d|AUGm UV@-"54= x0/9\ !Aj2:?-3A 4B*$TF +a`}Hn'8o{T*ԧX?l—a5zsT!AˊCS||ڠq5QD3.T:* +RVL+ u6e@RV(tnT?n9n]= ϼ6%@@DA$">M[xre*R$v]9C"ex &g rL>n-FΚKEvRϊw}%5z QvVF)U*TƄsc qIXNJ$AάPT2JR9!qPD=cƠSbTrK@HBd,FjX8cSYpdᜢr|@Hu |HHm: ^lpwO]a(W1q]6!7L,h5TU[@ʫ%߽ҵ{+y/wήᄕ5;] \"Xv3,o Ҋ-axyޛMj/{L?f`;To=l[5RozJF֟auƩtU.FwG\!]WZ]W3 T+mtdTCkߜ_9H2;g]mhx෪V}pݰ9rBB^&+N,q3qmVOma"Ea5gooApG5ۢgp'g̚6 R˛LZG |3׏ `e&XїH.#|WeɮL%|\쐸BS1v]\erq*S)^\=Cq%ʸ_.p!?c)+W2/{G QF3Zx޼p{#0q8xc'?z&hnSQ{sS}b@"SIXH4$׎#X0ǔR6ˣ؆^[OƾWsV}W8_bzViN,p4㓼~_RjRqoz7w>8kfs@lP߭_`}9s%7f=CҼxr:v}y mSP VW@Tq $mO-Wh+UDhik֞thZ[OaVLR%܇dCI{!=$kJ\ͲI2%( bԚq:].Xh,Y4/dwjQ3,LIH.BWR.dxu.rh[5{Z%5q%\D*+52伖4hiYr (nS" ekyp#j&6FX.!pahYm\D"n_EA24+Y3m)),hɥu2 MHy91J -B/]Y&v$3 qJ$^I_' V2<6"Rb3I?Cwܡ9M}%CSrhn= Brf,h@&:-$L% H*pue="*R$vb.x뜡DB0*MnpR9& {D&ւj+%_qˤm欻S5^ra\Jx?bW6MV? $)c;4TvooW˛W4:gyzӾ[>ufojɭ˫Ƣ›a4.iotv5ux~"׶ φ4?ƨs֜ qK> |C P嶷͟6`HP3OOeܻyxs>lwmƛD*TT?(RQF~o?.;nC|T1Rwp .ȵ$9I>0'IٶVBEm䔨s;"\|<%Vzh}ygr9呭1dB$)'4Z%\"#2)l\; Ƭ`$/"9TcFxI.RޣJ1&7UĖ+E|'q6Sm96EcJ{,Kw:ct7ozMpūZ59~zg+2K?ŊX|n!l|m G )6HY/P+(}9%R}ptJ\NR ] @p:eY=;=G(O8+-Q|Xb:p+HZ+Ƹr,Lu=ӛ$MIE x3RU?=/煉DJ;f i4!/DL$h4͹BPƹIB81 L$x(' mlhW{ʚiCn~ML7dL[>z3*ͼ]`rʟ"@_L[L-?.J˟~TfI]4C8:Ɋ%ӁO3f;-BfP *:ʨWW1IȡL:CȐjD&#gơT wEB@zbOA9yƈ5*S^jrj!I9E"Øb p`__N9} YS)0NRv(5xcH}}^(р"R-/s4M@YGPR{2:iK>Kn1ɝG> TE<Xt@NQBєX?6 NG.*k>P쬧e$LB#Րa a:*T \zϒ@t1*-X Y1rZ1Wj^èAh "\w4B x0.>΅Pq0"PR_ z =y9<6O9T@H%#QfQECbPFjw8z[-W5j49A0KQsaHQ0'%\gN:%B8(i]]>Qmw8,N·c;w8wX"xcbW1mrZ(+aK!fL͸+ eWM$R@fyg\E@FpIi!V#]M":Wc]OIh(&iDum9f x's'8ĕe @>^<ʉ-Jt]C-:A`v]qTE/h $r$< _|E|M9]iu=+H]D+Ǚ$%LߢϝYrKN3+y' MG%`{OQDȰg"BdQU!i`$EggBj),|k& Y@/ Ru䅌;JA8AUe>@IgqȽŹ woQpɃ0Y$K4)VBS9L`smQ^!cx_S%70:z.|BI <XC#w&@()Q P)1Zz7S7C{%#V]:WY?Oej_۲WkzD= mș#࡚M.G3A ChY(ךY +z:^Kqӗ ,K9GRQG ݺ}3/GK49 1p:|izy[k;?i ||*エ(7.m}# OV+ I ;5QLjq2w1 (Qu9˯ wv~9sM{`zc=׳0= 7ɼֹ*qaޣu|YJVU ^w2)J[AF=hhUGZ1Gg5w>ѿ=tg# 9W"+t$Z*=9KB V2mR 5s( 0  &N &3#@Xmm"0H/Ff^smrrK}(b߱\>FCE=_~"9]N9W})Qkq+ɩRy%Jh:5OT@( 6^#E#d^>JbvCBI :$"I( TDkB4\qi3Dy=rsSDeɖvr]pX7Žۜ4"qqͺG-A~[8Lx@rA #"KF&0/Tģ7F-)x.&W=*H ZRϬ9^{)@Do;.,x"^dh/&#!N%륾:{ֻТCf%CCz(*m fq`~ ^*ijV7ؗkwdK3k!XhlbEBŤ:QAh4"NMRD$ɼv h-Ï)e l Gݺ4~v15=1~ ? ?7A\\;2~&烳;緛rz/bh;tv''\Op <5=E!br'>4R&cc($;% JĝLbN_f9iUg9㧟/֓ /-I]$䰫r K픥yST^G;&:$擴k;ꚽIZav}`wx46>bQ ֽ6a\oR%cگ&3? ~$İஹj9qF5nK<N>c>!eоv56$5,av=M٘/N 䠽Qd`gB?Ƴ" |C1'JP&.D02p'bF"_\ qV9{ΓA꺎mfT,WC#~Y`VM> 缹z9/ͳ.d5~ s?|T31z~wADIac8޴&&niVPsL׋q?99vN̓0<]xOݖǴH}Y]6o[}X gך ZUH`9C uŎ'gCQcMwa u ^y"J ,JK)Z΅ڵZrZE3HvԖ6q5K:e]t749Pbjר:9 j<]&C,!}zb@EZ sdee1jXeˣ<{[k' H upE̳'U6R w";Դ&!M,?c-F+rB>@B h yt H\N7~O߯43.QqءQ>^Ozz9y;YU?U.k_ٳ!Y8vcbz_h`xu񯽩ٝoN o.kqBZg~?|e/i ri5f$7+r$0Gn~sT嚛0i4' 4-zr>M/waOͣ.^rݨk{<߼IuY\L>1SꚂ0**4^=9|3/eBR4ٗgtxr'F{ۿ?}~yKofowR~fNg [ -wڵ|a\-Zq6>2{Hg p7j"NJgf' ->2O'˴kY糢i'UVGyzt |"fl~kjⵊԺQ))_ڧ#b@/л/:obd:JIQr^y~&ߏF'W)_BLLB,-f!l qdm6 +>Ƕ,s׳hXlpV"L/2K+ҧdaZ(U.3IDV98*jMl,G-oDjqW^8৉(ikؔ`uO1痢Av [JbWv1cGKw"ldʼn=z{Y&gD% J|4|aUb9k ,MqR1(% YPE쪛BR>YID /8F)!blM /%e5M>M.VnẆcnӧYZ0tF;tDH y-#Dzq=`rʤ.;qv6wd^u!͠R8j`rT9bm,ERٚ`2!4O5(0fUT " 9B2E '(A(DA&ݶ-˵)+ Rԉ uLb KUDJKu`l]T1o6;ZoTln=l vߏEU#&RȖGEPNQASҠ uڪ |d2O1YrP$)Cd&qJօAhRtN V[MvF&Z 0:mlZ`5.iPp:>g/nA&}yǕ_ykm/i)usiT*ߡ HTG&+cjK/ߢ S|4]I(DzhXdBbNKGcR :HiRj-Mv4(cSY;YdrFݦ5B=Xkr6)xdK씬b5|A jU. c$X*SV 5`5^HJM,D )\aN/ |o "acl:K8 *HfԱXj!u6iWFa$#Fd ;#cHҾXCfd(hٱIYUY!Y:yI+مX6W`}l*e[iDZ$J>߄{ 8b?.yo7[v Gfy$O/YyfuA¼1qt*땥f*ۋ*;B=f1^6U:.ٙ=l-1`\ZIz5qv =}Ẑ;jp2xD8?{2ހqp6:/ʸam2G_o3msCN5V:Tll:'7yAo_k,$hr^>o) U,? Ae!Jrk6;kormi3ɒLRՎ&u2CN|(]8Qmc=HQEBԥ ?̟;h ;'`qν/x ADVJ|3)H"'ê+2$˸Z͠Mq'=-~Rh@z/ExN6MQ8[ ZT1&C &c:5/zp VUDF*x V )#hqE.#5L,5Aq[;ϭ[Sl|}!Nj M'hkNND7lI $jc ƶ{3 ٔ;W1k4NJx}A~uI=r]l! Kx6ʫ)2v(,vVR(ޤ]ծA62zP޸:l5@[d%AC4E3v'%w:越σe?tnGv,ǣh0}"B9uy"&B/bB)1G>@@d% ADUcTHL6a9z&5k/镉5<^aZz7O(!Aoax?%>~ c>OߓW-u|:v|HB6D 픛(7~v&u)Jڿse A#JIbB/|6فQ>YI'*h7 RV!b`ɗ RUen 3P>NyP???ͥ:12NkG X{B!FED2D05mTZ@DŘARd(gХ tn7A]Bd0{+݄)dz{Ì_Wt^gWxLxKUfV;DGo;pߐr̋}ɧP$ hiSLλ.  A]y]y--OLQyY%dYN;C"KlHx{%P`$MIW$=6:m]oGWÎ|p&@ ,Q-qM:#!EIZe?ivUTWjo Q']Cg* 9^|z1J@Z} h 0'2dB#F`zij{Y` ehwb ;J{HLJ^Z㡒=ӫƁ\gUP2,TƕQV!(5Hj}z];PD[Q{ gz혏*xƘjQuK&l#q_2@ Xg\F F8!('®blwݵz p/je<<|9N‡/G?>O ` g«^Pdrbʘdv{)eL2Rz̥} SZܼYyyGRNE0!:]A&qAD!0H6>p[}[5G} lG)x, \].@LQH@ Y"D36%VՊM {f@^o߄/m3\|urk{oM_J?T=Yҽ'i V Av^Gerz>J8Ă5£h,ˇWW _bHRjEIH$\ T,\9- <0-\c_o 5~#>g &պ[l̈́>ZO*S rIhD(8$&6 5L5G\x(t~{өzziLYkEup^ T ZdE`, E iByh ]e?>*H ZQϬ9^J F]kgjލة$av !߉"B^X]k)[yoJTNJ:kfļYᮾ%٩5٣޳Gգm+2N60ާJૠ}-hll4KV_1 h|p_|e' Q|b̯sG?n#-)S"LN6OPߛK"<^drPWshp3![@&0srE֫%[uQ4?d8W&$2 :iSv0Xk1|j9{V\ei3?'Rdʯ3G4wf2tykKdZ$tVl+d\\<}5kո Vwl6o{9_sFp:-j%y*zݨj1i؈͖EPn0OB@=h_2B Xg׷fYs7ݙr9$?R6~qviv5yl'ٴ.Q4fmZJclK7k#~j *ItVE)c a>L,E$Che#%`iQN\D:\h8AxgKDո-Mђ( bx})䟠+apk}b=:E]/F7ex$'.(gQH)I&("FgTp"ް'߲ Y0jtuWb`ZZ6LD2V`T yh}c-P9&Iv'~źCtQ,)9nAD"OyT~g8T[. Qe6NF%%>0s i+(4gpb@PTv!-o8AA7]jTޅٯuv|j/N:01Bө+cƳ *NJg2wGN'Gr9ĉx?eJڛqD 5 ; GZqNy8rl*Fj-6(W)XO] $g$Vyֳ+5Kv0>I!@/˱nA\ .\FkvS_^P~TRДCrK#$c呲qr`SK'p5Յ~?1i]cyC*0Ì&T5kfݟq OϚzfWDd弮o9#+頤k) yMN2rxRɴd y_itjȝ+5W<*İ VK7:X_,fMҏvNIx-#jҀvN;~y ?oM3I@X#hH2h˾tO;R :B14Zzj!d BKM3 1 $JF^{=# \K!zG_A0: ou6:MzAJL8hrAE=#hȰ}r uVwgխAX5zX+,c/e T kB:c`\啐 5քSf8'i/$Hxq>.Y@ Kŭ\EľbID J@2EqP7@ I%Ac Q92A(y b쐤B޴m7͡rX=zXݢnO?SN0tGگ6=:$g4tt*PL` (B%JL2L,;S .ɋ h\VGg)pok:%*EKZy/73q ^;STCI;`\%{fv 7\>mGH׫o?ƓOOHx?xkXor~^Nn(ֽ'=A dHϣƥGm)+FT@vV.zCI]rgW3U7Xc6CJ<$?,B;&=:HiZ4b)X |mqJ#Q`@r)N|ApaT H6LbX4+cf\rcU^F51;jSN?LsG7B}qu0d8pj}jӫ:&#®6uE3p֬uOuڞB;}R_;T"1$m sZ'1D|B@K GqLYљsNՌcd2&&N&JzDp;J%B*TR5c1vkrX.,&B /32ȭ<݀CgW0AFɧp65vJ9(5 fUDA `2 iL΍.[NC6\d{x._A DFж#&&,RD65v1vkl?Pv1Sk RXk^kvFeL2LV$ogw9>},l^rg.r/#Ui*Cʒ[QЙ HmdYsW m _00SR+x9}W7i/ߊnM^ .ߠX۟QNy{ˏjͷtH416/t]i! SS1մ:" :; {ژW?V^s$3iXf6`C22Vh5̥%'a*A& >67!H"8A(K(-YkR)"&I[Kfi Bo*CbY.J^x+ʡMY ksdr%*r<A=$w82f(% :{bK'o,*c=q5){ڲQ j ZZ Grja8o0:tS(S 7<5(jTצUϴf`סiz4XH3 qBS8<ˀ5{ZyuCپSϷ[^؅w~uw Dn%m2 R sWWD `htClr't9f{H٩߾ idC#*vh4+#x4w^oy"aǿ/+b.Ys1] c4 o()執[6'||3ײ}⋁nFX@PGf3h/'(xmsq@jlB.¡iVޣBL  5=޿ n fa:DR/Ӄm:;[\N<|d0DGx4ڋZ'D}ݝ2w*$ƗPbAeUڔm:x{W ;RnLhGsaxsZRϔJ)aE"(%;zFZ.:sG-t*僶V:^ ^\Oa/lwX^V@# .˿I5=!8h>X? '܊]k֕h{%$e‡dXqCj; *T{PX&!9```U!WyEV꾫B%تV2 x8%CQWڕBU]=Kue+g7Θw8%l >*Io_cFoJxSGp7wm љ&%>6Y.@)^gi|_݃$; *aht>ߞ% yDdF ¿̄J|t{7utGls̜XRJ䙔:)u2&H!ZV沺=qcq&NFC1iY,{.OsY1%M$YY)i/PGP"f̚l83Y1EXVBXSsjnZMi57ܴVsjnZMi57ܴMCaht-#]H2ҵt-#]2ҵtZFekZFVH2(Gcl>4?Xjk;|ElTcp\5.4ɴ\k}螇9$T:X >$O,(%e3p%9zLt죝||aNimhnr ]=qӷ1+yЍ8m'm8k'grd8R)Z[0%SdڬlVr-ׄkD]lBu}>ilsrĠ#ʁ"v ^h\;Xzށ$@6,d]Q-G,JiWڏrja߁]BSL53aH}jW+b~@|p3xBAYD钖qo9 tIbX+QXV1BhhS dC`RfGk\ h )rlύ R Rd1ꌜۡq?$bw21"rE9_̓1C㺉xE=DL =#zm51rJ-)(Ux;9z쪻{N.5geSj_xmy`(R!LK;'<0"ĸdRf6J $~#LڹC:4Fqig,%_ަeͨPQ:ާ٨qm4liG%3M\}ՂuO?6 GZVZ" y9HVa䦹_ GQᏳqKVӳv Zh@mEZJ5yIaƝѺqgCrX k%A,hXdcXXPc|h1>8DrNYO8Y4!F,"E\dRڣ@R\H\RdY)o_3d `{ڐ kTg»6Niě q@zwTM%s7]9I_ VHЧ469RD-u]HPkzߚPiU-u K]5ֲ-8G:#|,# Y#U }ؠiq̑Ѧ2#5c#BeL %\";ˉgKUh93 9D8IW4tsEC?bDn)8iʖ?/5BΉgg7G`S $H|2Ab֍N7YOF$@΍Wi#9Пuag܎S) /9mĂ+edQGEnn_T6ůLŭD )V_%q muAR78$CdJngh$ٱI9&L6EsI)\hJ9d> ;xDzFHqb-3 MT!,0mJjder9;Y+W_-TnaTݗ"?#QEQJ V'H!t(hH8-x(I@HE峱2@֝GhdӇE:c ٓ  sgԘ- L)`gR'TF8Go7|%D/h*,;Qj$Dʼnό@*gm9.B X+P썓nvPwK,u $Q2cgZ۸2vs~*IURw+UwRѐSdWocf(Q()yX8t7~ #QH ѹPp92K VI!LJ.?DA5hox\pD.dҀUC]+(wR(g A$ƩQI̸v9ÜRAssjTA C>+gLʍO?rl F ݫﰨ8qjm+4>3[C-'+^<̫ꟿ$Uw[MV8٣UVt49'Cu̧͔_/AS MarNޡVS|~|7_ <ރgFp* w@R N\qF 9d"OpN]X9\O9G8N\FkaS_^_~T(hJ!^ؤ䞲qf𿧸OKgƄP]Q׺j#F9{>ytq}f F3†cy4IQ^!Q5N~;]4 ],&?BzFT$# wUVy"a2m`^żO'B/&NQw֚˗7Q%K~E.x-#zREI~;MY.o@hn.A~k!v6N쬌F.ރ:#o۠c+dC6BF!*(43A4FDXɈ+b`=R?`td}}RN+5A%V'j4!Od$*挷r*rSx*?U]tVkYPK}݆ZE*QSI#l"S Ad+!A)Jc[04p7jwAbX,n\(A$%S+Vh܍na/m5a+?qի۩Ը>y8m)『rz4t|:%8jdnFz4AH[ƏJ  9ZE$։I`K80Sy&c=eM:PBeNj Qp6()Q)FXBSM7y^@~(ֿ']ANp׽\p|WJ\v˶6%g$d ŻK#ƅAԋɸyZ'Z: 8 '2^w[ \}s?YtTۂTVP2#-Ѽor;e4GUFޣ&h>[C4/n6n=gmE?_'RH5(j!AIARO` 4Z;ZKOh-I(IEϽ I3bvG6A "*)@#VU2h}R~.$KKw`xX[O>rVOuO֦O?R!jXIs*Vא=\+D(!US÷ZQabRD EJp6+%:8ӕ+[j 1 DVfZs&Q89S1;KJ1p.Š9 p驎#+=#G%WzsO>Ljr3Y3F/\bKB@ 8;0␘8ȂԀ[3!xN'VFkq)xQTO4K<4r*@@O DaɢA VZ$ y)y2\waU89%`K31FrgBN\'6H}1";#^=2)(2{ҖM 6" J< pxJ6C$$\R[%&Lv jEh4"&@t `I`[GX9S"S:ZRtzr)ҍs) rN:{#Nxo?A獵 JUi6=GSǗ/etek?xfqN|a{*rF6e5ЍU'
    [׏Ll=DqGױѶ RK%Mg@POg/I?n89묺fe5p0?C"Jguٮx.?oTEh㥚]uZoDHt|{IQjed[^ٝJTDfۨ19iUl掎) b?PAWx"d&}'6,hѸϳRͥk>*;]:.zM-/a|} hQ%PŤ5RnUT (́!;Uo</\ką;v9Z⚡3ZLs0RZ 8qe}2v@֓` @)T L-u@p7xTvUHu{|TBd=u$ (-8$Du@.{lO}uzAxĞNdRL  pNxy[P9Af=モ:p?bZ5>#r;T:p/*t w۩vݒ z|:utp@Z%SuUDRuxl(ףYE5.dRez:[+7ş{_bKq{.:DL)e@[&!Jgb >]14e"10l(|b ;wnD-tE@H0cDւQ&bD$DRF^6Y 1P ༷T,NHO9˝^iTblhUٻ+{W0,ٷ,)p: EMpb)X2@t)scTi'@rR NT$@&1,IV< 4($qɍTyĄ‚ZMQ7Pf90=9'b|P'Q]ISΛ㷫.mpD\4d] ۭ]8Ηs{y9)7ȉ*yEbHd~N7c‰ID8HQ8N&,9'jFet1yO SB.cMJi*(PI֌ȹ[3*ta1x.4.|V?-dsN;z:tc;n7k}ujJ3SE1 2T%ۨSgW_l&G{u8Xo6Y]/M[,URRaL Ӧ k \^='n:}8`՗wcH9Q-};et:Մ.(GΎꮱMv}GsJvjSC! tGm@^n IT7wYRqsxKߠE!vќ-y!+Vx4M7^o&IH{1e8@!ܕ,6)-Ng$EoƎׯ/ջQnh:oQ͕h4~ގW/>J"xi]";(c%p/=. 6hV'` NHTV'ڕuG:V|^W'V»w?f F`^K,Tk Xڔ>dE׸HHT^Q}\^WNOxWtogG)gdN U)iQfU45(f:pQOd(zvyV5w wW鈝W~㺘}ZSFG]\\Os<Օhdjyw1X?ps9rV/cהO7 ,>kry|2XbWȌ/4ǀ'Ɛ H}(1 Fk|åw4^m?T)!uz:J30Y|vWBOS|HJZS| #n]zE@j%@R( VN?PKZ ek=P=oB~efգu?!4OG:|CߏjY1M91D:b2?Z3g3r$Gщtp*Ǹ 4 v<fsC}լ4xH'ܭCtWK_H;xkP[uLG#@QV{ Ǚ8ݲ$f=|d1+`. !w+Dw*Fs~0~ 5*nVD#JҦ@VTeVfkQ,p@ws_I F~yݱ_Ǚ?s^5c/^̦祾[KѪe"fK[rH!m�|jTPd1Vµ}I.I_(!9 ʘm2ʸRQh4>X$5'y v9[n+~Fwn!5ms)j !b 9$#ۚbI$ Un+"!jZn ʓZM+2Tyᓗւ(T2$Q (5[t &Y:MMdj(Z3&AR GE5Sa4:I(*)V!ZH-tnKL(`2#M)+dj:eʢrQXW0aCFK$3Rz3! OJtl=B&UBE%SjF`o3 &8}Ƭ2:ּk94Yٚ,~%|m&8>I&l oz}ys=dᝏYl4Fڲ`<nD)Qȶj8jm7M$%UVش+!Kb&E ˠճRȦPB@*!+ե$PT":˄wh\ zXoRi,"@v+ ,*T0:Y-Yy9mx ]7?R4L"Z Xrd EIıD6V:؜D(h0F PY빥uKIi̡22 + ջZ(Z0*aUCw1mp z)ZtUTI@X$A Ȇ2A4KX{ Ex*$M` X<4a;(&#Vz=.50!.n>`, kNC@zk< PiP,`&9ih$mfLV"˄:}n*AtXYuM;‡ r ` ʮ˜L\΁ ̐b]JA5T&/ ٲXB{S2LF0J892jܠP5HGzxd)$h~C_ HNIlB)vNA2Z#1QYHI("b#n LcS|U*B¿(Z%J̃ q Z* ȁu(*(_CXkuڄD:[g†43^쬛[bB^"5 (NcDe ՉIJ!`B &}3(~g;Ö |7I-O7ׅ\mɘ;$hKpP 3 kP\qYxU2z(9Pmi>jZ.*h`fS yE nX hTc!Z<%EPiDiPyU,|p|RT++]6b4.zOxH\tp_R˨ռ om UtPunT& :UQ٫wFTR8P%g.$Sƪbvon *󨓹hOY3Of/FP"Gށ]JS0cځ^s  QBwAK\H!%LE2{Ez(%G&b%scAZx %HAP!Mv%KL|0rFQ "ZMV7$t- ^!Ae#KHqk j:L$EPlVB5@բ=By(?* >&|LHYN"JX''r([sl<L{B:gʜ& HZI@YLe36R*4|뭈 K*6h,e ԿQn%,B j/Vp5*A 1dJ-K6Ѧac⠭zyi7tZzWo-XDO Q7.nn@3 58)(°h ~; ,Q Ƭ^kA#y(Y-:p4xDh31iF0,0-GEI z XmڔRA5r#`bsU.j0uF:( @R9$TiS)7 !1@[X Fc P|/7؊PDRe 4IUU;(Qy ahaO1 Xeh#5J:HԜ0UY`-\c&Ůk1nk")RCo #I8Q!HbsfzNOWM16ij%p1\7Zqz" 렣x6Q+|0f4u3iCTCx ~izU89(SŊ?./(%6Un}`@"m@w%Ph{#Q@8 ,xBgrW:ׅH~#;c|p,=h'7QcP|L#`@ s .D VKc:ͺ|Ut"t0e| 4 "!?. (˔|D^]`!މhF6oN= \5^zDףjIsÀL$RQ| w+N߽k=_SQJ++Yy \/U13tzpQm;Jn6V$d:L9}V167&T8I(N/\X<\5MB ý0_/k|y[2"ܚΧ,ot6]Lо4l %\l8mw&ꪽf,3d3!bW4+.pu:uL 5gS2>}AC#xWZ ՝2\c;#Jңׇfx\C E Pp+\ W(B Pp+\ W(B Pp+\ W(B Pp+\ W(B Pp+\ W(B Pp+\ W(zTS+K-\<UWjQp W- Pp+\ W(B Pp+\ W(B Pp+\ W(B Pp+\ W(B Pp+\ W(B Pp+\ W(Bc ,*\p;sW%c(zȈ+\ W(B Pp+\ W(B Pp+\ W(B Pp+\ W(B Pp+\ W(B Pp+\ W(B Pp+|d$VwGp~>\Zj^p(% kb Pp+\ W(B Pp+\ W(B Pp+\ W(B Pp+\ W(B Pp+\ W(B Pp+\ W(BAW L~]هn3zskCjk!8m E&K0dwaW1GCPu}۲ -7nxb "ij7L> ]R@W MO!] ɽjh-;vRgHWLhEt*3tJh>v([C+.Ctkp ]eA. iM` ]=n;DW2p ]e;]e9ҕ$ԪV_]dg3R}(rV”,/[ mSt0DAd_>`$Ż/(&Ysk9Y<#K l}f 1iX?zKeE2`k:CRjBIWh:ei:iҴR`] 0'ݡ UFUF) 3+8#CtU:]e2JAs+4U] p*NWD3Jfϑ&Ru)U;Đʣ3Jf[6=?!?zBt`*v+X|ZDS4G6ww+tЦZ# vBk 9v()CztŴQG wxWVztQRt *oLF ]eTu2Z.2Js+a$1]0e3trc gHW0Z>l~f3l~T[}g6VZú6bFя?}bAG ٓTGA-6Og0=]RXN]C*e>#%#bX9ZLk߁ǭ tx^T64 Q!m[~ۡbnU yU1@k~ta|PWӫQcC18-6\b}E9p+/z6jIe+'Z2ÜL6ܺ]|>uU4(pT1W=P["UVH;ԄH+lW&D2Z}RqBL(MMK?q&9P~6F As~9l<97p~Pjcl~z[yeW|n4}}|}e|mƔF}V?n\跍>fs㝁r<ѿ ׊[njnc)̊[6>[t|.VW%(/Lr+sy O˟j+Qm-2ܠuySɰu ;z87!Nh"i]\&wW}~RjN~Nws"Zz_wwtoi7rBwjR{Fk~*]5m%Ct[t&:*ҕF.Ul#tƻhD2J!])ЖHt85 go_4%\ً饛O[ǡb-ٯPd2IHb>c ڲӛKi/ƃlzJOwK \Bfj9_߽葋E[dўޮ2V}0ٷM'Z-45\\"ʁNkYM ,JD4,(<4)zøXf(wjFkLէ{EAw-|{|8pJ8[W{fJva烸:וhХ "VtʩJiMLOmJ^aH6Z>?,/]5LK/룁y;յ<ǚ'ߎk#^uV{=pGG՟HPQ08.9: J.Z(o<(SBHHJ%$½c#,H VX+W @%vR8^e˙mW&qO"˘fIXJ opjpQviϟ}4<4 [},&ߴ8 |>C12mm_͊CƬoZsӗePj*B cJVzɭGywdyyW4ϗ^t.|=Tn.(>,Ri.j46RIKalpB;O* ԐT=I"ssbT'M:ZVKREP휔q,?p?;-Yݯ t= E_ڦ FZEYLLr-O:tj`[3q6^YC-;ʜPN(]n=,]1yRG,sD*y*Vp([`^yID$4{F7f9+t};Ha\-Lvך<EM~ؖ.=wU$*I!D'BJ G\4 ~Z[uੑ(} "ao;܌`OﲄK;_*j sϧE dwy^ l0:Y %tȥ,o ΀fi^a?o9e9j_T97˿}voiӛ $~2ϲtZN?6i4Ռ0N_??ίHre-L($P 3hB?xUJ:b  |xML1޾+ aÝF/fo1yģ7wp|;cZg!Z;OY1n4Sv[|=BJGPa!ܴDsJ>Kl"Q୍c2*Z RLytrM6K4ջFlܷgbS''F;Ҩ}y#BL,IQ\>r͂I˻iw f\"yW^…Z=~'nLE5ܴW@{պ=آuHOKli*Ɉ dO}6np{-vCPǰaK[  &!feuBΔ|']L%!A0W&R[K2pv38s3wgDFI$A $Rk"鹱ynƩ%{ax u U&ɗM:`߆>5L`!AɧFLP,}b|T|ϓ'Uۓ+Nre]?Wźʮ(TOX#\,-GFR*hѥ.񆵋ujWlg v F ѵhuN;de &X౫+Vn-% uT|L$?musVKxFaF^x͝mr|ȕ&.{"2aMFݾ74}ӅuP +iaOYVy1gh.1)FǀE0pͯU^0O%|]/IӔ0&p3Rj1yR*s_g֢z||ٌɇ c<*Q▱,NZ(>Ka){qM,);*ަ;=_G*% sHW$Jә<[Iӏ|NvzmǻH$.іG .6o?Ϣq8l.f֫2>Ds1X^bSPdpu;]cmQ"Id|%*S^kLLdD턋 jBkb=u,6 p>_pnq.eܰk.k=xOePƯxTl&d 5v׺\Ńy[U6хzof̾xJĴz_uXj"< >:a{DY Kh4"-HHHe1'к9{'ߟ{gG>i#Г6f7HZ^*.%JiA{!9`G!FVx! N^@Ol ٕK; /^LR$ubuF[Rhq7>3P\Z4+:pxlZ ̃ <K&.5 )Zl3Leѷդ="m>?a&@|ўe #Ւed ;9-<mӒk%#>P,H #F-#AN:QJlB Oc*5(K)6ykY3r֌F'#@EJZ})"D J1uf2 gvp}0<;]J`UtbJYK ƭvxrx( )cd>I>d1Ki{VחXF46K?,lUI$K(X* $LrfĕT#Jfv$=ye^>_w^W6 #+ә$8%=ZF8;IQP=0C .,CNuv۪3 wmcvv>Qn_俴W #F;Ny\$d, ;*^| n4αUfp'=}EE垔z3H1 G@O88j%.sTts.2e"aVr-&xګfG#ҋO eV'=7?/HH<8j,ڰwP7נCǣ~2__T-=Gz\٬hWr~2_Rix`,(Ut ʔV0%Z%u(.bGVaǕ@%Tn4 y`]AUW;`'XS.`k U cJ#4 2Tbg)O[,A2VَպY#O-zsY>QR'eq9uE3mmWsjbNގrNWUҭ7)[]r#z;kߓd֗A2ޗ7w<;􂹿b{ ^=lo${ "rP{+%RkȹVRqJoץ_ |W|iQqM` ޵-#_a^,\7vxb筣w&aDZ ImTMu)AeJT%2̓oړcl6 \rZ mO-|=OLʑN !>h1gI7&3Zd&LhEVUVъ.P51J䂠'EiavAOhsoj\IE U*հf싅v“;2a^^lOV'Ͽc_6hx<_oYJZb*yUMlGFW)&2(0)U[ZCq< &CRؔD AduYKt)hleĮ&v ǂմcOV݀ڃg P"Sm>Y ݻ( M9sPҕ:.dSG12@^t`bQ$ld8K| QEr2Vg;N`NƣkǾ"D|/(#[2G|6`|Aw̪"1mI $wdpD.8Cdc$OZp%g2"Vg;"^$zՁpq*zL{մd_\qQ 8rbȢPKBZ3FJ"`|VK( ApkŽphҎ}*!쇇-55rG ~amApCD? fԒjU a6z?IFkؘh+Bg]nE'*(KgL T!B,!}٠1B39wGziIN'#U&ɔJ8m%r*"H$64!em3NЙ  dY3s-j9 >6Wߎ(ӽhg\*g+~=\Ωǿi/km9y%CJI .{,)3 98TbSX; L2;l|іtMOYM:=|l;O~໾:!c,cdH\ qV=q@bIVC%XfsDtjt*k!Ć6o;|c}ўa9_|pXu[n ;6:whaI16,nct6 sOL:D*r) sH@FEo?$'GAe4ɠv\^iEڮ*rUf}m|Q}ytԥޔoq MA~??rJ!~tBI^{ݽƃʳMyD^\%91)teRW=,;YdFtt';޸G,cJ2#=,h%*Pف";/5=ג')MȜI2CZ ޑCՎЮZ,T5uo]d)GTu> HQ-Y-)&IVJ͟PW8-o6Dny S?ly kz(T/)Ufmkw>ls Vl!%`+ZɁ"6̱#cK8ӱZ~o{j?cm~3h|@ESGc> \ O iT" u)G^ihCEj*-`!Jlnw…/_y =dl& WdFŐd6$%JɖiFLd-|Ywx 6]sO׋e[-{9NY6A;kyF-wLFJrB>YnX^ 6+`8yoS[rHS,]~b,_dznNvnqni=vڍ"z@nh7 U򓁫"W$eHWI5p}^*yTi}kdkI.GHV 1QɏLѯOtl4|NwT#f>*!M~e׾98'E`}:VeS"icU.aJӵWW;4tVn]QW6*5E8eU9AO$\TpM$;>sZzXwm=N' F9[H"mF$CDs:g KJ:gW A*.sN׿^clFi1^qt{&+;Ƿ"VFߦb}x߫_MLXEX=ոcd_Mzeн~_=b̓uJrܩRUz:4W;yw(Rzx'/=/Kӥ*A@f1 Cx L$ g4ZU"Gܫ{R,wr V]{i-A@KZ!S Ρf,X4:$|a~Ъ{go3x;>:jm}@q)L14v&[ˤj@r};kMo?ekչ!;6j\@H>I VӐ0d!iBb2{kq,kS >ZrA0u BM .0CnmCܦT `vJZ1ف}o jyjeBajZXٗ;o%8#M|~HҎ]7b{Un׉?8/]+8+iA{pJR!8h/@]vI9L*$ћ&E%Sl:(t|wgGťiv%O`O!$Fe􅬀%_x#y*(5c c,5*10٠]i$5>^-V63\}l9+\ޟmz*x^)!iQf#9Fg4,( >RrdIEWa,fPe \%09i>Ft]QMKQgURGD0)3(X)?!wh&ZyA,p QTjp0 <wi<+%/v`zs vP\5)UPf/ <`2FR1DY&"*hif yt)t3?O8@O/ ֐(sNs2 z}dX2ّCF E`Vޟ? }m2GAkqq>j*ElJ1H5|DH )tT*Љ`4Y ڬ?G3д虳5 b m껌ڞe.઼BVjx]]Ѕ[:r(Ƣ S|`U&`:( ʕbi$CndG9M 4ZD&g« :L%ef|=,?o?! 2~Oze$E+(Ϧ_G痋!N޿G'ӵY/wjf~Fh~v94΄*Kq}[mOcj~XJL7=nd-G^]@K?踿Re|ސwH%Zo`}7 ?2;FK6.)vy]lfƏz7|.[ة;{x -ɝnk'֣.Zt!ӱ6V<+`"BN}?+=={}=ƿ}4JcOf(/׍gƆY{l}F׿6k;QT/Fߗ> &W?(.R MoGr'er; |N-KU0zUKhpI-~]+L>@8䶋9/{}$CBt/",?[PџEI${f҂ϣOtLRKBu^Cl8l?ZOoY"ݓ.p|Z G{y w0cl)m'kraH\@_s\#o }j6&W.fK6`uwc$-'ٲDtt%[he' sEM4/J]~zꭎw*剱{S clI5iD`_Xnp3RaE<ҕlUXLoکŹ:6z,9_IW샎 6nݩ'+[ =l=/6/!D/ QBa|f% _:( %L7Kj̱I“uœ\w,4>^yU&EiMjU"Z'-F[,?aquXL,(]%Yw/Ǟj|'Ev3qs &_f|v%E"W Tqr}e{]=?ִQ,fmeXU c)QNƪ(# 3%VqrJ^- 8 )FGBփT)2Rݠ+.MTר:/y<8y@z!5RSåNl5VNz𷿾[~΃G^LzD0'i](9f C(!.Gkh4ʧ`4fxzO|77 J( Q oQXܜtvڏ*q|zN5֍ĪA?jlofׯ+O/u, pVz:<.1Vx6qņO|#ͨ>P7>=<^]UλwO oofsJ㏟&+p ~]2͗$A!d΄zL˦˧Q~47Y}g~:N3(\zt=xwgb\6w]Q=v3 #}|5 8O%weF^/0OdA<'VFG?ǿ;G {{wW`cm^`~}- 7_SVS|M-z̫DW^2Ӄ8p{R?;Kb'?zϜt(wA4zRx4O SJ]SOSBPbZ[Ta#<74i[˲䣂$^8HA^|,KBZm:%##ɚM5)`{ކz.zv<*C@*ࢮg6M-Y(1̥P2P ?irSVxbh+hmCvv>~iz4~Kv>~yt-<~iQf v}8. %KɹwBvrߧgA6mUz>~Y6Q; w #۰=o$X.Ll" DvR@Cr^,xB1xpI1!8/ j,[)\hl'5#gw76]YûO;1u >u0~ZŜ/5]Ou渷:::<ۇ?:J]NIyPr&b0 {!k);Kd<ưtoȎ\75[}hR,REQKLQF>IXR mN8_}) "1,"Q%@^[iiaReRQ֮8,v2򎙥 d2Ȭua(1'~N'INW',j5 K@1F,R&!f ,$V*Ͱ Ȯ9uRk^ujgdd8yۘس?^r:)vt󭮟<9MK$-rm~>?z4c%.Z 6 S0"E"x3JXݶѡ(MEK;$"l%p.ht5Sş<'ǣ7S2:cu %LJxU l.]`eY'$ m$M9U=JR}"26 >N8_$ޔ5YY1wEk7-6Zނ]@'/tI\IEf, f\.YbTئhc)Ā2dUkR,]4l8暪"(0ID5ևY{P?W`VFq{R4L)EbT9"65 Å$^T#zBLO֨q*Y7:dkĀUHZUd uo3rk?eԁhV||͸d[E׋{(yE7~)32FfMj9V)ZIa>ǵ?{^ܙc[}!nAiy1ic𯸤$} nLя پCw5򧧳Y;{v]@]N./eڱ} Bjぐn!ZG ܊MZ)k P-fGʒdta`UεqIdm|DTb \DiQE )Sa+FLNe`BsDEƖu3r6 zV3kaY $ެǤOుJ*dHBdR0 *_Y`fNAE1!#}s6YV״0Ԫ5k*[Àu*{-{77@$ɝ;gxud@78 #HmѶ)߹=f+Eipd|Mhu4nsWX#x|8^)ٓ[a)/pWsk]E-8yBC`K,$&/QxO95+PQ 9iTѤ#ԂkwLmjƧp =y&g>Nk3뭚Mz9ECy /$4L&wF\whj+Z^DEg%Ԅ1$PPHLR{ dsTBڢLSG{'BQXee0S2v=.y9phLNSN;ڛA~q~rvyڃFʆBYQD*@Af HxfZsAh67$o%e:0 $:Ɍ "*dd0:˔kkܬoBEiab]@I[S]KpBTja*:uOIro2梬5ĐLxfB+,мW'G2@֡0O>蘆2, ;gr%GN.P}V_Y SJg߼#o^We8K? _jͿa~ p}QR;;9~h]yrSfb,#^ 9se^O<i6C&YQ3h|_?81^'  ?{w' WIv\[N-}zxњtB{Iw t;2fK\Rdwy5ҷE/v_N%ͻ't V>I@@&CAO\k-i^+m7 Fj~ovw7kE^͓1 [YsG7-ɞ6"~Iӧ>}Y}mJ|&W"- STw]~s}cd) |":$B sdfDZ9nv!f내z"K| YeBE%$+Rb$UlOje}YoC֒ߦDe7YVSGOg6*.޲~\ ;x#%'c)wAiZJ2IWm%8z%=񞀲BrH| BqgF Uuqf䬗v7 e${_|l Řb%=|&d=2RNxz{XLJtJaDsm,ۗii==(z@ũ =Eh]&%@2ŋ{) MsSZTJS(Z[b1:J!._+Ti]O9mO- jp8u57?dGo|Mg21ת}e7T%>5*x Qh;`TLGLkyK]sم-AUBP>im>gl{㸑ZFDpCW!GyO777_/۝m lW7_>|!mx5[vIVcb ܲtC 5#:Q0SFh#Fe2'bp5ombxtEVDn\cdyM]+%C[î|ܵ4B GC&ZN؈>:* ̶6FcQɑhrgݸ P/|PTV@~6mT# h9EH-k [:)@Եmk _CK-sHJȐ kj.Qaqcа( L4h-hVX/S2[i_?8]z[^SJ5S s5׾ZM(Xde3dQ.=TZhŅe UPsB͡NJ=5(+2[AmWJ_ѫTh&r34ѵjåvei rQ@=135XC mOatf!;0s\^{#U?s$(&!$Ll19zɃWWGHH}eZV^Ҝ N5%Cbʝ9z[6~:ށ-bXG-zFAWD784kk댬币|C#._Hoj|K~ݱW: (Fk|WCm-!5uߖf2k}9\->5?VȌ7q]81FkY%NWnڊ [M7O nsE?2C}됊Jj+Jߑr:"5/k4l_ӍS 5Q[M˥yhw7~~6.nIytqACƋ?G8#4[>],~orGAEǺ~7_xq#ʤO~솕/LtiwRξz ϛ~՛hZOӚBWj~?q HTq*-79I>S>Ty$u$x6:.uV'_TY*ͰrC.#\`c=B]Бpj T (z1R{v:eĸ&iq5L3jJ?vNWjTա]O[d+lu6"N+T9KWRJHUF"eNWqE*+!p*#\`wrƻ""yTJVp5C\H'lF" r.u\~䂫J1v}OJ?5ˣxs%mlZx%fX0u?3ow_}\o>\bl*߱ Ϙ;jnt( I.d>cUZ im52}/.0צhNFm:&cWxMjs*<@@M~s򔆹 {|`(E7Z__OieyyzcvהmֿΧ;jo7FN[|r8.vE g\~<lr3[PZ_NTWC[A`ʻJ›hcpV:5F=jHT(V Փ:,|0Lҥgҳmhױ!wVڃ9Ő`ل$W\BlR!TQBFirzpcϊ ٬ZH>TtI$Wd ZwE*$\=^2>x`Χށ:L0jZ(xpJ\\[fdF\U'+R+y" f+aT&#\`` HQw> WV']Jc f+iA,W(WB6"S(!hgYFy|++qE*SKd(: gtv23*p';P8EFVks.&wT3K\q{{S^ƫwѓng܈חWLjoߗk[)~ؼ&8ş8: ߾ث}Bޅ' J_w 3/~QOH?inGX_]^/Y/G |lq/⣆:y'|͓dnY}z ̩l޼Y;e$SW;ud8(hyykūwk2zBrq˥3;pŻYSZiSGlw/ tqvm\~?{Cmh'7l1lӆ *uqHn>.HTS %b<Ȓg? rsslH*)/WvϮ;10r؆Ue0S X+[puhsAPU&\\s6u\Jt q%XwEW$pjcT]WR)s "\\sT Sp5C\]Wk :.\Z+StjRLsnR+\PF S3Jl;vKgiT6F!LZcT*W0=CLk(W$nX"&'V2:H%W3ĕ΅W$XArM6kv%J#9b,r Q0@>kv$W\pEju Ҕ`prpUF52\Z|z,Ԭ/WnϮw L ] S&7Rļ+7WЮ8  H2:HW3ĕPW$؈lpErWv`pJT q%QYyW$Xlpr5d]Z T]p5C\f'?W$xN>LmkWұ W~SINũ@2YeڭAY,*L|HQcTU2+?7=^)ц=@Iot{5FvM)ԝ{Y3񚏃z,lFear=w ƀdbc5VMr!̤VClRJClc9Š`t6"27V'C*mI:#+]>Y|(Wl+R+O!` f+g4ʻ"*\\s5TZYp2peۧsX&(1Sj\1qD>~q5XN)$UWt=BP0 HԂJWRJXp 2 v&\ ƻ"+R[,$SM;T00 HԪqE*u+pBW$8# jfr2u\J) f+ŴTęgҜoA+@nfG$wC.&uLJW1\F".G \ 4FqE*{O +q"x"G(3Pb<%\+R)t qSG4T4xW$f+T "R ߳)cӟ6TrW:WTu8+^puhsf+W$W\pEj72|J0ǵW({ɝ:lZiSWp5C\Iιs  ǻ"*`j:H)#:#\`Y""`BkWҖ`pR^ezZivʤ1@Gcc3Q$W\0MjUkvRG Ĵ6te|1_Ԙh+dz\^aZHޣ s*c. 5Czn*ʮ#\wW7rrAkwm(ƂȬS) V^W8 5SiFX]]iz;y6z]{\j 3_AVd.,ꯅuc?G73 -yl>xc5ocC6q!YBz\_hh]kYyA VWuca\J8pv/Rեu_\/_޼u7r\X^wWqşwl:[2ԵڴaT}}+b3" (nW"ZnTq8̦^n;/mw=?iwpu^]z8`$9@F81X&}-""ж S;%t]J멷Kzex_}SVᠠuJ# 8Ck[m2:GEN#"8t 6bzXEݳ bW7vF3ؾvc54#^_\LtKe2(p~N7w7:U_[T5NIypNA7>Wt#?rS~5r!j (=T \[ `jZe ܲt &h" 3e6ro78{e"Nj-0芶h*/u'ZԵR28] o#7+{{H|? M 2Y1hkGn)j3Edmk&-)HV*T*g$5LYB=B{’`Ш %i1hN=Yo(g\}Ĭa&F!KHSE` d)0!gSVPy),S) @u#. ItVHx4 0 DNVDA9@ޠVNzYz?GeeGT#zRS3oάA5Xz]1RPkPǣR9"KkdOZ4%׽tȵ̫8)~D;+:Nuިa_zbXPkJ\tXvA^=rR|Ğ~TD'RKسY5L+7`VӢYO{.>aD{xdb+d| 8P_L)sJ" )Ǜ!\ Q?L"LF;ddA;F9v>fȌkA3iQ(ٺ\ ittQ8|\đa vM">TUQh@McYj֜I-$gJ:yL uz#g;C8#%qvz z3i>]UyX*6&s}w _J ayg|6ٴƃKA]5{Iޘiյ\O jJre-L4(ހWNuFH29Ⅷ(hJr\G 8o/;Co$ˬ71H_-V9(~i6|{o^Ⱦa甽bpٜW,;>SH-rO#Lr7O+-gO}1>b㋑.FY*> eNeŷNw % :+}wêPXbPGW[1 B2*Z\Ve}׵uc_rngjq|+ZцfIPlpU1r@ٴ:*An ~KԷs}5Pue_cnf56rKwPVޮ $ ތ[;g Z'kmmް . Ca~: Mi氺tCbX V҉*/$ zsmw϶ctG@Y‚$bXr1p[2%HQ\J8.p $Ic낪$BFcBbkϴ҄=潑c~[ R:e9DOIy `/BG}bU7#8HTz:IE'%K2(H2Pi=ORt%Q=%y d2Qp,ɹ@]$%kPEJ4dI񜵌PJAP`/(5tyCC Oൺeyni_S`EpMDnlwbTV+;l4.N٠N[\'u4,T{Q!(5HT383g"g4ш Qrܻ`i=#k|T3Ɣ@KsT]4g䘍!x G* ԀUzet`ZbNYDS87r̙^b<+`Cqu!<݀PMDss/W-+WGWUF8o,}=Vt ARpK @VU3hwD\~rh8|(g{?OҽpeNQ 8 9e+BȘ1cS| 3|N$ i!ĩ\Q[.cfs35%b{齑7\4AۨOFEic8r¥9ڢK?;V=ɩ֌41X"JhzɹxYC9e;m hIHN JmX02b2Kxi"R4C6O7<_'Rt4`׸BM cV2fGvmiIg7+bmrԜ@'Ӥ=qd\͒Yi>r%~ZɾMs9iɂ'Z4oqcTa0Oc5lӍFz!RҖg: .<-^&]f=MmB*SC8-{b2=FUGgpH_dә:I²Žpi6#Hիm}\KI n\[L.^VykD9gT,y"FA^W9dd%*M"WE,Ѫjm/.}lfK`x$8UPA+I4#dsC sUԴ ARn WW@͛yKM wZؐډ>"%ǻTPR? 8_%x+w WKgDߍF"?GDt2G.4۾x)\LyrW?1M{Rcfhnhfx^;-u,t=3x2iAh~#RJmgEB-ajKSݲ*!=t*;:UVLD~~\m 2alO b -)Q$XL+N"iF~ e$dL.Q`0Ttdk<+Nr qnM{(f;Q_ cDL4k #v4&GgZqNy8ru ~x}TOQ^ЖgFp* 4R*/u zrfAnqw϶:{yiqIbHϬAu1viJ4ݬF,s>*Q4%Đ*(R 2+B.:6.q8&0* }ZѬt.F|۸~j|σw.jf$ABD_B)譯C̻mؿN3*cr#W׾t4>v<~ŜC_]j#aSCG__JR9B%NP sWuY(om(!(i}# WG2U}-쪯t.O5 & F9T!2Z[ Iz!&霹,ǸV9kdk!F.&}0W!wާ297j#N=h}ʜYJ/мLn}ۻ !>mwa:<]Sds=RRnE>ѺZ^עz(~Nz,NY (ֈYwZQRglxav)9נ@=)`(Ce!e0D{F0R_mzm(?8>>%Gϸ"PСVL\3oJ%͋yzCh#_~zVypc6]RJ_ۏlY?O 㝣W;a@@ ?''pW3ozzCN\^̧L~|d(9t1dbL2RnA#Ez]_tt4zkT1@-SR}8/:}5r.scAH Qyx4gSBt 'VV4 u5DZ05*y<,C &JI+d Dl !,dB! n3Jxz]J>qEf1`N#I$䧍rtH5qvTU{$(,οqV!k|IKA%ܤzy`ĽH! Ah&5ǖVfa(d`{!EqRkFh2q;f]8C,[cWgiA1⵫iǞ^Xenv#g*:Lh}@/t*%o!2bʐc) A'lì@@ !bA&C&&HM1T =Y R beXM'91ҏ}="<EfggL3B܁V#i,q8Urƴ')82zn@hDs# #p%ũG&v}O/.V@ǴYMK_/6:X&~0Қ1Rz ak%c0Bj x[ }5_MZڱ?pa[[=9r0>q DяF+U?v›]Kl8M3ӏVΦr܁ N.(,Aϐ$@B*6@H;@ȽF@{R?&#DiMϑE`ȹ <HANFLɔJ8m#KU<!eɭ 3AZ'A̅A$2qEUF1r1tgo#z\^U>./W֗A(<zߋǣǚgc< cI)+eBH3)3egH1$NH ubӉ u-p/Q7`:n`|.>E[L?i>ԭt[=bº4\[w}"P{o#'v^:u:}謦Yp4q!1Lܡ*"^s/wǔ&wHƳ8K]W+W(wK -nzl6yuYl =z/A#79abZ˘BN OZz@b)*AGXC$\GA'S.&@(Po7I5ǻFC) G m\A2A搀F]Ƴ5,S~dw2kAP VȲ6 Q|.%9c.3&K]u="giaH88Y)Y4YJD0E+/Wg$\pc'^sb̙4,HU0=l!%A0pU\ҕjyk66"PƊm[cvj|ں|bP[|@bE5 \=O h4hjc*g[^iFiFk!XH3.yqRuy Ҿ]t[F]= S7g٧^N[nwu k!7 AB.o}r/Ww՛oO9]l5CRNOoHWvt4_7_7ܬlVg7wEw*x<7rU(Ux DJ5WtWB& ȟO>-햵ԷF6Ë5;eGΝgxIlvF;H#q9E1%+^ea JT" u-88%*9εs1I@ he6RC-*Vp5q[8B ګs־o樶09@%,Y,wo` fF0ҧ"2W!\r|_/2 }\32oJyz>姇I+a] ?LJVE:'%Ld)M-q(=Y:z5PytL?'>a}6[퐵H6 հs׽i:}]׫P>mr2̛:;jao/|_&?e2 FLnoB ^21!m\.WD~y>Y"wp]g㮊\>wUrHwMwu:{W>Ŵ'gHԒN%ٻB/וz*w[m MH;" nY4_ ,ߵAEr>Mji_YEWv9WG6ل|f:#$>Ȱg҉hT,2ͼ6!T n YYNѤ6r/h_p-咷\Kr[.yːm-YJ%oy-咷\Kl咷\ ;咷xKr[.y%o-咷\Kr[.y%oO͝Z.y-咷\Kr[.y%o-咷\,o3K#/3LrQ(y1E4.iT&i+8 k%I;^t,ߞp=lr hᣧ}҅s2O&[:zȈcN<+^2EUIprQg2' 9O*(Ĕ1cXxAbjOYCSꉊvq;:#]Њ܏d tQ9ep OpEˤG!8;98pNUEx5QJAgm29'F$pj#9ПP{8ۑ`cʖuUei~9i-$n_oT&ůJH&j rH/Kf `q1nY:dٻFvW<gx)^j6''"/\ dȲ3O[e[mr˖}"U.-: D;6j.i$x.1' xbNq"CRT6?s҅`N%bWD n($3rjBcmJƬE@= F#嬕@Y`T|,uZX,dHպWS2񞢴AZ"H_.`FE`H|/Rԏӯ\DDdIyiXrN Hd2lePgZi9;&|uҴ$.zj6 p:iޔ% lPo&m jF1Xg6yX{9+cR ˫rJx7~LF*/aRd!1.#5A&u{{z/!~\^[ ~.7nVR(e44&f<'վT}Tb9(#E&uVޒJV4DS43WqXQGO#m\؎!%j95A"b=7)iUȐ>݆(ċwt0)9l/X>ǺrSPv{O_:a4/WET"t95ŻD(YcT CLQxޕyLkwf>z6p॒o.ߎWrXXYdKI=KMdy;drۼȭ=c{ݻռ-ha"oHǫ #+_X+dX6] *~srv_d{66[Pf;"WBZ^ axH1'2rd@&faN˱J~eWrLZD4}',Q%KN&A\XE^9]kJEr62`0퍺Wd}ԱՄ>D[\3?m)%PچtuZ^qDG?,Q+GMx$0؜W;D !S2 ,*AD)ʨ@U׈ލO>cyw0Sz -8>'m2(H@eihBZӅFO0PPOw??#2cq~m=b@=)?`He*H,/gCÐa92˪-AgIޫ`2*fDzuԶp_]|7DDuwly|HIȨ Ix X4Jd |:98`;8W]!|kڸ_y@ey26yf#Mߑ5+U7T`rA"Z叽Dr3~s,)!5Jc7.trR~>QpMNF7#5Xrj|H@6lmmlw63S^fNz>"Z;S(XiA$[&ɤ)c)y1&1OA /h@90rGʡhrA-Ъ*$XB8Oa0rvZ.!5~nJ`سZ>fp~y~=a?G6>cC}n[']ŗ+*0ɥsJ%]Yb)&@ٟJ*ERhsyը~?O PEA2jmK'tk1Sd9m64G ]wzJ̰J/ӗG>9; έG<|UѢjzOz\R!͌PqT2.,e| "gsvBU=yoG,"Z<0!$a *Q: X@xtJDTҬɝVAHTІʚK}LN(|)jj*_."$1C'cBDW+jƱ> }ކ=2X1p`hQOEC[vӸ@X DPUWw[P{7)`F["YEZA1H.)"PA bߧY6%G$ x7h(𓽍gѠ$ٸgә|t3'ϓXXxR _Lrj&ĐT&ڥ>3Ru%-/,/,o?cAϓڸXsR~MڑjVwZsGF8]c[sZvtM |ѥk3^riW}OfzҲ(k>O@m <_({r^>O~M=+=?]R吡ȩ,T_g% Mzs_ǀ\{u>o󖸾i[xw[7㈻1K6>ۼq(׬̊崾ul~l|i#A6T_ÜxD6ofǍ'uQv'˒+INY6vEj_B_c6-XE-ۈEU|=Z`˽ 0>%rnͪd` ~XSZIb4ߘiҳ0 L_Lʈ v}Q.y|㞷nk8^Dmx`|Z[,jnq3C_tl-: ?x(^=œygXņ\G0@lvr >>tE2d% ,0]l.Z!M4YXg P6{6ئobo@؛8D(g3W3=s̖cYuq)*ы\F2B(!-(*֡RthڌDULi}8~դ47n9ţƛiC2}:',=9£U\>\ᎡT#AK$P6ft.fޜ?!wj9*zۍ@I/č[.]YX]ljӸ:6H cmzByHG _|R=eId4ϴ$++k呵$`0٬ \f/ JUNX* )⌑D$IEE\A2ygA®Ug7JR&d 2d g4)2:o7`ׁ_HI]6jMK!6&j8&1L`6RVT6c@(zeyVAŤSe5%YJ{+K(&)XrCЬ12 ۥo%V0tteExw!K6̻ EY k Rx|K P X֔C ."h )FQ؂sV0b99-QalguG(@ݠlB0c^56ZKdx.1l8 |%VǛҰ6` +J$W{HX0Xȇ TPr7z/3XP p΀4D. i Y Y0⼱@^BB$>y _V 5ȤuՅvdl~t`u~(j@J4TU;+Qqn&k)'7W"MeUhD>"CO&XVXkp`DPHc ޣ.O9g^ 9 HT—%|̡zv@ @z R{p,a3m>%0ZuIBvp ` (-f9c Ph˽e;V DhQ8fƮ6 34Hd(J+12Q 1aPƁgUkUkPa( !.}p$,RD'J s 0ݮ1V Бp64IxFDh3+jVEC,o:.]i,e4Cn$BG l`_睺%e Sa![[`M.s^y>^.گ^ivyX:XD أ4;W(ZS{N낙Gjif=ؤ=ɗ I7;e2 `2pU*29'k#Dz*DEKDoZajlJ؝5F߃ X[0m@ǡVTFX6Z4a䋞@+}!1f`JnÌ g9>5~Cm6l)XKC$1ˡ옅G@VR%0|ңk|X~:C+zgjp!G &gqم ģP,X PW_ioswy]|p ;%but V9;?ͦz{=YOw4$R~8a6 S}֤,w1{Ǘ86|6l pۢ`Aw|vo|ܡ/ezOoN`ca=o[39*WA׭̴;@}R1W&eŮzmZiP=:.j89T9lvT"P~%˲@jTFwW l?p]]r-^\>;![$KF(EOG=U+g^oig-mlZSp%'}Ü'4aoCq_U۠ԷSO۸)%Asl! N7azQk aLUrk_InUt&GIC泷Y(iJuQňhrǿ]Hc'>rHIjROQRI'/r ;[tEjwxrЂJhZIY{{A$;69h]cxJ'GngI˜ƞcogU03Wn֋6O&Ik˟v#q-˨Q~|U7@ZgG$-Q#t\OH-7 fC|g?-Mu1TftH"|hWk wey>7^<'jI\gcDriQrS{w&Madھm{)=yr|-0"jKlmk[䉗H{EOvm}:RO>^f;7uuꯖ/^nFV#{ahu۳I5)|h_՗F;yifdwz=w5gosU~h$a:F_ջ@szyv]xw=]g uMdm7C&'Ͽ4)՟# <:(.:^sQqsjQYy wo~7zT~oBF$dz&; P|t??oukowkmnG>voq_-F>}천'iiojU^ i۩"NմA>gn6ڹFF&Iͯl{Fl6ZS mļo3^k;!lۉ뗥IaTXVIt)X^3!fՔd'bf{6bb>s+Ke(lbW_M&5X^T@=ID7ul0SwY]wc.[;dQU2Фd|b%OdYɘYɘYɘYɘYɘYɘYɘYɘYɘYɘYɘYɘYɘYɘYɘYɘYɘYɘYɘYɘYɘYɘYU2v^2\Wx"ilŲZ.6*׋3[Kuh)xj1lAy=NFjLRp ash*a=leUx7M 0a4.FWU-uXc˦.|ZN*Ve89*s966dPur׭Sn&R)ǘ/^[m^8C(|08okN~PHykp=hZP5h[hS\YPъ[>=b_|DfJ)`i;]*[7zYR}( *NM'3^߼*MwTMߚD%J'k_}+>S|F[)ND-݃P]  -|8 ̹9bU@1%j=vZ&mʪ@gWj%ej`Ȳ1Vz St옻u>X0d Citn6Qj*HX S(SssͩP5O%5Q=F9]lfեdu<ռrVT7+ɪKR6i[2eu>Ǟ p5)n9#NaXz)mڇeZgo᥶O @G=^Ë{xq/=^Ë{xq/=^Ë{xq/=^Ë{xq/=^Ë{xq/=^>(tx@*5bADłbAV*ťJ %՞+++++++++++++++++++++++10A䦃k^Ŝz6${kl eFGQ%P )߱d<~-=ɼ>eyj:Ott߳F9>r>rh3mнt^S|^$tu"U V# ̸$}Se dM9UVH%h!0"yattDj_I;tw܀٩cu~" !( `TF֪3$X33x X̀ X̀ X̀ X̀ X̀ X̀ X̀ X̀ X̀ X̀ X̀ X̀ X̀ X̀ X̀ X̀ X̀ X̀ X̀ X̀ X̀ X̀ X3h=ύGzv5h]Yo#G+ff;[yD^0k{;؍n48Hde")R)JdTm@2"+i@!8ݷ-@D(tLKHǓ$Ӆ}K&q5$.YSm+$u4ꪐ XUVCWWJըרϵAW\!E]j%;tuUTQWP]I7Kp/VRoݮ2eeRr靵[.ꕟ>yCg ơEe[6oߴdhLqt[jSV2fWo =TYeXtU*k+TJif+$ s4ꪐSs**~Pi^ a8"uUVhU!WMmB9O Ju ՕѸ ͩQW\8=;S *T*٨W|cxVW 3:< G-dEIxͤ23,zG aVJ"uoA4IA Gbs ĸQJ6->d˲m_ YBjVu@/?m梑sQDoN'K zdVecT28J@0+/8R!ƨzJ#U{sgm0l)vWk(_Of&wUKk}zgz4ۛ_' Zw7'߹[׭o2{^] w+9%: *Qc.Y9yvFyi^ڒeqT Kt3j0sE(fb#ڢp̛3_*~r?4OUIyt9q_I?%f29(J|(U_?D.wVe2XVT. vI5=s|Z_\س:ZSY 8)$qx0+@3΄1MkgjM%vؙJCLь=S7h٬ӂWvH}zfJu@g*HDBI:ʳΠs%e<활:K-4Yیf^ݨԸ;: ;WS"v=j.MB$|2fjv{qzHY{eai-3K+KD]RnuBH#ޖz~{ Q[zj\S P'2JL@dlm5Q(HrDT5UQefIg#h CS<. b"Lo ,9YD%ĬSNMr5|Nv1sp vAM5xj˱ DЂJFizCZ;=#_qrj뎯Lxs3YD5PJȗ%'eGE+! \ ѱҹ/IFrk\fX>-8 nT'4ɦ`&"h hk#FA>BEjcMO?=B<fBlh M :z*utDԴtW#BH|9騍C6St׸ qT kpS-!22ϵ:ܰҕa'(q'&dO&b՚Sn-ңjx! %!0yb2 Hdel"IRJRNR;K[) .)!g ,a mR‰z! biXjL٨R\I]|5b<;`.[9aD#Еw}dUYG F&-oٯ@ޛ84˞5 uBh6 +Kq,)lLztHWL7ˎLTwvҖP %,˼kc. i*٘"²ĠS 9,^:Zɜ 9FCZ⥷$s}A~ Jz1S3WN畎hK &!CtR+eB,K1L$d Xʔ-rlS4>ǽzjtAY'?ߖdC:E @=A=,{?%J pЭc?F-a' h$VK dl@z^J8wP/ER [YS{dWR RIC=`aioQ`?ўF&M46^z9 ^b] 6;n#3&gF& $2\P}F9pJNST"R92hZ!'ϘO8ww`@wGHBhSLyS2 thḱ9 m4xG kR\Q[1;5ejk鵑W\4D߈ _*tH}z3vcqE,zuOy_:Xz.{* 39"kbZE* ?*Sk搙2*8P\JFxPx*7(,V L !&QLd9otl#HڦM|-9=}kfE"} +Kz5B$QVn"8m+(³ddc&$ePJPC 8$flAꄦRk`+5FԣxylTJhyX*reU y@@Ϝ %24i٢C KoY3(EEBٝ} } *hAqYs%A`I)x!] 2u!B :"C)r(: y#I|czZ:CCz(۳2O &T/<aèľ_t2 1*M2sl|@1AD҈{֨M {6C4X'R[d4R/"C\6Kĉ}q0(>]V4lqXZOcvpJ.?zhZ2\@y~)f޷ohUŵ8s+~lU#[U*hZwƥBEhQr[)'j0gMbK}. ѧUO4?qj3+lQJo=ViJnhrNUPo}_zzIRb␺),TDm$nU^+#̶r1u1gXl uhggۭ+x1ǭy11e|^q(Ʉ,,+/ W)!kUTDԢv#&j7\q6kn$7Ytk7)ʄ2}sWB ;#TQ*qT*w ؿ7]j/Oi4 ݍf{ lduֱGJEuSݻkW?.|k*'IK$!tK2^%نouNǕYUF ٥_A{kO*T}?/෩N?IoԵih۔)}T݆_CqB %h+6S"UĺIR.G\ hܒP :6* I0*`3Hޗw#Y.#%8d:$ܝ!S"D_!/SCkѰ-q]S몚y9GN~nV?_?I<Eqq{;jQW."$Frɵ$t0<+N_=:Zg3Ў| no(ߛiF͏?fٕN;\ffkLǯU6WѵPیڋ4?׮Z/EF MFh.@Yf1 F%O8sY?x?*j爴%%K+Th8yFvu`F!]Tw#7q18vu VleLr,g=+!uv8}NSr>lqcn8סTӦE5:0MSTkÁIh?6P997FՖ>9Lg p3|lܶIB %fE,Pj;{o ۄ݄1#]հ.vh6 4]|6t_"g"p_ Z`JnxV cu`1$/@8$N4Ђy"T8TR1M$jTc=nV G G(&sVsdkMtIPg52,3(aXbX BTdFJF :$y.4j)c$pPp6V;j pwA!:V;#gs>vR4S2%nu,o{=oXΑ|k{3ÚrG{]˫XO99ߢC w;d~UP@vMarNnɥSOlUk7;'qh짖$Z@J8j=!9k!1 q5p-]Qnlsܺd9wō1zi"Eu{DQPMrfK#py!.NAW'#N 2P{3j[Mbď׷{QQΉFi8Ь,7]r ]#͗vG5#) kGzuð4rV*b>1K]|<Z9L7Mn>ͣ&nԵs5F/^gRzCxR#Td8'7%gNcTk:4|ȎC0oᄒ۟^|ͻ /޽w L֒ MO" PߓO14jho64װЪY øt95~ɸtI|jcvn-@߿7ٻO~3T1 rBj~l5UU:d`qhavPJG FfnǞHIq4&`Q- J PBfh e18I;I_0üǡQW.I$ j/;XZpF%jU4!{GYLIG4=(%Mq|K^E[u{F[ю}1W;_j(_z0B:/=Mb/]kR|-YSky ԶȽ.W89C/vnvh8\)ڈvQ0LJa O1\s#SM.:tǫ䙳s9. 0 FmJnPh&0= (mh4i;4r4,҅YKRRX@Eђy%QzgȈ=(:-ݗw 2Y:g^5pc ѐlMb1(Ebh18o3 RH SOYíN*d}J /:$H, *Py6s"y.| 6ЌXF%Z$TP#s((*1, C+y#8ŸZNѾcA팜/srjiVvrW]o^]u'& P' 7p}4!&kb#dˏos;iLc˭9$Ռפt;E['IIoRfL8Q^Yh8$,ENJ$YkSdZsDXR8e=kq SbTrK@ Akd쌜؝vb>b9Eyn۬4ܬmޫ*?ViPppPM]gOR ÉrbЬ T$A;'h{E}d"0Aѝ"6<.C6l`;x.>T^3A%6`d1tLj9ۍGa jw{6QQn٨ )hc2$52 B!0蘒&3.N0 ǁ)!= &D%L{cɈuHj ꨬ;#g;F,e]ǾHE} " ܉XMGG#>8j@ irBbm8jtFSB' ԠrByPRk@mDRţ-i[E3X\q3.yǸȎxōix&pGRL|P9orRCN 􎸸.&ItWܱ/Bx lK]NG/q饰яOy:`j⣷X*0AI K5@Hgt &|x]:BUO=B&RBBb ܋!I%%xDm* Kc̹$R:'&CYO^n9lV+\r汤IhD!4Bpr&Y@y98hxMfi0c{l>6-GlWZ>LElU9% V_p a J$)7ncEˋ(#Vk XԸH*hJY$LiAEυD(:\Z''+76H62 JQk"tD,hac`c-3rvXoӛpzZDNR#W.QA7+52伖4hiY'3S PPP^rS()=݋&sc(4O"IfJNF3,DPՂH noy9!cN3:%b iMM"Pe$T-W"B1Y>PLY`wZmA& ܫ(ǖр,r$6 @^Cp+y Π;܁zyy%}>aLIu%p5pk#"%1Pef⽫׊ʡzX&Mt[BHs%q$ *puzDZ6e_N|/]ߘ얮g]w~ r#i_Z.olV6w{ |:3 %.u=ܺlZosTmtfK}ۼuزĖjwiy?yx{%qCK-7d?E?Ll[ӣ~ _jnrׯ~+om6R i, aB}cNfy;#W/?~Q*r`ncytU\ԔI‘#B5jr \ej9;\e*W/vk& \erz2ppWzyN̸y6pUVUR#\@D%qZr-p!=-+xtyV/j?]3A51(8<$gfx4 C|8wDIiIE2~F N@.X(i(ٽ(s﫷#x; ' ^'53jѸUن !Ar:Wa_h^xgsjKmR <S^+y  ;D=?c2|: RqhArBqVlUQ] nnS5_/߾y)\Xd%pJ^kfEYa^,px 0ٝffp&0b,{,;8[˶dvےrqԭfYUH~DAv ?GZ! I&tS)Ur4^~הd"uj ۄKSYGbɨSf/}jOMD?!*j,"%l1K -:wn1K_/ӳDR݉lNK(1L:g'fcdXC- sUx&zBa%oN";3-|O>xD|0Q ME4i"*c#L<"vyD-[#ӔI9VQ-YHDYk1Y7X+H֜ ޼7 v;Y ]ˋ=k}%\d3D/ɳFjǰ% xrLQ;4JExւ1HS!DIC'J"Gn Z٨ 5xWoKq+J%RVEί}m sbBk:Y.d5#KvBs_aM=HzI8?LZä-}H8Cp~lfW0N}UR\lmb$2 u5lQX#Z!A€!`U? "!KrmfC띫$٘mgJ'(L8t@h눶3qhݦRR<|EHP ½|AuR|Pel[c.&Q6׊ d$fzrg`^v'G !3k8t1"nkGFQvJ+dE&ɸ>)*%h |D213qִ֮3CQbe&ZP:&I@kC@ZRJŘt)oo4ϯEz ?av:뺸1ؘJ "q4lD ʆC#WeA#CΠ^ I}'m=*sкdһe[QM$!]-keeM6w81]\=m 쾗 *ftegb<91x iړ~2Ћ~LlT^WCb J0D 0vD682",9?k_g*Xsd|k6/PXv75S]NOuv"D{.ڒAdH7cݶ::ӐZ:[L^ĊqvҴ̋}-fqw퍃&Keo6M25^XE$bc%w`rjɖyޙ?u(@Qh<Y{ R4$S4+W)DG:(b̕ 2d2VDHueԪDUȮ3qGvOyğW䀮]횧fFo|g7rgX H  tl{Bw1Isj:2Io';Eod˴7m0"x=J ::2vYb`|7oƮuyܬC7O nV2dLϢ\MQ;M*[Wfϝy@i-mAr]4)F( eD !( S uFmR&O+a**|-+[ꝟ9o>OD_h4gLmxo0m~ɭLZ{ite"\e%hn["rP^?%_\/|l( ϻ=F~By2nԺ=HkCHgCZs㳋a&yÑ9beg" +f%RbR(wA}!S Vxqxqx1fK.z8AJ BP1&cSE$4DԴ1Rim")bLrq mb *2ǢRrG9.]:M/MG^٥;;ތY3JݜW4c~G@L7㷒^.tef$C A-ΡO[UfSlh+ .((k{ӓ[JOLIYK(c!YN;uP̶Q${RJ= nyWBPZQx5`7$֧/z~7d$nbnb`bl#(LrdnPAh)% EO֙| iwS=1Jϻ`;>a-l*$LPH-*@٠(RCDENv{&+㜌DR)58~q)W#8SvADOQu%Κ$s=&JfNW mҗ<>`oHf4%hl$ScWj3 V䢾p 48$ :˘T>m3rG9 rpaNJAnN8ǩ[!CU(VCфwc<';r)O"AkYOoDkY^_jjZ2H4qXəYQAiT"O9y}QR3QjA ֕<8Zz[A*K.֢֝*y- Fc}\bM^hE`vK'!ڹ0'QMެWov!I4~hg囿!!V$l NV5V7fUli ַXgE<>9s czU%nuUkju!fR6>;=o^ipZi=98t6ύ"KyC-Ŀ';M.O~|ן~?~p ?x~g? W`,¿$OV]5͚iu _]-vq]`>m0{ 5 ?~Qaȯ3v.8]͙WU(+,6ɤXxGRTR+$.īu!f<#AUrmq\$xg|tK<켄z 7$-ϡ Ȱ,9dpɚV-jJ$Ɋ 饳 3賌Öb25PwRpE,똤R.nZ9/'eC`nCΚgCGj&&1A09̟a(b0eCM2)4ETֆQҫ!ػX;ϭMvV/g's4Z/,$2db.E(:{BZCNuCw[^j(zNAɎ?&դ 1ma򛾶E C A= 7[I(4p[bnM[y")M uq}lJu,w<`r b<Pԉ4L.L1gp:3_zBψ<(u99 @ =G L3p\KbvdK `VˆM #0OCY${>>);R4S&֟ȧp4;\=M~/o.s>r`v6d \MIԀ.̟z<[~+?yn-$l柦uKN0*;M+18='* ݷi紵ލ|gdgP>4hhFnE|8 }jZٛ=)Je?x&󣋶:,UtB5_Ɣd2IԇP*˧[oFݵFYs7ͨ<t0h'-j_$ϼv]\k?Qל!{/HVWayF^gߍiNfr6Ӵ _0KRA,>Lnր<; ;Zn{/f{4ghihU9|yI-ј3mdٱ]S6hfoNռSt QuJz[휤ؼtƣ^l,M=FkIȒ3yb.yeY̸*1mHJ6\4z8ÊfE3U:ҫgح,O:2Cj/u`1!$6jIQFЫzC~ Q5س.ջ+ꅃǪ!'p }ӟ4fq?Pc/.~5tbؾv2o_n@Oo_R#vÞQE 992"7)kg^Iʠ\Q%$*bK+32c`{!K| ^C4Eɶce.9̂9Ϗ%s(];Dm`Qۍ=jm6JTBǬRV( M9++gSݘYȤr +:dkbbA$lФ8F Vjl֨_EY`M_?ED^ " 1"h!̳=wHfƒE"G/;dfu[#XΘ6@$;RND.8C HdC KZh8ҩvkZlG6 =r?>$_g5.ee\#.1 ]<-majԚ1bz 쵒1PBOށ#..&Bw싇Pa?<ے%z{Vo8A]EC0яe6ɾU>^=ƀFkИVΦrh܀ N&(,!I!'RBڡBb-!1lbiP9 9wɢ4hHɰt2%+%rCa@Q=.unVx yy9f^e2!䙔2ɳd`0$,1}ZaN:<]0^ *O*ؽ6-PҞۮ}婷e1v*l1`2`|H }}}Y%maY7ɟqi(>gw#{'/%BtFp}ǻQV-1DpoI'$-eIeS-vBι sk=v&z>h[sEGs"d [tkIh?~lT-G m-6hw\/[|_unzQ·H99]-; &^46Ӯ$Mhlk \M_<7N'd|$\^Lze?}shJVvJD+2b'k6(NNű}h&0Yp.otw=z~KϚ\CEir'(H?u8hg`U܆rt׽Wta5r/VWmbR'un58:뽄Rq;'LL@tBk^sJOQ :z||;~>B|p%I^>7v+iAP VȲ6 QhɅrIA*2:cU."pFxF91%k⁖3 Zh(xsjP"?<0{;Tż+ZGz "5j\]T^w?NɈ%)ϜI2Yё}G1 ld0kĺL{D+1SB`Co .GKאr&9:T#gG 2Jp$EAbPpcIeX2⒱$c0KÄX|IZXQJ`"kі2cmVC(ky}쉸6}G:ճ<=9Tas'}–lvF;#q9E1%)^ea JT" u%88%#(9εs1I@ he6RTW-*VCX5rKzw ;R>hi-ǧyzb<^}U8}Pݲ\srv|RDh!KZ4px*XжnI^4,OOz7ZX,dMuQ^GǬ+li>.cqBr>d 8)Hʨ2(K, N88g"Ѩ*d6ED!DL$4΄5Ji= .W$+qgT@NDU}[U{3xސG11603I !ńfg+>,R>/f&gjѹE3 3QJ%/!$YZieIcJ!L[Zs8 + WB{ʼnMŒ<3@4βHHOLhA,yt͘(JG>oHI9E2aLR8Rq 㻎l: ,CatO(lw@6֜ eU/v4 jȥݑ.sM@YGPR{2FK>sng[j8cQ9F4途8)(6r#lMBp\bvn]$UnrbEBTCK QH{Qi)]1u6-ϯjB>DxPj+ 4XUI;!)Kn5s`f)ip= tz 9g<: HCi*b Ai3VgPFDvo8o#ۂ=I\'M*8<(∋8/S{\݇QcCwWYz;j/ɍCzu-ߚ9Ln)6 M2OW^ڣﮃGJM.!Z^:l))a ?E#BT\9sP MG%`zOQDH]!I(nyU!i`$w dgBL l". s,P>l:$/[z%AqvZq:p-tX;sos5Tg|{ q <qr`F#$ie-q(1ګBzyxW\0q=Ί_ rNn.`v`(Zx_ .`ZZ5  M'  F @ Y]+LkLmG/j:7,7ZG.97*⸘Ḭ0P#X{b۸ȵuZݼO-nMMqE_-s6[vjW=Ғv^j=P "t@[ʊci]Qm LQqww]{v%;ωd뼢1/ހ3 _H4S:"HR8$m`Ξuco]}pَ .]]n6WAmpuTprўБ\#TdPٔ(ph<SB"EC"TOL3eb9-)zj|R ŻyQFGυV()5; D锨ăCx 'O}{Nv(֮V䘑G(oaL3/Wklz{z2-E<࡜M/F]LP f.8 QJ5"VHu 8\L(_.AmDuwPF/lCEj?JR% a5~jx[s'UD:hw2Qz6:zw ;hHxZeHDFdpY8FS cI4(hr%I@ZX(@7hdq,  DK VH 7X;\~%7}D#4UuE' 8?,ǰpNwMJSo"GEq?Pؼ?._3~}I~Fjpn/PVߓ_l؁̀Co} ۷şjQW."$Frɵ»l8ͧӷrc73Ȏ} 5;#ߚЧAI'3o<p4B}ouߜ\5uVz#~έ?CN_;}\ .Z)@ bFe6dY1(4\ q^=h >}G'md54ȚcOȚ IJ#US~Wإ{u|(mMp忷_Mmudr7?R: #K8{V+P}Bm~[1|dUtY4oY:dRE[KBNW[6ԢTDEg`JĔXXN M&E'vYxGo%(<]z;*bףNVkx,ˣ LS/kdBYfQ0%(9)D%Mvd$;"B2F"$\8Njb<4t#FM24CtOl:[_Si'LDՊ+m+]^n>`]"S/+9*Knqa7$Y j!t< D$餃H6@VKo!W7',sIh~4 A3ׁ47U )R'J3UwBV*h"s٣wZ~ *kPfZUfm?k-A **gvH yb8GD %PARqػ6,4K&TǫK(3 wB6!hIH%vmY{UbDĽDӌ|]HXq5x/U*éI σLhMdRzřӜTlڬ1Rg |0K.l9QE=E借JOߚ W mg^+^;zr[r0]׸1[+#k7rU+j3)WHXh\\Ob1JE&zLQ}ÿ)'2k 8'>ԅe?O8A2:}Ûߟoο2}~oi9WJ~zYiG߶7U57kؤjdgA4W/]Z>2; Z@gw/]7sӍ:ߏuoqG.+e}/4~VDwQQoQPQHRJ頺J(WpÆwP0+wK?՗I%IW{EjDMRX":" f$QƈS$1W^aye|'-7uIZ%iHT{lLbk Vфs;bJ:AX.iRKV66'v4 z1֛[nW[it6|@-cn)Hٸ>>)mMEXwMʩ 0d1ZtY:벃7% ag99m3ў;ai>@q2Yb,Z4f#'b"i&ruP !K0<j[#TɆEl6\Efk># N)*S-pEDC%`$ -,хo"Ò!Z橏p(t"8 /&n8c0,mafǎ a6=kZ``:v:&C1:-rg Aǔ,M<79qI5ʇIHe=VL5d )yT5!/YdKc6$!@<*QY06q>yPL雚2"mw30" nXMGG#>8j@ i|:!"p.80iH' ԠpByPRk@iDRţ5i[E3盶hi Xyq2ۦfɮEbϋkCY#=LύI$$'A ~IWLX9 Dpnyq'^lM@ȦfǮ| !Ƈ@aĠW{F&Z Ee츁\o'Gɕ[?q' R JJ_ B:c0ɗޚTt '4>rB>! -7m3ܩǞr',BZ/ J1GPKK+reU42*.*ǘIH$5k_ŀ91q]6Hk80(3Y`j)AO[iXn Άcg}y5}ilYyyC'2I2ĝtE= ˩ǒ&B9 mМG 1:y2ʙd3'8^b:!j}Rt&ewjt>mu֙KRzKIwUJ,oso>6?x#P˴MU'1kRK7N_j*!0Ku>{ ^J%a;/֮sׄ"OHj(͝t\zX\Vb'ݻUMbh8HKU60)yH9&bV z;xʭ"HRZe& ?f_'p@<'d4T"u2"a!rMqxx$Իqr Gۛvd}LT#6Ldml*h Ky]Iڹƴg9&!^&fY}*'ET^E9 XNi k:nkm6<AaP@gEM ì`J+h, [)1z :[~{M [ $gƂ4n@R-M.)|OG(Q[Htx6 Z"5r']_֑*3S϶Gݜ-ȵ V"vye \g=9Fǧ,=X|P%]?]o/s8y;Yo<Ů{%߾x%.t\?ƣњOkomdo9+ +gk5bjΛ?uSx;ԗ_^G؋}ts&h)g,{;ⳂSq#`g0/U:n+'ou®GCN< "s[nw7]Żۯ*֢.t*$ɼ@͟uz<{gq=&߽{}+8xjca4G=Wl=ܶ%%%Ʊ"Yj8sOdzcE*Y{tBnl,$eK+Q00Dŕ{Ko Kᐕi*l6ީl){a*-.P#!ԫSpVIe@]/~Cey*Mݎx>M}HS+ipjip8^OE_MGe.E89;Da"652zu io*ugN7U,l)d/6Q+VGɚ;kaՕ (%˞6FJůxNer*JFarcoz;H*ף\ Ij!zR5X=Jk Xw8`5OyρqX%Ӳ1{>W.;t<M o/j9Al P~]mFSfv8q&2NE쬄A"Uƻ,K3Uʞ&vR53=^ 4A:U4ɝ ~peMoV6M]UǹΣ3NsљjՃ _.v'KSECv8GiXiB𗲋ub.]mkÝHp_|=x؏)*e\*ίe]o{oYbtuy|T1CQH .ȵ$9&,IfZ@r?"~-tk 4NbwpA |?'ז yŀ9=0"ALH\P*L& ,Z#ՁHOTZ{dq^qϿf`8/pux1WSTrq:-aKGoCd;4ݨgBN,j|᝝^/)`ry3ͅx첱ךЃ>P|{;;~DarpCHL׌]Pc< Oߎ'9uId.s)II5V HLA@k'SZy,0Z7T$52J1hښ"uI{|{T)ƄܢQ%Olܘ8-x%zvϘM՚5=Gw2԰ *efտyy7רl~fwGaB۾ԉZ o'USਮ_C;/yq W 9Cl⌎p8z5&}ZU\`E2uA?zeK?9H$WO8W/l+%>-5N{w[eK^r36a6T]0+Fs6)r\Z܍}r 1 @6|^qf CW+wq?vrsݲ'CA⟨n,jo)2 tqTX͐QhQ( >P1@C>g 438p&e8,YֳWcl^kwI(r5nXR%8߻COZy݄}&7x1tt҂ EA;VB0O5 6bsvJ+vx[SdxZE6=:0h[d #7Z"Ŧ9snqny*??<ߓew;,篹:\~_H˧"d/|ԁo/(UzUvs䚠i:k֔z.hCL$RtygLD*h\Ϧ?^gǃu bo0s@} 3!/{4vr溡?WY ڭ b,?aJ#׵g(5lEOP}z_l6Dj=l;u,_:u7\jSrUf-eRXP\7@<D5GH\cIIO*C͡qps82 F%k:fGϴ1JJFF+"t_L$'9țM?|Nug|z|J>:V|V 9p=dlĒzUH/y!J+ T $zb8h킆hWcc'&H(p8@ qhx 4βHHOL Iހ1Q&HS^jh9*A&'u<C cE0*'S]T f =!G䯒[~k` S 9h&9n>sUg6;r\<2Oˊр&R-P/e8%u^ (Duۘj.U1荥IE!ME!WkC6+$YgT%`%cTZJ97lgSJ ̬Tџ)*>-1@STuhpNa\rK=FITG4'\gMk9P#? sB HF@<@:OLSvgH ]#D GuGZ~:8%םt\?y@Z9MmNd%(0Z(Ó Ig"B耣͞vm щ{!Ȫ#O);ٝ\}{Q Mͼ+ e..X`)u^ E3Ӽ3p" S#A4;kk."ݎcw@ONS]O0h(&iDɾ3^G9!8Ėe @><3%+XGcu膭Ccv'Vo?\])"ż6]v2\vII$łnɑ$&R QExa ,H)x&ZnJcROdʎ)RA 8aJPNK!rAS RJ6JC/\31RH࠰;m<8vq?  4+mJhzJ!VyG^ioq*Pb7$Y p)F/(I:N6@VKok͌)7Nn'ysܣi{n&qqkub\1̋7?˕zI+8;v>SNbB:}͊¢aʈy[w>JN*59_N/\7&d:V} S -@}SD'hp(plgj*xƺl-@0! H ˥CR % mvÃCaRX/ #^yΊSDvzh8ܙ^*A$FHEgnS\~c| 4@\,)G0}9jBBES<-կ͝WoOG텽U`v@!%1$GGR}>+#d:t^X[8LvB-I:[b!P*a2`4O'@WmGu[y[%h}NvW˫iѰ0a:s*X$6I=RTOk*Mۅ; {/_}>~{^|Lw{o_} 3`@${ \iǽiiho4U6M˜nR.^ژ[ O~;/GK[Gh93"%~ b~d>Uk2UsܱU"u<FK0O;P_}yv}tK|D6j#qHR2:Jo  =\ⴾM0Jd%iHذPy[xdn6 #+Ӗ$8%=ZF8;IQ<0C (~O'˞Vfgz'sYXջ<Z\xHX_5̧7>$bu^ԟ/Tdp啩ޞF\)Ogg#Y`(?Oy (|rB/G-֛v ~Tҕ@arQSׁJaZY}ŰzMp!MhAL u]6op< Z7_Rn=uM/dE-'mxf~:TBϯ&si\n`f(7,B3(3Ɩ|N=^Zn> %W6C+9?ၱXjT ʔV0%IPOi_pA;:kUjn; H A<*xǢ1e,ļǠ7ɹ٬y =CDW7TMHo{c9 CG0MTqXZ6Ke!K-Ǹ.{q% \ P9K˭.#:FljGI&Uz#t)d4A0ɵ r B($0f4j|sWL#,8ZI}52Oȫ FvC93A N?AZ4 3//ߖD"[={$0҃-cqK2DQI@"(8Ex4z,!D--Յu$:=V)nŀEh=޽~9?~~޿;9PDaY1LM@S(wJg#yEm.:m0ɰn-qpm~uqxlN0ާpv~q6̇ 3j^"V`4*h?CuO椮cKZ%h!IQI)%3ZX&f1û`rl?UC7sP vPnI~K5cKI0m^b~? 0c*\]Eqpz:nI`eвtRIJ%.TP%EHx7[ڛ'7o@;xj5c2%-3l-(#ק @~i\UZQ|mRZ/I]y\w^6#]~U| -EZ r7~Usc: LID|}|?ΖAY)/4Tn|ˆR*~C",NN*lBō,Dl\Tw}tQ3­U|p]?~PJׯ':hі:NqˊLB)l!s;d*x]錎%NV_:jݛ9vfjNvi6tN?ll$!ɢy a'YPJ@GIyh>Q(M_ڟkl5n=8Y=3<ܪQEq `XQ,2g`X'tl6uY[T +Zp*_u5l[zlW*ȟgt 7x gTUtXJj+!`catf,U7a>J]o]⒑$=qydڹk<4ѶsH6CfJ?9+P*;kc%'L@yM>jb w,aWM3d݉{np7R]B q)R peGXyfs{7 )<(:o;cƌL"^ˈihnDd^F-m*{\p͢}R`F:`5Or&3r}obT0"C0W\wJ1p j`w$ jxC"7.0 B" T)J,ȹYPa:_Y:dfgmh6>$zP>)gguUI{m?<9ѤϷ5Ղ;v/2k59-B`@E01RF1H:i8 ^Dn0@#'ftbaY5P2zE JH1\ʠ&}ԖU`RʭsḟUlqW]z]tᒢԷiB%ڷdq칻rq=MҌӏg{`]BrU#gzȑ_Ra`A{`E"y$ۙd1%YuՒ$3MuuXW#> Ո?F1b蟙<-d 1pZiL fȇ 1N8Y-gL z.B#,"ywGٮb>^7R7.y^=EQbՋK@xW E_:_j1}h"KP^B) ^<^<!}qS!i*lKM®bk#FmRIp5-gs?ˇ*)F&ObYtWCڿߒijn۳wtW?mOQ#}O>G'ݘ}2ȵoEGX*nH.Z4DYy $$]Y }6M_xjH;t_Ð|* }x~lg߻,懘e Rafm botz~ss;M;^0om/`I hlk -čEd|Gzy2A'Ҳ'kш=de+'WxhxjL@[T"cfe/;'}ee?R8ͥ`3ȽR'ɥqб  #7Yfe}J D =w\O}fqgfvˣVxGb'r#i]WB]s߶ti Ed%bn=vحQگoMѓf䥑a2ox.LNyls-4]`:pkY?<辚5ӟ-?V޵-E~ücP35؆dS?ab%яnG}-~0JQzϗ4t URNGRwxA(Y Q ZYz̖'Iu 1Io1E`^ġtaѠdB*'siL̛ mȹIA+Lib3]b!Ef5Dw"7>8UN'UMqOK.cEIB" O֧-$ 3\sEKx-2OX`UNH& ySE~/[~-jסg!Ti ۂck8k8k89j8p` Up` p` VY,&jZmV&jZmV&jZmV&jZm|s͙ZD7_u}Օ;wNy5^fրtNKuǭ|w}⾻:9]Ow yBQuL҆IJR:Y _|{mի.@i!emЙ G4Hmd>3׳9;౹wBd__O.fD%e&`ĥ\=N(Sp[k/W?oJ6H"j r/Kpg `|`Ym[c96QL$w^Ip&qL.h{$GHϲFHȉ[(@ EV6%cI2dr9;Y+W_^:>ĨtTNELQԹDed;#C.r %Y\#&N&lZA-'=/~;|rty sܚ`/NxXJtd6:/:xR%Uq8zO֪_7n'ҟ1cZ"8N/v=0n+rx=L:p-eS ,jŅe ",+T4 .8)MBj,J[v_w%i|z*6#t}Kk9G!Z\bsrA2b)6Sb>ذL\Uو.Rp2%8I/|H +Y(O_Gn 庥ߣhMX V#)w$O0r~OڙEs-u VLHKsniʎ}(C 2n3x BL۴P~p;X={GW~ftxy/S'ݘd=˾\vXjp`O*tCrR$[L!VZ|p٫8 -, NtRgѽU:#/>˫{֗Ef~^oK߶m=hBh)7~tۉ/3dR{}čɂ%< c Ӈs .oA6~Hj/9/:j#X=gԌG킔̺!}FM?u'/xUi# ̣ΎeN;'<{A(-lCiS [WLGoHl <b>Ĵ i3  LQChH!4g4h*9^U220:^Jv~)=LX9_YMx? 5BTqdxd/7]}/BٚxMҀM9ѸU4If5] ݝh.1585)d PI Ƌ5!Lx(5t.rk*o'B&3BI2=x`)j{FL/9ZW5noKsIvs]Wv1.|9ѿ#D@TLxU0%o9-]Ql<]ZI7MB%!6.'lG:! *h=h5'Z)o C:X>op|FKGO<Ȑ BR3D!^X4@HhSBObm)1s"L `.F8F(|z hbqCVh6fP$ \%brH}p̠HC0E2.ξ?n Qj~- FwSBxCyy꧵Fl~̬n3{q2}L\*HNP<' L8$f\L"̘ASY|U<("VճVT4RGwmm @pI b{B_%!zfHIG$̰*u<@LcPG OzN"15O͓uㆉOhϭ䎨5ǟ^ Y c?ш,8遡SYd/٤:amZtG!CzQ۳ƒi@d&{)O>V/ӿ-7 UwaL>|PPC=v;Md.6ѸY*Ac7I\L` tgUM4~TIӫ ):NR?GiUտ46x '׳0 G&lJˠe%TJ"]"-J:a4 ū٤=LЯ!׼3oЌ\;Jp!m}JXM'`7?VcU`i'#?ޟQ?)V}r|>ksnHz:[]ėΑRU yh-h}n.kW͍?b8-ksYf<8[e-/^2n]i$'%KO ";b>[3:fzZ:.ouwQ]\XTGE@ PbZlD6PC E=z|ࣛWM'cpstϩ9Ss 5S9 *ڞ\u.5>. M_MLuf<(O;rzO'f:Zفo)gҙ4@N- MQ<ݛb2cOL2$fOgE53~3m[LHհ&?4Wt43Mpkvnc` 2b1:ĢCRXӨ -/ڳk-Mb2¨)I䖸WF;y,$u1yDo*BU$n/HyifPMsBFCD09+w[GTJ!Mʊ.XE K:G8]4Y~ioM GH Q,97 Fq88=kF둽)FQF-vF $f=1nHw7TwvsUށ0aU|<^sp>r7tWu]vzj{KH|?ԘYq,8=\xo_FFć8N 78v(^.}߿>yݻLѷG'^0RчNģIQ4- G?=kko5Uleηn|~)x_|C|jgv n-@&/Gՠr p5wyRӿD,0* <#2P!gK#,5dV m>6U^YKbaF6 #+Ӗ$8%=ZF8;IQ<0C F|eOg@FxbK 2v_1;;;i7;a;K/`;+̪=ly`;)v109躑 C~0Qjy0(5 %c4/<ImdVRLGRES-p_dL[C){XU *Xj1%R)]O .xY'Hiv%Av- ] ){33M>:0:Iyx):' ;.^+ħ'b[@] >~G_+Mna(R~xr(&ONAr6 #|Wꧥv^qTk䢦L ^ aՋ-ruE u`&\HSDJBGo׺m f k_kZmw^msaM`4`>QE}>3MFU|AqǢ3k^.۞WvAd ^|y vC]G!JlЮtsl'H׊6J@i2bbd"[|lѣ6fK,vTKkv\$8e Xmȅ ^%(7`}MP0vY4AW ^ )^K 2V>X6r6(vpE = SvW.yTju>')[&fn>>ƗrKxΟ>"k[[M5+aOKA2<[ oXeM o~o=Gi|cZaJapwb< 6#/F愔۱H"|o= b6av 1 ]A oIK g!2)%ԑ8at 1L`)%r-i+A x$ )[X1cOG&Z iIn@U6r6Tݫw)>lS$#C;Tֈ`VT*wRIZ&S#by 2` <C"7 qfAH$*Ű[)rY/tRn, TW԰Nj9HO.3gk9N|(ʦ,黿%Qgk M-ՙ9kw{V(͜yVH,ƈ"}dL1o`9Ѱc ˪h-JoP=)棶ěU`%[3f#gfgdӅ8c[]O wΒeQYʵ̫y4r8hZ5v]= mGǃH[ Q0emHpȸh*UcSE#aHΞQk %ImR!`V0/2o.Dʬ^cp1wEkg-6Ck{φׄ^Yg,>`YƂKP *ĘiQieV}Aǘ !fE;$$!%7E9p4&>Fz}X;jI1O?Ո8Fܶ^#]4"i숌Q B"-Lpa

    g*>ᬷaM4(F #$$zgk0N0 kD`IrW}nԛQ(%>Cfs2!VJgt,vٻF+odDF^^σx< #%YR$E՗D4b H"Yď8qN2+k몸"gi6B>BHg|!dܷgZ(,/LPC`62%S$"-TlHoc"KXM#]_V0E8ҺkKV9 m#)u3+?kNQN_[džu& zw}!:ʽ)5rj{}[6je=z;S(z TEkRӷn"0ϻs]?~y-]snbm>|1_򆘯usq՘(aڕ3Bpx6Zu)#6ܪ[|ZRsdf!F-]`a^i8G+4|uq)}2aWiC-CWkh@){aÆ $%/MWf /KW;}au#^vCi+"p>@WzH+Q `CWR h5NW5|X]pEBW}V:+~tz*] QCWf1 h}+]btzJ w9sWm}V:+GOV_^mx6#_n-X/ڸnvaˑqo}k?;C+zy[HVlUݏow 1Ђhbhz}ỾW MBj*`d /Z9֒]vAtox9jQW@́^!]EubjŨ+F[@WTc%+Е^(]=COo8jxY -n(-tءg# CW@P:t ʚVDWp+u)t5Zw(Et J[Հu5:t5Pztw[y*]0{p#-ڽ(Utޮ{[{bD-M.ߡ>ȭ>\i~tNk>gS?~8o[R(a\~tvھ^fz~>=:./Ze}*wm 4C6H/tsGW9/^D.O<?~ZS8˦j5]2sW`Gc''`v=i骼xwO?mbvs?@~A>jDq{72>6L @_W:99+* DјwNVWg}8~ݴ!ňs_ϕS9m3GjcG'|ۏVnOuz?}ڣwȔ-q](Py a|nq+^Ǐ5m_|XXQ*D y~DL~qQ7>o?/5Jy`} ~Y!ߣ9ixq5@.۪_8+_Yxϗ;6kK,ǒi}o䣭QL>xkr Y>&cgwy?~@b&6ڋ%Vߥͯ:X$=Q1Fb%YCώqVq:zcA9,5Us t!bIXoر )ʥZVLo!B\EK{@֚,99s-Y'! yI4P0-w\SIh-i氾mmVѱa0P$g_GWd$XtyT@8քFn1=^X$ZCqkМX155(7m0\A)Iyc֣s6b}P(DjGU((IU&2֏[|"Vœ`; ѕ5!\JJ͑)@&^`_` \d#9G1QꛡDf(M_%Cez|=cqB1K6B@F;M+dZG j3|Wl%W' e CX'뭋@Cuͫ)"*".t/ Bu>9÷ : 9 & c6KH `4 (ppR 3'Q .Lu(9oLV;yYi{K6ݙ1P(q 92j$XBY{VLy+ 2|6T2 i`gqIr"YUDI)bӸTgT0-f5㬁@B%KJI8P Pe-{E d(÷2ʱ \+=A |f=aP.3TcƬ8Yh(`ʐvH'"!p"%]{Àm;Oɇ˫vӚt~</UZl/k&!M B&a-A7 /M *}rcV2E$]I,*2:DbQ4X`dVq4E"LԴ*\eJ,!`Fi0Fi0/1 rHݗ9AFJF[;v@vTm 7a=Plys ¶MRANEOe0Ovߟn< UELr!B2,sF)Ve,!zB@6@^s` QL(%@_AsIvk ) Fc*L`4kphH V -ɘ+u y@'^Bjڝ+CEujFPF,Z =1q-$2&e#k2W(`Qm3+I@$dM pyP*ZGxwG$qkpga1f&njpF,(ZrEM;Ҙ!gѝmhd\X&PYICxSw6RU:[5 ."> RoV Jp}AhiT fCFih|0&ZR0[|s9 .d1Rug]t>p*E\jl >M82aw=)hI0KL%wu(wW8HD4J#x"1 TV,w{q|j{LT$RQ b\^|ݜӏ޵~!ɔJV>x^KUv;t"d-h9"]/RZ,'שl_u699`JuW˴ .\kNWjZ{;an{Xƨܚ}ySC77󟒿?sykK5 ZousKMћr%]V9e?tl!79/egJt.:-&r$v![~u[?D&Fn::uFV};Mo\+ԼT  2E@? *F+k-\!pW\!pW\!pW\!pW\!pW\!pW\!pW\!pW\!pW\!pW\!pW\!pW \rX*p"WWE}Jo:E-+B +B +B +B +B +B +B +B +B +B +B +BY+Cb   puUY=\!pW\!pW\!pW\!pW\!pW\!pW\!pW\!pW\!pW\!pW\!pW\!pW \ H+Hpr pjUp*B+4\!pW\!pW\!pW\!pW\!pW\!pW\!pW\!pW\!pW\!pW\!pW\!pWW FO0hջY[j]S 0ozFt'6JHć$d8|X;rK[oȇU8tP8+r8,r ]UM +Iڊ`WN^,]*ޡώG:mjA9|L_ g/?Us((422.?gTrZfݯ_oM2ilMJŦA@Q5 ڴ҂Z: LǮ\3Z0nWEhW'hWZ+kՀJkM5]\FP쪨彏t &]]K "X gͮ`GvUjw**-C:A)9[ EN2XUҠ]0vXI~:J0c8\]'Cvu^?®8c<Ul`ZC]TݮJЮNЮ V|0vUj;w]] „Rt%"WUQhJ)]`E`|(vUԪGWE+*=oG 1gY ?Z+Mj2I>ZYȻxz5g=#i4C~?^Moie by)5>s 6ҡ.Ӵ 4棪땑^AZZbq˶E % *YZ+rV@{zAa~#*o*.(XO}fʜ4Z~\^ ֨~P~=߼{ן_\7R͕>TFC̨³֜rJlG4-e6jIUfUXY-EaN&[F4#&m+˜Dy}ve Ʈ\bWE-g}RP+Kf@v)]BŮZdԸDZ+g_y{S^(:N٦q] 6=`8FƮ\+bWYݮʾhWŮO=Jfl0vUUQ+iT3ͤ]``ȵt(vj}Y+ cl@vU$ 4u~]]ɲ s]D+ uD93*A}h#2.jEʢQMMbTfxXm6ZB;d/M]/u ~.SeOiQ [Hjn[_.(kbRFtYd3C|v)C+NИNb>:s9>-q/$%hW\y()1%c.'F(ɢ.EFoiNQڥ>ӫ%L|: 7-ݖݫ!klfJeR}rwg>XVwtWUw/cjkob[3cQ2 PРm-o~u';%m9P|j[+^Q>SȊI&&*Q4ePL1t:V0 ZSeD.4 3xeR1:hԖhAyc39!F-ӎQ$9t $ʳr%CU`HfYLKg@ 'oPX}X# f|&@NyôDGħH 3:;-Kev{PT>z\*wƘ>1fIfB^ S-IDSE%UÑ Xk#E kc&/W",;ʜArFw|sTޓ;~.]kyAǞn%_j̨?_Knjp™6 hyWEz(鴬~Jb>0+yy@FY&1VVRn)bͷwgo:?gaI|{ <-DMQ叿NOoDu57ЄxpLu{O@i<׫ a3夂rYO0\ݷ3}%UognLnǼ܅ ޵]<rwc6L*ᨬ|PQ˅rcϔwQzL! STe#x'y4O',+JnQTNsw&Q)ML EʜLL %gx&q.qޙ Buw@a>\L"v] {X4 C9U]Ed5&PK$HmNDQnL$cNL&t;2{S6C%KQp&Yq6D}6Ξ՚1w0r {]ɹ~q\|=I%YPfIC ǾIC^|I垤̬:]=hi]pƣۑmf>n'oJ=qYbRqDi GcM:OZ0[1Pǫh:& N; 꺧my46GQl؆I۴}{Z&dki49 >]hǕ)*Il4kȵ o5}6ffOvB˞(n'(&U@xj |9PP2t6t ߚv;nhPd EZ\{UaMẤE%3yAt"@`Ōb_yYE2U ^X bXG-w5~q:DrTNq1Vgg &`a)*ORyk2A.mq))\Kd)."+!d`$Pi1!35˴҄gVgr_څeuJwVZBϼ|;q,z_<֒EgJF@VOTz0Rt"WuEA%% tFR[:ly^QW+AfUYpRqb%yX[]E툃M203U;4lgEv.P͒G knV{v$|4r[[wk>$.h\SBJS,ɢ(#'Dj`f1}V]|RFiIk/IDZ~&DޱN`ٵ!׿/GdY | >2ba8{i_zij ;OvPO\= Iڸ'yK]p@ #AZ@X\tH&fq<*Aٜ +T,$Jbt"AV`CIk9dQ4 C ɣ ?73G-RK%_&wb+%zhڻy5^޾6(~RU5.AT'vĤj0eI\AҪUN;ME๽yn'fј0΃?;F#SL" @Z ;'S*XSrRՈ%Ε$UEcJ!J2L4QBD*Vh{޺sO?[gf+CeYt-Vق:eEcMZdAVҤmA7}b@}8mm-SQ-,ɓ ډ08IF/>Se0@ޠOÐ'=-~{lrj819ԖGShɺ\!F[WHM"s"oCG/cg_6v+miϣĵCC{g?ZszZ"cEOW!~|/pWU;mR `A:Q9Z TqbjL;9lxTT{x(T7AJUuM5eVe aTjQ># vS:Bqtƨb@HSH![7WDm|.u-Q{ܳe4=$2us169Sq8lr+j FW0^?ivJ.Z| tr$^l))Ԃ!hRP֥PJ(!kJj[|p|7AEB`hN#2LCdћ$!X=y* "`K]汜氹hsնYLvY?U 79&pY[uŢ ke;IUóPȡ~j5=3L=t  ι&'v1 q]携ApVy00y öʚlĜC$At͆|䘔7%Je~Sx5$~u V"U&N.ޢ. _:JaJatDH)r>-8ZzS.؝׾moWYPkJ@*v5Ůg2ɛշJz62lW)} k=:*ɡAy?)j9(,>]vPm 2#=ݵPlt;ɱ}o;>}~(zovcA=<'PmA̐^?Qz%YMͬ%V褾Z=rcrK+SALd<@+8hCu AWG;pLQf֥VqEf`JT 49MxV*cXyA;_9uhg? '3IAiخ't'Goitq_Ο{XKm5R8Bՠ!5PC& [@ei?=N_ucev?.4Ly/x?ͮX9 &hT!ѩtlmA`^~YWq+bt³(laܬdBiUHz_{v|~MO|>U c]XVG#u/ښ=1e5ƪ:km ETj׌%"p/dѫ\NcZ켕|l<9whb}Ug8__kx_߶v1zyq1)u;B ]@;6 e '臋~种%v`;y*9MS/F?T#L]S]\/ԼX4mHdV7? ]sxFCۯ!.y%pŰ1}^tnt]:>cyL{`Of2'M/uugserV>u:pt' Ep,氺^&vZƱ4|6jdy&Al$׫T t1=GGΝ.w'o>zr(}5ɸ5M39cE7Ȝu7udr9RW[zvN0_bl}ȕɿZJQMH>Y4xGiÛ->egn]&7ߧ- DžY }]ؤWwh&wKV޽wi6[6w[8[Lls%e4 i߼/]ٙ)s0\IXvizwǁ vɶЕI։>f]YϝU*F/w<$4G{w:( pS6,TU욒N*c;(jkg~&fL==끖  }Q(C&T&XRX Sك)c ^kn}z ztK>8;ɡl3%lϒC_"6#.g/i+wЧh"ǚlB%7wdK&ZLM{Bkr*0h'sC[c=9ӨF-dxe<_3-Maǒco巖;'3(\Yf<ǟFҬȽ|ͣL :z/&ryC,[h8AR6%#S*stN[-O\BOclbWb։jŞT!) |{wpNۄ^nNW, Rk8O- ]Ow4dwnwۿ**{n6u ޹q#y_旽E>Cɞ 02q6WՒfe4f$^tYb!lpvLk&:j>>CzlqZZÿvKT?yֽWPaD}'u"]]|_٭őbXE.^>wwi 0%agK>]͈HoˎO Lz/W6ח֘>lUOVw9{?x]{!԰u&ΖiFro~'2QsՏQ1=o^twO}7/_//Ͼ' FO5jvJGK(|3{ $%"|{y)2f񠍖g*o-|"+(o׫* 73N՟RޫC:CVIAU, G (法vH>D+5^wɄ1\ mVG`V3V6țww?s\p<بM1ĭWe㒈ޚ3pyp`<53ȞNVAF͉AFK}zCyg΃NSG̪3N؎ʨv^·*Do&m,v|s]SxuDiZ~4INU3Gw큏JyPut[jZ xxn?#_{nym[ "&_ZD+9iSMk&1(^Ɔ!IRt|~Icܧ1^ՅDžUS0͛XPx22ƦmMKN8;%܀qVL@vʧAZ/[WH/"(a)B w.Tba>@ox&6*3iZ i["@2TdTM+߃$s>;:i)lcc|!W`j|2ocÇ1x(p^$p4qpDMd֤R2mikC\ zﴐ\+6%hL kpФBH M|*pq( meaQ,SD}}K+7܁V 8؄<ID N8ؠDt3 @r Hl[@kDs#=iF6:(09'bWǥi79u6Krro=B_)iN8RݦV+@q!B*b6! 3XV%fg^/8VAp5qG9a~|PA97f\0ZYv@\sБ.u"̽jEG@B*y -+v@ȃ xL0F5;ϽhiF1RgmaNvRxhْJ`B1V%kGJ]IbZ˫Ueng7"b }*zjtX _.'6!UZg9Vɣ0~/FmXl^.7ת‡7W~mY/=86ĝ٧=#=_Ħ_P+k|r8vxpsk#Q Bi5 ~'g*Eq5lX;Yr +f]lRtf2%3/g>>Q"T ,W+,AP"Ҕ+R4jr'_I"\9JJهŐJ_Cϙe<~Wj5.|\S2=p+zn%w \`P HfZ;PjVk  6+D)"&{TZUq5A\Ipr[ HqE*j' /$+kY1"B+R)]qa[Uf'PKҪs J(pv vJ`G:4UP IvcTں0EL#tኞ*J+])B9,gJ V+qI~͎TnoF)W$H޺"p|92{9g0>h/㗿'WO)ެʭ݌2BԆ sOV Hх`(?ӬmQSS-w_Ξ! "^/qyjy?6J?˔ӸiM j|{v3g29韋7/9>4f8/Z)"AGS&>M1E*fQjSʢ2E-Rt]Ԛ࢖v2O+' jV<ǏJYG+B. W(Xr I./WVqE*AT\MWYkKrQď\YL@UTJ^sƸbl\31c X\qG5צle9>0Q ImF%ԪRz6: •1++)WWRi @++[Ά(qE*U+'!9+g ^ H-d!J*U\==h/FvKc#Qk'w}?<:tpF؂pqK@+Jv.w\J!*+!dw?\\KQ?FU\MWR*iJrQ0rpErdVe J*&+FpYH0@1"\+RYIJ ƶ?{F] !2q%`` q>$^\+KZS$Ek,$Z=6{tafTխsϭ}pcIqܱ耳 ݣ:x7%;7OGTvM;Z;Cn,uX:q|2۳1w L=?mn)T1P^o'LmGvzDo5?cpҝ+ݸݮeWofzs~[QO[_Ռ{''n5/{?{#q}Xxt?/W?=`v2[y5אWGv2 xeח窗gJ=ֈٯA q'g*[:ME73;pǝvzsq;6\NW+7i;f~~!3!O MF#@~9bM=qձ0j/ ~0}qYAWK^ce]u6L:\oBW@D8t(eztsL 3p%M:ZutQjtJ[aN{_;]uj2ꪣ+UGy,LW/HxUM\`d pT hNW@i|tW2Gf爸((6|?bb|5Gl/ܮ۷۠违;E=CPM{ A>G߆n G%c0;d=7}ec#R" ׋whe\̯VK@gu >[7mwV$x1n]=᛽VA{7{k'_7wd}m'ma\{i,G(ގټ0>Uw5|ZXP #ʇ; e`f,j/7_f|d0o~ ~!Nлyxӷ(} t K}j/XOW2$sZjhaFMʤ4룴YFMx{׿yonV[H F|m}vwqՁo~o-/x U-,D͖@շ䀛H!dQM%Hp}\P)&jQBr](cZ"P+)VM"9Ql_';:[H#;BC:jR& B 2,DH%i6GBAʓ5֊U^e`t*$Q (5[t &YD?o }TCԚ1$kFH)Ldit2Q0URB0'ZK}B,=XɌalJYhl&ۀ)J#I>bgkπHDK3R] 4Y!)!d*ҡa FhRѤ CdJ COi#>劁Ƭ2:ּk94uæ5Y`^E3 <4b}/TWO,1KFK[ҁCB'dHXlOĹL.Vkys*K'jjdU9挑9QJr-^4kAr-; :Ail R"vUS5}GZGWH}fEBPZ)dQ+Q49KBL }T[2IeSq%B`1zԼuړ(.zчD v%d)0sc`䒳"ac+ ])(J͕Rl SI*e»P4.CZ=c%[0݊ NVh sV@kazhm^jvVa። 5 <(EJ<Īɖޕ2A[4ѿթJ_JAAh8(l,@!:zҰ WmFՊXkuڄB: |/hwW$' +K;L')Q_00,bivyq ~9ܩmɘu `-$ >:FjAuPaLJ7#K_^J.G[K*d`1uLA-ENp`CEE ʃZR I"92ʘe(&Պ. %1zOE# }IEHV/VwCjO!3XhGWC`QչQ,TGW>^ՀXż3rmE5Yr&e5D v "} Əǧ[rX;yy^-:_`󸘯/^.smLn. nAfV44z:6t FZ ENiVnYkZ(ΣDjAKDm1&PN= @rTZTHpPaRD6HmJ F >EӣNJdt %L(H H*,m@z |""(f!=` q^/Mgc X@|,7`E^QH"VR2P$UAU;X7H.z?b zYQcԔ*Q f89 T,+~x.]ڈr!cS;SvV댙 +~Ҡ5A*Ei J>{jfR &C@0 >Y tStrgAVA%/b JHN kJ+ (\u"(Aˤ J ךEzʨu!~)AH k|T G=zMڃPCXm(TtqH*MkE!0Xg7vԘ%Uzԇ sȎI8Ьw8I8L](IWrl$dip(MU~DZWW4"o! vWy[~܋f+zorbDseC{&56ߛAqz͝7&>o|x{ 8l-s9VE-mt|{x7~inq2 gzZ]\z%;'OD~ ׫_KqϴZ\=zyn>dK뭛맽Ch^7u5{h㺭z@Ć'$ euSrA;;\/&Rm!:Jbt q@b'; N v@b'; N v@b'; N v@b'; N v@b'; N v@b'; N_Le.K`a1&϶eɑdc%YmmeKeW:,{x..Kw>at%k9[@i^KSf# ǍޤI5^$ >ꌫsoۯ?~?NKI%@cBqN4`q+Mhݡ MWh"NP=MC6.-7cCwW3utUQj;+6]RWp ]U*ʍT=]BmF3tU ]U/tUQ]G6:u~:lՁj?KWu}P֥xEWjR=]=t*`k:CW]+B?v(7{z?tU*t*`UtWЊ㧫Rʞ!]%"JU;*ZNW{+~ ]SpMgƮ*Zӛc<Ǐ @/sr2~lF̌4fhu<`Qݫ'@[rP\̾cN\w7%K/TL]ĺ™R*00Saus{_RYԽ{MFd7آ}P"h>ó*y]\K]5EOYY$,YN7wq]\?Ȑb,(XD^gWV#-Fn@/V%CݼeA5zdvcWOhp9Mf&;j"k,RAF gإlTw*\ -Txn*JyC Vr0]WsS ]Uv*Z}$HWXRtI]`ݡ WUE*J!]9IUۃ/nw誢=~(tf p¥E;y豫Z}CiqAWS^(n8]UJu*\]{>]UtJ*)`~ ]U3ꪢ=P*{+@ !uwUk:*Z NW%!] ~p% ]U`* ЕXP&hQ|{DX묫gZҵG#B|/ӳ+\Bh]4Z1=vX V*-22r >kf2:ALO__?e~o̚y~kjV2i/\t̫xL8VHL.2m6%K79ߧzOWFLE\U{'1_'Ԧƺfܿ1Gï9)py%t5Z>gtk<\ݫ9>9uVOL9LouzW~mO6!l3ƕ-Ҕ7mun#m4l1HDAK.yQI)Xe1cל&p/V.3Fm Em֌+`#՝ӛf>Q,KisPn_;[?j<&>"W}]?{fj7etن(kUEL_,o}^Ng9|(fvr*%̹yqZѽFW:4{LDQbq"{C(dʺ&_1"-H2I,c%`d'q xNGޕE(S(Fyroك[oYoɼ̶[\gk{Of~^]#<O4'Bj} /qϲsҵX7$=}ģ*}ݙԪ;}W?)"Lb>5^f4_Έ'g΂:Rg x6Ka9pv6;<4\ MT/y>&J5lj:`w[\bh#S OT0"}wyvP-hrKPBd ==t\̬E|p5nz¥9 g!&x3&)ozZ؆#-"%*"q𢨐-H[ %1!4 lғN)v yz1˟diqP J&y儔(5`4}@|:]'Rd0}7޻]_oɧfoH-*fE6G<Ӧ U͕{Kdc4+%ׂne_mn;֖Z;^iJi͉m+r^-lK2y +Y!$, /;^b=Gbߟ7=kL5kp:&K2^c Q昣D0>E  QG0SC[JkQQ9/Z[,=)P GCXd~5"[j C[wYFFvVg3(6}ގx1iZBkBAtki{'KIᴵK46 B2Kbs=Ԃ)&3sb :˒ܓ Ҙ^S) G:EA"-t¤(|ɢh:,RD2"s !U`+*b72In6cuk뭏g_oz1v_|@8KLrT=>| K JH-3(7Lb.IH[XG.#]Вji6#xTk%Mh.IyBs0I#StJ`AkDuҀ)`V*1"'6hU'=խ58+ʡG${XB˰ZFF} K.K޸ fׄ<;.04]j*1TEy@MvY'XQ $SI k7Lޞ~vj3u(R,@R',Y@aZ$Q;VyJ% 9 \NbD (FX0gƄwe?k #oC|Ǹg%$SZz "CNԧԔ^D]MO#0At1zhߊ/<}`sT]Ox xI9G )4T{՚Ԋ1?qaO]A 96uݫknUTڡ.9pNk e0GtG+1gΉʿMܚϷMp$ޜf)*Ya(ZuWh!sWB &HSf ƓŢ{okoD]M3qͨg|SϾ==+ʸ<}رrEԍ)T< ՜hJDYў{/YS[Afc~t%-/Y:i'ϥ;4B;O69d=+i;{d{lw(M{{=h#|ӭGsE` 2^gFpio/xTZkPkA5Չ@"FF$d%,((2X!Q(b6xtVy8P)ͅTNdiVF{ #Ńy8%N$}:T8%}jqt`E-^0 !R9u { c^kYuy[qeyL>زM3% dZo[lѷṕc Zoӛ-Et׉Vy4ZaNjM&ʠ {ݱ z"w2HH] IK^F2JDF.e{%t.* .e]osE#)(K<)$EHYgɈ*{gy gPMX;NFi^i{U@JY_ph9,1(6^"6ЌBptx9ΏeXkrl?NP5<h`/LcX4LKa#2R1Pgg3;mDTΆuD IF(yN~adQQV upȥ(2{âޞocu@ͪ¦ ~WOd[ap ;}( αWGRGN;r2MFF58W@&vChK&1˂DEMe r^%4l کRlj9*w"L>]PTBp\CJꓯ|Žk>=/ӳ>/~o;`c|m =|/#, .?"rxKVSHe@Ĝ":; ^Jm(-q ZHRWu;vн7?}u,TTKd? gEz2>Vf ;t4bj"O,';r9Nj&bRPG:k$w޵q$2?{ d~?Nbl'`C3E8%w!Em3DӜuUuuX/%0]8ZGbC:=VʩA; ZDl 0ø)a1k6r`mp3W\^TYpy_ˏxު߽KXjV`n͔H*WGBv=GUUȠ7 ӒQK+q̋ᨴIiCVZ q@Y>1؃nM J%,,  aGB,HOcRIf_YtˆMYUxd!R&R/5ec1h# Vάg#g˳XcʫzQ:$}34)dn~ ,H{<[ :Yxbr,%!FIwZ R8XV{R"#(aH`Kq(`ASI1V#rBhG"H&o/7z{DHZ)}W&.(=5:OvK-ߦMA\-&' 7 OZ`P(b d>t6H!9L ǐ":wD,ȓuϗ@p>2r|Rg<8TRnR _3q'=0\qj4!+#>d.ZXo2CqH&v30̤/$mՕT2Foj^pgP,b%DXJXTց#z,4E;k$6OPlM?Q,(3: tq PK1,T7ůMJo"EEa_zmt/{F}ggkH: lޥxɅaW=ubF'@ :ao>@u檾G?BJ@IQI)%3ZX&f1\0YMa tUύ<;8'[R&=KM,I38_ȧ"Cp-OVZ-K'%.TiAUY_W<_ӳe w?!w#ȶϛ;J&)b5RyIGǪ:d4>?}ٛ Xa \wVǸ<-ͮŇΑRU iJn>%֮ q?pR aZv<8[e-^2n]i$'%KO @>nblՈiM]Eb,E''a4 @ȒM)&TLWz:+>} J /+̈́њSsHkrcAPU=]g~y_W3jlW7@b< #m%5–[/Cu o_;6t_:Y~T^완,^ ^CQk&s;4VytW Vچ{& VT|-cV{^Kv͠Nx;ejDU-?0k:[ܙw^WiWbݹF{9]Ԥ;n: < >:a{DY T"R$̃ 6Xx6:h&?hL[z9B Y^*.%JiA{!9`G!t#+AA/ %tdzu(5<Ǝ5eu$1=]vuCEVy "pTS3Vpp=etW:{Ȯ^;Uku3y01n4CU(FHqOp1jtS`dTO,H ׿vٷϞ6~o/yq0QgϾyvװF`\QI7 ?ߋΎ|~C4W߬k6]˜OR;}QxA|jc n-@cB~=KL׳z6 ߬MvlaE.+@6eү ޢe07TK_CH:S[AA 3[Bw[Y&fcnF FEeuހV* $LraĕF Fzhg Zz:DÁF(GO[6:jF,&!F *p4N=;D1Kgg (åNQ+ԪcD CU.+DD{.69ĊPc**JH'ݱZ~BWHb)sբT,$K|xVaJ w˥ur,׉%M s$)|쵲n'.A}0A*1j 3ꄷsj$9h.- 02{N*h< #@t_&LRhfPiNfTD*3X24c56f¨ԜR2 $(̗Y3dk߉u 9LzJAt u$0:#Ni, ybQ"癉iޜ2BrG eZ30䄥cЧFs+%";9["k!N7reֈ`VT*gQ8#rI C0W%>f\yˍ;թFkw$'ɉ[+wQjob(NSLSDn08GN4ò"P2zE JH1 N94%`}{#32x5sPr#c6r6#c>]%f]P9E&Ӥwj.i77Uו of0gu($)(: E*e)k#vl@ GƵW0DSM !{D%4UTH'K`!#a:0- 92#v6r6# y(;vDm2>Q]ka5DyģWK<8hd 9A 9SJm̊ itXȠbVA ('@q 82HȁQ1096gh0M?vEDwͶ|D/.^( 4X,rbs"$ g% kiDYQ+ QA9KFb)ڈX҄Ij$fX;(?9.;e,b5+qHl\+.̸Hxŵɺ9aڤoB Lo3iRfhI e)7 jwŃI型;vCnxP=U[;mO8;BApG~T:{R0fJɴB0J9t*C *x]L:ҥ*&tr !сBC i-!Qz) I߰A!vIȠEG0,5 !zʣQ_QLNCYNS3XTJ`iFZ4?N\O8?qҚKcy.דE/%r3;Zx>=|;XD| 5Kfs}߻BMGjX~XrU{]4]V`ŝ|/X݃q?ŧl|tξ[pۛKܗkɧJI[_gqz9 ]UqA=wU5jJiqtW/]ᖯUWt3~cQ/Sn$Rv׼e_s/qꃹ3)S̼s!4z\YsR?y᳐eEdK:QM6c6eP\WZZ#muV;7_?>+0&NS cfS5/RK"98i go}{wdzɗ55@ yݏHR|"IuՊJ_ۚ[&kORhR=*'pgHka{tr~Q5lօS< ?4Xbb` DGC"!c ֕#{s~QL.tqp2%E. rKfʤEHg4ѻ""!ڒK J5(r, HSH ;chGYc.k4^hHIR$dXHfoر!aҷ;r}}*޷%0Jm~'lkOJضc[ mzFMfGQڝO>_KȬ$ hi̩]AP3Ff;:c~w PġEf[K"HeabZX(E&S(Z^ )e6c$he]/hO5ܐ>/fƜ99'z'=#=EĎLbjQqHZ*I3dt.`:$pO!k삱8HPHd@'hIȆx!:C'bDN!޷!6Z2ښGuS#Zg,Wʪ ^J"c\Ĺ V:j@I5 _NJ^TyD|n?_QUL(V*^!rb#Ir"$d-.I3^vvd3OgAXSPr'[),! j*(6@ZI%h2^HV,J"V+-mTAuĬE𺱝5gK; 0a ʷRg%aRVu2CN<(mtTժ*Kqܘ?Ghi_/~zu)IMHQv3)H"'off7BfP.NnvrvrRo H拾P) դ;,9J^k J.T41ј!OaModM݂5LW!x+ZqWɠTN! )#!C48^kHu nEe4]=5Sݑma`68A°TP4YQlE$S$ͣyت<:\c밯qӬNNח/&]dU?A$j2!aL6S7p2I㓧^ѭWo )2vNAZH,=11gPSA5O}<T/$t:l4@[d%ACĢYav+Ј:趁g8,ކKrsQV"ƿ=uL}_ڋ8PJG P}!D(َ9^ET5#e61zãĚMUFgTʯkunc|F( Kf0d%C% ;|WH툒^ Xoj1-*bޞ't,'X"̑]Vh:o:rN^A 7ץ|''/k #Du7l.?}9}fkR RY3c4φSsᔫ:\ҎrSNT1&<>]L(twr~q6{K PWOv>& jQF o$J,.Ya(FIQ@NaF2FͮKH=KP>ItJajH@L3DNF "d_@oˈܨye4yQ0{'񵟔mӓӌbn3p?xrĈhS>է~kӧ |@`/j(*V)j[ r-2*$ !##T]^c(HpB-¦ 1V#tc҉w!%|Тck]SU]U^G&4U5E6 A>n1,\Ҥɨuc(l`~^n%suZ%>y^!n2N{ ((9A*O qAseG?Y!K%jwI䜻!Yl .@1VUs:*szuGTWgӼ_M z_oq{? _wQfP<~\T*Ru1%1Kw Ax63za鬹k~5 68;9v;ΊgqÂd8>XM'hԜ5?^c]r2>}RU?F5vghJz[:Q[]3K-lah-hk.2kW͍b8-fq~w*j爴%~ULZgp6l͈*Ymp>c,ZN*ƣ"[ %*b+K6 E'p%İZ>:7>z8dt6:G{]^ oqJ#Uю;_HtywW}-jbH?S: # [/?Mu0?6l_:b=Ms3 [|>M.o;6U[fO:gIm Vچ{& W\|ΰM=ejDWuث  u;݆n!2]D]Q]J7d6纄qkocԁxX K574S"%D'r%y"T8TR@@#Mm*رNv7AL@zh:%|ie eUim4q^E!q,E ,$t'^'J/sO*NgRr3ر} th:r%G.AS^"PtA$ 3ɓSo9\qJ?Uhj,X\ϔL)+G[HڀF+JJD(Qr%@q+MIl ؏W ]?k-A **gvH yb8%'m] r~~ɑ3ퟮIAW#%S1J(hARq<#"%  N3GAVv Y Jk^dU_Z&)<3~(8HgNs#PVwmg2( {wq)#-@ {S|oS`JF";fiWeqaGX%>9G 7 Rrh8]儝\JrN?UnpuRMQOO-I|qzBr'bjrP 7\y[FKJb_mhK?xU\YnVL3{JE $'8wi{ĉ4hŃ \>d }@m]4sh CT?7w7̦텧( Q.?4迾hf뙽)FQF~̷A=B֌( ꑆaf07Y%lqZf^ǣ|1txˣ2yCu_^ qs  doǓ:b1NEV&跥NhN%įMgpoy7TF<}߾o^Sϟ~' g`}Qyo~ }G>?|мq\]V]>u;w.y͸0.!>3 3'_ ;yzl2Ƽ+rAQ̇M~Uo_S凡V!N OBꍗFx}b LO}$WI/GbK5IVYTH &6CHh-@KIbG+kC 88%i!QE޲1EUgT&XEwŔtDσ\d =Nt dO2vݗv 嵰ΰf Fp6ۋ aƂ;:&9Ey%KZyM'/ܤkb)ogY᳟+黧.*Pr[{3v(v{ >F;!HFaR sLxʈa"mr1^gա >^I>sv>pH LcުӾʠ=00˅i~64ԇzU9UD ZB˹N()`T Ģf|ΑSx'Z/&8Geܳmϼ6B;"ZDA$!)佥wDX\I4NScp@ :g(@,r@ў[#TɎߝsG_d>_bM|g⑥Ѐ2ZW(Ή1ؼ -&z$TP#\3FJd| Ϳ< b\pJ'Sh߱vFvAsrfiLNvrUl]u'W>NjlDۜ yrZjwrkj~5g5X>jw['IIoRbL8Q^YhIX&I0+:BQjBHcIC┡^{c}X[JT֌]3vgt ;]u>J.(6exMfn;xtPAi~5k:{~J 'J^DˉA*P\YQw pg#9sɾ *oGI L#x;؝]c(p1Ekw;jm kmsG vFE0L0HA;o!h9 )Y8xnrZ7T&!802^Q x"^q2bG:*k;և]NTؖr,MzS-߀=wD?>֬sRrn( LPRR^wtF`/Gץ3,^8#r?@HA ӌ -!I˕%xDm* Kc̩$RWxšB=|7\}:XҤSH4WdO[Bh!FG8OF9,2 JQk"tD,hac`c+3rXoF3:Id\D W\rDZҠAftNOQ$BP/և&QR(w/(2 L@!@<'sF*Qu2"a!rOݞw'W "}$iN焸9TF脗hU6z6@uFtSR\kibBLEf 2ETZE9dF4MbXd) kkxu0|p$F"A^!Ǔ{l՛rŒ$ޓ\ ݏZK080i^?8s:|=h BVPQh% w!t8~{{Q9:n\ų5m`MSU}o Ǻx]-3tlA6uaIbH´W&mMFNEMӑgwmRh: A8B8aڽL%r~cjk pǮ;wMFҪ1JB,/mm:F#S>Pb[ϟ[Mt-.l~nlb[?{ڜƱ_ʗsR79'IsϩؾTD\#gߞbE{Hv,n7-oo2B067{ټ};]p~$pKYuY}Wf?GgMw1qv|ͣz;;v~V^^<歁]Xo^57X;iD93>ȱLbRmaCY ő[ yk#Gk]ٔ)z”v6@ (RO)*+"1ɠbJ(ܯ< "pww]L625of׭в|nneblFx!(J`$(R7)rey$RI 1x"WY vhUc&PfTetۨ N jCM2m V 8΃mdqNQzEY>fIQ`Mɸ%pO-.A+ı%(lA/bW `N]%pHUVcgWDeW /{ślYǩP9/ta/H$Yن!崋h_e_GӲ9T9M2ɐt()3PZ7(*=s! H)Zؘh^7!}W5y+Ź`1X>_=HyIo" W3 w,Ņ&nm~&o=YY z۶ߞ?N{ɱ6$Qs-9˝R 3=WbB ya'A@U.}Te@EǸ{텽d!=6G_K>Y̙C:jfR+ }F[b5f/g0la2^xRu!!Ŷ ԬDe8BH~t텛7+,MLrJ͙"Av@N]qU Ev%%㊜'îdJUˮAvD'%]NܓaW Z]P]}JK)+]=v' &hKW Jݺ|=J( f!Uœڌ' -aw)Y`Kٓ h]ҝ-$:C%*sc<ˏCb4?}"e+w!;e\|URB_CxL Pwl۰ﮆm}7oÚt_ /k"!k1~YYTf 5:LJ}3'oc}( jLDP1Qz!jD(Õӽ.o 7{CdDx-Jz_WR"t&r R]1F: lp6HFZ{Hz.&^sXe[nOnϾY٬NZ8FGi+R=H%bqxd{tg|j> Wn3nxsS5m #MFI$5`ZkK녳XW*Mi4hre1}sV!Urn-a[0 ^pٞ@_AʰLZr^ǛZ>e/-oSfNCO6 #![`J9vQi'WZ=ҽVV.Ot?hy*JSISIJzu2˵m!)GJ/~^7c Fq3j}cp2.8GDhuo(}[SƯj0|m3\ fu]}҉_kçy+S`- ɋFIwZ T+QtKF0SIۖ##$\иe -cx93#0FtF9t!LNDe XH>bhp NJ7xŜ(aHÌ1pv~޶`=L[nܡ먗qMZl/d[QZ aRw5o$#3&w Sc@B^P iО8.ie)h2ȴ3O/}zjYOy6΃=N@g cci, A!D Id9Թ4yn }o}Y<8Ξs<;N#|kF#2γv+N&" de2K5ĺ,ZGCu/iͮo Ȇ&x".A>Izӯ]mzw"fYBEeh^:¨gBSAll.Qi9k]v.7Q7fCoϻ|~IX(,nS:nܷJ ^+e0>3󿯻w˫^ߙ˂%]tx\¤0L.P @A [ n>Ѹa?7]ɤ?麱ίI"IstzSuX,UyU:jvo̟aO2Wt]}YY`e2wRIr%ΑTP%E@x=͸X#N_!3o ]U 720CacT>]L Լ)$jҁ3~ywego}s~XHc_Zl%ɮKJz*ҪBmTtU6ZGEsچ~ˏ 'Ѹa8 noӻܗ/a Z$L?:a{DY rF̃ 6$֡FfbfjJo>>=# *T\:K Y'!B(s$BԍB2F^@{$Lo~}ƒh_ĺ;uu];w7<<)Frc7!6XQR6$(IӍGq("QO-<Fqcd03t yWH)T}g|ycec#@X30`Jv"rl57ZxΥ!2%DrLq$S)tcԂ1dHXGOb&y$,ViǬQFYJ9HLp-p^jF5,6ΎAsnzDz9C[0'H.#VLZct=Y7k(2>=Lٖ :.wi4GMEy$ >cUTGF gXʰV(p>f)>GwD@4:&( J3K`?s'ř%h$zzpͿXrŘPTkQ;;&g4L7{evim*KΊL uVÙJd$|$!L\NO J1u3+;ؙL^u@ճ_%P0 *:mqdOH ˥CRr\֭. [WxU712>y[ /SgoޝŶ=+P{(RHJ =:nF0AU^˗p|!aM4uV6O5x[(Z7blT) rS8,9Ǚ__:9~şǟ_^`.:xpV`\QS ?z w{ Mƛ Meh䬫a\r͸/O-wǡޟqvXx3L8ș}2*'(*T[ܕ/XH IRm$DtP\D‹aC_ZGU`V~בAu$NqX*`UhFXG He>`Bk$8ٷ (F\)8HjYɬ@ڶ:cf,//1Fa>bxёgVˈ7Gug1 1fPшw:Qc8@'#ܰC9zBmgemgXպ<fRøL*Y݃xh!ٻ8#Wky ؀  o;#$(R&57HuT3IDʮ8 0щE D&(\ R(%A68ҍ9`"bZ#2 RFtPpth9a(UsUYXBXuWΞ@h&1&o0nM}Tk^)B:dCeZA\al.TM%׍AFq>XEwCxf=Q`N{{`nR}lH[̃oιG|ΓC^{H5АZ zz#yT˃B|ѽs1lD@nH|ळ+y()pfĀ'ދ} H\"0UJMBN1B1BH_X/RHAM7r0wBR'.Qvm#Ϧ CQ1FY\ ĠiDU_}=;Tb[]8ZBTr$B⵱v!dK^۳EZ,F@/.Zf7>|Lj/>9'=ŧd^ժ~“cux_r{~y y歹TOɩť D޳cm1`K ,[Ţk"®ϥpJ*w ТzjY$pLP-DI֌ȹ]3*ta78TY>+]EC{NoOX~Zuq_-ד7ljyM-dT0Đ^U\eQ660chLJ6arXųv*D+Fǹ>tȹ]c9ݍ;ڨ:k8k賁-Q[\Bʔ+lkM "K[e(P۞%f[+wՇl,_ ( C@񢳩^tM*;]uNcm-d%!,NuDa7rnׇSd|^qFБ%F^4.%GlVӦ&PI p *>\*I#D 1PyzEQ Qs39Ԑx!uʽss6fZݔbݸPh:E=Y/8*k+Dj3SJ> 6ކd-lpbxkSY/'3w>[Fʎ6N-' G,(;$ُOhAُF^_.N1b:$BjL9 )2dl虫9a!x"dZ"AO!"$eǾxu *x5Aű6Ʌ*(_Q'SDCQ{ T*@hP;&ԕ  Dbgdݍ;&B.쟟&Ӄ g>;n _nJP4uXsqוc0&zLTk1֤SdBgK2DPzߢt}٠}^[㋟ȟi Gq,?[|1>8?xbQE~jnvG !z.r1RڷEr6+l#?kx$:&l9ЌTiԍrn^ _ڿzͮ#1ãnXkzu _<=m`6gVUQr|_/#Y{݌n^p$xIGkwW~ٻcwFMd1 m([:͓cfrhy:J VmsqP]1kj &wH?quFG€QA7IABx yJ^ƌ>+4P۠pwaWܟcoߔz/:֘_/|ioñfmX.^.yk.֓j_6Eˋo$*U J2q ;[0o:EG"W:[GDzNfj(K=KsGU~ >eIůXYo8-OLB6 }]%4d!xCH֏e$%u|~d#N7ܳs :yq~kK$<쾼oևWv4M<<`JުJ],Эʺ$FҖg"c|o@- sUcLAVV7_^nܜ{rn~e +7~ٛW\|ס;\7m:zD_o&r[bI(57YzhZX5Z Ƿs1t0m DiVZw]Nrv|һwMh bð$=h0!̥seB eBO$I9sbSQgij dMz&=n|X\m7{ \8ZsV!mCO>T61 GP1dY՚.: Sh5 P(r8dJк PqVGt#v މ9n;۞{}{qOr%g4V7)˗ iɠ'򐆏vk]Fck0ȹ~(^X0{ljz2T$L[h9PG b jߢUc[lc)NEɳQZdhVXuJ^~w^9cO pݰ'ncw8_6G֫n'_b>$tj_р'@b@'j_2qopB %4eQXl>D菨T $U)tYŒb9b}+|vV#qrSc! EQk+F,&$ Qn܎RjR*^,Y=T'T {kh/w_ů*_"e#6b)]Tή`%GX-,$wY^*Ʊƚ#"&̺Z+J,͉5k;7VBxm +l#^iFu^Tq9kUPΥjEYκsG9o@FF={stJY-u2(TEl6֑E~hd,OYzo׊zEbZB)PtTT Mj"fEb5Lf j)o?>=}msX#PKIUdcc\`cc%؊eV]Dz_|h^=]x}=AOmZ=-o_z {ud" DeH'DY:!L Ҏ꼎ڊ˴4ރ4mJm{/ͺ3c }/UhξˣY#Co͑7oBnrݬ\3^5!f)yE)ULuXdP.X:7^ r>A m~jQr7 >B^Qu5)Yfb(W&6 |Ur=L$A S<]ᱡڈ+w v|n':8)tvޖW(ßǃn~x؍;~_v凡M|m <TӇi)r1X2fon5nV\K^{}HbVR,W9e\A~;H>.)sRRǏ㋍۠/9K`8!E)l[i9|6s y26 >ДƢ罳H-F" ;fڧ˟C~;=\R#> s(9Rq]J 8s(a9rPZfocGRѕCJia*%EW3E;ҕD:x(])֖ו:һEW3ԕCn~=n1P^".^]w|J(&13cO'?O/o.r7ƭV֜=2{jl hч޿"4M+.C/Z vVJE3Դt%]).^t䧮+tj -t%bP\˽Ji+Kv5G]Ezʮ;S~vReUxbՇScG`2p7pWWhHG&] z0w+] 3])-ĩJ),D@ב8nt܍` ueAnHW  *.wC. L]WJ Kgp" mGѕЋ+DjrŎtѕJ+rk.ˢ -El~d81xFb%kz{6rW nߜ j~r.j}aAIpJܭ_$Ec#ADVW3gCBXfxRϨ=ZvC#v V hXQ1q]ҜBCgղwnSRMuP*fVEΜˊ$8 LK('7=Q`gZ:ehAi}ЂRe$tC 7 \ ܋J)), }OZߍb/RZ?JdWU`bOk$lFW])|v]}3'V=38 8u5cWhÑŌܚy: ]]},:J_tJi1N]WJ9]ВgHW:jm7R\Ji2,ue-;)`O܍7`/RHSוP]PWKPCG`])Gρx-mUtWV]E.cc"\5da+.wiekZ)--=yg#])pnt%޸^tSוRen ̾#] pEWJ&FT)9.O=7Jq+u~RJ]PWYר])獨3Ci])EWߌ>|8~gppp~#:y}-i'qqb#t][;ҕGnt+e7u]ࢫ = 9 8@?R\JiuZ8JjޱIW ѕ׋6N^W:f좫$>q>Wi#ǓF2r´OBURL3~k^ex^ֈգ_MGC NwgZ3> ak~ZЂMOb|7R\zѕ*JC?7^t+zj>bö#] 0؍y,8}])]QWIGR`ߏZ4 *%,K]Y󴪷԰pq{ĵB|$-gHJ?vkk]S:ҕc6])hIh uq ߍfFWC/ZnRJEW3ԕBOٕ7Jq+ݞ 7'Ϝ;C``?RjyR+؏,l῜,F7z?SF),q:#F$ 701vrи6g\ܴWR'5BK,u4Y|l/C J˓ZPʭw|mG`"FWϤcO퉜+&RO/nѕ|b7RZJ)f6]).^tוReI׷+|b!xݽ8࣯(EbVhEJ#kG{^-]H %^߾eI)(W6?Ƽկ׷Rt`\KshK,![VJ;>l1Zfg`?V14?Xc|ܧi?w6Gki)7qܕ$*?ך7?~/G؊We_>-,'tDEUڅoçeG;f=96_{TO7oێ?ޥ?t ݴUm\~Slƿ߼NY-!%YmM*f9&7K. ɒyw|7ol=j h%>}}D{^uӭ}uJ!uyzsQq)yc" Fj< LYA>;HY#x Auǭ1L͚s&UcA.+jqra>}?LxV <~ 4X_jS% L: "9 N!dpj2';i)2ݵ쐴ԚD([RK6"J1j ءBr6E}wʭe\.߾}w!)ծXJh}0* 28%r-Pt (5cMJhZ %4fуD䘃+>ʽ0,*qJNP=  c>MjuG"2B{ц uV D U%1R/ :m@*ZZ!$ ^]1Rh b^PDnD#.H{Jm^pTRҡx"%ʳX2 PHtoDmUiy 5ԆLGIIP`#{!'4P8g |Kw rwS-H$HR]Ic]($QZMR!C1im!aE& "j4UO5H!P`ֺkC5HPVx}.[e25$" 7RD4Pc#-7%+!/Qd|;a.rzgAy:!&Oav44)2JFVeG><[$xƁTPQ$& o$גvʒsX?ȷ=Ķ;hl8hd($˓Xu>R>E2R+<[(IXK+ qqɭ%i ͑8x*3\oe۸pҥTt':3acӰ!g{KEc(\؅:ӻV%ZR$삕W9veNWS\6 WHc !#,fE Mpu n/xHXE<ꛁ͒i|UdpUR唳awm[4xmվ̂o0$IjsLHɱsIQ(žb["ټ]}s4K丈^<`" P ˕3VH40[xk|Lc%@ʂA E"UqSU5J$Xu޹TEoQX$<@Ac#f0!vAd I(oL tSȆոTkUT&+7/4R2AV%T gԌQ!] QQ+X5*f= 0=$+ ~"RDbkL]ZJx5{TPۊbIلz`o(@H*,OQq@X*+T햡 /\5$Vc+:c"& LgA:iuk,3f̪(N1ƀ/N!'xğ&ôhgIgv0.Nrҵi.O?s-*䄨/M$ky2AT瑵IxƋA G^*-f%^E Hrc005n8&8bgP찱ڂ|Q cA1J#H*H2+Bi,Ѹy Edu^?Yfywu2u1qWCjED)VX;KnCzpC*f7]H_GWT@5ʪ+x  B R{pvp VhGc ;f20jxrI'#6Q'kFV8-5* ΃_8Xr6)6.63jAqX(&RPHƀ)E+YnT\pYT2N$Jhƚvi`,o3,4LkTRp0U U-eV hߨtf&= HH/mIު0!K֭hL69Hcyڭ0:-]Yxf]L.es ȸ턨,ԵW@7]pd3 L[ n)v`)jYJk$=|1RЃ`lۈcܪȰn$>^OW؊"PD W}\'W SoQGu63+v*ĶB!DE HbQ<T5uclep?{ ڰ(RVx'#80`ci FjmRcZ |,$˃>0t$ƀ?!],Y#۾Ap^)SC[xYk$}(1xZi#be֏`93#0ҲB3^\D[biDF95h,82{ ̠!GX6 8I*t+/A$\0 懍9mFِCL*VO:LD] 124+&۩ o#!2 .P,9vQ-0 U&y]p]ꊄީEF8Ø!ՀݴzFuqыպ C>rYYEVXD;`قױ^G?>t,Ve\M1믿~=&|\b| ) vEJ' GMv YPd}Kzl?t\3ihn9b•ޮVˣC.+h>lV8]}ٲ]i5[^Lݽz&jvշnC)jrfScդa}&Tz7:~H#.'^Ԏ-Faϱ@\b6m֚g$)I%Pk`"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R}J $a^ۗs&P3ak5{G5+9aM*"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R}J ^+r@u %+`@5@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H *LkyIJ ^Jb@Vg&@!_@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H (y{kM~Vv}{}m4_}Ju$-N@I%.\^p Kp.=|Zхlqxh:Χ˷d/4-!~yxP8w;~질~nl[+I"w9lJJ”]L-WlCWkoX?~,Ǔzm"E Ѝ^8$}n:?׏kXx;K ȂW,;`ۜʠD"7h%h|*sj%. !4LkP917ބ7k~g{ P`!@[5{ .`4_K'ۙd".`:^΅P}Tn`bfwT#n~str:Y>Gx:cas\S[$=,<- Y|DŽt:<~IicKk;l0LJo7 I옌AƔIT+W>ֻJ?]sʅ1;w4he#]g6;5]YݎVpͯ ֿν]^|S -fuǡL+,_noM7CCMd'nG14 |6U0p4 oKE*D)w5t;BJYָKddXۊsRP4-8;ű|exe**ugĐ*v!q3j9bfΞqէ npTO f^΢ ثsPSlBZ7tQeo'"!p&Qҹ2df9x2K\*AqPJK+kP#[Ad/-hP+foُ;Wy u}4>s|+N+=wxfV&&%r彬ul%gSĐEeJ$eY?0m+aU9y?=O?c\7V7欝60~xV6w}sy8S,ͅcu* TYǤzt,M|uɤ Ex3"௳z\o4o/|{G5݃ձ Mr7! KV̖gZԬX)A1^xlXKt!7rtoqt k 8_!m޲Eorg9_]avi\mOL~{w_i==wh6ɛ@d7\Mw@fϫ Tޡ TR}}k*f޸kkᵿN>G+19f\J]"t3autrvc؞MY{vI]!w#ɽZE{]79LVV9<;HDG3v",ZNz\b]OtSkyvMru~߲, qS𚯴=W]R/=Ϯ۰`Ey޵T0`}l7U^0^}"꥗ Nd3˗"lreJ!I >DIYgvѥ|jmҲ~c4ЎW6.~)M}c*ŘIqQ2a:ΩOT@( 6^#8Hy x) k}k}a;%!܏X{A(Ž4T5%hS#a?[9kzsTDu} `6sɛF%կ݆Cl7q>؁̀#8n3hnh&sekKjjfgl;@NfY)ze6}*/vro@YWM0T/^TKF^)Z֊%;Bwp4^_N<ҫy3xMlpvdǾWۄ^#mS~A% >ֳ#\Z&qX~>Xy*YmMu73E~қTAw>.>v΄" MFh%@AfY{a2 ǸΫ˸s|mn<8[EQpJIKU U=z 1;aJ~E㪓ɸdvJʒmk"d/GjP#i|u:o+p{(̈́{+ggAp'O͘5aqmS_t&zh=-Lе{fٖ>oy;]2tB6ًOLj\xmYʭc*D|Kg&&lgM٤{Z.;e+k~&GYϪ"ٽ=:Vknh DJNhAx<*D*) EKq+R)Th7x5ZW{4@ʪ J F8xTFKJQAhzWDe鴞š.{^?Tcҙą9?C)rly:),=qz|Ϻ=TDEg`jԆXQ M&'0=]{ՀCѵ'-݇K=3؏:VS/kdBYfQ0%(9)D%MS2EES"稼RH#‰ଶq.1E&eh ( Lま6= >̭93%C_d)t_^^z7d}Xn'Eg.0!ɊdCL1 "!X- <9eR:~P,Wl/w;o2q?[NgJlGaH0o3;,?jɹ7ErE]rQXlQq Qed,'#.PN9:Us]ϏԋkW-I|qzBr"%bfrP"7\ I6dFJWX$A>_E@Q' ë_04/54kehUøqd\&>1; sL_/psdr-7kf;-\ b~f*袡TQoQ!LF7 :A7gƅѾ0Aw%d#'m$YhQTBRGBd0P4FBk"5F'q KW7Ct8xK*ICڋec^ΨDM&u()鈖bBtTQ13Ȉ2w_1;;P ۹s9lgv'lmm!&lgw](sY&D  j0yt>)C:8#yuהÏ"^r#4a+9y!C 0!lis8dl9̍ %JDVm4 Th(f&e#܄Ӛzx鸷RPX5!R:(M\XʅW%- H1r* }Ha ENW`Ԑ;Ɂ@qݐLȓ3=*U)^XBXQg 4S%p d:;qR #9=z9Cud9)iR.p^ ^7%r CqcCs}%9Ew{w=cNP͡cnPV9~P 7k!N穄NY%j]յN-%H7x +q{;6qsC-E&Ǽ *[(Xr!Yy,I\zSY+$Rg-Dz.hCL$8T.K3&HGB<"`pU)Ӯ0)F5](Cʲ^īuO~n 7ul7S,J΁0 WXNX e`uq3# [tl2KpFQVph<SJ)"xbjL`s*%=%QzYzpNڧXw9%Qx-=uF J=>XD"FLQS Rpc)tB]dM^N,'^F˻XU9aYWsl+\Bn'}ݻ.G>wո#i؍?te-u[²^bP{M 3Xr22VKyZVӶvX{LPO tS֞Xl!Az>]d5pj9݆0Gz_һsJwnK@uOxQnn/ ˕XZ[Foy]J{HL0#{/Gx}mhp;j]zVϕQ$?Z+]u1% 'VN5,:ՀC9հ'-[*p˸g& D4$E|4%jHL451sD ![)P5"ʱA1^}H:ĿN~U$vVF %A-Gp($qPDpJKE"gi4Ns3ۄf_\aR|oE Uľ1^;"ujȓ;Z 驩v1[vGO0;´N2ZŘp|zL$E"ı" fpEgZ(V)AR8ez%-$X1$R*R*]3*ta1W^~QpEQeF%ڋH_mAmGhV5vR$pPN"ZN U$hmȢL$&\'iƞ pgYmr'!(fJhmc"^95Úy(Zw쩵ڦ=f"&DɐDN Dd *ABŠcJy<7qIՇIHe=L-dDЊ<*5!/YdK1'`&!x4>,Fv}ˊY1M_?ՈFܷoB9;iR(qG H!m@ꃐXDǀfx"E5є85N( XzHxޣ%@q(|fF,FG">j=koGe/z~a/+{\'XCijDHgbU zȡD#ز9Q.:[Òu"k/Ҏ/v|qihk' <u!) Aw>\\:9)X x%#w?Һkŝ&vyrO(}>:#W*0{\vWp_QJ9 (_Ap./!@m_8~ys[;zwp)zx5~ `O?g/a\E3߿ ںBl%38F !dyEJ_gq_xqU@>bۈE[D|6Mo;.n[?E.˫IIUu\T?}R 1EqNlmQaf(N>T%wlwM<NA>rFzFT;k>˹lor=IeT̒RnoI֤Kgh(pB#vqmk~kׂؖ}meRA)J N,2eU42BZbD58JHu:C19U\$B 5p8 h5MĴlh OwIӵd]G|S awŚ?2\IH:MJG!4ca,LDXN(%(ϜiAĤH@1\S(}(FRX4@9<F-qkZHs>p'\t F[h_ԩW[:S˘Eȶ'{39W ӸMU'1kRK\(X=:M}Mm%p>/bl6Ow;Jvsl+yrlMnU]Z:6asajFDJuukUrޢAISAsMXPZ@jZ@ 8g9ڑj6p4ʟָ sr?=GZdXy8xKq >GL x.)l'n7"\fwX1ѕ/qyXB )~sbŻ6!<ᡓcFUhx4 /1Gp~ts}>! BJCɅ0 N%_S y87y,QqQ|_4xԄGh\ȲUaU7.~D((}9n ;cCL$@Ti:EzY=*d87?W2 {r*y JWW1ӄVf9 Eau &ȑMIoUPb[Wϭ˺q9G:Y`vsW[rkn-~gIBs-p0Xr[lΫJ9og#O1;[.9<7%f)jN?iykApؗMoHAjѺZ_1{?yA\JS0l23Mx44Ս`F'Lͽ9m,0'`pQC26ч5~j未 -[_y. _F䞘 gDEDzՔ-*Y7SM5t<])M!WBg㧋&K%rtjλ(H^ _ǫ۫_k9}õuڎ->x%Q>y\nTu^4WS NXɅ6g"NsqgZuGæ$IMAb{Pk7͞qp;D WzU<EYR:[R*8ə#JM?mE>0EU?֡J$>il*eI1Ƕj0HDZ4%ĔXB `D s)KٝZ/g52,5sJDSxG%sӨ+^])7er!~.mFնL>5U|h|u}4DRP@&9ڡ_8ġ4(Ćo#fbTmW[3jZu?V&_7oތ/=`A(\O~_+le#E&3)"ti/c`$|H\<<բaaGwH!ʵBfczeӻR|$pr~8<@`X٥oF"3#8bP';"",T"NVwׯ ߻@t27߾ɿ}{}{wg@N޽O? Cp!rc~J6ߝY[Cx ڜu+HS^0ƅ9ibnE@/~m?겐$-˹fg&jĽ,pevE|F2MT 06_h>_@|dtPyMTIL0e'Ht:`_{5h$FLRT"H lߦ e(QwHml2~6/|j>$h/I4zwޛ VAB\hLIG<$$ktZ[9=O3 de+y"`'kKmreSMzZFXe]y?1vLlmgx hɁeei%2 %9/HLsx:x dFhBWOs|ǐE nBINkρgM;ݱ@>&cnn S*!d4ĩTBDS -\'\8eٻ޶%W|ڻ~ ;;vL|G Io5Iɏ,b9!؉j4x鸷RPX5!R:(M\XʅW%]Nx1s:@,8, p/"+ur`=P7$$d p*2= C YlՇ'4ǽ֧Kt }9qR #)xQ2g,r3#M* eߕ=[(t8brl _r܍fqgY‹/-ʪj]ZwTb}NY%j]#N- ׬@Sw/^Wvœ8Q!ڢ1/ Kp\HtpK'8T/b*B"u֢$pAb"S%!;c$:p4#X: bl'5+(|#af .S#tʦl{w<._4uS3h8bGj, lRes^9XyUʥo-Ev a)S`BQ=J>Rp`z_ӻX n7V}sre,\SݚF,oJJDJ m6il9X*N5n_6.{* Xa]#v=ɶOeKxHxZeHDFdpY8FS cIq.g',3jux3T&"'ZZFb<)f -.a f>޺{o޼خOBPɗܟCq$o7Kٻ屷IW*&X=y,iݥF@ZDZՑVṳ 9 G@ҝ,@\ґh#z. cZq˴I+9?t %p) bTmA 13Y`j&b butڨMn&cbz?ƾ3./vg92y}m/xA W !#";9U1ϓ4xdQ@PblH*%18 A!r0 P +8քhZgV{t)!D/ E*H\@D jR Ia;2G9K003g<܀s3,?sdnİhbs7y[G.jz&]dv+P9P8P,yHH8HyTAy<qؔ$:;:T#dp?9/_QGѧ7 >O4ӴM!A 㳮(ϵ~mW89*ꊕlu-dm $ 5yjtn/~!%q>_q1ؑ#̈Ox椹EY>,ՈX#$(H9w5X#kcCYZV|~6s:ͷMQ2zgI϶/mhqG]_o^^꣤ڨhT핢5(k\r#Mu8/g_Wsoc73۫R߷w*xoCS/fHj޴> N-'ٻ{KO~6wպoN&qWo86M{avMzEFwm֍~ۏGٵlK(Grs<|ӧy9"mE *j&-U3"z< t<;1WRKp>Xv^OrVls/CLrF=+!}fvZ{Xhoϣksy1ε\y\NUi=>(}S^L/.ڢݡ_?Г !Ti6=2 獵m8┉߻n*Jp~"%E13z\1M~r6c{Х .g 2&שpġq;kοB6Y׶]];9k n?Gwg;t)Y=#8CHz5.ٍփj: ]ufo<)4l&v^lr9b7gbTH{'ZfyftT|6S9xsC {z]TX1M _~|6ZufSڧ0b3&gǴN<oq,>khݳymS_Nm/orgUύM:o `QYCwl6on].fg:~kx`P:lsV{rKg&6x3F[7󦋼;N\;Nt`)ke9Qb)}xU輸-5JOh*?ׁŐZsCᐐKI#A9*ryy"T8QIaɿțqBT i5縋ie Ǫ J h⼊&BYXHR-)M RF3^ӻzRGoYG# 2yc1g君rJM}R5 9(5!Bc<|mhsDq#z4bVP;rJj#( ِpl9:@Obl))Z8ǡ˸g&"`PChH6yoieƯޥHL451sDʐz)P5rtbC}#d؏8g{fI; e.2ZW[$8'XMIK,% Mx$lEP 8#8Q$$1, eb\pJ'Sh_"h)snDZg}]~zq},*RUa䔳&g=tn?oX'͛,w6r FAϗݳK(*ﰱu*NƄsAU$a)R+`gtf(J!$HG'S}OB2N AR-)Tid,fvd,Uaa1 Wk[\ffܳl&/._}Wzov<^Ͼ|/Ƥ$p$pH^DˉAYIv.QOP{E}d"0A85O!=1.$LP &10L>&.َn: $池v1Fm3`7j6*a@D y IidD(S%s_TQvsol|, (U|`NZƬTxwXb:8}B3(JQKJUst{#jJJ=,km??ju]TnH:WHWlq=Wx`r/{˽-b%nQVŖ`sr^m$ aR sLx knu {V~I;^L"jo6A<1mFtsóQ Pt֤n1wtcE <9_e[18Oy,>\Ag:]RYinqX& G˚.zf5sVdžwqʊqL]ӻm;ano|xlŀs^:Qo12A<=Bϛl_,{gֿ6_ZOaV KT !a.Yj %wm$$T]"Hݛ I&@O0EjIz!E=L=(+&353Uէ0.5,SRY^d2^r m=Z"S2$ t;=TG#]/؛8;X^t3IH˾+捙H2 tŒŐ6,\}=qt)n/ܾE0\ij@%v)FۑZ7nc& }2I-̷,V$ASȜ5mHq dP4Ҡr 7LS!Rm!H!鬏"FY{5Jp`BQP H<5@m@ $PdKqsJޚ? :Ҙywd'BLSϬ}MgӰYR~\&o*ߏ//N. 9}K?(m]Ekb#肸ZbulfdNmY&3^ [XlF`.C*|\wa7;wB@S*{'b3tII%@E:H [$4-9ɗb (IKrNQٹ="dCN}]Mw}<rKn=JAݥZUyǏG+Ed荖YtLSKk%NJ$OE5⮬9gJڤ7NҠb:.]lGmb.g1; Z;:Y{eai3R4RhS4e4d̵Ar$diB,ٛژ,cZ\L28ll=9Vɠ#:Mg;.{ޔnێq}L%75| oCo,SpX0Vm1f:TRVZCb>G-:\SQa-ZQZp'aOH!g,Zo'V^e.y[!*'IB$$\ٴ)-O"&LC);K00H=߭;l|@eR|P>3E2QIzYg\yR+T,E1( d!ۛw1 vvMɦ`&h@OP4g60휔"d9 rf W)rbKf[n& T[eRĝ"gC@&Qয়YܹY"&JGNvR!EIVD+NBޠVIM="p<^{h+Zf7̕bH"i3Vd2m^y cu8Y_^ od_:'o͕>x#LroG;Ԃ2D%7H%`YcIN z+NHH YұTrBBzv(כk|oH6t eM-L* ]"j+Tx5cCxMCӳu跸Z WGF hH YU"('W 5yMt&kwO/Ro (brV1^%+c-bXrO^3OWe"Oi(ttI Ĉ78C^fA kI>o|ⴴ\{ )=d^+ڵT׃ϟ:x?~OOJhs Pp.I^9Tѣu茡 $xU)y)Hx{1%z;$ەG/˫邉 7<ev'c(|[go_̾ :"霊ڼ:"=x:'QhesztNh1qN4# 5vך%OӼhyA+6C+-:Zbn &t9Z"ޛ3"Wr:.%p@'0}4lX>i*xkE FE+);os7m~5*U)݇urrb)ŧЉgiܵx,39Mmɝcx.X=aFie;g+Oxn[bɶ -z4j5 nD%M[Y+j0X[vW*咣c<0Vgg!0<63bI*oMd@oVq! º!Kutvd*cV2p3k {kAW{g]/tHK{l{/f^ 3=<}iӭ2~癠Z_dq@Jf݈nRi.g{:YEQ@+d-$VQ;p$WBڞ'=\ft6sJDv.0˒vmYHaƜYTBfcIU $ЯAiAe4ݟe_/7$^,} >*|3/(I(jR.VzIAմ7v!H`/wU"bz97 UFCbajՔ*yڡ/eUӕ)x(y?j ?*Sz#吙`KzGEdu o1d)bhbF1A&S`29Z( J 'LMAekR/Kc}=}gC}Zvǟf&pqaeP$O ":9dB H&U"CbFN45X˞>2Z UdzLgzyX*reU y$@Ϝ -SIc̖Y$Ϩ Ey \waTNs } .8~f>(2 &t O,z" SБ:Hipu:jYz5 zZl]GmB= vi`'ԇ/l>+wd|V^Z!m fCf; A %g $nn|ft2 1*$!CϹ@Q~ԁB$=[ߌ،To_/DGWS/Ӡe,Gj|6$C6%E~H\Y4W62NN>\҇xb,-9lǿNޥ=4`08")Oys1مԒRLo4؆37 g0 t ìpk ޟ4?Yuܽ|qs?~^d> ?9K4xՄ͇al~C1cu Z0`[J(at$/p4h~vw|~4 tkM\na$O tO)S5'c3;h2==8ǫW/-?ܬWyuЛȚlŠuV2mAQt0d8oۍ.JIs_~fdZ-JZ%orLGEn_<0bZǪnfCɱ\n6jR37.ȉ5[UrэgCL]͠aG -lHxRGwoiMb7a½Q?{\{Blˁ\G)У|rqQkQ6otrH0kzpLsc2{c2(GR3Xr+12xnԳ0jaQɀ!òiӄn CƬ9sC̒z]'+۰ΗLwm$G'8 #%8yw%`AF?e%mm~3C5M5%z,ٜawMw=~US]Ro6GYË 7`d!*$Mu{CCRWG+}i_DƷ$hi_و.]akSO~g#ned&+!7;C!~L9ׯTI>݌g#(B7?B 6U>4S^Mj5)>ḯ0;vyTgkrS{\wOs5g;iN3!k(c+w?<:6=0{v ?oI"ס$Y49X{Yc螴#=!J,_h;'lbީ_,"j }UZsIه|'6 L9jot<1 *&q48Ǩv_){)4*Ϟ3$&L ZgY]R !!D89'fA'^W80fz-==S".{]9wRR Tqd ǷtM2[+4l׀fq`3)G, e UҶo+ sR%mKZ3؎lW)tRx3 Y䄒EhW|hx>:D,!>="% d.B "ĄB9C-$xN%݇9$_j=˹=v^r君D\eZ2 Iq3#8J%β6䭎%H(L1MAϮf.HY#O,Ea4 C "G0V&:PYù!@_Zdj{AܗcPĸ})sr|SIƧҏ)\C''ՏTi:DJ5`H;j`Qwd>pz7?/[O=n/>77Ww[b #ĜOI>VV] &1ripp 3M#npUVyC"pr>Xbg7suwg-)f]<Ьꜽټ.$M$bD"8G|to;#sfPM#C`"s!@ܗ\Њ+}9[WR!iKڃ n|ݚ!0L9nꁹŦ,Q6/\l:98omSz&4oV\z_ܦUVm&zTtOV1mx΄@[YO>ADLPBd"58^4` ޓKydu! hJJ.M#kR`CePS QHޣJTkM}Rn GWln S@Z AX`qj.ϮNǓƪ@&rizO+$ rJkkrMJ"B=jNFd_Ő?^̗Wl9󴏴lk@4rT1yeL .AB霹,ǸV9kdNKOzݮ/}Q5_)Yj|Q9YU:e||ќ+6#ل59n}ݻ!oL?HsYӃp]XܹU0IOЪ yTY7ש_-Yz%ֱ_U?@gzRP/,Lg Ŵmݲ]YҝൟʴI_bz8 /H>⿚ȵ}RyR $-eI0l嬵SYݗn΋ٽc&F2_NAxt> ol]nk7ƟV7NF}o/[:?YtmiUbF-:Lߙ_M'NW?0J`_uRFވvYhD8Zv]N/>Il S;L؟N"loҠ B3^߻d!'R㌂T o1suG+<= }z Dxw._kJ3ykjvooukp~ܠ؀q!&FAԴcWrϏ]|rbbsR?f˚i Y V(T."$Br&HR]<(r3N.{6ڒN %T<'T YkäeR A:HYV4P3yoǮ^}&JO=:ݯƧ=d"5scAU9R됚l@r֐dduG7z7 jVG 0IR ֕$FLƒs ,d"IՐ6A* l!\} Tɜ; QLe5r64KFo"ZF:E)% /ձ`^peUFUn$ *Y^D !'OZ \a6>vPOm9R=|y7jɶ M再Ӿl8Hu_ݾa;oGjI=OscϴF&|,l%ޘ3L,g 3MOqJ8{U5&C`%G_[T3) dso0i\"%U[3V#aXTӅ8c[]h҅Kʳͯ 2]l1Hg>NIQԼQ̑[yd{BeP.$IiMh`2!EQR!hHFd2v̺,!8#fb֮[jm`;h]qP&r! t 1XE,QLR s.y{KQXɦ>Jib!2@^thH$dQ 6jX1"9h>lRl:.&kǶՈxxЈCQR8;3oy6ZH S>+Aɇ )N8YݤT #pe2 r34o@hDs#H#ˑI5٠r;gM/\%EYY/^Ӈ;Pa˃ ׳96hW||.s Lُm]f?^|4+鿭i.bc&VH br96ѻ܀ N&,Aψ!w؞'B}Kjv-!}$c`\Y9wGh4ޠXc$"J61,ͯ L %!i3N ȉ\ed] !rzz{>ݪ1E}]~6V~kq;t9 2/ PP)+%z@ LLpɳd)%OH*@:3K< X8#"AB+/HF7+'W#a!^s~\cSP~87p tY}բ>Aҵo9Zkɽ, x~A%Ȏ]2`\wHnd0_"ԑ {HQPVv9Uu=d01Z!^Y$P3BhzK{K*x}%J[Fn~*̺PLZ=jyiscΆ׺pѭ'BPhlɔT]T RG` O95+PQ 9iTѤ#߯I^9;8ޘƨ[m  }&Mux9^s~I45Nkjc,|%&.?x8P9zcIe:K c"Ic ˡf\({)'[_[[EDVYL 1>@sBC ¡29M9I9[i<Mw`"[ʆBYQD*@b HxfZw)စF16uvA_AzI9IlXR dXsdFFAS]22VRheʊk0/|ayF.zz.8iab6<$N:[VtK谨|\LÒ1˘D"Cv2Iݧ-$VX0y#G2@֡edlc8Tei[?p3wNZњ؅O{I77F*#omB"#ϢOj C&x"-FH"Qjb$)g0}~R%&m3BA$ X3&P,ӂ_lH([2{${!;k_/7svNܠFǓykqݜ_L(OZLB?5o(ӠPz~=/o,*:{ 0揃̣̈́?Xt`.Um6N B r:_qg5cMlʀtVH7h!dA邊/G=#뾯 n1aKN{VPbk/fg{p;uTyT-⏲a6 U}(%-_eײ䚈_i<9UfSbnE nxt;:x^.l0ͫ>G6'f&iQIOʷ'clMCj|ɱqkەyr=ϝFIdtl+ky2`Ñr W砤&$I${DSwVF/T`.09Π7c~LNĻz1wܩ2lo?Nhxсlƥ PfwFxahb_~ooOg= >g^x z|ܭI\&f~ߐ*A_h+ A9pU9'}p9$)f내gA(I1d YJMB#/5Kg$2ӫ9wN|Z=;ˋϳ1[<8_hQl09du( ,Öt,."QKIZ&v[ t f|=eב2g3AqQBUa[g<,7 5fqY{})f8q~q#,K)>q& )mC^BOv`=95z$w0-z.mq*H&,Dd(j] %@2ŋ{) w[zGi R)MKhldAQ,Xd5Vuy{nV4+:mKϳ_L'f]C=ALVZF&؀{51 \jb*=R}4fg  bׇ&Zu!  䵦_Ϩ v@LmT7G  :,4e W-Q5.1Tkh,f 1 Kngnf$cys>GOP\bNCRd֚/%lmib EQn]$ )+4^3re*Ae+.g Y c9댜-嬕oo@aoJyZ֟ JIr c9"Q7b(RO^sE3#sTSI$򆒈PNյdAQb9i6le&ymv= jŋ!ݟN=wȣg^ªw*s}F錩W(i<X;;㉕b;|&V?|jaLd4H'fh$5C48-R6##5t(Vͽua.'릇"|SHKD/NDaXT[NhS1⤒Alx"DS1fًGwcB qSu,{tq:ӥuӶ\jXge*F)$oO4URIQ"޸yq>u͒ys`Lg /G5g_e>w-Hف7üQO3亀ir y6E}2%"LRʒ kacT>W27YKyo9_49mu;Bʭ`i^V go;-׽ٻl4:ZVm3"}wPޡ?ohwWۏ|c1I'ݢpm+z_7ϳغѮ@mY+k {z~Q9lА%O V02)X"4lDD6}KKc2Co'm!/dtID'1G/`(E hPtV8S8fʜyE 2|Aw$~+M+ {#Av;>Lüp0[\jfu-b}sLEu4AKQ&" eJh82֫5u3fljrT:&'LR yاr$Plx#vS:$+/cx6B6N@;hևZ5\ŐLJU R$ )Xfd\-S^b$ )^#D#T^>F0b~C@ jLYЙ|^YekFJ_vsؖ~0^Yx%GQhF7;[=<=(klv?jXU9otlMBPɕ= Y;=n>4y?GK>P(`zf݁OloHCޭ>Or/VbKB@ BqW`Cb`# R.̈́X9MXq qP=g<_桑ȔVQT'h 4s(",U1Ytd>-YA tCq?>()$K @\:!bkq! :"C)2xNlbRXGXo=2Qp}D}QTڳ J< pW [|jOK˥6FxttAHIdD+ C 5 oAߧ.Py}kǺ@`ª1ŴApJUoˌZߪ==zۊDg2JEid%3f>&i:8Un*]6GRSPBƫqab4(r4HMq@cĜI$j 5:q~p2Z ]lE .ݸn}q<ۡyNɦ1_Bovv7$&(nLԐVYn jNb8/]8y7$ +c smnGkyV[x<|:Z^ȶz%[}k!,ܙOn`[vȚ$1ݍB7\J[zvu>HlPlHnU5^ f7MV䑎]<\=Md^u=9XaWnPcOp]/i?ME+ Ae_p?{ߴA4@ BQx6o?x_b} 6{݃LRrUȞꞺ`{gn۵y&f4뱻ʙ:ld)hg"J6;ڳBVߞ3[  ('K@SI=Q`?,̐_fYo@<)dx CAEQVuF14( a(bRE)J7**vPzE^9,~ƭ`71}G'.. fPP]xꐳ:Y8J9!v0BWwonO?4A&^&l-P9i1c$k/$>!PuIKRs&㩞S{РJgUR;fVK˴&"pwO*!hZ0%. Z|:\h8AxgKIqqQ[%Q6LѾ6= >ϔBK |'辺:o7w|u+gZn薸講%f}$9,Wp%E>ӌ e$dL.Q*vp)o:^x9%,۫zL9󸊁iLkS0-N߉dMF,Z]à Sa &f9L*O-?D ``.8gHXq "ȓq@Ȝ1/IؿA " GJLsg!%QN* X h[qX s *g A'SG5˥ 3o81 ֛L[*X 5d,֠U%A :'==^5? &d7DVdqOo8Pu;5(rD-r4/cDQyNu'AW d=B9aGL+)Gn> ?:/P^ݽ՟Fp* J*/u frfMnq&V99kP0Y47ޭ+#OWեkHD3mWkuR)!Ga[8[&EW?lu? EUx-;͍ywe!FTjM̅c4Iy;^H5N~^./C;sAHOOޞE_7n{*oQd#L~l.N'@ϯ]>Ɯ5eȾ^{Zs=yn4hTd`gPMS1=մ6F)ҨBﶱѤ]9PzWO_z{J9}חo_O8 L/ $ Z,|^ ]R]ctK~u7;IOǏ7ħqfGa6/&{TIoGb+IGVY BKMs|c$Q Rck~>waFz6I9hLllb[NӄheLAqP*sY.P*bay aAWPgO 4'5t,9#%!W?(-5r4O9hqo1r]I.q9S: _I8ghNm q*)[MjZlPKyH%sI4^Rm1uNb]{0JHa$hNPtCrڑ{AѹY lO{4qYPQg[GP.&/T% z/J@*ΡKDi Gc]%AgH. ڨ hGkFy 5١tH֋DtMfVp9fE*,at8XGOdqFZrQb9M¾r=ʀ6Bٵ9)%ZhDb>`}%څ8Bʾ잴7%Z}BGQt d"DƏ2#&! Bp4y4h2C گ zAyo@Y"1fO4=+.FΎa*2deYҳo#6)p: EMpQHDÈcTɅzvQlЈaI@x͸DKPeZJ{vȹ_Plh,govUKɾ9txMTI^U|mPxa0I]@-C*Ћ4$(#2y$5N5h bܯ~U!+ƃKǾՈ,4ⷢY"g' MZ1OB %R &虠YHQh %DiZ4N(Z @kDQ4;ME`>#gp9T{ԋUCu}"/٠⍳*$.:@_H)B}:R].o<ͤ"W &^z`!}(~T=%X۲SK,0n@?>UяwJ^)S&PkaVJPE06:8jaeKg !%9p 94 ^#H@Ƚh)tA%f,WODA6Рǵ`XZ;) P⫘`GPI@.Ik`*Qfs3(Ɵq.[ز.FΎ@șwM: cǧ{}Rj;_{?*+LRY9Hi8'<G9~y+eb"c[jX۝?"Qo: \~[Urܪ-=Tó.٬Iq9l,5ByS4BAfyP2T"h)J'@ e _?m9tOLhmq+/GCoRPoȽ.|N'JV0+GJB2#tKN^-t` (ԅ5[7~(T9/  㬖ʄeԮa^`%.@J8TiPԶq %YR^Y,kTMJpzx8b=Y_ęv\ \\Yd[Y{^ /3Ѣk+Շݜ=H_ARO(C2NH\$(#H[3K j/sK!/iQJTh|H?1B}aF͗󢢴å*@ƿ88v8ej2J>%*UJnBTv^W*[V|WIW6 )XD},ecz٘'ybIt0(G z=镳!}&0rY2κ|jf>We33ut'wN,GHcRyi( %֥k5XqYUTmJ;gV 9k*,ʊ6x!^9wxXu*hP˞߱5.OL엻}t0{ Wҡvڼ.^l.qK/vp]oFuuisYrO8rEt9W fŲyju4$Q`Z.q0WM|[U?u_!{(}nh4NXܭ笛yXu8 jg:|4[!Ik_V>F(KGx}`FOg8QOM}NTPܯm m!;͠e<4GbXN~niO$(w|=hvvœ_4yt=vs5hxuW姫󻫟=> 7\Q_o|fw\|{^,_ܗOuF 7^/Ll1<*G~&i/ڲrVni3(qi'F muD` VGJHk}#QvJ['Ѩ}8u@>{8(X5|F,Vw%Dͮ+LA&̙gBg O~.^]ZQK dm,̂ ?{ifiŗ6Mk(!H^ӷ7E[ٲAU\bC?^3|PwC#+KlcٗfQ}juhM 0CfI)NTp~~WJ`b>\*~_R l#5eϯO(xqH?~L~y#aC%B^2sekUzGn}اvvC;uǺA9c RuB;J3*KLrТ΃3Wޫ4AAaNrw&Nګ)mjmjLuiȵeEy(yN]RnfjMXfM'=To)6:.M'8vV: 3"VB"J>LQWނ>`0x6"\ѺuE;$u ,A\'4]~2H ^lϪ3$GW-=kܸF8Hҕ+tkK8 pltEJrт]WDiluuL{ap+(A&]MPWՌtL W uE6+JHW]n0\tj(IZwLJځ*ډ}r"Xo64:EH{ k(;ShzF|b \&fFuE%]MPW.H9K6B\-،VuE:}b|j2ݶ4]>fG.즨HHWl=].BZAT)vztzVI!wX[QƦ+7@W.jת/A0վ+՚B"vJ#]!ֆ,]U(M@8 6BZ'QvWItePi2H.BZ=jaIWԕ>DOJ)0gJ΂t )PnpX%ɠrN\' J:cO3jBi\4ToJi&Tɮ3M()ҠZ)G)Ugg>3-VjjY ;;3JZ ZB  -8itE'Xs(]|VX[Qƶ+$2i銀g+ J2IW'UPJ[Nd+Ul~1Qz? | `N.< ׍aaJ#Dd]]!AHFW]m )H']MGWJK'-#]ltEswΤIg+>"ܱs *v]!e:jOqtfZ6A?v6Ů&+R>-]Xɀ:L28] 90~N0 =&$KhHWlװ ^WDduC+c#]!|́D+t)d7E]y.Ch ./ OF(c;A<$ 6r0U0X.BZ)B"Jm^BϪ3a DAtr0\5rj-4Fi"  IWV4ZHWc+UT"ضJ&]DW a8++ĥ3he+Tit5M]yrmBߪŧOI,&-?fq=..nA`~}sʭq"@]VeeP(ț*DE O?_+63ɇ۪yx>_g˛c>%R5}Ӟ?֗E[Jڙ<@yqUez_MOO_%*[tn6O>mr;okNTWsjkhEKl*O{7 }U><b+@Csl^zg٢ɶ+MW;#/_X4ٴv4ÿmoɶΦ>nG&&' ϛe%t~C}}* tT@{7gVBY;%T^hL~7kKr7rG}oU[w~e_oVsyRz~V7.hd~qtIӗ v6 n>6l1w7]ӟ\g᤾L}uUɾd-X|=R9n5<{3$a@'wH|}{y)vyYC9t{^<٭1uM9_xﱻGMkk%wõl>#2%QbRk#Y //._ `MOʘ|xXy"5M'=H-ܐZ s*qki/kO|xvTX߹cL;(u:̓ks.t@*fAOlN/{jxZp?֝laY&2.iusU\.ntK׾}̦҄0Yˡw{UGvȰSt:BMP-!萣2N*4`A'||Qt(Mlysf:sM'sCuF?Gݑ#mݟ}aUtGWI %`g&%RD`(I FC݃";P2dTLmйsPn-ۂg.lHUqn# /)^%p%8Er9a:L #;0\܈`HkCj!w?.+z5eϯOoH߯;~E38W*m&s(*J%r^ilHչOutʦf{͗Y9>5`7kQ>͟j(wkHp]ɽb&RB+MhpDCJC 8CF33.Hu }."JօNQWέqt 6"\>+ bKtu]yStEgD FW-]-uEì॓tGv@]\tE`bQZtZt_Ճ 6td] 6cqS H qŮ@ݵIWTNa+p+BuР銀c+ ]WDYBJ-g+v{.Bڣg _WDٖ ;" qd3$Z^WDiSj2ZΒvH_J@ɍ6v(O J.U~L.즨i')oNW]n\t)cQBuUQZpњ迈eORW8 <PW ֨uEgIWu%{V߇?1 cF] U EW(e\{%AЕLڵ{ҷHepA"IW)`+|tEpKR J+ ']!0h`+5E+ "jxFB` |bWk](CHrYFB`]f3$ZҦ$uU*|R g<ȠMw/usV,+JaJ!aFA қ0<`e;5#L60{$oM2ᗏfX%{rqI6f%SI6c"0 -pH({h))0ЂӠb+ SRESԕ4']+ EWHk]WDҢ)*2 ʳU|H넍]WD)Ӳף+ճLBG'57nBH#[J%]Zh<#]Ir=pAJ."ڱV t:jR;銀zEWHǚ >jV{f+  elN+N WW3iKS]l5y4m°4J6&Z$(c4}M[@Ȫw܊ߎʎ/~YX~C}* (?1薙ym~Q7ޣo>?s13y q ͑PO}ޏ}a| iӦ~z.WOTM=U2rlP$k ޴ &dr$gUn:[BҮd /7mkd|_n/7HO[}ï @LO@_jzM^ؚ!;(:)9E23^jIfUktNZN6SՔjM8j^U} c\gk6d> [Wj%ߺ1&XGxB;N&uϚՂ6'BYV&[o-5Atlbl5Qͨwm{թm͢EeKHXꝹlS5 kJMSjJ1'ZjX BuɌa8h\\D\L5CёBE{))w@#5"{J3M#m̨Quڣ[?,h2ѡL2:玡(/i !ƬbjO( QaX-|mAu"ݓߤm{uJeU!m [=:fDyKƜ؅),~ЁR|!syshZLM%cI5thncP9\:'t ^gcXKѻ)F$ZBGZG7Hiܴ`ۄE9%yIVCJ 6Z"$\m!jTfb>qٮE{֖bS}HFu)[_c 37% >{(R.Y hjA 1Ȏ ўF }m.5뎼0Q#_f E Us`\,ܢ k v**6(:ݠ-;y9xC9Ǹ,nT/JV*.:_@2hEgK'ǚژ[]JP\ZlhLV0Փk.keB`T" {̂U쑔YlXkY`UhS%vdZ[ElJ=RsA1%vx X* kRTB2!Y6p.HqI +QIw4 **өPm5$$Y`U %-ku%fH57]\?@ e :o9Q%Z ȄAv#bTy>MFݚ @E&rZ%d^S0P>xK!j 32]Kc,%$0AɣIA'kԭ7@vdmX;(â{jK0-"V1X52+IiMkׄ:Hf 9"oQ^1 bp cFYTPcD1QB;L zށ U @*c6*R U 6ƀ洞6Y[0sy6@2jETAlZ՞KЦs&Db2 *R 9Ok=F_v *Mf,q+3jlj kF DσU>.:jUàef@ 2q^HΟfhJ7a#U2'EkO^ (TcJ ]\1d@b~pQi8sTS.\;KC1ˡ옕ZqPdpMԅ.q$@BrۨB8b!(9`j?z?\yB%@qowGq[?]+7 x% \n1u?xr ^{I`Zluؤ]_ׯW&==좛ƃ鼶`@Ƈ=>3VlI۫ٳg ?,B4?pvVIۋ]/zz*~juswyS m;'oOmܵraC0gFnwWa_2r9'Fs2N {]y4Q@ DV@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $NB|mDzN &}:Nd8:;q=F' ,N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'uA's<'z}2N p׾CN 4zN dio $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@zNR{RO =SVs%;8(@FqH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8} o~\=y~9V׷ۛ?~.~^n`AAKןq hKac0.jm̧DW=TFMW@q1EFxBt5;>p9 WW!GHWl-[Upp|eBsilaA̞rV03gʮU_zZћ{xf :#wԑrw߿[o]6ܻZKL<|{Y"@ _% WKLPkW7*<\oeZcG?yHB,oMg^v/0r޼WM^c̦c/Bw>%M8 m>^Gbc[m^_]ayr0>X=M3&HgJ&q8Av =K(8|M>Qe HyFL9Y"s8 4aS1y42ٷQd.3"GvbZ'JM1> ~"2愖`:'nWk{*K o}iau1YZx*bꊉDJ;f i4gD*DL$:4R-W%oݟ0ϊ۞y^rse\DbMC7k' ٢TeBu,(mհգw1 Cɾn䱛y.f5Z2iنJ2PV6L-h5󻓰{ѡV_f9 ~Y8󌢏qIqLEbR801S sćbn>X2HW,1IpIL˗]ag`t38ZL D&<30M,<"B0v7` vQC$9`֡g95h97N "-[hv4tRy(.;\wMrv4|PJD6&CEI&mh\zLr|ޞrv".Do,M: (hJ, &("VK zڭYg)g\=1ueא}Dg)Bdw4B 0K{&U21>zR ׊z=}bT<`@5)xb =u@H%#QfQECΠΉuo'#~8SmT7,Nochz<όbZ^Y O<Mc_(w#`j])htHA XJ 43H.)=^ں mwӈ8_g,r=#*ʥIx"Qd[#HA\3Ye @>^<%+HGgrt莥CG']VoGWn86)&Ed8.m ;D;Y$&' 6I2 68)WӇ\)H]@pBR}󆐛cby(gYἆ:,&FF@`{OQD.qvE$j>TA҇*"Ҧ!< 5A)΄D\5,Yw9D7#lw- ܏Xܥr;7(\@=71hP0ƝNZAƂā(h r\G^8腷9ɮ8zߦ2&Z&;l^ttjvوbAoNep.X}&X9V1~>ScX4tٷ2޳}1}1ŴݮxGM->+ghm8܍+i]E }N+{?Q*#ur)\ІHo$с>i@{.DfvVe}׶/Eڴ3 OG >xmsɷq.Xl]m)>1z,Sm{ƢQ$}hQْdya65b'2 $6 bjx;Vv%N u+(a#bZsX:V?ā!s y.]ն|1'mT0h}L(3: ۨ:'RZ_i%r'z>oOS^;X+yωc~p9('p0-.sW\H)KhHZXZD OL3Eb9Gz4fNO}Mݼ%lsJJMd3BDN Hv:@z&(As"+P%Z3+a`EқR5^r15nok瞷Ch<}]O}D^:YYr%E_NaL.\LL~*S)e©Twn^OEu%Kzv`cKV ^w0)B[AFB6:s[}~B~rd4zP`۟7x+T:-xKB V2mRJ+pN} )vkLHڅhp`h3#@Xmm"cBg)T\4uZpd|n{ʐyHv?hWu='Dmͽ^#LNb8 (P `BR '\ pDc^D#d^>Ubx?! $$*5!9]Jg)2d=~3]pجix7GߗuM&_rw-Yw` Z(»E=rX* B=X+0␐x4y".̈́8M3ήCzZUOyޚ桁1FRT'=IP $*ɠA lHhDdn(G᳧VALTPCq\Xs\{)@Do;.,'dW HSzLW^[uZ .E][vC@lDvPYF9To8R g Uz@I`#C׼W㇫cʘu6O',c>r޾L`(٤<~|MnosLOs? tr3+YU|\ i2*'e܌PHsO~}uL<*J;#h&AwG J^sma4N )q\EUN,o-NH"ౘMms_GQl%/jYzj= O,ݮy 9sNcލfmԈ2&TO_ozwȪo|FE!Λ l9v[YBr5_ D6s# m skܿ$͚P~#jQːC+&) z<ǧDWQOcm>!?F9Mmyߓ"K錜#P-yx,0roBwyn<|t{ am|6tPTֻ{frGm^M p;`]7|ax.ն|n^ ۄ1/MtYue1_Xuq0thi.|XM,gU@j!yi!!%m}Dp")GdԎ ֗:ƕ^Wз3Y@)x83FBYUAicZMWAc𤭋DNژ^zniH).#H)p̧2 4T*bDĵDoiubn知gPZÌR%˭"89j<8tNl. W9͉@cBRbɡFٞ/B~88[umnFe_f$4T9sj7T˅Wgԑk II&)5-h`ݍ OKNp}'hc|ǟ[\5E'v9~S:O-vQwϗ9vr"d,#.PN9$#;swvu4#3SwkIWG'$T"F:X_$@˥S/?ݣi^ilo4װKӪdߡ]NJvyC?~.|cvn8_tǫq~4p8lN@㛵% Ѿ0ළ3f=$3GEeDMRV":" ٿ0Z(PSc)p7H_ٰY=GCF.I$ j/I,z85*p{GYLIG<(%M19ɨ؜ؓdD =zNi;݁FέNgj< mG qh;10=gY&D Մ@"Rt@<< ܮA0wwMY0:(r%7BƼ2S &Dr%x\` AdJOAshF0ШQL e 5FY#qo 1kBt:Q:P+^,@s2P XpY fiJjX 4<i8#J饌D14~u߾>]Irc'ξ`(=Fɜ:#QΔ40˨9\O&xJ򄎿boo[t[3cs*{tR'oǷ?v́穄NYKպAkZD)@;r/W> =;@6qsC[h6MyTPׅdA 7$qCEBLeHhQ 1`S2%zgLDx=VJavb )z֖`9@kfN]/"=n_;4~@ب}1bt8hGOi>>nmhwe{k3@L9֢W;r|SJj#( 6=}K/{R^a eܳAk0&:" I&- lzscG&Wcp@ :g(@,2@mrpZTv[gGVSA?^qy)2ZqQ9<> ˨"EkQ+ EB%%Ah>EhvNѾeWJj8mzTK4GoUo誥ɾ1TxӫVɆ6׷=;y7 ٽ;y755f!%zQT2u&oRfL8Q>mHRTqHљ"*#ےZ)C^{\i!ǂAV R*Tid,&XNW)fƾX,<(,\J6̚񆠪&vdw^l#vR$pVfU*& ڹD=A+# :0 kU64<C6l`;8˰ɥ *mGI L#x #v1qFl7k&f_PFm3n٨ )hc2$52 B!0R>Mn\RE0 dž)!= &D%L{c!du(j ꨬ-Ĺ~!colRc_Deqߔ%">Dd!x3;iR(qG H!m/@'ăXDǀfx"EhJTDTN( X8z Hxޣ%@q(x|irn1qv F|qqV'_gY/.¸\pqkk' m!) I|P9MrRCΆ; .&Rٱ/Ba< mZf)D~8RKh;ʴhS/<ͽŻYl7 X:RuZĨuqB3X-U ݝ=9RU,pR~)U֋N˹tϷX}\3{m]sWB9Bn(Ǜ^k|(ͥn;C^ݼ[b -U)Hä!-<)&ep[JHrӹ3])6v0aȄF| X Nބ5ΡvkNq $} =A{M!w&ʂ` +%I0@ %9eMC *z.I(,h$`OWIolxe%)Ĩ5p:A 10Z^Lv5O )Z\D W$W42ZҠɒcSf1J*| +G)G QQ`cN'$KFH%B'̰9wƸ; vzO-:Yx:4SN-mhip* mpcLxA'o )clY̸j 8f3=C6P?htܭu54/Qޣ]9fe-S;oEZoo-ЖԨ$Ӆ Gf$uJѢfFy t\Vhv~WpMr5 }xDcSƃ{#𭊾Dܜb@]B֎U ߱rh .ƿW_^u6isR4,]ctݖ^=xvG2g#}4 d%vv~QݕwjVr|N&[[<:^y9S˳<G$U/rj.(7~].W 2Di0 HM )iUVo}!|,qKG'RAM EZW'{cB1{Qiǔ޻5xyExw KV8CoYWCȾSβrPĎ^W@>9\O`pu/qO W{JiWp%S0U *KD*KW\҈+WQ}(pwR:3+. @ \ei;\e)5NUب+WRq(p彇,5pW"G\vyxW7绩\7'k')hlyɠ"D«?UUɼ-p4t5511' O8;Dsnշ0tXLgq9LgiM`!q֫#LKʠ\&67n[ h&2^4NfқNi?t3;Eyx%ג7f9l3ki#Xc"6E= ӄ},◻:U9NJ?̧#PY, ǠDy`Ey>_,W}fm'8G2 :cY0$x:T\\ַEAj]f0|=?`zZR,TP.gz\ݫm)Xf$1,Ÿ6 bZ+["z:Pyb'\Bk=) tVPNr6;(5 # 6;( )nG<8lf5xSʢQD(&f2jRdqu)z0uFV FVh1-6͊{U:((*'.6MSu ײbw.v}9Z6'?(CUu׬ZWl]e3h}򓎉2I}UpY|g1Ѧ?-(κz6{ގz⿝* XBk$pEDG8Jؤc+uKgF) i )PYW=ԕrA(HWlp+jm>몇^tEAѕػb+.u]Ν> <ԺB`].."c;\WD鲮+l]?# ՈR#֟F^\W,]s#RǻAkCgmB=JPBzT*4^2x<M_=nߦ}ol~3\|7{S͟~X|}].}X\~n6b1 @PǦV6)7[Jߞ6Ӫo.}>D|ºWuetz2-¬ }jR|ߞW+Z*( 5.૗/,!\Ʌ>IPTD㹽˩F޲in.5¤ޮ!Js뽓YltEVGwmUt~ZHW}(6B\,]QJ /f+VÕVuEz+F%6 ,]\tlqͺ:_KFB`0ײi]m )̝^J[ %l2 4 w-(u!L3Qj'K&Qdd|ҹܞ+V:gH1=LT8͉C8& 9d]>ϛUe*2m*ab4Z%Ǿ- %C =ZNltE͋u.u]/n+f!V#'ͺ꣮8u 6"ܓA9HWD "i;Ϻ+gHh-؎(t`>v*d+OpXO ƋggHXX|+uGk: Q:BBcC/-p+2JY[.BZ/|"J%z+mpXv>*](κꡮI9B`+$ڐ=?2p3?%1;gGv%2+A+G)f"%U0zIb晰gJ0)* 2ANP<áiZ >/n6/n](u~qG]9d{DtEM)O!^i=3ltEq;"J'QWg$`FK7ttE2y]%@s6 #~IW$px=׊n7{UIX.t#جDŽ^ kp]-uEe]PW*`#]Bi]."ZRQ&%4xo OSp+MuE&몏t!!e$Z%SQjuC]A T)JyJc VF'M{XV%6`ɿb J/{i'PԞϘEWD5;?2SpN0b"u]cv}Pވ"e4pL ZQL,A!{^4B纊RuWW rJSzJWM8/ \d}2$pWFB Z - m!TyZL #]3-p;NGkdbRI}ԕ%a+f+ultE!i1H)eG]H!p|׸8\ɦ3H*y]YWFWjЫ'ucWqi1q#]EQV>)"tz !QAJ6"\FWD{4Q>J-c+5%ߺ"ʐuG]icNѷz:]p+ ]g<*AejN+IF"`p z2ʀT'Ub Z-rXW4^w.(FF`MgӪDZ8V  b+Ն=y#)srVoDjFWk]m8 uC]yt^y@Mh! QZuC]x銀-]\tERRܺzF{^3.wQGq8Zh*&6CGJg]=6#]IltE:pv3e]QW 1ltEhJ]WD!몇Ҵ w-anFjSQuC]wFB`y8\FWDo]%`[51-X0cq߮lY@ Q2ʙLXid{XL4lȆ#*r%䉊W9ighL ji -J~$(C =ZЌtEg$p+O!֪鬫ʩk[ E Dv|pWMoVcuyĚS|ڃ;r/ Sk&Ye,̗_-D+qƚ_ʥ\kBu=+p8>In.KM}԰?W`1'~]Yo6vѫrT]~֣_9OYbW΋_VZ^y*Wi}?Y7oE zr@Aҭ,񱈽mxϵ /FW o#auď|ZQ[\ޠ7lJa7Њo/~_?|ލ7ܿB%g SPΰ뉭)m%gЛiyI$|#}C0^=-˦n-\/O!'aaQ\MQPWE9=F,}I уzߢ@˫>qg8x'o4#"bwwDJf7WoE6>]]ԌmTnB Bhs`?>(mJ}it##/o8pw+C^Y6{}pxoJhǺ5Ee]Rb9b t1X3f>  eBPGps}9ossؒՏ/9?B}Ηg4S\+ygA"sw̳9X"rKFh'bkvNc-~9!ğ7U~(ꡀ)To{c,wd/+k=A>-wQ*R`Zsdyƪz`yzx8l?] 9#SG.;)#d%}ޏrDknK|fs{5ls|v Rn3sNS ^*tW%>ٮ?J %f{]jCR,jKѾ*h]sZA6 (su;{u7m}VB>_a񴫖_=r47sVz'@ƛIu ~b|~AzzH x[SYwթZ4u]DUڜo6̛)) X}}A3+4r5A W哛Ho _j5q z6}vn{ms ?4o}2)k9/x_mB~]?/Y=7b_slrcW^?|i 'D?:o/S:VXZOQNK\ˢ_lV9/`vW$^\M=Tq wuJn;6| =+ia^VM[ZTA:lOBUWī3gS(`qzKNl+h]-4IBa \V-6sgFULC+Wn:&n i-&ԇP7݅rN-FTRE3TOQ|r9粟r׿8_7(z%Q/V.xgz_)&s*K7{DIK40}(ɋlIU)٢KAt<<$o+KԐRW^SYT8F@G7CaBS$J&$a *Y6 g B"KKKufV|Y5YT͙)^u ^pTjCˉ2tH0r-iS}JSG)o_* 8Ap7츚uVv#Nj5\́(㲤zْ.>p1#&c{a\0oZzpop IO`;+Ez[iO{zJ_uUsCU5LjGxP28x?W9VRv1qśDTᔘ p |Qvd5▫0JYD%20P@A0q!QbI{LB) J`N馪f7ތrsz3+_1.ێPKf["`,4||]WYgq^~~>~n*#iU Qk)^Sh=fR%?ތLQܲ+A's{7aaMo(#o_+d6ýK&wXwAZF`|V[ C >8/.9Wke= eYYm|@gTAdE#0a(3ӂ}9?Ύ|q¬FߊA f `ۛSF#Й<q&!G`}>V)B.>evzmR" |ؔaܝٷi+r(&0l7j{c袸ΌRxMpCeF!eh2gCHoՊ7ҐMS^%18Pd<ŏbM`3 dΆ e=^V YrƳۼG< (_xaL iZZ\-ERT YHVMjl5TX7hE#G+ ؜/ˏ1ً◈AQ;-F?475i0zUz_l_ ZlU5،aW 0NB "0$ (Fi " J(gŌJ)')f(뗐ZEQ45,u}H'%VZU}dJk\hAv/"$@ۛկ/╦;nozkՓ# Zz/FpBKY}Vzj+;L/?~?Ң;_,WxEzAόrLVOө}6VpK1ÀJ0x1+Y3k$p&~V 2OԬTף>mq.%/bx!+ݜ!o,z<"cK a$"7!+dEnv(A~}LqGwY]b=ELp&3,o2k;||~SYyĹH5\ (tDg9__hs6(D &mw]nO =ncL$Ŷn-(Lί:O2 }#NĐ-M~>-W;"Fqی9cġDֈ2 Cefmt9^]l//%0Jdz]dt' hbLeF]nCg{><)xD}?|x1Q c<^'@[wJa[S'R%eRy͎Oձ\$4)S1|-S穘b /ʤI0i!n<HjcFGP&}j|sRWtCuiI;ԍiu:AYK1xUL5}0Y$x*U&g5 ] i`ڡ}͗UϾ^{:P{+nHS|ȉ!p~5%hD0$"P , J*b{gc^ŁBU*W0Бq>M-9&^apOxTMiVlL ;*;3xͩʲ&©F~;xc۰SNbegB.|u*|Oak޲w._ ;>Foqp=+X=#ԙ?c25v Vg&#,ٶhW:X)Ok֟Qw,T-qK~jn-Z ɣ HqW߁JnpJ$fZ0p)f 'ciE0$LDA<]l_KBiL "F_f9vYqR}c4bq, RtQ"@[pl(z$2Q3t"QP@/HI2U)X"łTEHRoEzY|)m1]^Y1Q)8a_B█~KS3ߧߨ,Z9?MU_ 'ɻU|,jS~z%z0DD" $i&P$80A>?dS1yem]d#"P&~_k>gYWU$Ti4URj wlݴ,<ҡyDKPWqKnca.~BiU i&2dPS:oBp2%d1FJ`^zQ6j 1CEAPWOhPh1_AKH\TM@IIl\H>k*\EO33X*ʀ@/y$yzrs&Jq68 <`$ XU{6egס^чɳ*!gZJ/2z9i-AdZK^5 (NTrXRl~ǙE~̈8qDrOuw[s1$aPT/@9)1<ݠka.gZzt e*6"~z"fΈ &LJZ ! EQZ20J0E8!f:M^Trd%LUj84~0GԉSٸҸoѮɝ{GYL,6-ȣD ZLVc?")9h=ϧjn.ZF xsRqrؤr?{W۸J>c C Q,D!x_


    xǯO'צa8Bts(l_%-$c6|Aˍ#5Zq'tVjNzN=:.xrʄ@(% nvA5 Y)Nl#f%{nbj%Һ0VDOڀ괰>*aذN>dtUM;'/`;hK l|쟫a>D#$/fD`Ġ aqF8[n`ߗU^Xۊƨ 9`f=хߤq_x_E0n50ʭBǺ2̲e0WmR~W@KL9]j:_£'0DzN1;NJqFйMcs0$DZ.$> {$qƷ4^ pcB)y.R̒eXLLIhVWBP7 q|А.t}1)IVL p B.|8Րpƅ,D!(šY~ $om7Ȭ=+s IØɵ ^oJO9Zϲ :|]ARS R3LޗJeJ)5)(f 1̌$cֽ.9}Nf;!m}"2UNiCC}n)KDrۃ#ڭCbuC¶·%$뭫O*r!B'.%"E*#d>#l= !ŇQ/8 DVbKӢ!90k5QOޕ dsI sW]D>B{LIHq[ZCɃkkNF?(!]f0yl5`\N+cƜ=C gZ, k{\. CT@/'>93 :^V`kO0CQH|cRRJ+UAyUƨ &~ҏ"`yn*va'Io-HcVSJ=>7? 1-^.U}[WNAѷa~W(̈́֌@q1;J~AHaEq }KX`^Nmbci |qj e>Ur!M xЀ/ק{0YXT)*zu0xnw;rOݮqJf*D Ջ Yl *(N/Dރ Vv8>cPBK^ޕ8z%5>Tq KdO"IU !uL5y/%χ:BLYe篿 ƒ3 5QRFH Fνlu -!+Hy"j$)m|J#0Ϧ2arA<.Hۮx? >nJ}3 fǕ׮Pv]>+E(cR daצЊT )cqu?ŏ:[Nø-mBԌ<%Ch3 WIc#b4f)~E%I&٥vL=Ø4V ua,uYaݙع=WRYWYr2#:uv2&I75>Ei[Ǐ5LeA”W ) => 1^$0ZZca;^TPre9eZ~.rwvU?fՏ?1q8it#q_} |ҥ)ӵjBU xd@ 0(HXPyD#dj9<;m"^hГ̴XY؃.9a)Pga2D`<9O5N_Zú61m55={bjO`É?' %V[Ugѫb vIZp$9e yXd`<(};~XdCKO^}v \Zuꆇ9^ [/hR[ `F(6RkM?+4?sk? 9~0Ulܰ؝.<؊!D$ogYKp(KJp "BArSVJXX+sY$;ú,}o zȒ9B0e"K(!Zz4D* 4g柟Lr2'*uDv6uHطNx}[aQŬ6O-nlXT!8f 2Q8jKkvuJ2<Sp+Nܱ+Ѻ&2QTravMR#e'^N k\Y.A;ehb3,pә)7U|4p/0fAN^r6yY)TyΉ Z$ KadIjXcǛf o\:՚{ 'ϋ3 '9X)cRjrVs=.IB-cZy1Hʘ箁 Q>0I40gA٠}&z\{eCb> %S‰yYgw4 'tbYґW6-f+smXcDU*k^w2¼c]ugT]hwNqX%{ٰ-ƪpemS0!rN lE ";~}jI`51fO o/+w->+YƉ,0e~-e6 .+D|`OýTɿ5x[ |'(A8_`zƸ$javëb} WEz|[ht;kX H]z>iU\XYji핗K:YR#C49R 8ۚR4YS\o"<8-,Uగ^x#-gÙYhOA%a%T{z,$o=ϾB>cŠNk2)(f {}=D 98YΏxM)LQ'!.(_#'+zb]njjr>zzeRoDz [yQIh p2.Tpr[C9r~ U|uTuSWKŠ2F5 W*gV.uuA7nCX3j?/nK׳[^= *O||LKo4'qw,,[f^J&̩Ɖ29?pBU9AxSڱ{>]ݏ.sUָYٿ4^V{'CاAˊ.Agd2ORkYy^gVOjL7!Q6-rEH^S8wNՆ5dLcC ya6`U}Eٻ^^#C㔞[-=9Z6>|T6R+ ^d (*Ӛ܊ٞ+ni %o_?{leS̄ 9X 0hިGm}7+VYfJ4;Z90Ly<.YS:GY[@9|^f2h"Yt=Wo_7faJ~-_EVWi=+kGqpvb5yr{qq|ġ9:QC!0$q>jobaX*,J35Z[yaߠE>qb([&ʎbv?X͟' =}ڪϺWQװ]ޡ^}$"k}( JxmJ<ΎD|Fh#k*9"|$"ƚh9^7XaIp'&eL2_V8.ų/^dPudQ}LLIJbZMxQ¡)(7z;["eI6"z9$UYϲjuE T)00{HW\@I.P緼ΨV6I)"EUk:p?fØɵ !A@DH=y,/:/{͡Ji9S~"# Haݕ) e.RZ+RT*V c,!2_^Dh0;޻[MZץ0ɋ#K8p %.`^IQ|~oLք'ޢcOlSzuL^XC0?p|NКۦE2BZȹPBJM(Z^:"Z6;P 'j~[bq.$xP  p?\ܿ^Cx@XGU\€)¢.;(Ve6CGPΔa];™shfAFco6.Z9 X{hCtlUԥe.kDl!DZe|(h#.(a/Q$>\?rGƨqédwjYwu~2TbsiU:^_tq+xUdL'zEJt mi 9NC]8RTu5s>!t5F|JW#=ŝaN k\O7v sr?} A1jؾGtnEv(7p{D g_K?K<$* X8ԱEY{ٕaZ{RJ5B\.YC_(WQ\5m랼 \6Ѭu5 z1\v8CX [4*P PA/J-J~~;5;<҉aSKF$4\}2h,jNB.2%5UǼґw\쨄"Dφ/'\juvϮ: ٴ¸-JDEsBHxu?Sil$+El8TkKD+&=ٌٳ, ~Z++rq84m|IR-yL49%ugo (Vnײ݇7)(po k[ăt;JL``&x.SCiH ,$ )x񖰨Ó.qc6I*51+6|aM5u8kY-Lڔ:paK:\Ι'WЋ )jL{8o1%հ]TĊ%z(gZɑ" ~${\CXӖ-K=lIvV4ږ"+֋UEJ/DD.mz}jL@25d+CcCϥ6\ZM0xņz.úrfr+h,ֹ]9}`B*;-G(\غt)t4/T*ybݲtleOb3-HdI$ȟ -6`l4^~K)Ő.%n1>rSpz]b!GA0owbQlܻiuQA.+aPY /VV'bU";ibQR*OxA3Եcg0mgAF7`%qEdJ:5̲6kc^ְ;uuHqE4li+L6%Ws$(V(bS^q1/&Q0 ,iCzO_o1mdKEWsZrd<=QfM'4C{Ǖb&73k\'~7)%6vF݌Q>yx Vܔ6OfPG KM76j<h9_YK{ $"0@qT( 'Tš˱ xE[x5y-K.=A^wsˬ*=~Jr"+J>A/JssEuE)aR\aܹl ?P+Ը,6cq -=7.S Zݻux]Pb њW +qVO+z BҲTCbs .r%ʛή |jm fZ:vHw ӽwήaTRRLn&:%w0직Ltdl :SU"d{>_Ax{|[tPTQJ: 4xe;QRό쮟Hfyd{Pb H٢|DZ;j?O:}\)n:}42yV)oѲ[KCRJ[9P}I85N:P ,зM!^Q/]xL-Ӓ *Ev!n[~}c˰z?&QUʭ|v#[ӗyvEo0g}hDu^ +dK(ijƝ ޘipi,n0uRݏ tZ]dB:DʟI:$~}FJ+qz7U~>K.O Gb4LvAejx.hGٝmPܛy8BN^}) z_jt߱|<[\ 3b-;  4U >ڦKe)_< Yd'VHQ9牠aR$(ؠe*hcZN(#z~/T^Y'dK`B# ?,fѢ>`L Ja60!QqJK&$F46,KJ$V}xx)zKuTEl\{$1x'tW,uݙ(OjsN(͒|Jf0'7R;oUQ`B5 "ͯ&YFyt;JmdcrGc]eIlr(Ӥn߬5~FoEk=@A[{%2C&`Q08 al(+؊_j>/^ɲ&bԖsوcnk#ApģdI x&KMeߎ&qkVZy(,06I%}mqlu>^KvQHѳA@R^g`1^Hvck kE7$-tcis,SC+-Wheo*a8(^|Nw߷, [{Au\LlAMPG( S}"FMq_å`30Z1 2/4AgݱZk^WNhiڂDž}+!]Y(;;dmX(MF*\ḿB8YIdV ?&L"IE+-@) `8B+51(~NRXegl÷yO,4b2_Lo#P/M8EYӆ6MSu:%Y+({Vqk1Du%WR)/__DTcC 1 d)A*hZ,[:.0OK i~G2 CP#CA\(k-4iq*ǚڹx"&6HSZRߥz:P y#`)USsxbDq,gAI\a:4*D/y1}Rޣ9n:6P*|K8h&jq(DԦ2)I * kӷϠ&7c(UNNœ|2pLw)O` ~9MMIhbDad`9 X*hlAf>+"tӧbĵR<oRk33 *%IG " Gab*RmV:! IaM ݥ6.JtLf<$6=j뫌xl\oi;Y\cEKgKwMX!DR&<0Z)$k[`YS!t*r$Y9Sf:~b ,NC[}'^ʻE@$\J2BlAA€lc%1QVB2a$l/E \c5+*ڳj.=#@Sʸۉ.ՁJYwSԒBj RK#) b82Le46sYsQ@~XAI:𰍷;'Lb'jN%Sh퀛;j7qqI θdUڷckۢi,|dٺSFhRK|ݚ;Ylj.\7!4.Cth,q߾|0ʖmYrmJ铵No+:Z䳧V'O0h!ҳ":awC<ϋO, Dg88HOE%xʹ 6p] \p/gr’Uj= |o>6eWBh!gLƐpNM۹%2)X=z!CZ0VyA|겟ݯA*i;B7_f.5iG"T9JYl{SH˄iQmGimϭ(RL8J8ШY&lە4Zcʜ֘ÿ{TP]ɛaoO%K`EKQKW Z4U 訠w0BIš3c{b{޸<}Vnސb84[4K0 wǃAͥ$޸5[*#9\tnaZ|qZBT!5?乽x.HR>oeg˒ރ܄ d?718o ˜; n04.L_;1%'~l'3RI!8Sc1T(0Qr=hU9܅~4B_ ̅κAΗNY{8Ϳ'F19D'7Y$(Xab* Ȑ3 ßݼ ̰ea[/p7O鬷mv,i\9gް:cuiwcZ IØ`//[l}y =c:\' \(m gUl8"rn,t>kL41cȀu(rBJ]g~Z4nJ7Y{la<뚏aE$ZFljE+&96p)s*% ؾe̩0D6ĜG}'qōOrV,|:첒fp+sB=ye6_h)b*̈?D% W$˲&$QT@1 %fDƘJoCv̈́%vn'6T۫z7$ԡAR^X! qFGxY/lSP|u<[>=RA\PxMڵAjX)doIv{Sn]*kBuZ R15(͛ LuKd͸NO~҆h7L끋dbf_PZ)* ۘ9LC.y|5v*v8u{B:wk3J0#?Yl)'4T̶ZFL3uߌ3[`%48zhE KE(Pe73{֚c1ٱAV\+X__<[֊/H=ˇYҰc;|Qy|3 8eLΙ!Vi8H1O2i6V3U0d-!{n0m3e(H >Oo Vb)' *u&@y4q 3fp r[s96L8K^A?I7=[jFfQ'qʑuv>)`|8 X #Z(a%IlԪα̂0Yf:TTAr|nZ8qlrDs06p(mexms/hfQ`J"A,g2iI"]%9d`&0!ʌRiny P9íqk1K5(-A 3q"$ݧaY+ [eq+t 2ψXiy g"aƃls rC+i-*%e[f7 3kA`e"/aPr H:,!'F [^jN,}*О.{C- k߸JFjK W&4NUs;<gqa9>pZˍ͈PJ5* Y*ru84pXz-FK5o ΞnFk윩Bq|G|>d`h1>m<}ZpodotBo 0 wٻ_l6>;=%rO>?g=7,6?=[lzKLZm@٭2C%:%u~Ejgy~L+]=(o~z7/)o5pΩl{'iUg R%4h 7qk]3؟!η@b$fG:* ㋻@m:k梳v}#8Dv2e|*g`pڟPpDTjr܎$ჾxxAA<ţZ*{7 Q @o34̑ DILg+gTPF7SPO) 磯8}b=: =n^"q I *YE  ι3O&R =5Pȥew4>_M$8RV}&8'1;e1d>s-*_аoi0NůgH~{]]N($dԹ;&ĀCl#1܋6 X4GC!q)0JR4v=ri|0qXapT&nY<әZշG.2qwbΙL( Kɹi"0pntuCY4DrTI%!hǙ s|kv7\ ONc.k)=##aS/\XEA5K%I%[(Gyw~+P`8F<2mrdvgVLKUi82&1))BԢ pb*H>0q ^.CX[h2f,sxaAќ++t1W pivT]CFZX4p."Jm+,D̟g RR]`4 I{SVG,zTzh<#`Ml-H[3#ng2DӠNTMyL_]~/RﱅZ7@>k]z 9i;lx4X^* K9.k)UE'σ }ry;2Ay~3'펑*ff~pts㤗"a U^/>q& HsIf<QkrƊzc_Gl.3&H'HUi.QNfRD請(cǗ),6mGI`ϊ:r}ԐZt n8D *W#%a!=bIiXa}@%&ty*%NǣNLL_$0vX?T͝R)P\SױE2L[0szf2 $,*g!*XQwi?R:Kŗ?g7n:MC=ұ.s[ύc|We6fgxtҢyp),eռʘB57%wB=WE0e=}ri-. g`d[ƨ+uڏQyc "jD|Qø2NRÞ^rwl\+bdx lYzT3O5h0":8,%Z=H4w1݀LG>n˄G%gEb1+p1F~}줨ئaMNJ$OL= UfS|6F88SYS@:,Qb\(De@u)1QOwE/λ .F-_0Dk a{NS\_uX+ vm>@IoK?WL(}8t//&USRPIa,s.u`U݇@K."8j~W0'/;`wFCtٴ1;`g FuܾjnRź]v c;`VSZ_$U l}jJ@NҤҤ0^GR&Q #Ik5(8h\eJ(8pr;&ѦdNJ:WuUfMnSGq0tn 4q(z\ v}v86i~ Ņ_ F&9K_KKiHw!K"&7>8#rjvyUey\u&|:y&mƍ]v}g){$P>1;s"O_ՏkA9iZ -v-T8OnxFG1 s 7??~߻y'Lh0o`=Ǔvr+4Fr_śao㲷_oyџ+X]"X󬷵糞]ȂIfpWy{_O&?_#2MzK™z8et1iH9ڐ8q~nwA՜o}YK'd 0dZʀd4qlKx ).2iAr656/Age"Lctb g$ v' β {iYVlZuK5!7۝1&TwSqlڶL0h ߽|2zaA l/#ag; I/ oPM rvMi&"T[{+q (ɻ,-.i{IZAQ4 Ab:9p H3̨P̂:][RO/ 9ͯdKY@2ew L=aoufT],8>ʃ#2 ?w,)\ch oHnף/rAV[6~oUi!1io}5"uL@Gv/3g,ܹ b9 m/%eŧ2}uO"m :0mYVo{ME$nk[Q7]U@64DV@#iFo6Fɶ,Bƽ:_j>\\ȆZ҃ԔK[6FNDu_rh:-Q)yYl j}闯(Vϗc<q 8Y=̈J DGJEI6w>1i;"RS~N/; Zkm#GE9 L'_a1X`Xpv4ɦ-[INYb[۲fIٖ:IljVŪWU%N_B:ZhS c/\2BjsAw0̍!: /g[)*3 B F;}>f7t f/hy ܀'J'"&wweQ4LT} ~?-:FX@{BKT( 珡(5;IO[Yίe(DFFDHx߷'l05XCs;XQm!pls`jVKX1#6뮪rPWbd8 &lp.aVOp}jqt`'8ybbg g{L0@;u?-Pb?=TÌl+8Kq(H9|R4DX`]@B<R)b\=_:MB8m A^etje1c,_o~8Az͂j ܭ!WvxO&od ^E ")/q3n$'(M2dwr5B8F$TQ_.(xmpK`x{D"9cښƭp],N@ a%=M9rdo~:=/(1tqr"?kJW{Xq+crJI$N\TM[HJ]M0*RW⊪v1W1\5z/kliGtj* /fQ~k cpcuȞ^ǦnfX.MW [ vcK%L2&O4a<_i֦A'IrF 45eM݌8\Q6o ق@FaK$s͕~%A(xf☁%ZeJ,x+*=ƁhkE!)P.|P}b]odǟmQŃaߘa!nk@\T#ݩD=h,-p梗7q8`T83v~Ƕ\U)l}PLkq(ZoqI Ok]rV'! =ZIrИ 49V?~C AM|,^~K _o g@NP@OɇZnhM"PI ]E!: "S R RT(^e&1M ʰUCh10Dӻ*CBiτR8Knr4XW"0A@E LvqF-4UeX"ҀT>ֹ+52q-2鶮U =_%+s<3Z '9"C3}y(P2I ө]}][\b"hC`)m`IHR Yr.b89sDnQ䎭wTX/;dQVI CNCФX Pwԩ# G+K. UW&| |ٿ]^GӟdZcp9,?,.' ώu1s˓Lg(`^6>{eh6k# p"yFƺEIM!g]|_ċ(׬kV7x#39T"51E\ZhY jVI}@\-˙3׵t_UR .it9^)-̖'-4 PbFJQ(AEmRB"C/Pb~cO>r՞az6Ͻ`yp:h$ `if%u W\?i-Vkgeq:MmA֠0z [[ID$BXn5À1w|wH0}95c14k4co)J7/5,ʥg֚+YzFhG0/hjX赴xrx[&ƚ mO:XQf|5BDOhF lN|4qA #L>XFa+{6GZ:6b|YXI)%\9ظy=bq0Mh[ϬR>܄WF~\ǩFa=%a˪z +ؗnT`ϪY~FcכKkDqf/fXf VʪaHUk[Z [UvRpj,C*+ve%(Vj`(GbJS ŶiVH0t!H4(b-4Vm-3GvBLE EC+IvJTooC`QiCjq-N_:*\ j3A qCE->d$BX%/XEEA0ОZ΁~(0Y G={P#U>:MK7*Uef8c?Ac,ǧYT2ase]r?Ux /Wd9ح.d'ٛmD(6ļ_*H` 8-(Q`ܲʀVkx?Mq;-KE Hx/73#Fj}o-AEfE@ (IX[?`qʚ%*x{EEny {hCF"W÷jfuNb?Q2ܸWrG)M"TT[RХOyjy<;M<0tCɯ9t",-UciPL K|vEL]awYp{MB\ML ەNa F'L T9C kA0JR b77S^ LPp$~&k o6 F1MXc V4Z36Nic\[P[=c[@o?OEx6^U5&"4p8H ~{kk@T2 O<9,eG!r6*OFx#l Y" qr,y˄"en[#'50|Af>맓q $Ie1凓O媄{?۸zx-JGK`8Xɖ{WԖPRIDJ4 s`}x lJ:\OW+.T-7xJC0IS8eXW2TJT:1 6j$w -ҭ'r.'FrsyS,/fnsPm2LRZR`mA%:f`%(B+S1)h3ZYɸ %R+$+"X)}p.g*+^>h Cp+FYT vTI=N2 *S&ŶFD4LvspPR C+DY) v4"L, @\%+UJC)@ G\Nu9 aS` 0le•dV,+0,7zՈo`K+JrY\VQu8DO8O{;sV1{۽}o7xpj5Z}7e5j5_~xw0wcYUNVgq9|P*"Z'0^|nT0F!z7ZU{~ ^Xq/ 4^jX d%;-gͶ|v|'䆭X3{[Rk~ovG9*}sW|2\?#Ӆ<[MȦi Q/%u(-b,`B [jCx)XYOiɜ |I@KSKnSHg ^"OM(\ fh PŠ\0%𙖤xUP3TSU:u)jt2L `.'R9Ck^(,Y ?2a96A!DeR#aT)k!圕dd$$pk{Mya6A!~.|D4aB&WiCcl1Dah|4ojūA_H|-b[rDB'5Pl7@_BoJH.søf*Gݫ9<)zmݥ2uʅY6 {"ѹQt&5T+[J(ۇ}=lj*s+8&d}RbGsP)9z*Km^1~uqq*GMKՔ3mATMF&8ӂUͪ^JD譊ʉ2Gd3D5rSa ;F}e&knht9W]F7oRM"ݤq(< `9ڈs߇)bЩOl,cw]6H^mQX  V'>f[ĵ @DE C)9̺S}$}6xl&(}KWգ=~q38:uh297%*L1;5ֻmv5&6!cԙ;p̥BԀ; ~іCK@-DrEߋ?AwM_wĻ֑S` C@NGss^=mkI]/VedpԂ&+STA^]i{uܕ+$\gq IP.?_Έ0-cDt59t'[|W퐵 W\T%x=[oomdqO l{]q"[6,`{w>k+-S]Yr)WX~5Ɏy1p= fG|׆|OcWſzJ!~>'zkB*_B端˿\]{\ޟ[i"Z8q#Bsrmeܦ:=YKYr z uۡ\aAe0!V^ D)y 跨!MMOL\=_=6䑘L >ÉP4B3F$ZP4~\XwX+ϯV]vڐ/|,Fq>4-=n =gPB9'X5k-~WRͻ2 qõ 6Z tHW Qd+C,xV݇DdF D7C:`@ۀiEܞSy! rx|.5z׋?0J@i痯8^\1bdž:ZӪH)Oo=i߾..ku4Ts{87ޣoXM7Fݺ56lӫUE^CFcTݫbe9Y%\ u;+hp*Q2nMmpSga . -owHeR;+uV7/ޡ5yH's'۝;hr'ܪK$6-!Q :/S6>X\YxlMbꃮ<t;wut1$ y4a|0!8lp!Ĝ@`I_V{M}f*I ^o1i^ ne-A'_} pq <8߄kNߍn<,1|vYC-xx?[&JhYy}w>z!<;!m3S{|0srbN YVd"g~Qm֒]"gKlBVY/_HǍmtd׎¬%#ràJ2ZͬU~3<'-X{g7?^Tʧm F-M"w/2B6zrꍩj9%7kw-Wr25.X*U, $mjBRkjSWz~UsB2i=iO+.Wɫ:/1xSLĬp&!2Q &b F1sNg,v=8`'.ΥgNsTf !'Dbb` WX/[+Jf"G1x XM[")5V,yj3ڸ5OهQOQ 9bVK3Khiax>ރS FcP ;6(R{Red씁uoGZqƿ/6lJwqū3`㧆obxM:%\`K<:`6$i j&EɹxGsλu4vqaJ:Ka_^_3@jy784*ج2Z5jHRNѹ*\Q[k^wU5.咬lPs)8#D>>7gZQ$d_O7 BO(#(LB*du{̩n,.8'Q7(Ne[%JOIˆ\,B#2SG"Ohxxk|WF,E- +Ӫ {\Bǡ+ UJŷJUt߬3w%/^՜͢/תJ!O7ʭi0nJGPˉ`X̄Aclj Jf+%`W:$$UQ8U%F9D'p]6S @]q y'eV,MS^G$G @l(%svV VXi ggRCVv#+ϗ_JURUˊT Fª:8\jggvu?Z4W.z#Jc痿~<ᚨcvMP!vkO+X*VIۇWbBE͵ 3HB P=bH5 2K(u8jfSxuS8*NդM``B!W8 PX K榦ǪlCx*+inuuGr8`]/6zN. BYҳ6qDn" :->rbNjlL7./sG7/LȪ,UW1)lz*zYb.I%%YË{ͫ:1e Pmuȑ~ ľj c5\R7n :-NyKAMPS:oeJkt$wZ@-$uÅBw bZGsuZx߼TA~9챧RZ)Q[*fET 3%(ϣz9Ʊؿm1P,"Ebqpw}'qOdjquW;ױ&KDCRbح]U܍]x27OGNE m퍽--Txw DpX=2Xz&ijN[f9_{F[0C YpA^l=9Y78^"C7r 3%S&Xk Ңv=@^6i+ړ I B5 sqP-/V:S/:?_|u_Y<&a]14Z%\U/z WKa )eng_rz~(N088jŮtķ]!YUly_7eg7]rⷫ]sN_>|>_E3CӋspx\<6^_tTe|˹ >6>%ot^VV8WhgWTv`O[=r~,>-BA̞QٟGq^_d(sR;`==-nB7LzSe +R})kzS] q5w}VWk (ѫuSP/N6jHj\L`-Qr]#%`4VW1d ɏ=)敞!97ٌ>DH$MP_obHg /W~cCYn6/:`伄T;aR ~Ppw>|3jsxbx1{N}0g0أǠ3 Q9AŹ ƛ*W+㨠Mi)->VLau3MsiFBɲQrֵYb{m?O6)T> HD|p GYC..9Ļ71'HkҌܐ2CVh[TKV1 "|?P7WL'393fZs1O]qvSE,udsE5Y>}s/PX^՛Y#͞ol1sMwu?n~uFȐ؇F6kD^΃h~Y]&lK}HNK-RYͿmK}\y݃>|%!K-󔶷ٷ\sg=g쇣l3f0['/&|N^#1iGDM]6}j֏nYgu^HپN\*7ZzF`JwV؜`([t-B"!9zSyr_4/0 d!LnaE*#!Fz1 \~ ŁL#ʌ=/y, :8 [heEl;}g ,HŕˁP*yzc'}L `8:2.91AsH52Ct'^ׄRl]@ƚ9'Lfg%B-Ess|,X?-q%/R1F]piuv oS~q"SO2b@nLKUzXXmhw[ͱHŁC g " 5L#gzdY]3vh{=h'ӁѾKy !G޹k@1FuXݯdQv}idhM_h#$KhVs`UCNI= j-r8%zk4 1^ 'v;?T,'JH}[tdDSeki'0yIȆ:M6ȋH'΁(9"Wei-/Zٻ&te$xz"D XD֤A7\M6X "HvM ZsTt9gYHE4nj~;Ondaq/ nT9Ёp"'usNEbrR(ڧB&؊1 Q9=X?Qeڔ *J5U,$h; x0lr[JVT+Kw?IgSNVx'cg3BK&퍆N@CBH9R jZsM|}zhh$6=7W<.=+dM/o+SvMz̍7c^OzYY-9w.,yj?XYW<(KxB zm] rdq!~ [jwV[bOkj?w@!l_5rZԾɨRl4ژ}aI5b5$=V &j_b3gLyMRY;!Ւswݕ0vZ-NDd9rJ&%2NZ5qݫ,1 MԲ mhJ9H\\veښ]K^U!*|°\Qe!B5(,ڣ\Kصot}s$ufFTmy꜠L ~R 4l`3B@Y^Ilf`|EHB Lrv"=n2}ɓ!za#j@mVؒrh{/,[=ї;7=zU\䪗%W,%GG=7SucLU)ԛl">&_Nd X>=_z7':;ZW*pQ̫''KUQENΣDم46rLI2Ͼ"%3rЄAl'59Q@?!$C})0b&kb8ف(ҒiǏͭJ\\eu̼^QԭiQGsL8sY^wϾ_~,~>]تOex˧O3Cvlߟ>ZYju er)%k)1)WbUhFTMt[j /u WuqA\kdhJ>9-Ѫ#В'mLi ]H5ר%UH;]aZQdY^Pr.*'8NZg2r ЄVN"BXn:2={yf(X$-cZI8ƠhM͘wI ˫3d =jN @&:EFn9XO럞7`(8r+^iqh"|q !LL2%|Xi %V䮍`>w1 =yJ=4"vxRuN- 7$l'Jؾ]<hW?7KCI.y2dGr$,+n+%i''v/0k4|M'W6.9'?׶oQ T^B`&O &EIH)a(Ss"7B*ķHA){"קgm_9[i~9zە}jsC{̗7 ̇6ڬ_4v;&|1=fހfiiI=ʴ ; "ӈ8xNiA(#܈ q-yA//h35̧tkҲo~eLJP\UCJL0tR ZY01'Rv EZ8*PJ$;FOMVk}MhTokFha@2>qdysI|K]W$K;?|uO`^`wyρ]gS^AIqzߺ{ӳ|Xz+yp4^;zCaR]e55 *aPUR|̕IA(T QlaML.` `Z7X^patF?LT1uvO;u/<(Qho'֨F$/r%{eBa I(@&jkbѶ>fLbZuk̀- KETJB g ]KX>d(D;l@~h0u67?Gh}PY7kLW%!#I!Ȏ$Tհ$c=W [bInO/ޘAg2b]mo9+,rB! އҋq̯?w./`0R%"pUcM3dn1իjy\%io{=c!qf֙#8V =`f\[dT([l/H)Ef ۘk&rjNBtfc`bыo$\ɽmKHEӺ-#M#7od9hЉ ;VAh;Tl9؈PliH.*MB2#ބhIP0&g`Ea4~geWV9;X*R]L/4jQ]zA|ppqFVI:H}RHs%XYSfW69acf&709)P5blm}~!F'LXb@ʉ8 5x; J!@[ȡkk*X4+sZM+:K!̩r9i1 µL:8Zuw>%1[kA^lɦ/#\5()(6l)r R(kbmqkA.>zj%=Lu>b¶"k[gj)s8̘4uv8eӒ3hݦFx*TkQ[X^`sekb G+e]'_zR6o/`r*w~zed˟zfM"_oF_eu1az1bgCxL_~qAe_uϊ`;Bo\jQrcE/2!/GEwɬf ]RmC,ݧ%[83Y/},fQ,쿗wJ?I) %|px؊?USEwo -y j }>?8<8V-> NX>_}l$Ӂ,y ZCQaEiWgcU3}x/.dE]z%dcÙC~tFWGdwMr,ykWx:/_ xq5?ew8?˃+vg~ҋ_\!wϮkx'W\~[SsJX+c"N4yVv9]l9e, 8P0R~tn5%=xEqo~^`:d^(@>~-'9Fv(؏T64"\RRdzLqI 1ې3m!R$6d2q.&y]SƜ{T&7BKVN+ V,ĜJ *}$ WUF)s.߅ᾰ$2^#wopQ 珼I1UĒnKV\g `'dhF3e8?nt,2/'%/{d?~~z C78?bV4zhy1 6hhB )ĜMD [HzNp=P@<3Is du#u!hLC ԨIv* ,۾M6g'IpvRE#M@!h<&Ʒ`5)6@9jX?`Q3(lK߰; AiJqIZH%ߎiZc-WEkM%!FbuK|{k:02c`ZkI!6yRPvn-6>s𦚒u*m6ה?BjJ C؁'-Nvkzw@J?Vr٦e7(珼%5Nk+>D;,6c%Ix3{K3}*WGNKi}z"*g3f"~ zuv  v+TSz0 * ť JWB_✄'*P_8h4y=[b=d 0Iv፝=)ksktaѡPb-O5!Ĵy #Lk:u2cBTP=owӲDZop+u8ТȢ_k='Alk=R-y\Hظ3UALش+|nc@oCTQr WQk%0dȢ6 fWX QcmꝜN)ja 5h7`,^ܡX;e_2clyixZ2}Q:m=[<=1\/@5cv>^v~o[X7T 77Ю&uc+n{9jKXm/u_wg-thLN :z=lӥG^H A 淠K\#AOXd msl~M)" rjzuqƱ7>R"^ӸrTS\r͜[7sɯk\q!5.#q!TmT3bcqmW<~GqG^H%JMD &l<9ҝ ϋ:a:ս6=L:&n]g:zȋT+jLPb rLhabZ1ϝ[Ϩ5&nmeƆ@ 7fݘfbl{T^1[FD*0s; tVqV.R hI5Ztj4m#ڔeC)[E}N+χd6˟~ŧ|u~ZK_N~c.;8DŽZTI>V]/ ))#edW [8)2o&j.zR|r%0>?ˌ9i %swLqCԵL޵6n$"e76Eb.΋WFx~%˦E&%ѢL0-U_tE.]&׮W6zZ" X/ٱv0vk'v*WFvCV22ه{<PkqpV27k^2X[aӾǁp6΂\kv? dZgE{X?\de'6sKqī;G N؋S٘w[>;0,Mv|i]0|/|eUܰN %,8%p4jAٚ-Z-:#[g\anYEN(eBǀ>(;JMYvSQjǽه5`ȩГv+X_L3ki-XF BaD moĢo(=QdphTaC, eI.gp9v # 6G˱56і +^fϷX,4aB>]l?= V<, o)h[] ڰ_DyJL/Sh:{:jt3̲-)+O|/i淚䋗?oÿyt>t71iff^öNٗ<e5adx@'7css7(LYvVYq<"7/ |]Tާ׳U` {l۳Y k9y΂c(c  dO?ߧݔnx^ ξUV;\,b58 DNOr-JF3͚Ӡd oDeEEcD;XFkHFʂopi=5pY?l}MJV&|%ꢌr.ɵFkk=A k@\#ZŹ" jCtJ^u7.SaRFjaUoiBsijRrV9ٰ4D9!(=C!exk-6Rb _x(uN=)'sp;~{OBR֙5Tyh&CY:{I_Hї:!O-!@NP*!''M OcBM$7YB M!Rߴp@Fd n j F@ a FI69}KC~0I?h]ITn|d׀ L[آiha$1L8WSId*2ȍP}9oVS_y%"nF- Iy&"8h"?62I4,۱s$&w؈qҸ9F4$p=sA i.CaЊ#N(QBY6L V@a~O iCK0Zs3~6W|Gno8,g…AXX@#¢~Jua6:Ä/~UGx y¬@>ƅdԌqI+m%LI>4B!>꓀/9C6$-g|Ag67nJj!c7hD[$"~+t!$: j㤑U@ȆS!"Z|C펏'R ?ta40(ZҦz)4=&էPu*O s6Φu ]Xm#&RC iUAN qBıDghލVz8)cLtZrۻ$x 󧔀N)  NM%%`I;`&6_Y].?O2(gYϖNEl/DqUdMamb-Qzxq;dCUX:(I$ϧS:?P\埼g1$iN ġr4"KE   #-%+NSiO )X*TPJH@Ri%7J`{Rp؇OʩHP5Zت D'֫>NjZnI(\QsR>mUwf*tR v~#e%`id)^f TlH ת>53s~0M'n$-#Lx%Tt3MnkؑK6HGȻBzmbkֶuVHr.W;o3aD#m2|n7|n5|{@ƅNZGS-I69Fx$T|QN X`kĂrJ<4JZDUKxSt׫q/$IMjOsao67Qh~V`=&r?VtwbRYJ)֮݇a%xߟOSHu5z C:K67?x+WD,ʪC- nYJPݴLT{hس*vtYJJld,/r\oXNrӢ ^mܳBu՟,5i= 4N8-z`}V=*(HrPDZdj{ݠ( DzEgŒHɴZ hNL0["T( ت"`!Ja2&d0gi+-!q,=/z8$ l-^Uk}Y-&N?|x'ξ8b [,(zrhEGWM OW`*>#o~1ҀOJ CDtL{Brˆ^RcZ&JUB@J$DiR(%\Ȇ\7&{dK҃B}0Ys5&2|Ԇk+@ȅ+dXY"5h4 ?8bd+Nw}+D,ꪟ@a[<>9'B'Bm,jvY~&4 dvCKBCD{QPvt$@~fвX?LnY g q;V۫pE?wzR{=+[x3>Q$m`ߞOZ] SzCV7p)b6BhE8,o[,-|,ek< G*sRi ,}jae$Mu-wEsRDkq@9uu#pև* F8ޕYUT+Bñ]bkuұV*?*Fn@>OSO08a!GHfL cZZ*QyBZ_Wtfخi=STTrSNb V;80ӯ)M|ڄwuV'{k6_3q{]^p}K"EH=JHTj[)E%ҧ9eC/D^R}J(݃3Ȕ ՛ƋT']=lв l$b (A^YGa{bSBzEy1U5g4c:UF65Z=#rw'3fiVlЗɇ(oxaxyIaj%gj?wX%+Db0bg嬈R@>?2xOL0IT$2uq&rI$ T$Z屈Yi(o-|:vOO7%}?'[&V㡛Zixłs5g!, }0_R}׎Nz~qadF`#"QDET1Sɗ87&`E"p;"-2~g8ՀE S~Drr6I-N>63ݡ%Q2TDG+R4 (ZE FS4O"|?U&RVdP2f!JUDIf&JRA t*vඑ" 6=7>llzZBIN f7&tj$M:vh@mU:k(;wĂ#쀳Y"=?tMD:A,R.:>K9eKd".Jt4x6FV#*rU3D2O 9 jo$36Rʴ2VSEө_-d+ǷOAo+۬U"ZY-$aY~"g5F)'`d&\88lk4\T*4K iy{(E`P]rF,.>bzR\WH6O!`X 0 u[ĵqzae}hlA$2,;DwmjcVE9e?gn_B#Xdu U{I巃5֍ؚ{~;]GC-H^}Rb.Ë4Wn(R2۝m@Y?X-@ן2Y`jc0BJLuZ鴂\16s$%nA2ga|p ߨܶBUdi^iH_\vvT*ԛ:0{c[XM7س&;CR&!Z l$*T%Q h>Ww&,*V22{ !/5kqVYq4т%Su0ZQ/4Zzjkj@KVU4!=zGo}Jc^t +U-_^O$&~u"aq|۷+aȎNŝ--+; j +k2zM7_A@D g?[R6rUGkmR#FgrPQ㑊.9Һ5Tr@#:`֧}=ҡkt*n%RXqn4^yӈZ&yʬCGRhݯ5kuNˎ,/\wi4w~b#?n'b1OfSm{oA/ A]D؅OidN2KJ'KwA;닇 ګhcC%:J!۔)m%i=!Ƹ& M2p}ͷ z3]e븻z)X#}UrZg~SƇK_x2arwOl]{pM7oWCυͨ͋vbXPx=-W"bnK# Q+l,TV` &OkS:,X2ɗu`el I-3g29FҡP@cCm@cp,f.GLa suy sm1.%f%ᩋ*0SF`AvՂJ$&_CEqi`fz:B1EJfFDfb!%Ab } Bo,mcH"7;wG biF3tˉ8{*aTU{9Jr>8ЗfE UE5GU灰85v9Sy:7d\*WD};N9_RfUҸ 3M Q9$SD6.5?LrW/0 jv}I5~p+wΦ2NwF9Ct!gL]PtZ`M.<V-OQ=O@0G. YK*. h5D$X~\D vh[Q=\?\t;9ҷIk܍GsO2sПmk CDnM[t΢?{Z')8IF VK&,T)ȴ2;c`Sk ~Z1e@aRjN 3 5UimѢL6&Tjā&v|5bURzA(6D-XVbxƼdQ:Q JXJ@$_m,g^d1ͪ=B r呏N>r0_@u[a;NU|-Up# D9xfѠtI1LE@ST=_]M*:;R0wzxկCa=]˦CQU: urbX)-&^uwV=rv7~-,=3ӏs63F0!&HƼŧfɑϒ#=)9r{3 oF? r#s{&8mO=-6ߴM] oO>qtޢ2`FUGr.D) s+b3/^Fnl4&0@ Qͽ"1z,B Kue |`Wzy(BZD56|:lNlHǢW9RfWu+ϘE΃Ј/HF(!O ymLQPH,Լ#+$Uo^uX߮̅q]ݭѵ  vxfe,BTpFRqQu%B3?`|,Tr rd MId(FԂR2^^ DǬ"蔒!㠄(>*˚܆ܐےoW].xzN苵E_зc B+ötfk. *%-] - dY-Mu1jӬ2ֆ?1M_dKD'rϞT.6A~`F-#TޝVU{DX? uI{|S {XF0CN-U4JgjwiiT4HF8(SGSu×w"ˏÜjBσI&Ԝ<\FMȘ?`'`zh;;è .F x:;?utvY63.>:Z(,Gv /7|J /͞D[G)2bHHDTh殛=kTgM Fx3)N8!tHZ|LhQdȦ  KRfciPB27+Tò{#Gdpt;pgpBiW"!CD0{2p 0jAf|ț_8~p4s!A+%T v(*j/54IsO3&$.NH <")&f16Zkxm}e_'"˦ȩWFד]i@ X=A=Kqӷ( No͜QA7?_udn\BtTLV<gH_QAs]1fhvCKV!H|BjLu1]}ug?+}ܧi'z}*]R*C:WىRVV&"Fc&,56ob6k0@q!K8ۍ]eoOsl?gUƪϮ2[}QOme'I')SR;}7v}xbu>^sT5&?):-eJk#CS 摁֔ST)!\m~iـx\ ٺQjaG]| U;`AePhiF}o/OGיM#0ygE]r‘ngcdWŽb6bĮ"i4*Pmv4;%RcX$`\ I0%0t 3CC:1&LҨ`Cb)Ә@QҪaD Q5>P \ŽbǺjE X4Fiό& Ƞ;c-$M_R@Y'7Q8PZvvuk&X#ATjv<;ޥ 3ƿAD%g`E8t0Gl/ Ȃ[ɲhq܇oO?7KETs=t.2xM'Q0 |O2ricVi îA-ԲA]BA];ЃT}(q%U!,""W\AͲx n { :6b<: ZJZ1P˱b1"c)IbCMѪV7*aT; j`2T p5JjW#5W,Z5Rb ID.b#,m"* T#j.a]I\k.:sXo]aWGs4HsVsmzs(Z';:MbCA՘C;?v|X%Ncd%WEOHhhJ"~s?Z]q,n(.%c)DZtvuPtQ:j|[t̘YLY*:sIv${5>NhJwb\vuPCkѡJ|%nMZyl=9|yQ]`x=]FFm{Nv KR)VÒ1CՄ ֗ ofhT!9Ǣuߖ %TO("-3a -)f~f#t* AuI$:[=aEJԆn'*Y4ݒ@w,])W"E1`h0]&W(wѧD>%.-iqy(O?ߗO.r}jt〭:Y9^`ژQi9$QLSPLH½pj„4_%ypr>Ԁ٫o`6ɤ1ra(M(JsS#K-(M"F"цXP6TXRPDeCRMϷ;q Kc NXVBi:@s0u"zNؒQL,I %JKù/Rƃ'Ptq@BT<\Zg tj|37yN:/۷?|͉OekӋ31l}q? b#od(ߦki^kNJ/Z[.Ú oajMU 6߁Դ׻KBRAh8KNxO^uw>MtWuf|U)n֣(g'_;eoe6=.GPijmRLPxG#Ġ' Yi\):?ٺ\/sƐ}kw9U8wK?Xy2^qI8eVZƩ#Jұj+u~}N#rD )r6,~'33I<zoǝɵngAe%"CwPhPJ؇ >$ v]jv]}e+DKeJ>|E ՞&p#ic)RT IX/AN>6 ZWh\hEh.6H%#J=Mծ4x;] ֔bVس1 F}=\8E.)\z<yҮkXK aq_;loRG哚% sMGY G&՗=Mgɠ:2採1;{ch3DX"j:: `>or`{k"F(LVW/3ga:Άf;;K/_#}`FAG@\)$]!NM[,Pax{: p=9nA@Ol䯣ל$D"$9qgh.KWB~` S*f*c2+BPbə$j-n@~Cn;j`Jln@y |;WS$[nJ6])" U@6Thh!9^D5H'z͉[Ea1Je HFa*u)'yJb8IJE-ߵg]LܹD3(>H̱5Tg̟rF:G"j9qS3gkފu>qs( ȟ?R"g(¼ s a)/7 d.eON-\`yyfyٔk;}LݼYHIL>Mo-o?XԽf.4SV|M&SuWҕqr:~&^a⇟!g($&%w(AP4WkFWЫ癏Onr摝7iv1E8Y?=9%g쥛zz5>˯o5|QHN_) n2.MzOD//W򻗀\6?~~zZcqn Op𹗚ĝϼ2ޒ ol467߾4L^;{M`m_Y\у ι8JI8]j]tdg0&yλjL@Sӹ[4݅mrq7gw4^z s OI@޿n|yf&1J_/ONx{>}Uֿ!|b/^ן~~LQ]N0BK ww4|=M{?Ԙ#ohb:o`0xՋ5 [w6~6'M~IO0h;*2ufW|)Pd_a]>hO!$bf ^oɯ\Fϸ̶ߛc&MW~!/g#ϔ0D^擈pr}pt 1b#|0;"h1=+:wH4%r%,/pz@OIROO{~jtڙ|p5[y}A\IA|  ^$!#JTҡƂP[22^瓚UJu&&&ƬYqшLA5XS04 A,jDF9TcԬF |F+ \)@%v(sB) u#Wb9g!*gT7#5]>DĂhRgJNsG )[4stSsaC-HDp҆"Uw'⿦w.'ºԬF { wlLZ.˳X//_mqEr`o7 +VkgFwnr5$$ۧx:z a:Ն0&)15sϭ^@E_b<wǶgB$eڲۻ1 !rjc)SbS ?E#(L Kbi3/8b`ʙ[e(RD#R֪jX&@R sM$ 1&!#HEXpAu$?q #?r.ZVZ/ġp-*ijt1ŌuPAFd8ԁ9 jaνNу& [ th5ʁ AQ $(YIqւ#i!#w[qOlŕ!Dߊk`+8e,4Lyf6pAZ!)l|UQZVYF+|YeLcА1-|=ScN5/ϧ'{k<_qƠ{0ː/D=pAO2rHjOy0&Ҽm31:a 'b,S蔖0'9'rRÈ"||/ť(|CH3E%oCۼgRйB̧b rJ9|R+چOE΋9& k P`( uaJ0Br%SJK>S&G{ZWd>“LOMwUVUc\͏vzUB1toun]kĵ[Lb4Eftd.}0O!yyǴB#j^| f8tcZ[t֏Vd[S`0슶.bx_~1xc4~3 Vfnpe-hD+wI7pל𺮐ү #_fkݯۢ)kP/q#D 8<+UK Q'`Pkr):Up i VWK 7MjQ%VKq0c_-:cWb;[-5FwZQd߯M`vz2B%.Aiw6JB&ݠq10@"},7閧ocΕRZ2X6#vB%,X$$&=,w~*'PAu@^}4)J[e%Rq$6eP͍qU+_tT@A9r[PtIs˲]"D<+V;b$& ?CZէ,A дp<$x+b͝]hT4Mձ/KUj]=͛`3jh&IKEI]ʅl{a6}7Yd23?ƿ4y昻\1*c]iĈ.m5D8^@2!8:HL܊Hz "S?ErO"DMIP&`WVgmڪͰ-2fs0339}I2;ۯRʩ>ft6XˇӶ2r{9hN(dx[/_>=;r_ƨkAۊ+ĭGrR=:yEfYY3Ihq&ؽsV9!}gc7n9 vH/ˁ<]{hl4o|w ?PH+Ey+G-xs`P2R3^VWPkJ@z|^}gj@]#AəT29砡G1H+9`<7:0?b>ve:h?aq4ANWoIYZ3˱<¬Z ν (SF)*Ts)*+Bq+48KTKF6HBZK4:j4VѩzH,mNSUjaPseZ%9mTΪLlc k醶ݖ^m8(NQ$Uj6urRKbah-0։+RG/J~Tϴ}DV)A@ǘ!p|c8 Z)&UI4Z}dG)TzOk֜>c(5$ShVʰϫkWo >:( J1W"kbyhc_O 4aX!ӜzS NmqǠmk~qԔk0Q}QctP+1y6f'ߞoBqߖvͶqي?m+Xu<%mQCIKB35%ϲc*+8v$Ǚo';W|\!L:}^l$bZ`@dp?,/,{Nkw_vd/ׯ(w~wu`va{Vdd}\4n(uS΃Zkb1@Rei  0Y2̈1{ m_|M6%l81+PRFHlȳ׆F.apO?|>u.|%`A]s; _4 v/zG:3`R0pÒ4U L-mWM/'g߹4ÖMꮽ/쎵b_C-n5,pEoB %ċhݏq_6.lS\kXi}yQˑB|-yfs0{!A>`}.BD$$+ | ?t6_[XWC'B{nlx4)~1\;5:D!F1ۻwW?/!B(~sIG]~/=jH,sV\#bYJ o{gԜ<\&Y56-4Wճ[p~PBg LC!.:-%f{\&ߊx&)X C`+?!ĝ;( #Pe}K.g˿"[g4 %kdI{ ܙ0Fh#YOgn{p")nW<E 99;i7f+wbe]"`h$BBƔsb^+iTD3<8Ɍ[-BZ JCX B=XHqG( ewذ Wt g@5u[R3"UpI$2'#K韵8*|v3Np0_-|F5姜@M@qiA:|vR.%cZ8BatwW?; vk|9A?n$JGN&)iHyqH IF=]GWwIC4g +ҮkR$#4,d5u"R@ nRu9!FGޱ50:e.]{#&PЦhYDZO @,:#exA;N@(+ht$<'w8AG -mћ0O}jS,ů;x7CMS/>.K3*rh*"[|1xmJ#0CE6YĢpZᨆÂI>CibE%_-ME N̻5{㕟WnFPf%!?TͻYY6XwF?['`viNCMS$zG0G!pAZTDMM]̉Ԛ/{aQkyZ*#ZxUGT<2[#jL{HG'4XlPjeI0&H0P[J18k``Z¥Z~A`Q:[9?B. UO {c%FE9+)d 8IZZP̴F˓ޱNjWNo8kj@<1ګ_|Tp<ԃ= LqYVKQk}PhBPQi =DTJzZ J;@{upep>Em7 .7tXjR/#~"5Iz"h %g!;Nqԫ K+$!%A|'W_F S3 ll[fZ[}16{t]R MJ7eu]NdH**2u: |GגJʣUJyp\9/xb]2h,q@l!-u6[k5sۘC@وR~@ DH5'K(JcAGQf *)K+Ӛ`%@>fA1)U,YsbFȭ ADkXb[6&0ӵ>7MJ(:r)aڌ.BLIu8 YZe#g!dpJmdYKA9M:Uf{6u銼9yJG:mVa}wHjovm(FG|+Lf3_pя" eJ,,RBS^/A.XҤb#/O`! Y/ %,)l>jpS7['m(aPgR:bf\+.I 6LO͘3 ,%]ZK)TؔNX)-J',lVI7ꓮnIi|''' 5p7E%\y`ʒ3 !5' ;jN|IBenYF<|h 22ɉF^57v&Â)MQGODk^6۳42 ꌰF!eqn0L 4@<הG% !d$,TAx*hsy/1XkBUPQ9zr} 9.F.t4']"!dU_dr3]z0H ?}HIl!X-^kf$$,t uKfd:,T7I*QP#!2묂'Ah-}aĸNt¯6p{3.{GNQ9(P11|J^ೲJK։G#Ɇ~Lt: GL$78vYXl ;ܳQ紲MX h(#?bjNͷvv*ppbjSR EϊA[H+&ԘFiyRFc>׊͏.@6sOZ1Ss6Xf~Xf&,.1`Ig'12 \=Us||!:Lq΀Tv>-(a'Ԅ"Dz>r@RzfZ:V\+ M(G/PWSJS'*P+Nz__MG޺{qƐCr4oM30g}bu1gq03@fҔ/FnFﹳ1Pps;wSlĵقV?N,n0?|!j HZD!w4Z1pa#( c]WdR%]sJ~SvLR/0W}k;[.d9{tOc'w1>Lv c~^TGoYoz;jm n"mWR޹X>ҩ.Ձ?ڶy>xvMő`+([#1P9]%[ k<"Qs|_Xj9=F=ϑ{Hպ}/굶 A8鑂cC e;KqGfTv s^sE70?PpJ/w̎#@E9= xk;1߽z nOvP: jrSYŋ[HŊ]˾Kd!{;*)>,G&\8U黯.q65Av>~"5LpX9'ǒsȿx񁾘o{1X} L`oGX8wiz?VM/?]4u0#D½R\3d@?qW/~^)19^}18S?m2 7wg0Yc2(߼8#ǵ x:Q/?mgk;?}c!]/8GAc 8I]AHz?G!"2LȱI gJ| J3ŽJJSnyҹvъRc&(A(⒡Wj·h:@ٺFeS|Daϟ^\.hb0˕IrWr"ڇK=dL˿Ny-^掜̮咖߿qBeZ_8XIrmPfଉmbpd( H". >Q(f55*D k+BK?&h&m `/#p J-![4ɜLUMΕRtq5./ $fxv?n  Zx;V|\ Vpg]ڼ/kel[@%be۲'b}24 ? `P߬[~ȭc %x^"N*AYͽmt`PH`@j| n-wuyetux^aZ{#_e|T%kIA>%^|fGIcYaՒJݺL)iŪsK*a\swhvF1,|\yovG_';d$?~]sݟB{IWzv龮@YY(yǻu YPy )XJ|a魆-2*M!.7pX9A򌨕}#yv9mF{Vuos(akѕLrr%*lm*D.d4hYm%t+(u6pa^t[΅>lq|d](Hݗ`BYok/7P-I=@"9֍ pAunBPl<֍= 6ݸՌlr[1#4E|B]S|咧fn6 lH"]ʮQ2&j32j ˬD6%seq5ؠRΣ>S=Q]lҊ3 ||`e` I~:,d N sS_W??I. /g?ԓgcJ0(ƚs۸|t]~~9(uf kwns;1-!_y)1#tk`ԙ6O!v]nnKdWroSݿMIUXr꜋zuR4PM rYNHXjJ n̻w6]E(Xs\]~~ַ}-g|= #4gGCDNo "'_o`㏯Jʽ6ұ*u懠n>ݼV^>`룓6+G'mSת4ԱIؤbX,f72^=Z-%Tޜ_nyv Ws2nzI>@Kayڰ!g[&4mzԿÀ\&WxNE-,7FV Qc-}ӿp^fW&Ԭ/߭F~􇋻nNfۼ އqoS}ITΦ TS}[][SU+/ئ-AC/cD'gXZUq;[!y h/݀אָ<㕐ɺ3w<[>ɩvmN6g6rm _&&g=t3m,K2~{.eBuq{Ղt^"ֶ?B73}am [wvgf;Xb5wY<:!>GsfxU/V;-ɟ?A.f.X ͌*a؏3 pcPxc_H:P9Ǿʱ﵄{L<^|%[[iWj(.gPrBjн/ =$ vQVT*(ƚvsֵ޵1Y)aLg^hSp\lr69ixiiwnz謜>xC@>$u涱 ?zwD|):'o9TMz66Aӵ(vn;)-!_y)g]6mlL7o+*ӹX.!_y)-bm6Imln\ 62Mm SANɆ8! > fcWZe |TӪ[WϼA?|èJw?woWEC}ÝtqDW >oW?㧾;{￯/ohއtq/ A-v&ټWK|3/pO~u"~fcEHtμMV,hHccJĶlCfM6גrY䩃^m9E"瀑֩9rO?^:ܭP)zrc&k7Q1m1sW}3τ)ثsMV9.(1NMy\A&LBL$@BBk~cb] T4:eKŗLL7/Gu\5Hڔ|JWT'5p?ޡz$kBCڀ561gilV (kS*Sj%%&q6ĽJ( RD)c)?i(/S*;rÕhL苇˳G]5!11\/p-a44&ocu 暋R -XM r`b@tـBPky0"h(+ȀvǚZEգ$Q FPɕ2e,>uQ^'lUtuTK@ m]`[ِPRu BPic`Pn 8` *A˱Q⁧(l >JEYyrدnz#4 |Um()efq1 XN& :Y~/ʺ@J @ѷTNhQ;`=azv&j$,ʬAmgH@)JX?r,ci42ߢrV㣅 n WǚUfr!Vy6)icqa[.o/@Og?\-憡L{ ` ,pquΧ`f=_Z (4LɌ2TXOX8O%'Gk6)ߎM/.m'1A z bkC1B[ O9CD_tPʕ=$+1J3@O:S}3q E!} 9xڠÇ"0(&~V-"E)v We ⿻txQg VJ#E:˹]ȶuVL{ ܒH s}VSnt]՜[=-I6zvu~U<``?|yø~Ű Ѭ;Y7j)V`o?dt`|4&N]6%Gxǫ2)&ɅS8%4pWN5ۦ'Dvrj,Q $ ̝^6Z=N|]F#6G+4?T;p&M7>nT#px$Q3frjSZ?:3+;̤ IAkJ%.\1SXfd&wYUhEe/VzZ Ҳ[˚vC!187ˌ40|P}dz3aJc:4!H5vǑ+F\@$2.֒ċ*xaDC&Qj=K\svǮK96:lD?t6.ee#Ð4ƝO!8%%2ӫUzl|}cIaCA,eA;-BˉҡBL D"&Re% 㒩Zv> q/&?PKA&[!IG2xl~D`$vP X\ړ+X;Vi}3xsN_O ? ?E0609. b081 BxX_ſp 7MV>MmT?~د9EwitƇ&&\_bttD8[3z,zo /[*yS.YYTzqW \)< [ƅd}N`Cʖ+Lc7`S+x] oW@FE-3DO"߂^ƥ֌pOi\g{Ml>Qs10nk5*biQcH!uuɝmmְ@*w-"A4, Ez2䷰̃Mo1,-W>WA .yk20{,0 5rmJV dX.}u˯trcaML)&Bk{ui a(Z* E` 88RB&V"cʹcjyxF5B>}{FoyOrq׫qRvܤΥawA X6"}yWOj^mòُJt =Cب iϪBPIP箳 sX% BeBq;`!j:@>G o᝹e #_mv]l5Q0lVj%1r9f("riIuI?}nTj}OTڡ͋2S7bcK3`Y?+W|^WȼZnm,nzQ`[$$Us}'E15BF HHeu][((W2Wn7\%) EՃv%2JsQ4hP1AӺ(K]ا1?QlGnIlKmGbaX4 Ma#cAbfBrZiZFrjY3=[hjq 1Q/Ѕ(4'yd0ǡ`aM Wk-bGGaL"Q%Va>ȾZ%'M6M8I5$E| aTQiQHƘa_]ƘJw FG8ġ+JYX]b# اK߻iENK޻iN焠smGZ{6kֺh1{8[y=_ovo}8xdƃ Y1qgb\믿*˻]#tU0L7O=H3_ <4{q.#Ft[|2o@;665t4^Q\vߛ`1,kZ #&s[`DI/&A2u, X簧јg4s7Y4Y95欟xr~iW 9i0-T`it]ּ/{pE89,Z6ml:\kx`MLQ7m~Ƚ|wY$ظ>^؛< 靝rL\Y5jtn8b^i !<0d1pI52^V2I罼#)Z#&nn2.Ö$ "c\3rsfn5 ؖ60IM {]g ǧ7ev.WUM3J ؗ0;U,=; 7[VΠ_Kt|9g:køU>~4w& TIw5r]XUG|owJ*8|,^ncԨo(CDa kױAhM`LrԕhsTi qlKN=b#J v6?5s,UHTw0?y%eŁs50G *|IMBeDÐ(6d0 Oplx'֐O DW]ά593Q"N]QYLl+‘DސO܍2VZrK͉"I3>a2JNl\ |YT(1 >_ʸzu\P3EF^`a,k{}"ŧ4{)aޢysR! Y:yKtqOdc [FK1P0fLYV #KK`REdh4^(5 hYȒvg yk,qL57;@L+?bD7GM9UB0 *'cl6#5͐LQ$ Kg N\N\e>aQڂ$I\hI^i0?)r\}ctŽY}'Ja}&51e^d; Swt AzPF[g"DGZ~ltȊ!F`:.t 4X: yuGE0Ȇd62w`-1=/jHJKǧ2r$r[dZP!)qCu@u b*xSEkSukh׻:F䜡2Ӝ` +6X =Vjnc ŔT'DƄ9:̝S˕tZxjvHޑWP3{>lIW^̸v;!k(4t9iYW[E&@GpsA{nb'eGreQz!µg _>um[zoj<͂YB>\*oGF*Y Þot+ Zp9i|Iݜ0f&w] g9(LZFΑRi/ xVv;aQC9R3~c$/b@G𮵰]gc ?Nr^vGsZ`W~7l#eg= g?$0z4Ҵ SV_g߇e&]0z"ЎQ SNUlGn͉O%ȷ33|Uę `"<ھt У -Z͵h|Oo;@R ~:lbn-3R!c5cfes*%Snq7 Ç8@# k2ręOfco6Fplt8Ms VL)7^ֿ1Qh}2ɤʴ>1#_`go^2k k;L0BPKV>9>tAJ4YnVϢH <&L` y(NG((s%hu՘~nvV? >^*/]p%T%rja/:EAH"h%z>jiWsyE'=(:-cp>,{z L}_Q;Ѧ {Kdt8cu0,Ŷy\LYZG 7dgdUL#-#]Qadžy2~P)ˣfJk7V]&"I~J&r9a$Cm$tϐ# \:ιg`ś64':H"fdΈ>*4ې!$_~݇6x0r1pd_2Ͳ7x:nA]k3legN.%~σEf`/8DކőzB>Ud|a>NJCf`YtPg@k[,@E3xW[.5Y?k^SM*ԇ)׌gc?Y#E=9,Gl38oqaVO,*-ְ|8!#T>C59{AT)O3L4p4Nj)cnx :\ pojT¯=B6t  2 O.)t*~(3Ű=*J,l-Q2yJO llϱ@v5Ol17٦ؿ:/5.2}ѹ<K/!:\ך γq$/ 0]k=Q5˰ިrޅO:Y/gtp$ IlO߷Dj"=*=Z;">1v:Ru/f2h8 |2GityoɄ`5GH0̂YDfVIiDFTK;~װVp,3W$ dOxgh '=IjQ*`.Y4F{=I[‘pmskCvVщF8{IhA՚3|*j6Gɱgj|t-#gmT?#5B +ۍKcFcY{۸+~")&AiapInƖIΣJ֖V>R͊˝3̣G%ד#e菛 u۾5] vjpmx$缄Xkv@X;jAb-HDHmPJiTI@i"|KMC oKF{?i%kxCM1CKCݝ0DQ Am->]iUӴЪdЎ֋Ir r9x[m #5`ׄ6˿}dl%\4nm$ƚKoq*$5P?(T*QѭZ_$l=q9b`6ngDcѽZf\DȖy]@3$T(aO"]<ֺ*B=@|k`þUqJ;v0%탓}sLurL^hq{Y!"@kv:+v6.9l4pIQD.R dB?41ܣlu[%\$s7n>vri+F^ T^#SiFw9pik` ]X6]os %G1*-k+NBf-RDJ;ݔ*ENrD š=DF]f>TVrk͑%tqعO!c0HYI+"jU?oooHqS$8;_Q,YcS{(TEK )o_|kB'ԃJdS|rems 4"Ekh;n`Bv"=ʉ:Lpܧ|j.p4T^MNçtG|F!]f Ln9̊O0z=pwg#of:&3 ~3& X"x=\G̋I|o=DъÃ`o{>|QLݫkP:gC?\x~tEwГ5~ȉfQʥܼ=9Q(VB ^s%I1;Da2wKVr0A||___~1,H C&m~ӭ|On?D[?[&Q)jj?k1ԙ֒k>]]b KR*lo.WBa7ӏu?O@0pp~03F w\R>bJČhfg҄v͋/LlSڈk\卿Y2 x&{<4I xV7`Og_ʜĆǀBZa,/_.~pݼ١Ls~tUM­ҟO~b3p1Z,Kѷ_Ls3qsasܚ ׻;6&?04N_[AS,|݀B`Y w\l|/0XgS W FH:+Fz3HuP0۫j}|6)l<>;=&6ؚ,[Le<Ø9sRf9ir3N ;̅1% $(jCs^`Mv%#E%(ܹ2D!&ZddIQ5Prmu9P!dn w I\f32Ƙ̄K #{2]s@+YE툓܌wvd2%/[ 6 `Rf*V!iظ4d@; yy]APˬiv\ɽH;-Fb8+)P VJ[~ PQ%U0/Em uN@X9nȾfI|f7% u.#A sL <@BH9Yfr#5Az)kΦ5К ϻїິ$b1XaQM'K˥5N.>{{1/ק#?.!.~^|\>wo'w>3Ur)"874sBQ1H>>"96clT {aUy%aCx0˃s8I)f.wn~>(m*C?N j,De7^&p|JezWý(v#b.{T qǜ p]QNL_6vՀ3Eϥ?QBпXԿ6h{Z>, YAkANz@W\/.Ñqnv[[uY7&C(:U*f/E_/y5r@/lzsf|ƆcZ3c[JW\}6hAZGx}<^<:CH+29}4"'{i6L1?a;Yxܚ׆𯀆[f噁 S 4RL⠆: Zk,s-iHՃ#btUP0pIx56Zc9J`虚{xIF,(2>\Zr!:ĴqE[딷.$tUP@H&f))THQͤo݉M$E{rQl!FTcэq#)I|2E'e-)[C8I׶[-L:[~"}:AhVCaRc"1I)ΌrR Xs Ƴs5kۥ׸Z QvߗThPt\i/7GD Rj*oo_i[ Pt =͛dwD+˻-zV 62\q?<<`m  ,i~7w0) MqÚ/~kcQi{-^I oeb,ㆷ%LGiUW>\Eo5ɟ'!WnxG)o2! 1H7-iTO+͟wxi"s7'#y91g&91S7%)ǤK<@ZO>x]Eo3gfG)J3U6eU<Лi<ki{ű16y>jQ41\觮pNR%>,NEn}HeWV1NıŨ2:{\zCD=r^#[qIB+nE >Nn Ayܪgs+.1pŭG-YJHέzg9܊˜Zq+jqrK2Rs uW|K^1jp+5ae*L*׭-hx-yx"wqUݣ{׽5U樒=rO4G{TwqRmޭ`cm9rt{n0QEnr6?GѽGsvqJ Zi WTIMIʮi*-GYl)n?#F/93jTTATXFAH>0.d<ז+ϱkH2%B); 0 R(*WGQy)BL)f:T* D%3sUd0jÀ vuP>u}]GC֨{D&#T _?Eֵ>^= gs`E `qJT)e|6r3Zܺ@ ,t&B NGpۣl.C:\8J6*qW!Io^JlƝ%`Nx/h634Z[ * SM`Cg`u䢹zSkIE *SK'Y; x%b!(U/E`IV ^TfRT[mP d SM$0`~®i*Y%"odN{/tV P9I"hL2f  3P &!{pAMQpBkx }]cf^ y)ezŜ-#fyiV݇| ] 9#ĵ:@^!»|}}PY3A+~W{MO hyլ~:^d4<_cYmiYq8o1r.wA5ܫ/zͩ.I,MY,8t~cl3^?Qy1 NXSTfT02"Ofo$WÃf7&q%ӈD E(a̠IyNѓ@38WJZ/T;UdckBʧb~*}ѽ"ѵj"s!m[Ro @YZ@vmXtw\G<˵$ ǓjbȖLjNQC~:Q{lE <N]~!~/Ie@dsVB6 Q0"OOt&C"R hսl\xn6}h Ѳqk2vc9$+]?hf܃&YWraJ7":I+Ml)+?<_k5/ -yzUb{5kJNnkcvo֜ݎbkYX=.h]Ceڒh5Rx?Ft=1J.St8ҵ*/7o6ڽ3M) Qte^ղH+r5bTXY&l3\x|X\> ־„@n)דпK)EhZYPiց$+2%-^LA.(7ѣcߎ /ŽubЕ 3 ^0O]|"eR!N睔PՌߔ$aDI1$ "PɘA({X""&q1ztT$q4˳R0ɱXX\66,ait;$ H($i1[*"2nO"x.e%FTL .V3HZI"XŦÔ=;4II.)0g,ƒjD2xgAP4ӄf" ۓ4C+ͭͽF3oN59P@²[2vV_Y}q96%|:5ԻhOuQn{rJ>$: n5w( gvQ`(\8}`A%. vO/ iys6H>yusvuF@|mB?1Nwww[HcPzGF+Fnq_z즻&H>lFvpOm8p<|0Xaw}@H7oҳS0$C3Ona6Oto7Chh*UO05}so͹5}t:W= k#JZ׿QzsF Q`^k֝0f6?uӝX=m&h ^@L ?禫Vq|ht+*$ ޲6C?sZij| 0Zea@ۛVgv{VY"@pH{[w>)R5a4)J_ ՟ɉG{\o/[O5Q);u91iB8a25%㔄)35e"*6 FH ”6,%As?9a{̚S$֌/کzʀ&Ql7t1e殏2pOYʔ%(Mcs TORӎO&$0A)SDO1@ &wh:Z+B/\2;}Ǩ|rh0*|:&\D}d { o>k%CK3a' k !)*hQN yӒ~5[ì^BUF}xWx͠}Pp&$; [(| wɷ4o)}>DT`` k!e!w-jF2`V2. "!8WR{2d&YS 65jJ$T@Bjp/#dϳ,1QP)U: 3]i>{m]k$Rr$H/J4)E8 D4K俨!m S>ֲvo[}ڧ+J݋QxN\ =qBx`#D`ּ^W $g )0 >ƕE^-5)fy5Z~<<迭˂jpQkhҧj&GFqŠ6?gnYߖOʬҩ,nTp$,2QʡL^ dVɁ^ϬV`xVUh"zp(͒!y9.ɿ3/~QӬzw՜IoTZc.)9VaW}?m}1z7}P<~5*$5>'}BuVaʯ89x8 *!T$`hJuhS) 2i \{v Rچ0v Own-F F9A_¬;0`@`7It vg38QF6Stz*sfN'Gm2τv/EZ%> EVapC]C!s"*NY17& pA\ڤQC %v{wP\Dst-:(`x@mADH oGY& B ҧıkQo¡5qA15uX}~ؔ1|?Љ^;[% J-}-X@!B{j҈ G? 9IT-\F {]41_b)O5<0׆+e`ݔ9OĐrЕН{Jr*^u" 3S4aQ6Yl4zWw˿t\4!zlơw~x7ƾp88^t@G[#`qQ-9E}WJ:/`/!N"xUw'AT9/T?iБn({,dI?4523'R+|}z`Y|^I?g2Bч_^Gd\tFWogrls??fy3Q@ ӣ`/@Jx)^w?9sƊwbDEA<ژ;ؑ8|)@(T`9Y<|[bS{Ty>H᱐$탪1n^u(Žpy)0֔1:|*.|p֚9CN ެ1eV4v:OYZvta?34F H;A"JG%Y2ή0GpAh6یg~lށ1NIegz'k5982sι'lj&nPjw)`E% #Kr6gӷPU`<мkBAGxU8ظd0 ?^C5H!t5>X!}a=π0 1Vgz oq+璁gBIw E Zk".v2 %JDe9Yg(Y“4扠 g(FP򏀣ddDQ]P!S ߕ<2ϤD8ꭜTQmSOrV cQBEuNXTB0i1_F2zDB0 ؞e.޸#B0wJ*ʣH^pF3B )2+dR >Nﴂ)Q,J>~ɲU?5i/oYv,jDXf>icO 5?WJ;kw%D_Jh\ީxm|5&tJFO\ϴ@fԘhG%rA2Xn{,7M"kÝX!qz,'+֓UnÝ,* :-K;)hZ :w.9n+iVQ EGk;'v^5ÀgY8+,}I* ͫgzTQؾr.;܅5`CeW(r0@vA~L[%`h7$rCLɷ>04(b`T{e0$ЪUu.a9yW}SvN{v5O7 `SsZ2ח5wy|Lt# 6Ѩ z6K)[>lj rYs1]O\7x釐'g"C_o/f|I9M")YV(_91˷ !88yM^<ޤGngsѕ\>){x8e̗~/G$ =sB>_n<{Dn~qk,X)K4krC^8S5;$pP`S־f}2 i=u?(Yj@2$(‡]#g:$kFN4$E?TP6ga.ߧ, /I{(&(ntLs,l;pvfF#Ik>Զ\9٪.2Q qnuj0>Qm2Znj\,%`WRn&kQ yTȕ2`KknWc!ʣnTDö3'N{3S=,s3p ?[i"bnVrn#ަ͝ܠO#Z@004lrp惵,:_R ly|D7^$7 lG@e]!p\?/åFsm~񧇚2+]Oy.u48CzwS"/92se&/㛊~Drj0Bryr"uP>p$ip<)@ɸ~s9}ް?\|Kw9Ow}bj*_/K~Y6X[@0E`Ͻ4FMo m4J1\Rb 犤2W3ۖjPLH{[NyQFD"kI]*"Ԇ=N\eI?hyY·#sA1 P ?k!FZF3!zT118˧t(8yKhb1ȏ1 a#)Nzy̦FKZmH >aRt1Yme/YV8LDS͜KCA349TF "mktR=K$l G&8kтq( +S  FQ!dt]I?_>}uȘb 9cX8t`aFN樃X d"JUWJ)Wyd>r%kMzZ`uW7!p>U}vmTȥ3HjIdHT|PO%jY m%4) *zY.cp# K~h']Q<7*sּ̬yY2e5oT=)G$&T6HSd1yOZcuzu1."*K 1 vI伴R#0]5 6uɽK.1@񾎚& L!ň5s)JUͬ ǯ^ڠ@F~9i\\ylwr/G'Z,qrhZqȽLS||T,6y嬠-076\ yYs'fƀ(_3jČBuk/3X1 Ry@*u-EDY@Z;mkaa B yIhRAFХgJ|'w}w19n]ޗ&݌Bsvݚ2?*(, ,OIc>{G;z [@(๽2G G$'b(׵A$KsYR\֪Fĺe~ V׻QYX/M><9y vs](mQq* 4ROXWhWfȀ[KB2QA!'41]fvA{3YL]ƳpwMf>2E {?lCN[WAaQ#G @np=S~.G w J!(lord72zwz0Ҏ T^/p Hm@˽]0zo@) 9&zp+`|7x HGRp~\}ܟ9ro_:?ȕ-aWs57!Mf9>M9ww^/?\tCGNavt 0F4d|}o[lK-:q5uo] 6_?:&`mtM\F >~%4(쯪loT&j1T *J˚W;J18ypcp;W}Og샛ҙ7*f VcOQZTX#IYL Jү}/#2j &OJX=]Di%3w6޳z|}(Ts>}>pO nlKmDg(d2_p7!lke&IBHmYݺjvkY-/~Cvߌl8MY.ha@TSN{n'H7ɴ[1=ݜM]՝;w"VrhœHep5|Zu%ޢg#lL uV_;NmGRz=~$=<'w+г/쥶&bܰr]xήNWX=xVu쉔k653C޷wKCz 1S]3v(#oԨeSF6,99?[c.ȈMxz(\BpUkc}4UL<Rһ[0W7o^u 2z3Z8x6O̲>w?\~\2p$#  h ~Y^f x%eS6 ˕? QuЈJ 2U*p *$c!`dPhF v|wZp!OCz; A5ͬ qkD?]o{(Ն=%k4RP&cFEhn{̨UFw@-19NODl01#w#o_p;u}5Lt#.lɿ_`W斱v[M c\_\Ĺ> ѷ)G 9 -:#g/A`#n(ڿ/upߥWlM#.{ŶBzvF*y#XpJHMB OB;15R0i!_Pמ1홬cgiOUBH ݐʘ94JF m,9Cz1=o)`@-ӵO꣤c:!m%"t 6k,gzK_%C6>A81l$Ov}Dzj)Rs!!v1>U_UWWw%%w"vVCtF||s;^ξeͳh|CJ՚p9)M  IkK|!ˣ?ڇf`mJ*+o[KUzSy;E 8,qm݂dä]l:]t .7Iu<83pn>!y8~0yMu7?muҿ@ـ-&X %U6X])A@Ò/y Ebg+rӕeA/ʃ'D_$yD8PvG{c)/swޱĆ,Z|,QSz V]<6NSlt1:hwq(:]% +)^P{u=3F t^1b:* pDeԙ"^Ij9s|5_Y/C;oyYU}bj/G?O>xo1_#=[P@-],>^_SF2zլO- #p^*XA}<1H ˽2xì7[#@< 2\ WWQB3n`z@;nygI gSaW_(3jwAP\504?x crF(TPĈs4OG͓6=ϒyp%.&ٕpkO^ƱY 1Ӽ1TOsYu-9XрAw,6*Iķ_ͶLl겭llmeoʦɓ4m;P {;" 'rC;MAfBބ`-jgVs/hdjTS='#ՠ+݇:Q'r׼mȓ W'|Tjo#m5켉XM,Z+F mʂn-8ZtkMQ寴awQ=VcToLCŏ6Z2L* 4Zhpjj2)lN$T]cVtڻ-i!tDҌ=xqD=X5"\F*NY+0NcoM2^Yw5==y2Y~z:TyaK毼q֟qa%/r{3Y=`Cʆxo#kekEV{w'  L;vanӃq7"Lu5_ԤUSpR?pӵc.nzzYNc6r|Ľ"źl3^yn=6I R3^O:+~ iy>qw>U^,*݊X0$7<{źDF%m{ӛ jlj.3{EΛz'8`;/{דӟ(P: لp-ekI8 qA2/ᵂ(%#," FT WqIpkw.9n@eu)6czmt] DFB3KSj/ Dm\2u iJ\(C;<\?,1h!t8`R[`).mJ"TN,1R'*$pj:% #\0\CW)޺BE 9nV(PX D`_T"y02*o#/2(I /!.ӝm-gJT5'V`7_+,9B.N{3.9x/7Z\yTˢ\nfSPi&-^m>56_≯e!*'{\$ADTĴλj *x災A#u`!=PM❝pZZ}+2>y/iuԚ6IB{FZADNAQq- P!0d1 B \hTviLzⱩk].wʺ4YåN_wdan7E(|So W_kXZ(|7Ԯ&. R;ϽZS{.G=mR2St+IZ9^믦 Vs#IQT;!׻mS*{TTy"”= IɠIkc!J33Ee[Pő^Zۤ'?72ZFT i"=4KAGHxr DH UW%mX^вd5-L# 4C0 368!FɘI'IVPJk 6ZE5/N74\hɌ8)O nРg 8B{aID]×:KYR-Zy Wkh)Z;d9.M9՜ R s*HR4&-DR" 38^Ơ_ P1ߞWFI-̩ Q" ,jb#G@Fص$]{Wf_wwl1V6YQFIĬj `WUkkCs%jGO!ɮo x]l_o(zxoC{;RPH\r5\ qq Ӄp7h$Qtt g6+凃k'Fy]Aݢ&0j3_W658~Fok fit3MUQZsׂ9C8}o(2¹Jvrz6rsI?l9590OREUUW]LC{& < ܓ@UWmB n Wmfjrq~ۈ]p{p&^Cj4S64"Kx8mT(Qb8x89 $lHお]4\qi3Dy}"FJ!)DeɆ.qqaҠR@qȫe8 `K2$ky"yF;NOF[bDZ7n9͜9ߗﻋQ<)N7J=A3nU.J8476vjPxrgnXI#RK+< i-ZswwQT RD20ikƚGTSk= @ p-^ $@4S 7  %u wL - Ѫȸ`azE\s9.S 4J3q5_ueU!k*nv]ެCK?{k>}h,@չ2[פ5RX"m(T?=h|B˥md>{*կlC&y3Bx>>]|R/fW{Ե_/gz7_]{N@nIb@6[r$y.Y俟nnKEI69d0@lbWu骮's jƜ'L;Ywz{a&>Ky:fbeXޘJxKVtQ ;%5olOZցխ'`JAb1MKsAGmX \#={S{=Ulz{=OyAQYݜ+}m#%']9h4,kqa-c[zk7ήAĴnvgqNG*.Q.=9?9Mȹuem óA&p+#]2; PDdv~TL}:|)ۺ^2{(jm s /Gsa" ӊlYgwX;?&CmaA ΃< 1ƿgyxx\9N<nGEvm|f8_f0܍m 56Σ>4WZܥ57O7dTTZGTQQ/执Ch 2^by5,VצHٴ#BPU/`Sb].m&(¾ N1 +eE8ǹ j7JO *FqO@gía1L:~I %Tg BgK8DyU3r1bպOp ' IHz (:ӵ.` 9VKE=F*Tc!z^ AQ p|!.0f p1,}*,D#K_ W^Lz=Yý"V28fK3RDt/N?tbYrv7Yϻm2|ĄBb&sP &6!'lBUH,C+M"TJbz}y/^ZT3/y ([ehf(K`n$bS܃F(0Q7f46>6NJ8.~h&UABA"!oX0 9cX<"CAS@sM{ۺ/=~nw$}T+8u/W~@w'ex_p?/_>[xNw/مe:yǥf4֯K+ X;,Ov]c ?8~~NInx'#~c3I8s濍%o\㲺6xUzٺIYun@sM9zJ(".( Ds ^hB cî{i~a,=z#XH]+vvEH{Ry= 6IťRZ!hr!B(3$$R+1n(V{]ҎƹּNwZ//H//X/Y_,_,iˎSͤo~丷~䢷~~q=~_t'_D{?abI WD `2ڐ( cEw;3,+_pHOg}b#b"iϬ<}Vldqg~k}x7vl|qYD˛L72c18鈠%._6v=l֊Z*Wy4w|:+K]  L(TkΈsea.[%϶f|ɚ2V}Vds?wsewmGgv5$_gS!휀W Ah\L@a;IZ~ܡY_yr.c~YfhGمb7kwiStOInG,l1)MeHtna-ݰbUF0 ލ"Tx*?*^UP0 E-G[`p F?OZa:QP1?T+~x[?^t%ޤ,v κ(~! aZ!мP<4:PgS9(x(?F€+JsKVQ\@%*2Ō6_+W}١<ݝotSc? gƨA+#e[ ْ-"\%kC*dAм2r h 6C^ȷr|Ѣ^@#?a.PInBKZIQ:wqN+rBT{?)O5!JR]<ՄKWf!K C9SKR]ÃjήXLw8#Lq b ?gIMGK1=V!<^ |VZCŠwց=UmQA3_΁[~|`~v@!P'jFs/t<uubHZR^-k܆Jř s4!~> :#u&W1P^TmzݣsZ6c)fZ;9fnc P)\2\- iXڌwBLnǒXj=iԳڅW.[1b \*gF}ޖ.]ݰ5xNgJ R|ts<ĹS 䃫n?jz$4NnEu&?쥽Rv qA"K^.ANYjdgV7S#=Qb#i 5($$M0˥$\%K9jA–ux$}b~7n5Ͷ"m*7E]>яշ t(Km2WYy94wc]?/Gv|PM.O*O%Q8`mγ,9ڑlD:lI~mW+FBz.>2% mGo"֍!`|1#:cn Hvmݼ'׺U!!=Q/ Y7TWh|1#:cn X9owh^V\DdJW;nR6X7_ E[jinkݪLQ G֍c <;FℊƬDZ*$"%SRuL;Z7_ E[Ϛ{nkݪLQGI Հ<;FJ<)Zuh^V\DdJw8nVn1X${y[s)[ߡȺI[q<;F 2DZ*$"%Se ?n V6X7_ E[0sn=Ѽ֭ 鹈zɔ­pLTb8>kTA?~D5H3*5AS)Zjl˂LqݵNI&4!)]sxjи g3[hF1\|pRMepgA0"H,pJM ,1ՌpnRM`RuP orCRMPw- r 9!\&Pur@5ch19J5Iٵ3P-rCRMPHv/pgQs/r̭ә3c9!\&pcfJ6䘇s^#ŇcRxZ3Nf|19*4昹t19J5Arѽ@ 9!\&Agr̂)=䘇s;ПjȐcrUhb{9frCJM%W=!\& 9!\&s 9!\&(;xY+y9^䘕[/39f9rCRMв9fMrCJMФl9f%|19J5Acu<䘫c]Vu{ "|fI</2yh6톎W톯뙎~gv=z1-oj20dĀ56&X]{D iTξrM0)3|~?ꗫh9X{1kB?i%8b]ӿҪN̈́'˞֯~껯v J ؼl$qCDF&bX'@ptPm]Nm_F>*"0"ڔ$~*,8*ː1qbaPЅω:-75kբ nGi'O(gfes< \Ή̘Rul)JH 432D1VcȲsiD䄐 ~0}˵"2$([G\o%_ $@A'DLTdX"1bCx};_o~yz;Ot۷vV~T" A_Ú 9MFQa(@ BŵdN'B@a#"fqXIҬ%+AX8A`ڦoJ(=!RdAcJZiM-Qlg$Q $b ̌X S -`ACL$@O&2Rk#xB)27d sEn(R)#+EX2}&XG8-Ȑ 9!4FKpG@}(2?;uZNf%LIˋ Mb@{l=焐Ҍk@Hq\6;dRqb4!" NRSSV(;S8BAsB_e݇! DkRy5M˜8e`#XP`,)xQ"D0>28=zA(%=pF@xk-xaf\Ocb+QŸ h7LqEg}nz xz.dp6Zٙ~vQS}݆ۧG"ZM2V{ņ4Wrr7^_1ADL<ѯbښno1OF}~6Zn| ؿf3x>#m0#SÏ_]Mh V^hen~=!(/>"; %Z%<ǯ`9FEf?./8jfygW7aD(.L'íb+ bM[g1!.¦lsMg&27v $?d)Y>WoYզ/< 595sH}3k8A>q/.F7*Zw$7)V?eJ9VCz )ʚ1_L1~@4)/Sᷰ >Ѕ@"ONI߹OfRPⱖx 09!އ<kC6rLᨮ32շot [؀ TsU lO%|NWSus"G>$!CQ Y] oyHVfsUm$_{GAJgst @1YZxbs$;u+*3&$Խi Yd,js[X\6ZvRl)tZvk鋱@RXJ1O347GqI-zZc8p*1"C 2ϮZ[aޔ5Q4ZS\z>ۮA´lH}-OQwWQ2w0Z1ǯ酸m!ZQD[uqʺ3jJTviBce!|@afH J ēyd2dGfy|wGdE$]<-LG6;|>k@!l4e)\Wx _u1{Cij $K%-$Fex ;mp^j׃}[J/LTN't+ɢarzLFx4Ow5r~sy"i5(>^>E予%^ې>r|Xi{[[Ə=erV/X9;)h\}/Ӌ h·G(#gML |fqR'3x֨>|?,o+fE. .ĵeӋT!X\ӘcnqM36og(q羋s*7qX1ڧ:q;'c#Yi CDRJd2!\.Ee[+"(hXlXAB}~/]x(+%=~^)H^X$^Ձ.Ê3BX•qaLÊ:\jۊ²e@?X(*|ĎC|W<0τfQN$VNfm@le~ ]? ,%djSBnXt0i{gsr-q EkP;Zx>4\J* 4F!ܶKxhψosukֱ`ٸh_횶785o0|ukY5yYE5=>1 p5#Qbgfp|w#{lK/eR F^Y/KBd4o1~*'}äa)c'&5P^btb7Vsv6문sfXBj}&ph9xu Rމ]幕v = ˰?붒tnQHmT.I9;rDc$\- KrDtl ,*Id\*bg)ξ8hI4d |ۨO Xmmtɻ(T\l4h42)`Rz+3 7`gKX 2iIܭrjhqzl| Xo7wK%@s{RiUi|hF&_TڤI<*6L^ Hy!1"nsʐ$b~N1#Z1n*".wEziu뛫vWs9erx󀳫eBz;=T70hBE%fo;'"BZp dܗGx7_㡫FsS1E4 0cRkmDB\Y"$Cv&`%kdQnqZq-ݥI,pxRzS$29KLIV @Hm%oSɐr1I-,8UH"AvZ L8.0XYwm+.kq ɀf]hE,lBYEp)g ї6MZ2HnKJYaw 6 6!3 $5F8ZZ1䐒JYEJtR"\nM雋Ne|}[!1Rr( fӭtʤwoFJ}~ Gr!۫Cُl.;HFqpy ="O<[psvFn][~=ODq48sWw\_y%oS6ㄤEhp~L&BYe g'HpM3MÙk?˞H%삺i!k5~@be@ T$ Jl@rv&D˅PϪPh "NFVJBC`# 2gJ`Eh@ɱKXcFZ#7f<M;%1ܰm,]jMa 4|@d#ۨ@p6e)c@Z^;l-Cw]L)4'\f<W UhJ CS?{OƑ_!ewT.`d&HYNE2$&%5/[ͦ$ahVzGpM8;$S#qoaNDp#NMT`ݜb9R>C2)uA)*趬0ʕh4M1- OF')u7 FM0iJoYL %6R1OdJ-K&eˠkWi8٢JuqY6LT*ş/Q0IFdӧ i=hu8B=EZɻ*d쇣8d!K˅^?Ɍ5ΆsސEfIu2ɽNdGXdz~e]k89Gù./ %cRLipʹ5ŞwmY3~V*@}[/b QZ#- |>=l;T_m@!xG}m2[Ł#K^<Ӧ2@CIAbnXKfz!p4u;Pop b;)([,PpTt?j.N^m.QI?zO>dD*7`*loiIq\c'muYu]G#J+s%P5OJN `Ix_ē ֓b;vro?TV]TH؀[\igS/OO2$Ͻh*ô7Azp2]DJHA Y+ ݲڸ&J»1v'e \0*"@ N_R}44#tO{`0FY%{ ?r&U3fPJ&wՌq 5d7 d+d#IJDΥ&Dd ,Z 1YAt\PF|t[mRHI^KN:!hJ2#!& Ш>s惢[$qrHM.V'itq!{Ůݩ pЂ f}8pBDi4h3S$qJ8LR+a[W˷oT"FIdLF:E%4*[HC)H$(M_N%& }NSh:*-p0`2m*Emh@cv̹zv.SR !nry8u9$̪) WU'9\mHjmP\ 8F+(ikM%4bQo:F(r{~dR?r[@ΊRG=0"K0F)MNF5BwGkETB # 11[I_2 )$:y:Ϟ!mC@qA12w>V?yWqa1Z>+#Ut3bzxw}\>].ӋsTzG_F;Z\RahG֟c/YL7<4$n>kCd]@^('? +#wgw*}CX0Ia)n S"2ʵC(7;W^sߌihșcf aZ%W0( 4j %cM2hADq3;IH"HA[?6&94D p31EO3ZjؚrC0f`[ v@P L @>8҆2/V%+u2=ԹCݧxEns ].yb:c PITz/gvUr2$]iTzr0;,Ow,)M8&YP|_1qkbs^9-rK|gI iuQU϶rl;D k;"KA$s"$yd'Ribw>^sv;҆@O]OETõ.mim(-)dI&%6X+eO-s"$!}4'A ZZvsF!ޞ_033&-`Fa D`j\ڝUaݾ*U P)*z* Kg1\JbT^Thy[=<!bw5 5>j^k-%y@PpQ[nTyFS;0?_N择]\>ܓ7bUzOB`pWhuB2 bN܃ʆ ь--CSݨS Bh_0!1Q6X8%\Sl~G\SF1LbmRY33afT9{_Q7h Q{< HB=J} `te2G b豸5^ΧEm讒INۇf \c8ЌN1cTWT2_jޖI?Sr}'ɽ_ >cÏƓEo/W4W[!aS#2.kS~},,ui[;; ]7pLbYtD֣h`K)шx/X4*h𗊲D̲,k# ۄRdͷyaSD[-[!9LC4UUE8.'ȭ}t9G-J4Q٬ujMM5 \ܞ+z/;bOS3T7ׯ^a`:6DTwFkr_%A쌠L\T9v~x!4ϛE-_4%VfEӚ㒢[x~*o?܃~~|PO_e!$}/v?}t1S'ӧr)MV-eQnnj ˻[/ڣ;⠶+Lr*wˣrcPe计dMw+:!ݲ7w7(tt(55h-Z~9T>C(έ .~!|qYJ#ڃ8)94KQImB54H)0eDPru3+YԜ&%xiҼXdžY{;64#wl8p٦G>\(,Sc!h$QɜM"5$8~[i2`ͨt@b++D(Ѡr?r@a*0^Tp51rc9FrYKn$EҁK)6+K ̀QJ0JZӠjY%j{ WRJ}f_a8^1g"oϯDY ɸpWW4K`U.מɋC܆̮Kv8f1YCHr^S >vxv C;-DKR/Soٱ:ULR&DF^'2!Tv<[6[wgwء;IkҒo )RQXAA ErҒw8!'V9;k*`qs}4{Ke2:`g (&ypA*BTn>}(U4jB02&sftɿ"2U|pYp]Η;d7;c'~$+Ւ"ݲ xldX/A|lVwz'fva} ŕ]a1rRBL/iBa60r6ɋiW&Q-1I_Z::Kq2[Ffg1gMͿ'J". 9h|ֲ6ZOGHS5 1]F@+\DH:Pt%Q/ z9zҎ*2Whڲyr' *wn)*f"&벹vfn\6Ɇ5"-KrCfSOsעrr=ekTk<QY8 ?+f_L0@)q;N4 Vޏޕښpe45O'\l=DžRtc&ccl?j`𖩓j){#'}<$B?1 1v&]٨Ⱝl&5Jʢ8m(ܚ6\u 2+]{eUPF`BUU^Tcuk.!7ޒM񿀻<ְbjпOQ.sh N}"Rbk>o6s*y_, vh:特'c:j; dɽ*PF0QfyMtªd v2YUqM]$w'?JE2"U B k,%p ^Z5Ty)V2psb$BKnhbiT(P,tL`uV'~TlPo7ȯƑt|A#f"Y2M~MXG'K'(-iB~-$dn*1$(G'nnZJnyZ;;?B[F>O,0_Bz"3YA(j,@a\ AX d]\eUֆOqQh LŒ'nֲʹ-tFIޱlj뤏\zi: <Aft촔\O4$wIgzn'y; zINHAZ1qi+g,ZIVXkFVnqnPSI腑L׵P9!E]i晬j,.XWR aќULú|Kv5x'A!"DPY6*PRKW15 dDd0 ޝe;J@sm=du`ncy-ټ>`݁#VfK'$J{QyѮ^XN$aW ,{mٲqeskI6jNH^ VNsPE ,ӒWA N Bv$[.λ̖Xz.kO|d`3?"'kz5\NyH%e`kEk;;-l<>r߁#ަrGS?Eoodѯ]_˵[=\%]|zrz(ZßB힧WX3|KIѤ dR x$-U`Wy{aZ >wy[a=8 kwZyl0cT.Ā59 V}S)p8KY(-~|5tVgDn8#jAu%I.I YoM=ނV=ұ*(̟ D=\%Dݢ I(!Q?4rvH->;_8 77Emő}ٲ[>ZW1|#,X?-KF1p0[4%+~qe,$b%e\0*{{ j?Jq?`:8r>[A'fs97K| 'qD89.͎!)J;ԥ J@N+Dv*UjW*,@& ΙLJCxшx1(&,^c)-BZ#e`}*Id66 H9@ q>BfO;DSm$A׮c*JGCΔN7lnXa<>R&-j|Y%/٣A5A$GqMW,I͹beyW) J55ZRb;J1y!PS8\hZ}|:)T#q\|AUeO&c`Z9f92a5pWd|g+gH0kx\ɏdT_u]2b]s.oywU9WrtPp %/ҡ@u06]zg]#Ѩ |ТOw!VPKO~%_pj%ws{t Kt]7D7iD½{IJ.ڎ=^֍iM̿tWE+vw7wEqnڀ(m7H% o+A⡤o wIA 'e 1e  l#0.;x$RLrjT^IH ,֗~F> 8a?V~êIқ.q,isL a~7#mB3j7_ =lſwDcڰ~!{2&e~_0U2HflJ3Ol9j>kR1n_ G7]:O1 Hc[Wٿ#б_i_g'~>{鸙wRծjsRLbUS-:;vQV3]N`XrdꜼB4 P|h' ,K%Z^yDwOY>U@I->H9O>>ܓh 1oDQ8sMzfdwiϝ~Y 0܋s1tI~]YDFr&0bb{!v_:Ą[܄$m]Ed.&+?.HRhQXM _Yi%wWemn(*EA߮TETQ%^DxV˛"}jnk/y3)J CYoҕ4+zЂp9ac<C+d^u͙nj;<̀A|W} fٍw͞ww*{oN?A$2 ?|nI9L, U|7 x[O3i[-P=jT&|STNi`"aXc3:e9 Oc9K:Bkg-E!EY*ub7-XlZ,VMSa&$̍{KZ$ζ ;nigP{]9pA>BtEڤ7hϝ<بNuHcN& ˹^rEi"tzl:=k/ Rɡ2O֏]j}8Cת:"V']ꤋXtѮNj?tNSjVhd*^9 Ĝ"V)dLcαD^*䇮܍r7?BؓRMV}ilgyw9`,quouTANZbFk u <'7789}i0' N^ ``r2h S`RI Ab#H'VB'?!4>vo:*i!̠oė5锐RA/b*ELTrUkBEBU6kUX*k4LykhMa*^BQZ5gJJ;0ĝOSAB}\Z"}b))e,#\om)- B ʎz@LEiK?V~t3@]ױ̈́x-h^D mpB0C 7ж$M`Е *Zd3!w,3!wVF 34C[dzg^D?"ϼhk -exEb}A+0`!bU m$&U]sЙ1t -F 쌤 vF.7>\⑍}*n)ja=ʒqM]-Rp mAV擖r d:T%+hZ )sUSfF9~/J͞ ` aRg7}鷼h{k{&ehtG;m4\3E yXܲV*+:&29"3{ KʣF* 'b$ܵFzK_U\ ߌzBуH>2|l|}}FN]g,a@rpeTW]SjTځ#+0iP;ʪ52'd',kiTO{dń?k il&UCioF] 5 -HP3bAeUZr8 ouA ߲ژRۈf6IjͿ!-8.m;LPԛR[,[J)ޓux$%Hdm,hQmk"bq-#/dJm )nElF8>W1TuLju,)Fq؇HY$Bu7ǭpeM)dt9$r!~V/GqX*+g 5ΫYPZdei?{6/[{P*?%٭9I%3gT ڪؖVl? R)J@dG3UD@/V䰅Z3M{3 q6MC*LET8g&ՙpP @a+ޫ2.G!8m֋ ";WT5b 叝E9Yj]-ҡT p98"|&;-EW/R*z;Wf  &@ : !(03ʔ&ȵJAq&9M$b\*7@ v+:Z@3a#%{KR(bd1[;f\4O󾻽rzq=ᴕЗX͝`{NE["xǹ{"rm%Sܦt-.ΔH&U`6ADPbNjg}qܹ)uW6S)Gu.4fp9208 b8> 4Ʉj̛`A]|=Moܯ289?ϥRq 焤، +Jk4jr2*Vg%i .0hd^YV[ctµ1jpP6;k2N|:yJbn" jZT̜ ܷĺ -fRVo^y7DcGbR1G"XJ䲼b%S3{ _zjjWgfv{p2G[T(DS>s4bz)S/.;[iP>w s@MC8\͐߯ԙ yv1P3E;:]׳;F.d h7杁S(3 e#;KRGO% 7U=Jpyb*4e?;קp.J}l\|[WywU>!Ǧ?W [V{w? k #fLn[l"g E,@;[@Na8S@ ¬k}کV>[sģ]GοkFqzOxQ  :m|`K34D*-93h[å'}p} ;Eh\-lN`D%s\760$>z1lD@@] GUG @mW(P4DًF9FIVJ'IFR3Br$SuN rRNE{Qn˟=κ3kB2b'(U`T[fR?vș*׋H-¼kUkg|`2^EdgMvw/.-o=]@c]볦 ?AQ_a=Xy^7B8M#'nb?_=[?!/ޅ\Y{Ʒ0 U\zȊwJF(Dc^;04CC.r߳)˪qbTaYKA1f2 r8řZiЦ%Ƕ [30'?Ec^R %z9lvڄ^EwVr켶@Xq@!EBs8w}_n^M 3k頇]K>. 4A!a"ӽp4Z iQfLlyŗW$>ȡc+~!~y߫D48nXϛ!X|vyHkϧM% sB;'wz4khnֈ  ! >k&b2'S}}ɾ_,wJw煹YOV,=ݵ6ƕwQJb`tn9~1/oo@; Ã_캢7_)QW,Z\7ikłj1@+S!+o\m b` H,ڣPiTnPh`4!lk(glO|1ؤ$-` SSc5MQ2YeZ O-?8BW>k"Ag`ob-3–JXB2ܽĢ-cv;3mpjWborRa!<L*夯+FZ.S{9I?^V9vӘ⤗2"|هt%P5 @t8_>X"ZMa(sT(UL%((Giέ) )- 7]v޲a3Q6?b0dH z KfZX*wX9*͵5$՚\k$}[#ܠ501侈+ a)J)3jP))1ܤPLLlDN4,uZl L9e C)h·+fh ZbԐ3zPGS-Ns7A`Rܘ9&$5TL ~JD )&W"RMtfGvKy_R;_>=w}_`W.Ut"PIO޿{X>/^v抯~81z3 ~(|ǻ֦x+A(O\ES JB`t66Cqg^' D/V{h9LPWy6kE$p~u11 Ev!Kl7SJ6<֏!#Q.t:M ws*=Yމ&B:{</F*p''6K1\p.T1魿Lw^G>/9_leII(>LIf,r+(+c!?CuX YNd$vUz_&۸y.]gW^M&_ ԁX@'3T37R^o;09X<*P^],\1#ƬI|.f޷h* flfeӧŕ7jp F7{9#+y" ٠WeVVM~[[_e/>U0/+{qXWrfp9 `)JfJgr<_̾/J-;<:!$cA}7ZM+׾ D-W!Ui5ɛAA sCLj6QBRw:@uj>5/su:>}>ueOZB g[O,e?GVJSR $ `t\oqQ s/ G{3;ˡb r/ǮC8ɠ-qg^}m]e˹ $Z?W,Ʋql*$}*D;Ri ph<8cCL:K0AgI=j)%D%m%nm걭Vs7 NKuU W.GDnve"LY{6 iuJM37E^lCm:!ڑfk׋f#\|_ i/Y{Xp@wVks`D{sT8CtY5#LM }ˁ4NhuuUyJB鮆bHʝ$S씴T*l3buedw5{4›mSA\\P {Dg+m8oױ&e90Y=$RmOC;KyywVj $JιJhh>JuWa/A0{eQ 84 ݱ@x/ehl~PZ&q{ع)6l5Fg2 (EKk#[[&dQ9˹=ըj raZ S`rUd)ƕ]{ߤS )QO# ewVh o.,g[۳ٍچ$_}L5s?K.ռȣW}g ?;ofzWy)AP~iBipn˯;R ~.9F9~e^i8 ć>]U' y"IjjW[x[W rD;h=^ Lu?JW.d8W$[W rD;h}FYqH~vCB^n-Se3s]*匰"r g)qq5V/=YkO;nְD%OkɒZd)ԒfɦzB('SD*#[}.Mk[ߨlOlcuώW{1 z2V<\O2!@Ycnsll׆nayϳ5/1j].Ek;[*]s6?fq*5v:iּQpнFx`̡{JI֫C=sR' FV`8 ])V? ]eҫ&Rۼ^NU33,xm  JZ!Z6ze鍷!('N@Jvmfq,=?`h8J W긂:Zpx2K\kmeThRHc(Ƽ+0L -g":&'3@b!Aܚ LDZ_1)5Bj_J7+VH^R[XfXAI͙Ⱎ 8 D#l2}"(4m \+ `w*U0lX//X틅VT+nO.(LvRcpbSa$߹J>Rõ+a4eV xfs(]3S*?mH^=U C21>H L8oǿPf0ъ2I 5A*]p`oW?u A(w!0KH@PO[/h~}UH}2v_> 87{S.ʸ17gi s]BnCw] ԘAN҅Xjpb2hDT|p :.~:tٹKcciM1vk]Lq _J+ *%+meѻST1v Cچwy%FJw!XPZo@a4t)'?nhI,b=[$gux&j.)U &21*6Db=$!`aٻ&@ԅ0;ǟfE65q,Be,<۞G6;ƾcޕ'{ǂ}08Яp􍁏JE~ aԣ\*EհG>77sбq5VkQƨ h㙽r<`V)C&Mi!Ub٣j'wu h&@|#4C*瓋 z.ǻs.&Ba \t XjDHnxǂK?ȑyU!x!X@GΨ06Q(iqBb7Ί7lf d<)ųb(e&g)eJ$N9 'j>tI ؑ>W||v*'w4?E`#nn x 58ٯxH!CT^8mUj3'r-OZX.5i9A{U3IƠr}sNx zY2N)QTZÿEn:S*2"&@*O00B~Eogc79 DOZ bßݻ0bUoۓZޟL4jgdrjaZ1* LA&zJ(8тrrT1 ( J q{>fy@}5FNZWcBaeAK*k>g[aT*7i2˩ EQ(!wEN"/RN"/9n6c؎=Gd5>SB*uQтQC(o%-W*:m7|W8/5u}軲~(]FmtWst?O>˷kƁZxd|?CR\?$A^Xϋd=/Hn≠QK0^dF|D*RVpi aêGn? lAN[ 2GA%ƒnE]}]޲> k{ ~)%g79mz/PArB4WW?eLx.g8^zHoƟhP8(8 cPY (G7!_H GEpe]I "b[tpAA`CDǎY7)eG`W?#\KC%luY rgaQ)HK˭Zƍ&$*Zַq=ƍg۸d\38/^mMfC L*D.M{k4r̀5^t8IkO&l 7(jA0oq5f_} 㯽To9#@֝rCKs0wA[rnLOԒ#RtT^Ra8e@F>.MB&JaA3VnOfFڑBh0M7u~x< ~$d“V!fL#&M`2L(᰼figSx%Qѐž5SHv*Й~S'ٯiH7߼f7<. ݭzCϧ'zs8]ʴNE7!o&&DYs}Bfz7 md6" 14-69m=^asxOErKlY .!<`gZcՓJ@$ 7O>>1yb6?OF_]]5]dUSO{yOvoV"s9דGP镣hupa16ι$`+`HK+nR0?L>TJ4]_z#2K0ޅS,>ߟ}gy<N? f^~_2ׄF:Sm'>\KtYYzW.dJ҃7.YQu Gtv۟k$qZ页[j>$䕋h'?Tw{Eq*!hNwn Okn'W. ʬ@ pw˃<>Qe]t`(8 aQ I0&PHC}bc./wi4 S0衼IjT1(B8?J;!r.5Υ4ɣV*NV#r/jl qf<[;MN&OZ΃| SSd%)L-Q+á 04Nmy8C'QnB/)A=VF3;>ucɌ282SgyEV;Vi5I f!4DaKcfDžm1 jJc7/Kmݟ'sws zrKܧu kcc_k:G.gcA*b_4s[1`à,%GLΤ2VFMh8e|bJ"1 f뉀B"N!{~[f>22B8K4&52S(Ze7{Z8j ֵp>>Sw:AJwP {?J!^{q`j8Mvv' 37mC(!)h`sA5l/0:E[)Ywܵ1ġ=)4^;wT2sMd5/>Ԝ#Er^>(5v&s,Cpmh'nc~|4SPH1 wB9J؞R~sm4ʠH6&7o 3P)6odK4塪@'iymp]$Z9J236Jᘉ"s0H@xY%[RUA'վ|nEc׾KgɵoqLHOFCl11Zb@e+dA ׮}eVjT Ś"n<]c[Տ6}WB>L8I^s( 9ǖYhc@t! BF0=RȥCz@HITVxф,:δܢ""EHuJ;  bcLc]rUyT몆| :Et9+ MF(1z пQPa@}IBU>r3*#FMjƜOۇxha]hGK{oܭ:"̛NEaZ-8=\mίRlꏯ}.չ'iѭ .h"Ƙ<n>]^V?Auo`8/*K8EK;9E06ŕϔ^'豭b=(R<;nRcP./Oz-%85k֠#&4[ޱcK [KVIߍW.`Ψ{V8rQ#KScO:Jql)*Z%G_N")}}/(Y`10@:rEGdkx+Hd f Hϕ+A]@ouȏff=G|}fB Ƕ'BЦ[Dں~[)9.j"pYrmJ) X{5Apz>D;A_֢Mb&xr lF6 JB /Rhİ+<ߌ 8qeKB|1@t2%5\Ɓ526uYnɠ[dOzK=+9f^^ӓk_,0Tm=IЊ8ezTQѽ+v9,jx S6m5I蔔5AQTw'fjy#dlਣEBTJғ j,VxXt=tJEBad,m\\ܾݘ$ܻFyd7]\CID6rӖLxRz2ζ< v'&c(HF3z:Fp$CU".R59䚧dP2N  \ɝt76*C1n} }.+bbW+T8D[FCNHE _tkR.#T Gz zіIo5{Ch续E*_AV?ij\e{GUs7dw .FAXKi%mg-kcuYm@ҹV.Mf9:'~/p  8V #)򎕀EO<}c8)ӆ"h}z,YcHyTQ Dp⒬OtFvtI D`ɌP6ocބ&y##UL$Hly:ؠx#E KI̺SIQ=i(UbszHCR{iu+1Gtwq_A(S5KsB!H@IP(TLHSh2r1 .Ƒ19͒(?IyDD)19H\N$c*1I*eͣ= IE1ϥ>zW.(B (6!h+Lcčg.)w<w@ܑo^uI\!Q<zxI+A5益 چ&z2z=_p}},|yv^~rשׁܥ 0JUe $h!R"BѢcN+In0uY*e KY2˟xկgnc SvvT=6i3a4lS%[g̓.q[=; 9nyWջYV/hbat3# lw>i]gIց-oamXV"Iiė~J`UcKpv: vFε~5;1~|vZB߲DX(ns_lb,Uo0\>ta9`s.&vq&W-"͵SjrVi.&C =9U`pt{?-ZēptT,)?,",z#}F%q-'zr ^4vzY5=@@ 7ƀK^~ءzsƝax6%z4CtL"vgg\k[_{E,Uc< ֆ׀y-zo1-洶8@@%T,iqcc΅V/ B; / l r7g|f {<8M<(jԐ_5ɯ N(|]>v4PU[DE脠.RP>d؛3nQ E{[a \yؕ}y7;/ɻ9M6."= ?aъ?37t:Y2~Dg%4`9 r22fFJK;&q& ZC RF3VD#A"{/&<ӳNfu\5#^v?ՈnD{Bʇ.#YRBي[2oqinYS̀Kl TF1$7 <՞uoqw݅;Y&@%0 1(A q-GzrT/bP.j& -#(K1*1Ja_z\h雸Yb"w5` pzS'X,ki)h25wxq(}^OG2KէUyCUa`{Ho;԰I]MKc>Ո $rZS#o8b؀ɔ2б}5w^;r~={~ۮDlD/pJ!V9 ~ZĹ0qkP}AJ?JCo Z]\IH܂ 鱠M(fzs`oylyd ^1V#Eb,AڥCKJewXֿ>17ClZ'JJVbPuޜqM_8#sƎ[iR7/ał2$OVgh5S^t98]1?l'- Ӥva1[K}[l0(lh2&?͘ߪ E.rIEĈA(V®v%7θEw.g0\Nj̳ihvґiK҆)t?ǁ$Nyt&DlA-eUxc~LpDn[߽>E=56FRyLhRHY2X_~{uP.V=B_4o$/g]'/?yҩV(0Ք)e䋟^o tR# ; ߩh cl1GKFϪd*K2lZ*/{>b7Q_c_t}bt5lZVz]<ɘ06c #%01Y ˼9kq,jF\p`$))HqB%YIMvWAX!=]3Xeb.f_2Zqzpz'yUGA7 'A< za냶&y! =A odwA?&`=h&vS2oBE;sVSAhKuzMI\HFUKB[J @өՃãs{t As({m):LL7*K}Ȥ c͕ ͔I֖P:c Nٛ\ ͢ϠvNjŧd{KP:xJռN:[ u@6:` VjCQRMPJMZJ-#cA3cUzs*q8MU=!8 >Gm'UK%t5)aiwd=%9ρB$tGB';ZρaLA! 2 \%ܫ`::gFޢBПH6 Z;4y%hн?ycEp^cEKhȿ"kz=ȔnV4nAX64䬵HP#'cOMohfO}lǗb. eIJvnv%ٍ=JSM@榮oNV)w,W"ZAN>15@T\\% wVZ9-{VbeG&0$fؿ9)J$>Q$.8>:>H=dlG!=P"Oo^>}!鐽jY&7W@a9MXvֺydDTJGrj{Ȯ? K)(U?2z Z{*K],Z*9Sam&֫q =hV.hɜw[SYf2S2Jv} /¢7GYmTCƩ%J?MXbe/e&G .Bm)\u~0NmկLNI?mVVtYYeeE睛ǜ^8Ol(86}{D/FI DdD qFh9ϫA&'=Ioӓx8p.Mfdbvm%tu8mxK'8W٨tFF!('ɒ:T&Ǟo&+* Ul]$ YSjdQQ`c#旹Jx=|$b6_n:?)}JM8\??_X_'O᷎k?7A[*X4E_@i*?ar$,`~9Q [Y5 {TOɦtsg_Mq[%p ;ǫ7Y]SDzW(:7*~nNI 6\$u8'ֱ,a 1I6zƩ>43mCax߃QsS!Iw5Uн^zn ^p҉ۻ 8 izWJ}Ý4Фh^Cjop_o_{zwx==|bx w~}/q4'v?|WϏ>=<]_o^: [gZ0G~]U9YЯ[W 2֋ZLCwY^IHcn/1te0"|=[!B1S|i}H}5剫`v,'t#b4=Z)N wڗͳy./?0g?]O0ȇgIw>:]検KĆ~>~fyߺ7ouN?HêCUB5G_oÎKo4c0Tyun}Aiʧ^$ۣRR<ꗝY˴_0McrG% @3=\^=E)%[ ~BckrS/TlJ /Rj{ܥkxW|P)pwE{2. ͗u|4Rng؉Vѣs/`4/zk]u`,:OKP:<#us~=҉_IK!Unw?ݼ`L{0j(YЍ\9ˣ`j~h,S.9=遮=u P;Z{ fR_40РkMkFjP+ICݪUz靝@B֗'LI'9l0hKZ9̶v ǵ\$;8i"Ũ?z+[mσ*϶;!'j$Z3JCTn`C/;.pV}Faao܈ͳy6Vݖy67Ͼq+= rW{V]foE"Ɋ֣ǩh,] &pJ ڝY=BhmhDT!k<1 S v0c,:rc01 պ{4`WZ=!\R4+u]kAgIM}j`5a:? nxBa,LWAeMD|I#oLoퟂN7L4+N^?A9_[f31P2@ɷh5*yPUlo#{hZ\[]zm>ZuW9 wď%&C<0o%LU.?ԱK)ҺDp-P`$|_ @r>5pw@Ď ͍W%v+XZ*B.hB :Ojrf<)Q<}h)$$]Ojb26wݶ1RpSE"E$b:-mi3ap0g!ÐP,bZjÃL1BHfQerWp -0K|X\_YlR+V mgDX+m&1gZTt+G@+E bqs+,sWhƗ!YxX@*HŃ%GHPt#@Nh bDs}OŘRTM]!1[9 APُ1_Jb,]t֛7o=Ck6 k%g m!lf=))Ќ;>^)D #o@,-#i9Eo4b&iD} ()8"f0CNVK,"U)4R&ƒT ALjys= ?aCd&襃D& <|lu*[Lsg"#lXV+ݪ\ +x$%coyrNZ8U[՚?J]/כ[ l0`C{4vk^H^O͏=v'`#P]rnj..5O*Sv`Ej;>Ԛ#!jj6( 6k3|Vtp.$sҢAi*L|h- TT!g7jjȩqkӕSmnMhJ4rr“ ٢&쨜 $8*kr6QY-`l4ڂ?uGݓ/03rVu?  si ?;u-fMLJc=&ȏf>|>K [*lP*KiꢎE$<PD qh12PQ0ljW"d|SG|9}wsU..W}25 B .Dd9/ $S0Z!HVRvZ#Ϗ~|ww n﭂v ^ޅ<*x}5`bi L7X=˨|JP @M ivL9[3<ˁ#Sd/} " Ftz3/pc<~a)iySydR\<) 6Gs0#)Z%zV~O:pˆ#: t`4[i DjM@)U,]}udxfxpM88%Ⱥј_xO8nlp Ծ\8f52N+!]wf.ZS d3Og`A+-T DЈFp4%ANJ,,K8y-$lZHgZH^ k!y-$,n-d@7*#,/SC8/p$օãy!-ԲM2  ܳxIVmz-ɝ@Hbܻy=pZ/<\]x8<ͫJqs9Z~m=~x{U5# Y)u`DXFGʂ`h9SF3Y*B-HdbQNJ8"MVWtzkhꪮR]uF&GܤOU=Ceۤ V#v)Vaa ̘`jּU$U%k.zII=Bj"dRKJM3~39tEї#Lj3$R+>2ͤ6Lj3]d8zSymkXB @U"cW1Z]p%x$(3 gaȇ@*!Bs ^1ɩ\r-Q\c-h 6M"?~<6Ll3vE#Ԭ(ŇRة72Z(I|ImBnQV5TAz\E S~ǫ)&F<jeva!"-'QFሒcαQ\8 aZ撟r%xrDlfd_'f& 2ٝ 6 Xb X䞈X5ZYD [ 0P#\SKZBޒdPf@Ā.&3 (3Ϙ}25aLx/pih] XF]pydZ$F鳟:SqdEd`4Zemd߻Цv9Z".RM$ZDQp\rL(M0A*AaoRze*qRE璠^NvZd&9->|gEvZdEvZdTCyo#Յpי괠).$5`K Pp)EnA#şboP!P!P $%k[]2ב kƌe:`$H^L/[Tj %}g V|vTTQftU&:G <"IqI`Lz͑! Cz@BX\t2LhP|z9pfB e&Є2kǎ`m(THXVJ*R2{))ya{/bC?Wo/'7Z?yIU?U=.Rt㐔vc0:?0r8CL3&%$0% 09@V"̓A}qdTfN9ۓ?wi>8'O>'M%^ɉ.4xv5OꍋyY}[ ǝmbag]d{ӳDczcl-ƆgZwL g;+l HUMo^? pM],I$j mN[ '#~!~#oEw?Xtp!jh3M=Y΂o}ȭ` $l9@ZTZ( c[s;h܊^ ܠ!6C8k/ k/0tr7}jjl>@^]h3= }_>9g5vH*D ,5U &S#>0c ~{UG}EHMFQ"(}@gjY)5s?ƻt Y!B!Ԑd?kؚƝ'!|C*NOߟ)W{\ n (HdCy 8QB̦ уjpETLiSSuPkw'}5RUZJ &qT_ oz62u/ 97^"#e4ND!+^C]3.": Q[v0 8YX-) *ZNVIxѱ9xW3+3b C1o^.޳ V"YPԾ?ۿֻbæ̿Mĭ琯y}rovt ͷWyv˄||kOn;Z/=4P 5bo~ߝ*UR~>dC;AOO?؁TjEgnA(fOdz { ?yBꂍׁEFK釐C^+wlP+5PTM=GpDtےUu?iJiЩR\#lj8OdL;ԑ^y$шyE&1i&Jk,Xw#%[B1935S$_<j71cSivJRq S)Ǭ)jK1{d.CQBlQqRNTwY: 33KX(`yS:?ې/\os+d|x\ނLApͧ8}P>,Cj]ʃ\.ű>I YAFkjvR?08K`VsYC9Xٌ Rh8{EwAمFV` uB8,mX {$ QI&1&F9G0^\kwlLeȊU.gU8;x .}/o/{ɹxXt5bsv򡊮M$Og6E*!Ы*кԽHAmk=(};BD˅rdd,bJ]^ 2Rp< 2+k CY I(UEfWdYumY5UM}+k,LqW H _ |q7j F|RyP,_o5:Ujꛊ&bD'g;7f/Vؠ\` 껡v/\ `Co?nL+ klF X/ ?b y`#_aلq.9\}o40T%ޱA{H(EqUneęڶ?qNt?%`̓F &P*chq.,?wtP]}.%EQ,a'^Xv\m{kDCA7f-8(0,b/H0ܓVxͅG Vw/*[K#ԓN h{XQC5%ը ƝQ;Z)W9fؤeg ̆zGaAt5 zǁx BA'[A9E9%.@%G?1N22V:Zc+T'>@׺1opؠչ;%M%O*y"*)u\Bct40hɁ:]'BI)ܭ ѰE:BB}K&<>⇋oגSed}ߖD.V צ&7-X-+pLdKN5l6F9J\% [ |Dҵ:2[|\g?>OyrQ{&QOP;v:+g2ȑAI#]p<-1lPƉ7ݭy\ޜ){O2):uK*3yN#^C1e3ۗ!:6<!(4ěkvƲo3e2pIÞo¼f+{gа3 '/JӏTr'|@h8 @-OAQxcYt 9ܫ35"Oߕ4US-k+jYc(}OQة>2ڢ1XM:z, fIoRiO3`u(oz4!ujD|X^wt@ \"<\(c*2I~LbR7vsmA0 I!ޞ#΂`q$>*8f#̆vG 4\c)±κ0-s7[@Jٟ BgI6cHչyVs?@4#p!zA{xϛ5蒄(,㉈X ² F6FaAM4;b}s^/ƢXJA1_پZczR̉g9Q&`e͐!led[<ɘ%g9kpV=pk!Hr2NNh=±ڏKG aE#YY!\{,Ӹg\NI;I.LU+odTl䳕njez3iܫA趾w&J+1/_燗Vۇ5`HP#b,CO;cZN0fp`IimjuXSef_bP$\TUVL0 \` mr8Foa˥2BJb FXH bԁ~[Y`b-A#"`Cio"60sNa5ŬL 1D+p|<H2$"ȭ<jDBP dA@4S0B&pd*3BS& > 5_egzJae/ o3˞)h<{Neh]r\H#U Z56{Bn)bcSWAC#8E^.U/l❞|}Ⴚ[n 1KET1$0RdI)c=9!5aJ+E\1Z'Fh%s'5vC{EQxmBޡ f x}ybxdCz0ޓq+Wi`?/|dlmMˣxGWkN1oH`%Ūb]#4_{e`)2*/LK)㊸E Թx:h}?T-j.9sO_ lsu<޳LK㓍pTL6lZ=*>68M9䂢&CNc>!J>䶎 åp ||.(tqöY@RU [4j"h2%r]N0£&olž$M|DKn.N|MFJjl"<̾R]C%-`睖dk;TbKs-FA62xEȖh#Hu|FEcb~2NR*]Pf$hq.ؖV164wh,-b劊OCTJ"zK4F3WJ@a q(BQnТUՄ i /f#(0{p3cr8*P"t /90(/X5$İ%g |Tŗ"km7)ߕ9e"pD3˖iQe~ %&B ɼpK a3_J|&5Bb@+nZIC 52^mEtŌ_+`Dy,UN&Q頵Quv=b.ϥ\ WW84HF!qFˇ{fK}.s{ u<ſ>3t3\b w__D'{32g|q8RSC)yXl^W|>E-pE1n ,)tlLDh 0G\E( \]\?G.⇯GϿmUx OxI'CQzg&}^wO?6pw69GyAC]B3"2M\ /eԑ<۲ǿ,Eo|AH*.ڑ<>ytG((SW62ϋ85Ů{粏/5u\Ҿ^_py7v?@]x;} PJ9k!>"W8NjN>|kGMvk&Bv#{1Tql2  eYsD eZCjJ̖=N(J8DVpAIםΌW$*';#'q'((aS·9F F _} z.6 <->/;yFjThI%o Æ_^px {>4Hȅ"{tܯ` _ K{5ͧYhl3QւU6O,ZT#[e {Z'NB_{s}uŷz |<T-6xj ':Ȍp3^hiBSHUV6ք{*a6X#*_&/9»m ެ<$znoV5Oͮ}7_ع~^_Wqvggj`e!dekI 7Lf<4z%HXlY0/..2T;37}!KvH7\G#wi*l^@Ԡk6lncH$QJfqyז\z@`x_!KBZ o `j2vsVBBc-uDcגsp̭ᚷ%I0[D̒1S6MMۇzo'[öD1*۲9*DiS#6iDNAi|NۈV60hp%)-hVw0'S0hj~OĒ`o 3[miztQ+)ړ -3ا<XkyNa=)45(xK7P'$tAdߤ*#)&yj.Giy =4xjAP -4KWH a9u(Z#e?=w'$؝dLIU9),Tk.LR :Vcs{DPrI U0!8 Uu7U\w0kΨxۻ6k4,*QTNFQ9-2Zk8ypf[ C&KNe reLq F/ӄ4죽Dk߮AF!-k XͿmlFCT;Y|CǓJr#ZxRRѠbmլ0ph;or>sӵ  0U-+5ӥ*ԵJ*rRJje+VJGʺ+`_{HkCdaL^Fbtg[x%JN`ݤGpw+SZgv:9I4.IlX]ǟO:;YzX$0(S2$ZlcPD]]X_ɱh밹ċ˻,nwMRb}AA*7g7a}Ha8;x)0560BYkyr؉x GM鲭Q{{5O"uK) h}]XR=VM>6[%Jigv nO9e:hdw[AF~֩(gW!$!-Vm"Ky^L* cAA;laSPRX*3{zHBcs+ aZ.# [0/[@F9*.$ `NPD]tf%>444شco[N2/(3CrrϨST@a- JD{EPϔ FҫjnTJ3߱ƫL}UdL L}8hAId՗|.d=+A*eMZꎐjhnH 11NTb`V*3r",7cs]2|+)1dQ Ϳsg2g\"3>Wh*qv$i xB$/Z%BHp@9LDkuvB5NO*-.ÞIڲ` 8}yEaR0eyiweOa(Qm•[f((vԶ 2&8o#\ WqoC- B61 (Ed^At2W<= t>a Ԑ_eL'eq!81J^? q/wd'940bi&nRBO-&)oE'6 P9AogE=M ݟڜ28CN䔯 ɥ=٩xZ9   MA|Qpg A06sAɕ;] IGQiQT\3RQAi#DЃ2@_!D|zF(3Ƭ1_5{ƃ %Ff>n"u`oْ"-\>؉^L1OWN|?$&}iE'sxTYFLFVޕKwMt1ub愾}qBS?Zl ۠ @^P)^a6H1$3x&6'۳4o-QFhOy,7v u|5JIMmfuSFFWI$ Fqcqݞțŭkj)x> J*EcۗfC椙0BdP2 :ɊT45{i:-eD2{ޭFbER (.=ܷ(8 WPIoMS',+sw'XBw$jB:8D k]IFO|>t,.ݛvJz*h-.-agH)QDf0" It}+5lDƋ!0YȽAcb 1>t>c$'a^_ @ Uf,7TQq͟CҲ3]V3BܘiGF4Z'BJ3DG1½DmELf"Sm 5fư:úU j-6s۝[Wpk`Ă sXm)W== qaM=k&{z?;>G4Xf+&9v:3&0j @X|! 1KГ0fB!rƔr)y f\NHQ x&I!*S'T-=2mR F* '`%#`\X)KǚQɏ|ϻ7/~>_Ooǟo+󻼸Yl[nY_>,ߵ~j&e?0mb_ly|aqS|xJpz}\迿U&YW94XUɀWL2.`Ji`XL岬hEBܥG7,4dHmƫ `F`Bz%DƬGۋK^8t*m# 44=eDe?7W}hC&jytSKgݟ^!ߛdF ;>4{X"Vb]Rn3V2çihU0R~}Oh}q% ݧm:^~}V_N*l? UMV7ʀ:#&1BH,+$d՟TG;U_Ql/VAC/`_Qw8ט61(U[rf!f!Չs 94g~>xeO&k_C}"( ̊=W  e X5  K2Du3D ׷ #% R˻N&1KN.Ovǘ?}1[=݇4A)BpSba}ieN)ql;B㞞:8xQjcU[N`s$wN`upZ3ι c8 @8N }J69]7b=dհdtiʷ WN. :\~U##{kkO;Chiaiw^J =zQRjDk_:xt@=n{6:"#iIqX5{|7J)TL&x!2(*QeFe- džSwb}-e=a f?z<9Ag"CyHC?Ȋf)!k&2[+0f mDI>}'4gbL- 8Ӎ©m;rF۰xuro[.?=s)A50~B鈿ⱉ599ooĀ3{:2Bj6H>^dܝj9A^wA(_T?v)kJyWM4vY bfTݮhx}J1396t9# J`<7)%&BnJ( F97x~dǫ%Й)#zV"[#q3iJ3m$HH^v &ED6.k"NrOl?r13RMi]CFfaUCU'-k@`PDtK&bQr2e&ό4eV\F:%s\zT ptS(:5I+@8$Lamp xM9̵Y^-Tgy@&/xMcI-6 l\I2$WMї>\ س\8"6߿<0!Kk^8Cg9d2%x[ܘv,c{?` aս4Y!=Pxd[;&SH )9GvھN(&y۴PUS4+&XYB #۬17#oUdc6&U"sy+~'se;sϿyWVFl6߳Gánm/CeQf+>[pBπ퀕BA2zuW!;iVLG%3#`D\ДQƝTxfs7p)5z,(I¸K1Z:DV++VjM-2}Y߮s D9, h;;Zv'#w=hVM Q/゗6 wvBSCf f~R= go:ɶ0n-ɥ @֬C>!ÌorÌ4:a M㍠Gggj2tL(RXcdh1M1xJǺ)`ZwtEGQcWN6k\C~% R.f!MkHHQ38 gvUtQ@ՔɊߗӀЊBB9<ӷȀ-jcXHiƐƢ}XtQiQ@oetmFp%*cd FWrF Ȃ_45ϔ<:7P*7[eMSLN[4~Xq4.d)^j)|۱[ha$P&sX(](-]߇<4uG{hJyn8agKjzOL*h"*GLbuGUV\ !vݹm}a/@gU_}*ݭ1'qy9bv o_OgkМz#hV("&'nZ򉧌%$\59!/䉎qkINng,&u{tΗbP<4,^!UŘƴEJDp5 &I)BNHQW?l䪱hD$BŒ]\6"Yl`^"B br(d#u!BM-n`~G3č=K&e"$4R ؋BU [:E' ҂DUIgָAJcstz Ъ+ }ezSmoXصdC/};~,98(UL3f22^.I _=xqu\t?iAX;:UAE'?y @qFN;׽%ifXHO1gd1GkY0_:v~??y XK߄b\ȎE/l.&Ku_F>^f!6 'tq PR7{,;BkIk:㲸_6C"3DK"B7#u@;ҭFcGYky:"X߃o%}{dxz P+R$a0HQA$d C6BE*cOqB32^)< iS(8Ti%3z~]3Y:_y‡?(_Ԓ)ϨhtQfAY%ॻtq=-i48b.?_l>(mGDssA1fВ<' rݚ &pxs}XwmNƞ:=EY= w!6'iOyXÒ'=,S0k-Lkm+F QLRj ?iXM :IEZۑti9v8ߔvVXKEȻRT 2N6S5֝\g_ bv()$,ѡTBa@%DH1&AHX1Q_U<%BW*Ig|;+Z[zx@}EVC-<G ߡۄRnn쪢_H5 >V6BKCiOÛD 9=ϴ{kʋ IgN]Zh½=\]pBkҾ\hMHir*SalV6/ "Vgo6^U7|Fʥopq] lB63)QԠ 4Nƽ-*xj?(*Nxi|LBҥ2IŹۜr ̹A+al?W{d3JD(ϳzjtq,׸kLwl]8NeJB/u @|Cs5i%G5O/(-bkPtQaD k$*~ XMQgt,kbAY <(ʟ/7?՜WwP+M; Ng6:+ڴ+I"&+V](?ܬf| "֒lZ{6F=b-FUZ]zys2"* /O^a7|J,/JКvG4( ş6{j V @`dH&C.K<é+ΆW=hH0|Zrk q!!zĈ..$7%;_U1yTT2_ ^$hPXVҞDP1&+6p=!lR0n|ڲb}MӥCwm4#YRpI(9}/Tۜ.oN_hyڵ^\,1b6-Y 2ߵ) ]h՛b = __y\($1Dզ4b".HŅY/iBGOV&zWU֗H#q/At!X0 *Fw9#bOmHYN}D IlZ> o'a4KFB~vӯ(ZǨѸM7mk0 ǣݭwvNAHd;Qo~LZ#=l]qA^wSlMd ZLV̅9{1rw=_=ݭc>'h-XVA&(fAV!PJ5T[<%-=T2[͂,ݲx[BpD)KƽSPcdc|w P;pTm^G}mBBù /܄wad9mM:G,LdTDmAF{E&q`vOn_~^-;ӨN)~9wմ޺Oy|^^zwzQ` FH ,ۯ@GEjGyl gQ$tH)ҿ_ۿO 7Y>#WxWpumJoQ~9}`{ۿEc;9Q} >{h64?PtfG8)0N>Kwf^}{x*N/%~Dn*(\IW>Ǿxr6щ\*oX>X|vܠJ7$4'jr Mڳf)!0[aݸcd}zO.]ޞwvKA:Oppχ/D-o[_rV݇Sރ΢ۋ.\):1`x4+`ۇ6f^h9zzeßeW{  M{l<&)?PBPO%T7QJQ)R (4F2<N8҄VsQp-.k 7wիHo⺳!TJ=_o0`F]\Ԏ p.Gyg)K˭o:[W"P0Ccc27Ծ${WM@b,Lp*j:DHF&$hB$>y GOk3op^5VB(3ԁqIlD*CqGZ&0HMC[f&` ( rsc8vc^xܝa;t Q^fLoǣdj<{SVmp{ǥ蜋Y--/N>/+09&w7cJ Xy'kXS˞.9 a8*i^XDH-v_׿z,684z@A̞jIOA=pp&ƹFƀڸ%ťN OPi%v.842U4e\LEj2CU1fT>c .\Nc"SWCBp!MτI\2c\PIҞTɟ0BOiǰd{o~7Fqв&o͔W/ |7}tu;٧S:pSssNy"cxi Il@F#kfC<sqU` ]'9ί" 9*}S$qCI-S@3nz8үd3 qj@y4T+_B r V_촟7` ^>,P(ID0^(@$(~`:!#>^LWd*RYW)BB4`i>m:L}J(]'s 8eڅ]Hc*k]UKC;K|]A"I]K iu6߸i #}!h|QK@Јsf  bj81Q`2>Bo/HrwEH8np"Sh_!9z<C\T*\rݡн =pשN/6y+Iqsw'w5 8/ o !>Fp5M%Cm |4 aXEXϚ|Qs_Ewk ary_bs/M.=ΐ!EUr?׷PEpЄE{~ ƒ2)c\UbBulJ:ʆXXF$6:LAk! LbIK7ȊR׾$%;wl1Y].#DA(6@QqCƵ_yZw D$yT4_p~bn*vbSt E 11GC&uAq6D,a3A3 @S2>}Pu}{C "BT#ڕ*]&,2i4pXs0K 1PDc0aQ1ZD"K VXWuTk|}[HE20ߢ}1O5D4D4D4DPU@f9ud>kL(fwqswHA$1 hl@%T W&[ZDFF"f`P$k (af1 ix-PR`:<22< S5(Ɵ(kXJtT':tc GF90,c*O$ (FȄGZ, '*bp1 A6aXQZFß<"lyC> n$$ R[P R9hTrΞ8PR!I?vτЭz캅 ]ļo32K Z$ ⌤uƄUoS@; @:鉩E:A?o{m\, _nL\{ Tȕ(슄](Ɖ]?ǎ-6=[,d?"^zfX#GH!=wH3sJ^+AfO"H0b>WW/"! WRz\2|nT+ ݆(엯}m|Ѷ/>>56uv@T ">=GiAJrBwP3Tc!Gyw5M)7Q+Aѭ/ڄ/6a~|ǛqVk$޼h{vP?\9Nn>yG[o_zp}_zppuŨ`ҝڨ/]R/8yǽɗh}i/SFlF#>5Cs1zؕ]-EF>QsWW?]=͏-__}?ܽ￶r7Dcoe{.{7W~{_|s_ۻ=ru~YwyWIɼ_8=~t?(%IAyA:`9:fOpс CcC8ߍOzD#z'|7o]H#ҧ3 }5%Y[P^DS Eŵ =Ԓ&m)R V(x8[ 9bh=;/^M3[_MbC7YG8r @mvg_t UW3jv>?͟EKiOXq7ϾqV'E f ތ'[<8,EsfQmSr]iiр7ծ˶pAkoFRɮ-JQJ5rl t"N gc0gGs9$zL#L)Q!-O$0TtВ6pQ c*$x4d)pМ6?=|MilNcwMem&d0<TΓt}6Hm4"aĢ[c uDd]cs:(N S(LBc2U|>a~9#Œk jT\Yƚ,-ҷP(sn3e{k ?y&c~\`vh^>0 JTI&|:_Ai4|϶F@^"77.rUD禦Gh=Y~r T+˳Y$6 `q[DV]J *Z(`A)7 ?J;Z'и6㄀aQH1-[[}|i  ~H%“0/>*تytv\8 kbGmZy8ԓ6DԡiXh1+A;vNLA.1p36@8z{HV|\O9쟞Kld[?~/K@W[;;D0׫o~y1^՗oVbzkz7׻~y85?:4~MOgBg 7x@PZC)(-i7A~˲;^׎c|xAۗonw供=Kj?60~esUߗ}]7l_XΜ,0GCwE59s%yqDI|n>nRa4,>D[l!ISjgu6_ #P(ծXO ^K۬jA<?\_IWwGs}?[BpVJiloN| 4mwoЃ} vP;?KZ4n@nƱkgFw+Vn{?q|_u")jǡWk7o׻cf~8ѫW17*Dݿ}ұvxFp!E|g(K{-68"t soF;Өčn"܆='؞GPpTH4(y8*zXGѵpܧGQo8ptOZ GoǍsH ,#TsaDxoY4`İF^(0 vWl I&|dJ!Z{\iѪBjo'%õqQ [ <^NV$ ^v>dx be7}Db=V_YSB!-Vpx#_-;3o5GN=xh 9nۗl9lKNU墩6 8]XRE0z뽅[MHP#H}7|&wݚV_FА aZIT?3b^hՔT8n8M@x`wۛpJGz"-ﱫ5Vɓ&3WFM\K˾!TlRAGkgm6,sD.X,A<`Z$bU+hi57 ;,sYnH@؝ˉZ %MSj3CR&4A:;_7AAtQcn{ _9$W͠ySY{xRsPZTo<ӬMPlP8jDΟpU{UvE4SnnޓҦKuxsۊE9 1y5XT-$r:KY4AʦC=ԓ-*^xSV S2fT}Z02=@yX qw$܁g{kD !N5IYԗJ @  A|51bLg[ ᆾB3yλ=<^$'#M΅4T)xj|ݚ{ADeDuvϤu kEh4ʢT3obȣu͎pQf:O1{^:Ƶ@bIț$-ymuX:A%eê=2vƝkۃewVY- ԚP]&Z ">D{-CIXti<{S'b]"FN˗:``dO"R]XoPW-L O/+=޼sBq~{Gajޱ-aFp*2HؤXNqYuNIP.eZ 7Ճ.=L=:3㬃c=ud@Yjxd| 2NY:Т~\5Gjֹ7LLIL-Ӈ\yO6vfoP@ >>ulԜ) agGV"|+-:N:XaAY0&2޵}{6C̺E ~)T:‚Sw\{/K.TβđkX |9tBj ١!ƂEZ ֝kTlS?".OfC"\S<1x" Ӎ 6F.2PEq( 7+lw ?wQ(R(-+uAi9 2fӫ~:hyYu®ȁwσy2z7qF"= A/q!0͠v;.gI*"A?wQK;x ?%Q (x9&[^/Wp}xjp }O/E,cQ>GQG)ԋL nr-zqr貨4*]BC |ף-! i=)*z #$Һeu݅g# O5 nj$er]҄L: NFykt/Bpif[Lbul >nYH}SAw| \,)jbP""QԂ& Qso7'=2YpN}d|Y$;wuMRcZ1D= U -?ϻtF>ƈ bHvusH,ce7]']xkc7u}4\2]%=H]5B O9 *H®v;? ,IFݍXQ,9=;Q׀- nfm>ZwQ 7fgeCQ/_ϻwW1dRt|?t mI8>ʎ1f3>F ԃuψPAr5tL3 Q].pg(ѱ#EXkԌ3 PԜdf/6>*|ivSw!d\v[ghp'Pn#rNe qLmޕ5$,ELLlwnCQX$i)$xL-K 2++3+ps2p"G(   a`9XlsxD)C1uNB`W Rq֗MA݁aqjN-Z^4USBCuԉxfNE`8rA{,`CȠ6CNk]t&~:18gvBI6WPǔ'%0rK!qyBƣ=0O*y.}`X٥O8װ'f T Zl/hֵf𮼩 WD|muRF5䍌ӌXZ`dΜ#sZ{AGWk^Zbm@/>)z i"%bj_A[3w㉙TZʵ=uA>\LxDZ"yIV JEh]!r_=2 ;J؏4^/##YUNrY^"ؘ6MZT⣿5 S`Hi0\HBq 1\nza]<_yD 8/EK뫫'w{_+vF:x: yjIPTM+,9͛ZrZ8Bvl B*&#: LӘY&|lQ/`1OqלLu1' Ç{Pq2x6'bQ}5 He@e7?ԜԞ ֍-K6sɻH1aF.`Ŷ`@L'] Pg><6 zwocA寰ogD0:bxRos.X1>]O>nb]&L7Wc1YLɘcDQnEg7s!ȟ.6-NyF3FZ)uΚHm'>3Qd&Ȍ`oeZ6ր՛wiYg{dd9hozڬqD5+f[Gn6ތUgUN,d| L+o*xԦmĩT7U䔐n_hPC%j3NQxvj%~K~|6 ^ lWq'O1(O b射 ё~: Bވ;$)eVbXy]fMjm!~a#u:tNf` fsŹq ~7z]ӻ3M瘵ٱۈ8T(Z<ʍئDdBٖuf2}*#Wpvo [ZMآӫdPdwNd+L3[q/grB݊;NJص,StI"T%+;$R]g 4 =OBqL$馌R{ABiլhK 3q4Rh\==+L.zhkԠh+vIŲ@3,ʵ:t1ƜORW*7dyfd"5,IQ%tgќ [d/ Ѥ4o_ -EG/Gw}gr?K`7L&2XrS.S&q`: d5\ BZkfqNB䴱`r,@zv8|(bצ}(:b``Y/fO♽_-wS9:tҁp@@*LZIhœ҄͞?mII:V\s%"1ĩVZ#RLE/NQ(=iQzAkA@>a؋O®[':hG,BP 7TbE m1MGI7`唑;KWxyHsj(۴U҉V;#s~k@H!yFbʥ%CQkw!se/T'd"lv&2DYwd_Lud@=`AFX\)q1[^(Ƙe``2 +enC-3FhVҌ.N Q#pg?#ۯO[?vhD8_1 U5 '2G?.ō;j,볻?DD$ʞJ n_:\&誃w#e[%ob{*9{%5P2st~e/W}H,׳/M<X/쐤| ro/YѬl{t~9fh+A+7x|.NDT_"=dRG:J]lgW/)~y<:H7̰Dȶ%5?Ct{!k1xt `{^Qĭ C0fR+1MqƇi"8#7xOE?h/f՚& H2"2; -gʸ}ۧ()^6R>T3.= t[+tPT#L`]`<$ zɬl _ÿ@T=pNS{NNϗt#s0f*ӛo}{+w(H9Ykħٻ'ƴGƧiJ:6Mw.M{&L7' ?G< v66o{3}jg18@_sc3撵T24yV܄RA.Χ7woZh#,)8+x=v7Rf)`qC2A9aid/TVX?ҕmaxGp_4~|ߎǏWW8XZ]M7+tدc(F YB?o\]<~_^(zL4^]<PMmbͼSi +h1 Ú2I,Iϕw='kv煔HV%Tm(8{VUbûfr}VRJʧJ^~MXW-_n$Sz. BU0\yAXN>r0(T:aRY| `RT 6@9c8UIgH8`39I+-&IcAW;ae3Io5?|3m@5Ųϋ㲳V‚`uBi2t;$؀v]l/-Ų Юv{Q4ݵxs'7~M89Kݤ+`zu=M5t|>ӯhQ;i/aJg%o<_vqA:#Ɂk_w"5ս Xxj{*: GB$O_%6$JkqgFN~p=ΊBW)?? p}$BD5'@ DƗ&Pw7A`kJ`Jv[n[qEyLRao$a (@`2.N5EZ,RE`Cdz(]ٴۖT 1B:Q1ϒ"?qMn <1f>@Q6ۊ޼m9Sl-п?`dJ͊шLГݶޖP\0&-S 78н"w)}@> o^s<&-eьo_]\}5ë~~5}U|쿫/n _>f˧aėDZQ}^Aڄ\ٍ!3.``AD9TReHr>T<:iFpWBZcJW L$=h)bU Oc-8Bb9gX9Jd X&'+X6YR&-~'‚&cLS+5f1z`+f"'bsX|2qE 8'ZZ4 ުA1wٍ$a-<"J~BMc/ R /x!pdB J u_ n1u6=r/?k"1Vx/x9#TMnj+nJg+C(:$+=$'y1ɽӘ!Jc" %YG8ͥ-] -HX‚sKa҂Z }sKRGC1`Z:-{SFy{)աQ $e19WJr’i6~ QCUG0: D Wm`I)רU UX)8DiēU,0 %Cu5;b,ûbBF!v_70>sbNeLAl==ח.,[4Ca4;U0da*cCV<:7_{f4܃.>UVPdEZdv]`\o7ïsuRCI~kJHC,/7ܠƜ nK;[79$OF7,M76t1uN|MYN%Ttb- kv]5C3&F)*$xV=NĜvjg&ƃ]&&eKH+gքzd-UF)7^9[M)BRh'B |~7,ӜZIqI2< 4*J7Zqb&wU` rtRVTC@pTc E^zwp%q65Ge(t#RV&($ ҳF<+)c fʥqaiz0]ōi3$ݛqU&_iفaz[W_ܲC5ɧn_\n:whwN ƯV^UAcI}Igj]c8$Vr)QLnT1c`2<9n2/_&7$3,v+rSG(nG~Zu[3$/?}ho+JqoB`@SNMu+5~ng9/l*.񝑦cV8(v. 'dѮpD}Ť]Ki[׈߮ǛOrC -B5|L{J!Y:)qD $hstrkҽ6VaX $=UK{WUu|jϽ89;Y Uer+P˩M܇V]aln78 >eQ%29n9S"I*+%́>I <\}#7@.ӜX>]X0~klQi)pGĀpf4D$gc̝% ,xZ6cp}x޻ y4L* 0u6F}o8ϮnFԦ6{# 3 `KOl!\ߎ2Ӕ;OrfnO_y"Vxnּ{*llG-D@f{o/Ӫ0%˚ kJ!ۀoz-1W)֭_=Mi}JJ4b eT]cH#-:,6:fv.Ô[F$4:tb=^I Yi\!A@Qr=vфMķ%c2v$ xK0 ~gMH 6P?u`= ];"Y GoIkfڿs/nոRޔsuo_$+p)({yiI6B;+yḡouᗿxqLO|q^7~XQu>޻MKsΪL'x_ ߼ӿN$w_ݛ~Ͼ큽} F+-J8bLfb,*E4^q #l G^}Xӟn@Q]mc ah`WxsUw=/ϝA܆TƹnhgM*!@eBV}Tf #|c2YJ6s1K2ejT0P^:~2/dx[ ' gS ٜ91l(ޣ'Ie' a2>43[ [,duJ'gpWT/Q>U 1N$.5ğ$ͫUuH7 ޝhaEBsU^5c) }eU WF} iTRZ6 y_Rh۠l{&/1zÀ q ՟Ẕ󩴰\t125͘e:L9ǣ*5fcAZ*3Z^6i5e I98̥,Tq'2N5)`rm:93u8I]'lx;IòHo2Cgc,[ W2>k9|_=?a9Dçȗҥ͋F4Cmy+T A6EUm uݭ۠ Gs]%$ 5m@Wnjx @Bq_]YGW +*ӆh ؓ6z#9;:]y1{ؚ۴6ŤGI5 UM7au94Ŕ-Z6^yғng.Z7>M?zX!o!ޛ|`a&%{muِ}W&4M搇eIHJr(חnorh`>̲Lc *L+N;̄z!)SrXGgBFٙ;q)[cfoG#UpFmyȒEQ&<~C+OɊHy^p]=a4L v`Y[ivLp.KĦ>. 3#b\qɀ7[%RB7O" *o촸)F)*{~ ԋu#ed7I_SO( ^QfFh!ĂDJ]]hRfu7y7'n|{/>2׵ϽՌ,y#|$&u]܌ "VnmCfR7HmwUuGkv u2 <]lP:b6@T|A׀ e3LtD4Csd `yܱ@pkHaRX6LTqȞ[%:1R|X0& VkKT8#uIEZ-yk]2[ <ƈފ|[o7-_jޚ 8d>X:kd_OesAǵt*cS*HcNqϋWʯ΄)q6\X$Ϝi1zɘ*Jt)m@(7uiƬu!=H#Dri6F"<&(Tcs"u_i>&ɔsvDw>LRXҊ;2 `쌯BnI+^A>0?|.H\E.YR#.cP!/J2,)3[Q&e%>*`7I@;{hBbV̴>u'K[%.$ .&zzc! vt&`0 3d t;( ʝlֹYr^gܝ|l]9zVޏ/1zS ,h͂<3IcQއa653wGd!> я'<=%ᛛσn?Ve[D`矃2EduM:~>>+;ӷ<;iDw[5 1w?b1pzj Zzl-Tx A}wf$ʴޔQw'0a?p37L˴i?WbGWfGaU `25^9 *MRg s4}b fr滟.~|%~VR:}5<w?tϻ{B w"B 0~}G$2 OëվlfkhDߎcZr!\g4i-5~|\c F" JH*b@p"f3 v Nւ㚗r.(M+X5ᩪH D` EfRip]%c1k<0X?J' #+AABeQDh`^aiE ڨndzL$eoF)Bs0kGnDzR>qKaݕRhI{)`Rg9Rhv΁I'Qm8.c eCl1X D ,AT0,0 ̃՞KŌ#hK XSk`Umb^ hnDks<zas|"Ѥ?hu$§6 Ah+2%'w{sfy6T"Z+T;4O-S B,H'qI!t90Cet0@xI1eIЖ`,&XI?PJ)Wwe悃 ]nWu]#3H> 법id{T}JrޮmؚR7GdHUک"щ4띗26%}p ;m.UpQ(VK(~د=53+qO*Jz]-~HWK?WK"H8AMKtX@A,.@I DB8~e ٷ׭/ߓ)tV4)롽]|TG+?lfqs2G_ .f~ƱWCu­4=惧ݳ"h[wb F‚rvơk;`!EIM(@)ԶTS+Hj8B)i*bo|䒔&XAZvl5 -Vё!ٱpoPLgSֹ?Tz %T?rNz!*KFH/3z%9߳|TwN{ *v.DNsQnjhu h$G?}-@ 5 lQ GGQ;bY?r9z}@Je^@*zS*p $"g,qFCcZٔkF77!ٽP2V1?Ya/wIt)r2_n!MH6J %p9MﲖIRM,8T–v `,gӨBtiFLM{a' wwQ^(K;SfAQhXiQ4PI(?F}J˞GAmHi;GՔMPISb a6SʨFse6XPN U~7Zx)[/G5DJ$GO9IriM=XP@N.G&%rt\ _SY4dT }D^*Qk#A 3eH!);Nah)8>lnkQ onB"(*ˇO'ŲmX\}z{u\^,)ڞuzWƮob_mfTp_:mm@j ruu1}/%gk+-`3u\vKaD WQ%hK'TZ0!D)c`]+Q 5ie1l] (J$6B  K#QRY+h@7XԅusG4lɤC4?$.8GI 30ŶDAUNGZEk"lZ"B0ֈ2X*;NP2U:fAkRi%t:rM3ੰK:9"f "2rtj,Cp_oSUpӀ8cOՔMf*i 5E(3EϹ`;Tv wsZfgF'=93[N Jʌ?9est3zr [O 8e5s.Sz2-G ezcTHh|q`TH$ &G}y%Q^hP闕ҕb|pѦ O⧣.}/ϡǽpJᒂBrn¨{vrzan͵dKi:1oS&A:.6'H)+cp1g"$ʊDZ$P2Z`v'Zt =fX베 .9Hn[&̓˱》H3}d;@ѓ3 JK:n<m7$m]w4HV:]] 66CrkB:IiT9Ţ3aLunI1`Je宒a,Ҁ-ZaJ+.Hxv< ,i09Sq"% ]A 1(@ Jΐ d`ri}mWk3pV)e^hd%%q)[/ږR^pRB8Hsx~)Mg"/'A+Ӆg1ˌ5ԔPHd֗!k Ĵh(ۗd5Qs"\Vy%F֕!vA6gh517J5T SYma$g)i5S\$ Zv=4EMe 7$Z%y>[R(Wӿ%~Q^} YV7,8s'j?9Σ%06BH -=縗Udp۫ȧm)><~YJd~^]ry?|bi̹{̀ՙ9Qݵlr*M&fNi5jQ/wr!Tj%fڝ i3ip۹x;!bMnWƪ ~A!.S\vcO/ڳ4n~GBjy8JWpUn +JtJbpWW-Rx #e]{hz&J[$Bx5S*īޯu49J %T+ahj#Z.X/t@v,No C+ph|*tqZ%w?OI*huq*\~sʻav|{2g~|Vdy۫/faNʵ޾u*>MpՅ79:'|8e!^ V4߇/C}fT]¨6Jw⨭Vy tԆڨ@P.4(ZRզZYbsWVm>ܤSkԚp Rʑ;x9P~$T2dv;A9ANʹLB TZ9"1BF j4Lj|+GKG ]UNJ@P]FX9L9-ʇ͈BDA K(/!Rhx>cxW.:Mzb 2HpD#dMw4 2hʳ#~CTmNs?l&c{8)7| &VOf](x&aW]e>LSėH=Xۋ9d΄װ!Y;ǹ+ˊ WCJ,/Ks2%*x7tB54By ]Bxs(qvcNDDSa ʢrGӭJ!n:j5YW|D8TjXT a.(P$B;Ga*W9h:f*,DjFdLiww:jU~*_hqX ӧpZ]'~_t9ӗn֞{~{g3=W5HdO﹝2H=s1fYH#?rQlW8W=n8(d@KHDQ:+48;$3Lx6ϋt2E)s8_fqy;xjK\r-?/zMt;ŏ߭r׃`5\9bzoz>yZJ6=i-on²eapE*Th 2<-"?Ǎ_z3Y(XxX٨|,DT^{v pc`!v"&v! Xf_BjڧXW5 X;bɻRW?`82콊߆_#soa] =}A'j+)Iaj.yjxk8 cq Bc(30'{w_tN!B_o[7m f~]ivuuP9u90(s'YP,E1Ĵf2gXaf-ae1ϯS̯пwZ]%vۆμ4M[Yd`r6m7 UBm]Ohʵl+{D6()܃('k]2>{Aq>'3Ů:8X+{RwV&rjRyx\TJ4n{b ZJ(? Kr!܀=a ^X[pU y+DކS":h{]Nji\+C8>~-Ӥ t`id./0hKB;:5:U!GMR+:gc(oVHd'Yuw( IVOZLywn݋ۋk8iҧ+M ixICA=ՙKOW]qaAUE!V}F,rSx-RVVX[5eդugY4h #AİN,4u>77q(1'4Q+](Own1bnF\F]:R)%p#ju/OŔ~J{ۑX֣2t)#:=9?j[|9_P$Q4ՖzOtOԲ/#Er:]\ݜ|a|t_fp IԢ o-NwzB_ 2I1oItJjRѣ[ǫ>"6"xBc6R"İB9/$,7\\!]ABqE]YHN@*v'+a;z95q k\˺kՌo\jݻ6#X@Kf@rvZZb}Gփc}+ڦL8.s`UUBʋm7BDXGZKJyM8`2ec=~.((^Rq v" . +EѼrE5"e}B!'9u![eEV /"+JuYQ$Ǧ) W&A 5C ᗄ͞||#JFRfѓu$:ܶ=soTΰBEγ%FũL+]),ל5@Ƽ֑y.!~>[)|ZM:wNjo3ڭ ݡ1wQYqe `vn.ޭ f)NJlҀRV6Α[QnrP4:[D4X+)_T:ҥS, GVև9[愪KXImayj6uH#_(E"]մ .4fM゗C-TX;8:(*hKdp?+`~mEn5B:"VEH0JQ! &AyAJmxIKWPW:j4W\ܘq6ZWQ DtDx+P:kNI4eQ+uhnuuhA:LFӈaJjP1`DBTiVqZS0!y I+I#]w1D,㰡it]NͧƒN"q( Aym 1^$A~eCi{N C3_lk=!'"zn"'`'.n*ss)ALGd vY0dN] ûj9҉/YDqȍW9c?b+豝b4&\MZkURE7J*>ΛeLCƵ~=l:3/5ӖIIǩة1 4& jM 9Օ#+T'1KT߸|5x7Rm?7T ~|̝a:3a9f2^p-Jz_N6Jw(nj˻YOKMGZKwF㚀$ZR%Ti񽐌i]]JROL(ɟO}- Ai s2!4+3E,gX 2icCû*}l O;h0Ӊ_oA ё=3t\y_EYk$ސQʨ/B,Y~~ УZ}+]d4m̒"\Iom9bi8z h>MPx/8h 4E;[gP ̏Q-V5/%¹AWO?KX+#X_j8z$mƄˆĦ τF`&V5R/RŜN"bDx': hनGK+9"Iؾi#X)eVW(X/LB/ML9++n5)ދˈ1{wM" bx"5ͳo_m5*qt7(J2-.1f8EPźk!DOuE{P ukK)qMfAB[,y57 4/ܘ0M8f^)\)JBm;ocC㺪WB{I("XGu J9(ͣL!?h /N_ՎГd٤'V3Uc/ H`ϜIJNz׶u4<}aa0~9@줿(|F"PŃ;s"d4~(l08WFe;2f/?|=!5qR[ fx\ϭʩξw{{U+Sg:"y4E=[}_;_?aD"8ӑ <Bd8 laV(>)?M,8TXU|4-6Mh9 ppq+E9x5>J}f_B5OEdN4e\_fVS4SY3?Ϸwn߼ZPLW\/ػ_w,vFrR_G4}My qiO161+bfRت91!p^Ysp9g޼2Fy7 o375+uú~2>GXS'+j0vϘ ‹ c!Z9i`{yKoZmN |fVGE[ 1nK7_l'w݇q/eI[;^a::ƙG\?˗}'\w“,׳W9~Npse_a9ˉL 9%ӅIaQs\xR9awj]F ׏ut:Z&q&mKc)ѭ [ꖡez<B3.'3šʒn7pXX%jZ6`Cqk<&qtkE(':YYkᖭX({˶rn;4Ȏ*̔xn4e`i[^0h%9Bյn; T$e틛:(8m719F[זfAҚ`hfѺ(7 ^ Kz9 7brR箣֓pݚ3'^}8{}k囏o/Zȿo}VNZW"'''U*WX޴bۅ]5E}h?Us.(7oXt(e!9D`B?tGN*n]ۭ8IO&{9~EO$1},i>J!r;O.fm203Z۞1,q:nPm7 oq߫s޺#Wvgu(nϽ&a7+J(Yӱ鞐+uwqt0P%eW'1)y/\zƲR9^$LvemVV+WٌrTMz@L!Jyv+"X@s6>AVGO((&Y)ZHYGbAon]C~%+.莺6;k3irqa f<G|rdiU#Q 8.-k6o%EqI^hbGmFZànNo-dxC\ijh=)u2i^J$0X/]_EaDJ2#DŤB l, \ j*&(CWQ 4p}Pہ^4عw$?\U)% U?B6Ҿd< (+7VR&Ѫ嘌/8(M~h8Vʅu\%JI @VQ򐄾\xp3D΍)ܘ` dF)BB ol{KBR/R5dDC$HTi0 }"M5tjmR5b IMqWv70k1'a)8#h|.Hyi$M8WuB.P0zNly"=ɕ},>`JC8WCp+kruǫ@GpF=&\(nCr^;LҭDݝP3Yz)xx\Vx[)r U Wv ˌ]vِ`AfcHOhgјKNׯߩfAjjC2P(b8KͱD PJbAsn*ѶnqX` uS[s8['W%x#a\yj9qXS&dgfǁl $`[B,(jNqŴЏ!BK9'#a>$Be>K5 H|UP͵Fk<禐ćXaȁν-{0\>ZBOnRN(|QWFlQ-Z,ÈaڡbRaP6<" lVTT-|N?tvO6r>X-kIu#>V@>K>B%v| 5dxO-%-mepl:H:"PڐV! (_RB!R)N"" 3@(CQ <-m٪TɲU-|s͆@V@!K⟑$Wh#Au#4ϟ&TDoDâƈ~{ճGݩ/𮂀CF/q4zV_Ȱ{o"1Q\LJ`#}j#H  {OrI &і@ :MկZP%_5Wp,/=:t7("T+X4!0tWقiȷ)aa4K.,yjg-|#rctsdI5;i>T ( C9q/Nnm?NğKğ^1>BCC R:R!D*EҜ1b@rE`maeۯ4џP">}*!sVJ*de@hØTɊ [ >$]\ܻyλyλyλyEV`,Y#]rs#]^q vUdž*aFjq 39_b$@ :B* ERC$PV%pQguZ}e*)FⅠU^8:+ύ\yns#W^q nUzƵ\ʗr<깩UMznj+N/*NFVRA \bVcz_1=e<b]$e7hoI[ǒGґsuLObX,Vyh%I8 `dݼ--IP" S$#֒ XT6D.p R4oHG57boo^l9bW$1uJEQ"ʼnQ-.^*>\I, I5H6Td:&0p[{m3Lri&@cM(Iɝ&ĭ$0WyLҟ%Pq{iI")B+x煉Y8Gm _ 4C#򃼙#X- F~%^I/a+o6_zRM\ҁ 2&zt)dEj#B'A2N:ZIQ3.u'(᧢*7}o$IErBMɡ`1뜧d)P7m CuNړ+>ވ}!R_#9B}: p(Efr3Ž-i3c`B9 );>I2N*F=jhf7cS#:@bVtD)J ʰٜ׀N{qĐNGJY AP,KO 0ƒޗ 1(,&iSHe #0ɂxB 0 U -r3":)$"IR֙=irCg(BZ ?½ouZ8cV$o86WZ\aAaGanFI١8lPDf8F x}#h&SH X2nN&76;61xu-6*3LژkAY)ײW_s2b[']3l% _ʏ`JO% %s4Jʆ(t!~ 吗՞oB4~QԳ)_S+PGKh@^6| Z#"-8|ej\Vvl]T4x2.,pca/;ڙewO,,|,D+gda/ ݙeh,,|,BEŢѳv2*QS7bFϾ=8G-Sqk^ݞv2(˳:V/wFg$פitԤ= $GB[hϤhs@y靌z.Kcb0QezW#Cssg7%%p$EK~H %NP yM!,o?C^R \#%s4 />rÎ񌥤:uReÇ=aGz?(іD?r%R${qIJmAN.zl瓋nDZ#> j3p/o3 ˺HYX4Yh,e;=5 3 IlK[|6=Jm'cqa1GWgװdXn;sTݏfM/L&p`NL B;dGBדީI!Qw 5§]0a'/rKOsY2t hզ~[Qu gx3J8LdLMx 1mI [σf%<H\ ΐJ|3,]! Q?+(=C,esbVZB!BJLs8*niÅN+!(>vGqtXh7YAUR8&-3 I4Q\KZF[mY_`kũNf3Gy O|HW1m03H@q ,@E1,˒%; +\zJ'{O,/dfU_Ret˹ &j<"HҀ&" ZN`ڔeRʺ$oت$F_wj^0GneD2a`DfG9ӽwOulw8v4\D3f~&$K]rF; ;>{I9-DH{ړT'0sІ<Xtx;mm专3>j)60S,eT&?J 93))އ49|2"'|Bm =*7u+[3eՁpE/.SK-Qƴ_ފ.e;5ꝰ/S9+O;!VU2 nj0럯?/ 9i<3cO $F?>]S?>]2&qEH5]8jQMf˖3縸BbTZ`q~wOc- ~@BNWujG2=wDABs)&" 5b3(YH']琴}6!Y,V180ɉKjsێYHZ):Po=cW&Y#)?d٫2 RL-Yug<@HKI&5,P\Y]Olʈ"G@зxSfoˤ%e P͛ǻ@WfD?TN߷IӸn?r$OH{H.''@ۇ'ş~X;Y__ <==}2ZOdYS*]AY./__;6nm;Y8dʡ”FL)GhûeZ0sȌʢsȲ,:RvX1YTv~zѺ@h)-,QwQ!Q@!Seq)a 5BZJ BYo9Ȩ.]jUg>v ja׸nD[.lAO7-&i 8Av44 ء[c[ѸKQ 9xUҪFՑMc ~QU. #*r 57} NG>Ε'CFALa} tM]X$Bn?H?Jٍ *'Jz~4|Ѩ{DrmfZ6iն?@)^/:htYw9S!(#&ϑޟZM4}v4zR}o3WK?\14rQ3|-%Hw~3Wq_IYE+y $0> lƗLP?%p@~Aƴ' fv西We7NAjFeE}L}m|-A:WZZVIr\B!d՞o<2;QɗZy4Glr΍ .'ΈqiWUwҏ̐O1͹Cz).=9?̆r\QcLMS#_f=uVԼa|XjpZڪnকRI ܂ɉ~͓<;io_"ݹ!i3;Jyf*؏8ꯉ\Г׬E)dqkTf5Ik vT鞮_ K ?^g6`d_r Na}}oA݋LM_}2BG580-$o7H82ܯG%ݹG1G$/®(>WxWN! `N|o$6̚Qr2xl**# SfEeF.^0s߯ [y$7;-|WtEvuߤvLzp!H Rna2-F8Xaie;"' ]7H@H1D]('/E؁άѓ m зϑYĄrc135 9ln`?,ǐ͹4{m;<3{шt8B_c3cP*zb_EYq*_sּ&SAVN`iVlG>ᥚ6(\꺵D#N 4)}"83|0w T$:Mh.X(<0r\پcse)/zND, 'JsdDZ2=K*׳cvɷ:9\ChaqQJQLO[D4Z0Vg'z{4<(4b6q-ikGK$֪/in>TZaC)j.X[!4WE#u&6Җ*(Ml霵,!(̥Fd %(B1wuɱ Γ\:wI1X&DḩxsjX˛ Ob D0ewu@1+B [ t'$9c 2֛*:я{.f:"i#S- `VCH˝5ñ'w8  Hq6mer+\#G܍,-O>a (UB9bhܛט®1oiKce SUbS@H3clenw ?\7h `_iPU虉~@oY3?!RVQf?jEgGRqD]Yw^WpY'U^]ܼ 2]JS ynqT}JR\5YQvTRfxD g\IdW'jp҂?´}/!Svk3V #P'Gqy{m-q J^J4ͳIwJ`xRng<`y)Y Pn:/uRBFB!ݨKOBedN?̓;Pu,OQ< buCYEpRҾ=EgS|jZ؊b> vN'Ե^k'Xmgˏ%aћVW_YTD b̴;vcBa볮OrɇlPQX2ιkD5a=^ :&Dy 'Gm%mڋT*N%ĜXoeϭITD \ym\bbrH₉d\5qzfG'6vYST/1u~ ;vIk*q2T](23"JaYi{r|<,IS-l 5ٔ#0 gCՎН >jnNv:9 GfvTs b'Ǩ!(h5L3|-4fO-GdzeFp]iOO$7,SV<.r~߼~<{u=t'l:g'b޻0|;Č`4 )XcT\ko_turMG:~^Xɇ⌉C0y YPs B,WޣV +^ѽY`NI4Bm{J~oBᦛ@rSPգn"$*5̑@:r~.8~a'"; l^>~^>\\Qx2őcQ7}ЂT&$F!Kc4&^>|WwOH Ab(uG"! B#QugR#D IE202 6feR+ВJ=ps xI+^6YwFI|̄\ޜ?IIRb!nLtsAy. .C?<|S I>>|Q|tt~?ѕ'^ݼ FI_ &=~$nHdww.hz{~5Lq0J=}{pIńY~CMq\wSSFgv™ JݞDX@)x~Tl"2%*ra{.*]XSȗl0w?-?^rcݬ~~7r^9/ƍFFOFIDfm7A ̥ƥ)ˀXd&KmT\3 \fiɪߊy[ I@wn<AЖNnEs?Y{͟9лb0r2FimVq D-`Nd,2~4\Z@[\yg(-Xq9&| S6e^4.㹧-&IYg -zcqO w25P=#S)-U?5(mB)M)ж:/B_E}C_IN^<gƔ 9gFyf \x5M D 疭WORTA,9 D3mȳ h Ƨ> 61w+Iig~Y ؀j|mIqs@dZfbLQ 8fiZxJ&(,a0'_;j#>J &LDaxhUEMSq͵H-b-"'}ifQDxA*לm-◂;q"Jts P +IK|-3o1.CiS^|E/{R>1BO7(•g^+tփׄ$LHOXKk߱&X(aT^<1lx42hN[R8mI%ӖHjC7 n8$4Ird2z Zw'mJHrC&[|DBۙ, [{# .lxJGg/o?O׬Aenߎ?[GADP%+W.9Wy<Ъg[\^ɴwάI1gL7*} ^c|o58#z<_8)(? 6ycp:d;|}kC cǥ;8.7%jYM=[M}XMFJ]J𷱖w@e$L)v*j95^@-'݉q?M /nU d3Qf*Y`pGrE޵ŴUtN@WzUEa `{?䏺86ؽ `w u \辕բQlLnǵ'm.vzRt6K }\˞r~ {`훸U;\PIb)?v~l^l:K2u xHg[CxZM5T4)8-ߝaL;saH`c`2xf?~羨&7t>. -E2< /tņ}bzrs1a_8pɜ_oh_N\7~Sy.~saFC/WgcPpf8pmzyJwNABfNIӔ\ɌCRy+Xp YLۜkgzY J,GVLQztX#`&D A+mnGEl2UL\q2[S.])g u:,R$ei% $4F`/|Zuj;p?]mFrz ЄH4Xs?4>|Ϟ ad t%}hguNvHCYngU+kGv.;~R^pcMZ V~ Q;.`K8эgV[N/3+6=+Tx],]eEW-!'<Ѫ vڢgNMۛ7EfwNj1jYQs:nf4+=lShUh٦Hᢌfb7`ŜZN_k-J1ʥom-nVyg[%ĮYUY..vW^Zt]Qd d+ Rk2]Ƌp~(B-<_N 3F ( 7 lBu(aӴD*׮J儤HSW`0[1Й{ u_2SAV~^ **RZ JL Cnxe-<Ok?#uy*Y|샎\W#Q12؊? ƟxT/7͉Afx>M깴Ѡxv^ϙK>idʘ_) Oη] Qnه``++)xӧˋqVLhD}VZWnym5Y&8;eb˹_><^6%r.s1)ZP,ƨ(_F X\}uBya9ȲIYa˦,9 &bh>;G1؏lQ(mZ4QW-h=v!ޘD9;7&4\GtT.E`a??j0kjoG}܌nNQn%5}(!{wz?’/.RYVEX 'A% 5 ʅ;_-~.%' 19LTP0$Ν:ٯ V9R+uH3 T펊JeyNƳZ\;=/>i;(qo*spy\~d`wGƶdC&na)Q_=Ƹ!ʩ퀇&F5qt D#+|GYGg0+$3~kh>8ME D# GKh%&4*ɄHOA6".pIMk>>}{RDw }Nw:y],t+/FzVeAeP)h`&„8zM;ZZ.#Ο B[kJҠSdg[ӿzC#x <S)hQC7H|IIE68̧<\*ʣlIGF |Ď Q bt<~܀SCCB\*H{Ց` )j}ΎPѨ ; 6ƪ0yt*G"vnV."K̅%׿&<)"g;*;苣uXޏ?<U+h]"Wk[:dSG78' gRZw \֌Q3]* 'y4GNqxp_ͦ0 dz֬LJBv[۫Od}W-^ԍVbtZ7Gá=#wwөR 6R.OZM^5mAFzc$.G'Ƨkxbq3qWHIDKaEH)CMX{RB7xn7s{g# k&lUGЋg{}\HN~[3\iZlof0ǀV}0NJ@<=]QEk ZB@NJxjQy9$N(C/wXּ*);~=n H;H!h9 gf{rO*ԩ6͔ F~DPk-ĵ:љ+h"O%7 U"Km1#Z yM&,SAHϫbFՊ)Jh)yED1%\k ="vc'Ǟg4{DSL%]7>+SR['!t{]FؚhoL|3H#D!9:hsḷ$Sa>R\ Ermi邲 '?u~y,dQsrJNOA2#+ ݨkv!HqlGS;߼{mbB teƁbm2(-^3<[t"m+xp0NcDnFpu0grzf 6jf]PǭԜ܉fZ_QH.rg¨)b#B+!#3D;HPES9l-"J_3c8ęb]DdBbb) CF-pqe7*HoroS̃GdhZ03ɽ٭?]ϊ!vҶMJ"w'K:SgಮEj-D"9]pnT:KmoM;ۣo? Fquzcu/?>Z< hHi)O{A0d~Q=5jf2C'o=28)x 9gr֬"kw#70_N9]F(;̄yrOZoW;6g?|>ZY?ꞤX_[{ǫMn~ۋwfMGl:FhE(.Qwz=ݾ'3fS}fBi~ĎVdzd46'ik%NMN}[;KKMn ]jtz` у)hIe?yᆿGP~S#75wtS:pe&/5`Ѡ&wJ%g{qViQ:\IjWgһG}hFg(2>07 <)e X$~Xt4NCŗKٿ~//@mG(;d\?,?Srf |n"÷(_s ЊՄ"|nVք~Ig xhak;3Ya;tldRi~Wy%T .#g˘J̡q|.̼̼0ֹ*AwB*ØKW.x?TyamCRHQ.T KӁ0j'灏/ AD8 9Ľ\ aǯނI10p 1S٬mϏ5ͽskjUO^Pב'EڅU"-F+c oYJ Z>"M6mntʡ CK!6S(;;|^l@*g?gJ ^p.VN'X'FuFˬ@}ʞ6 yV{{Fh"7hN;'ljY/= 5&*0=[_4Du"8\UI=`ݹVziL^*yIm-帆{d1$OWaOHHH4qCBl-T "2Kg)0O&g o ) ӝ:yb BUɝ"uq5C,rг~^L53 /pAw(F::s8 0l<Py#{e 5LkəluqTh6 Ҽ>vLo>|i &Ӎ]߮QI~0z̳B9[ʜ{S LĆ8Zh*kImTSPNp͛7ę9xWSFuq}? lVH^Xqzg1 %sr8aWH>4iDa箌/J:X?=~ppls=X.aIG*'4,G.#L\oXBԬyǁkdsPQ~̨H}9@j*~"Txq<:|;(66mH5el7;W7@`N<+Ddz6Y{ohW@ < §=33ɔL%)#?([S&8xxПo (l@ e (ե6Ku$㸯V{?Da}b&GeJ8BC]A,D$CfLz$ek39! 1"3տ*)|Gn)W;U+1.mn}Dzd3+rV/v]uIyT(y^{hԌZpsdJ^zILj78=6gxUsݬ\︪>+'!Ѩ 't"2HFeKڸcCĤ*M-dRcI-2*Q?؉iGsTcIbL`w`V\z4;fk5Rݰ$kz(-q\^$ \FlKkk!ľm1[05 D6R82YaA%6B'͔A9: (p} pÅ("éڬ/SNDQ,W‚X$' q!~rNx&2jE AhJiB=q^BmuvlU\nKc1X;$JN 4q\&:.-qo)Ʀďu ksA'j؏Wǟ?ZX5(Ga܄9A&9cVcAqQp+Xrݤ**s3FU*IT*m0Nf%p9xk`yum^ %m3 _y󾗰 u1lR] []`dY\mٮCoU.xP=η]8v<(Ȩ Rp4G7KY9sy8C̛P/Bj7Ƽ᎙7 4Ruv5>iol0R00&0VìHx@(ls4 g[*lZVi¼WcpHJq ~+%{gXTxYӽmށ*2GɤH-%^?^R/""i.r rV!b<1px}/Ra" XQE+:h3RplMK6ֳQ@[CUDPݎ˧ 6 U'Ӄg"l`f((\ƫ¡.NBuw7]9 ƖשּׁnM~dh:pVm1[L5UkVd1Ai0Y RX+ e>{dsFpJkUxfm?r;Rp}/R%l>~H$Q%f<_P8$'BlZ}F#zzaJY!Gh:nUWbKC1^PP8{Y0ݷ_UdXXc8O߯$f)wXxoIDeƵ 6W ݬ|0uljd*EBZE>ͯ_Y3T{&{nU>L̃|>\nXD*0{eU.UsddS5n}MP3펦>~-h:5*cWMt%[ߋ)ґnm>I~ [EZq><sՄ?-R.|O qeٗ[3}*g`~ ,{\OiMRQE/Ψ0!R72Э.tdgIq)Ev?9_dg ߽9y6x;lpp;}vUnޜf}O^-N>IƋ⿘s牘1/p?'Qv'/KM(w/C(_w<ݵb7oBaԷ|#&ڋ.d7Yᅞ|9,GQtǥp_^~n~Eţ_JO/~zg{Ǘ Vݧn;v\??\wFߞw;x]ӹN_)Ͽ~.{3thx ]ڣm8oHړ SUw?o'.x]z H>|& NBG#J&"#^(G23TA' SEB &Dϐaة9 'R&i!Jy#qu ᷻3 JI4GtJ5x+NhX$j Lș9=H0TeZ@nV7Simnb>z²C2&4@I\xiYh8#b+DD- ʏB+ygA,`7@Qes2_8f1I ^rJ E! 0ht0jps5Q;kVG+GOA&6Lf}2A5u. $:CaT:]GBi&9Wьܻ ~HQ+(~:/2LfArGI%_,W) UB~UB>~t;Ǭ2ch8p@4F^xrq䙱NBHlS8m]L"86mJo-X bA:Pq/Qf ,G_[UԜ(b84E,Ș|5lT~D(ƕTu]Iĩ5,6f Tt-r'9 Sxt3 .mW0Ա #x{jn$.U$ԦaN BcD0 R88S`8sEQXx3 b ;Ki'O %O9D':wߧ7A6%}ġ0ȸ6=i]c ޗs쨒ϽO*"986wJ/t BM'GDr:.@t}_\Ω\OsnUj̔?´Q.-I&^BLN"#4_y߇hj{0@*AXAtnNolFٚ53L9|Le _H QTb +o`ppW! ,k%HI pJIbB ސU{0YX <*PY3gz9ܠ)l(- 2 b@s_Ia?a<'}1H̵7\{$=0c]@MHlSp~_҂}_K SIC n < < ?0L})EiV@Mp"I\1d|Gu|=^fb1)xd`3qXAXS̩Bx iBM*1*;V}jJ"J+a}ۻ +wڟrʕН+RDH~tg϶p+,Xx}VBqqcɢq-J7WP\QS8J1F'Ȩ溃ᚩ3nL*$Zpg\8Vo-UDj\[X&^O\}Fտǵf<{̼^5v̼N'wdg W !YA5&\os,`mH+Nn7}*6/þi)Mw7EYW32!:@8I('\ϧEsI Ղ^ɇ9A ">RQ(C7s X*O Qr 9ֆ`D$iu:͝ #8ŝ~FZrXJ4QR.7utA,קtRGaŌ7&"č@ǔi ƅ $[)#* ˕$`OV阇QX4Dm4a(q i4ha 4T0l3䘠Q. %#u`> 8< o `3 M&+k6BN#>9J!)Qd*9^2)._vF&elTHZx' ewp"<jCço z䂗Jz z+ Iy[&>@VyQHb-`4G3qЊW~*d;B"On Aeg Cl:]YsG+lxgpױ1bei:)ؼ0}l }a!Yˬʬ̬L<_\NO5l P >)apNj+o 6h0>=XBydzsՊMݷ/ E葞ـuc>n k,)}}&H<Ҙ#h]cΘNA=11ܔ\C6*KVAj"jmBc>n k1]GBkQEwDC~9`iѺRHΑҰ?r\#(Si*0=ONҔlO#*HM%puȊNLz?ӈ8O*_."&oIy'S&'yl8Jtofޜ=X %OcxUZIPSP4Aԁ y0"q qQ׻rI4956L^"Uʕsl3'w/6^,EUOs:(kZΐ~(&.Ƹ3s?G1y"ܧ|I~ `eL*Y0ocȐuyYvRCVy!w>XkMK=R\ J^`^= qAr(>猘 sƧʶMli}PU>B$4jHcq2i8HR0G%P N8* e)6ؘȡ8@sx:D@;=[4e> # c r.yZB q3'eN!VT`rʏF_Ɗܞ} vNA`ozS(=]-ʳ.{?E)l_n֝@Hv@'JR0`brnݹa1.{}P4FCzFPʶApv¸oxMFg*ލsy(: ,ʳKlo/a!k7vU)"Z85 %| ZAtAYCTZba;㴵͢ѥGM2bMDA F9R(Ј:W3"ìFYPŔ" , xC!1NE Uؒ|mu: n6ŗZ vi pt-{ &0.lk垯v4D 3c4ko$Ig@2pBE%aيǐIɁk&vmm1WMX1|MFsNz5D .`3-"VF"'1Qz$ _saB;c %NGDp-M*b2k X2ν䠵q׀qNGh3aUu8jD %]3@}kAKː@NazjF},8^Yk)bZGG;$ 83z%mŒT0K9GB,|Dح6YM"6N@dRn?~LbF+K<Vwj0O͙Vc*sa6G%/ݻw}$_,ܚ Taf_-ɋ:midϋIRMYM,.?R6i1$jy5/VחY3"G>B߼v\"(Ԕ}ο XJJf_AƇ+Z02-(gx&| pPZ4.7_Bӄ10֘;A\L E5D.^跦B:zS!6Xc >0Φ䳋'xm0CMpkPHk2mUva2+6}3hIiw6Wgk6Hm(vFT6՘bTUи޳P2ݍ5[ƔEv}+z;#u*$E\>%ĭƘ)ĺ;'cf.xe,/a2_)2|!J-( TFX898p7o[bVTI]Da1EYT$,@rrHՖqHȠmYɱeM4fI1y7Rb]ۘ|D{BP;'c~OiFEvH98`,캏-&oɨRNʮ;??'!wM+'F%l:糄VB;'C{jY}^Ta^8?-~2Qu ȖLTTcEաBx0F/+ᄶ"_c: WǒA _WJN*LrAݎ6!9Q%NB+D:CCk^UHQYJjӒmdEUZL3+v-FwQˍG7wL8QR*`mZlqPM MG(;v&l`S;p&ӊ $q$$W3e^.Xf:Y+RMNÝpB hctV(đ=FOiL, 2(i Qk_+ Fp޺I6R&4L=bUZWUw 0Z0\f>ݪNH!tt&s u8#/(H^oRK:Sz0}Cӛ0|TѠV.OjL9ŏCehϟ'm XpI[byq5>3\ׂsЬ&ng%ּ/fש5On/LȋiN@ʟ&PbVcaÌ+$֩? KPXRc-ޑlkeXJ]pqqHR=`j+!Lƀt"R#AbypX1 $ϗ NJKMշ;T˹1| {w7.q_|o0y(i]NޚG_kd4C:θe(FK.9Z# KP{` 7 1 ֔q3y=~7E`DS +Q+rPZ18V7LN%A^oFgMJvt=?:Y`<…n+x6[| hVf;tjT7121 ibHc:D#  ݕ^b=Z!įy D[Y0 1(f*]c+ ggBY~& ᴨx}2 %#Αlx퀱, +v뗖DZ9" ]XF73ypw)#0 $R;<ufꐅW\࡟p[pJȎY35|?>z>z?Zhkc~ [J;ܼ)y(ɴbc-QiJF>w= ܻL$R7Q( iZ.3XRnpI [@{ &0.lpY)'- nk`u<"^.A0CBn@XB[l9bS9xuMG0> -ջ{EuQ4ZQsa7쪎<+64fQ6f5_3փk9@y[}DZP'ueZs+N #>,2yw :_\bW/lX;/ht-tӏS8mB.]wGa#ygЫVgrP5}omi9aGB]s7M<X`iy~_:o{Ii'9~@@z#] t嬠8-;=u-%,kޢwsx\P`B(3dZG%e+c)WEo85K#8wmm~9g/[<,&9v[-;lgv1miJn_ؖ*VE5[}dP3N翾} 7xynK)/& U%7I-r쯮yqY3iZQo'ivNҮ _ sU(O_W#DymIB{l68)?K7KK-vcJﰆJ RZG, Zm'忞QqX7j޻K&,A_ƌ !3۲\s^͔Ko!ѣ̢(7eCǐ_WkSf7Y^mb7 dgwP֪aWhݵ1\ٯg} )\'D(u MVwLDX=spDr[Wb_=CRn~]Y8YQGxωovZG"Vʇ$4| ฒ#]\Dkepj5(IF)6Ki,R,̮}kw)As-^sJڭ`ibwi&5UL-HOr?tz}䯏 wESb%S`4b Hcb %ޛo]ڜ T$b taMQ6Ebk5FBKkiDVNAV,}&IgqG-ڲJmTeY/C̉5rBOIlNJR=;3yIʲxX/&>M6L%ʶc}[d#iljCCN[^&Ud6k4iybI PQ4nx,3Qȅh1$&$wJt}7{Y}x$]T)8X̗>"& Ɣ}fNM- ]>Qk w5{AAtː4JÐ9Pȇ!|L| ] !Lǐ!DmkIՉNk/:t\\6hDo^gs)k> K": |ʼILufZT/pEO~,Êgͱj! <`-w_¸/ʺ]m!Z]47 j&A_zM[avҍrn;0 TiDu{3PmKLyۃ;\-N;ztJrAUWL~!LHBefb(F̓4{GBҦ.0}\ :\*׫sd{3b ,Pp ex$UBhd' #z;2ȎQw|dCP=hjEZ7h]=JDǖ`8SWQ|_`7,L"$e708bWϒOhzΆ1J"֭ӒHhǠI|pٵ쓋" \NI8P/-}hЉ#gĕwDlJQ OrO'5tWgNod>ji䉬ee'E3n"FkrW i\yYNr.mF\!F+͋wm6|D y_Ch%o2ȌZ(Ѹux* NAW|m-3ޖHxʠ ^\.r@L .H1NQ+ƭϠҶ{2Z.:mvT D6εL==I<Ipfs<)z )gHbk/0T|>&1٧v ">p, Z2v[=0.iPIbOR``:T/]4&GWF:$JIjv!c)tɥGJ0B&b݇^R gN֭*[6dvU>)e4GWT̟d@rw*G~os:,·L:A w Aܗ 4X&iEbvgwͤGm*n^9 :"PK@,*QwNPSΝmhUMΛ<0-0AWQC=tJYvZkn/wuHA^k5dg dVI~ERINxزݛ&dF'nXvotL")i )ʻ}(mD>ݗcHPIl(SZ[K>PIvdHcݹ'vhd]HS34\Жt Bg|aMcLG?5R<='ݎN>2KrTzeg-f|r:(01%5`2(g *P)oe !q&&u!^;ӄX3 /@PH̸Y|s̈qbw뛳ooJ緳7g _\/Xsuz}v~8;\r|Cs}%_m"=g6&yٝ#\vqx-Y%֌@ gL)AH"PB8!Qqe@UGvSHI$=cMUy2*-ԬI= ED%QbrʾEpYi{Ū>%*$sZI,mTi)vNqS!}^r\:Zي sv#:X8xR!yKE/G*֕œ]9' +eӅiB'Ȯhn䬔{/ 4h%\)x(O䃟cXF5k-9QR rN E~/ _@"hLr5qK0VIPTH19K76L>}Ob]B}{dI#x*Jj`&i11144.#BE`8W-5|C8}wOouA;XL̳g~~ڬ2QLy.ևE(ZH 5H.ELhj-e6)Grc-UfWRs-E^> ?<) }{ƓbѨ| E4 p(H1P0pJ-wU3Qx/C25Bag>Fm2.&Ԁ^"x}Ԏ7r;ΠYC]Ɠσ!㊰Jir0EDy{3\<\_ɐBD]1}u96C*D9e#cu>CN,k){ W jt T" GG-SD7Q[пAOWeft_]Tx^=(|)2zAO"gsMo Sb%SX R~ИB t|Ri[c , 6< 2ezL 8Bѿykѻace=;>\#]a8k0h~+ޏ9v?|WeaU4s-Pf^GєX] t4:+T`V,ǒp=8}ekKƢA7QCL%y[OհҰҰҰaM?Yر2R9[ϐ@`BZLeGQ#I)pA3.!ƈ2(VL kL`SBӨъׂ$|Bac\cJT!u{K^G8AłR+K Vxk|O1ŌZ g LYE\^pBXŹg:4e&,`p|g>s0Nb7ba, $N8aT9McyPA.Nkc-r=sLi[T6hiCS\=/I | Øbg`:,l 5^8a2(bXE왥3ɥy`4sZ &x@)) `yZxR, 2Ѐ (`#BXS quWԻ"}$ԕ0FjY A #]?['ik֥S?{ƍa,r{=X$ .wdFek34vr߯(i4wtF+Xտ*փ*83m/GԪuS<>׏S"%ދ\TGxގwo.X0Q,ޗ|?2gOo/(D's^|}ˋ^OO<ߌ9\ e^Hc=~G.A]}FCړ.0H.LR145O}yչH3BMH&)ɢ<z )gDiFYͭщ9=R$5W8UCJ-h  XHbҹ.[,' *ғVhQkkx*(Bu-ȑp]^3F !GQ" 5f9y&YrG븼(P`y{Ih8be #PRUZMm9xuH'7cmu:w^ԁO!b*Q %A0Sx͕eI'tm]qpEђ1N:, "Yd*pu֒Ϫ$cUI1GY |FUN$f@T:7epPk܄|] 0 opD|OLsl.rsx7MR9aa)o4 O&S+Kvߒ:Ibʇ:HV Sxus7PU ߍ1֖ 8u鿐T >܎'EN 2U'vwLj&_|j 9﹃KZdR[V͋΋j V]E )4aZ/R8 mɅfoY~'U?>x\| ϕh0?gRZe'xJדNca N,]I  bDJ%sVVӆ{#+Yژ&0en{i.M)()!JZHƷOn)JR 0/~s1zFO\4Dk9%t,3I+cN2#fkHl`Eg\I6<;F4mb/~o7TIZVD!Q"Rԥ$: 0E$A g.M P 8oF4Rqd]N{d%o\O&\rҐhq$G@9A(e ؖbv ['ED`fqݯܤ0ͺ2jVms~f7)!%IIM5#&fƴ0VNL\XRq,A{FRl`Ln(?udrQF_:a.yҒEH!K<9ᴹJd&7"d8bK۩]Cŀ'E1l5[j w㚟 մX_LJmT|-:^?&`?Yj asY哮6@g8:^?6=}'sdP7R<+V.9-po<;Vx9xiGߎ~rn<2cǣd<'[(7=d=`ͤUՌt<µ_o!ޘHA8߾ˏA^SSʊG/^]QYMRjAoT#2}P>UvcϪ(ZQh{ F$B-{(=ňgE8g6lBKDƍi\Ux #Am,[ 9p^O4%]tl!BphX(4WHv֙rZPڧ3<O_ŀ] tʋIH5$zGxYIZxp@Zds|_S#[q:L\ PNihIl#^m%(i@6\O[[ z5:V59`'c9Ċ޴9ps8%pZT++_~6WgXXKsnkBAؖ8UfZou.q+5CН?t9yK$5J/j8JCmw6B9v"p T S@l❟~6꺼{Ӽ@T8. UK;WȘ^{ϾF9c:}7\>?4-|B؄R/>6wGqj{܎$nUN%SRF{ҩ P(k@0Ji@;}GGQ2ZMYKY"xK2VVBW[˾꛿|u&7HDH_6 1߫bV.zZ=\݆e%_B +gڇ%m,T:'ȸ$aPKNC_u4EA8H'CɐRiY2XXtҭ:E|>}kw-k-/4M;꓅ã#K+.L7/'zm)9W\+%i4X D#)se⒕ZEh,W"v)|?∜u*cINPp~ #pz̰17ksZM/ ׯ^vn6HW :;1~2͆d}lFSe[ZWF4hΏ5:FKV+vI|k]V0)R=Z; qHqr% h^͂Z9Ba;>'kl0G_f/WC-\Ldn&ۿ)M>&WrOէj n r}PZKI+](Oԛp O';)b2F4^ޏFd>Tm `'A>\?LLU2 "PSx2^eF= T$Yodt{SP<*M՗DEk{`f{S2zg2CnGǟof5ᄅ_u\b3q5àf_#?[ˏd[_q dȭ\j!ගnZ7W ᵺ#U0ۯ 7)/KqG+`N-i_K(^ѻo7Oz'q4{ǻg,3û.qsJ62.?ol@2 +i9j_B<ۛm*Q^WB?mJ3ѷn[}؃&"v NÎ_n3z3IJ+g2{SF &0}.`ǦC yeaY[Sy7JJ% @XaUN% 'o[RoXвSJy75tZӅN BGoB2I9NdeHBpf\A aĜcK*=ƞP,ǿۯqAu˦ b}uɸ\#k'[FSBCf]}'gKmc%RHS*sZc((d%$Jvކ%hu0kQ3 c6h^FNFD bƊy(=sXXGrE 'L\[f[EnF ^T5X')mRqGroPU&nz7[@QKd"Ky.dFtȲ70y1QD'Yp@NI-)dpJYZSdr'[rTRDVs&%E W=a%Ot\|J9& fʎnPzxsLIrI__Z{F?UR6H**1!-7ʟ,C m6H*gOzDUۻ >)~3orٳ [9{.VP=i[ 9 ҭy(^_*؀#7=U=\U=ld\ḑ$gz_%AZe}u"ñ1f;#A =ŖFӒ%Ua+ůb*=# iX-fDg/;[`%o#V9xx嵩Js:@R1ZEu=*!B%%5!&4Ӓ5Ti_ W~;PkRxLƌhݽ4!BBY!`.^!I 4>:UHJvEp: 5<sX(*"Tb`p4; cj<;J)R"1HRE)RVRYGg&v YaO_JG22v̀V#Z˕uՃTSkUF}MtʉN/Xi'vVXHrNͼr<:<xڍVZR(u,$ՊPXX0s 3˘ 8"uY&1K%ۉa팥T׆bho,!U [qJB ٕJPwZ7{#;szKߟiFhsxkAfv7svY5oeJ^㶘:"JwS>LAeh3!l{wχq']X;wE~#Po?cfLjrQv٧*[͆__ROfoR4o3[Ol˽z]tpx_o՟?>hCtfGr5™N){gv=~lND}˽_6W_~X1*7[7nMl usȻACnM11.au*toޭqGC[MDM1FTk O+Y2n)O-RmޣKB,NS5)*f kqG8wc$VH$, 0BjXN2̂\IJ,JYiFpjDRP"RAuA"2«a0"9kMAo= SRRx"%ś/${G${I|,_ '4j_AfF8Wȧ'1M`,AVFba%5Ch1E% Z8`M8bV=L"]²B,LX.N ;: B[$a(U\fhX:aGe p55},k<V \!yW.g&207K)9,_??xz:&8Rb__65 A*`9q}㸩c#=pun>DGl,4Ϋuqxܨ'kTnDimyI9*%g) Ӌ귺viD_D%F5by1LVQy~tN!mF cYB1I8ӣ^'&8X[)S\+ĉN*t c!̍ki-Q}r~x`HpBwm$UC"੤"%{ʬa1Xa8քns_Kb|͓5}U'"Q\PXڮhPwQѫIIcSAIJM\ B`i˙T!>7Q[&RErRvJV4(an3f3#pφP;u O$T}JBc8`dZDkkR jcc K`ƩOݲbBSXILX]nuo؂ROj` j.ǒiALG1ZGRI|}XIAeTF%"M4Zdȍϳ W@zyV8 P PqjE%1ףUiBzw5:dZ]Ҋ"S.,ϩCk%Na-ZHtWвtiy0]svMR*GPG 0ň0]-Vy8X%2 >aiMao4;E)Y/WOmև8WKveOy 1P~-WaY>̹#;6q9sCX-Nf_Ce_?ʾVBZ+m{U>DL"YoHٽWf)j@[]`kJDz;Ϥe-yp!\ ӚW ƆamעA!)Pp/]2Ⱦ]A׶]!%1 Qp|t,fEdH^fwLhK24,mSwN >]E`"Tu%5%,wЩf%ug(sCrt].~`aG K1|Eԟ"m~t6~HKZbOZN5Xh_F5\5wdA"HQM9 /|2:Tt6sVUҎ`}d%s|sb5%ިY&𾍄a59̖BvJu 4GgS#u~0m7\a8`WߝzXjIn7В-xӶcr~EXo! ҝ2dkArehSDjݤoku~F,e]wnA՗kDVUcPLdvr,zizC~Ǐɰ*!twj l}QJ4@v0yHjFv.hx԰ ?o׵~ÝٱbA-g⥆|`|$Aߦ,bX!,ĥ<(Y5G l ƱM%4I\)fH'8bmchbM9Xk8qG쑕Pc$I 4W"0 N`0c#Z)SQg#(mk5y ^Ga`&Y2p|@FxhOGnt\6(#8di*@ FyTn{gq攴Fy}'}'ck5XV_ Ѐ}O'H2?b"ܒ^Fs>nɊ_匛 NץJځu(2u0VS&TFSU `c sپ4 #|m}{q|7f%I^5$-v@:tK&Rbc̩|:K7 4G;3څq)QcrʍC4Z"[Jm)B<3gyڤZVQpYD ʼn2T' rB.\*L7!U# ]mbN\Zhy9NPE>Zve%vERK 34H/ǜש SU*K׏GTX F6Ҿd;ÝݢRyg~4~Јh镡iO2P+ԳRѡA1ֹ $-)6) (Be4%h: 4po-Zy52NF=U;!?Hv=BdX?;2jчAw 9m`ZelMVOI 'j&N?Oţ?|FX ܰ~1<[t2IodNlݦU_DƑ̍LUj#"=IR'З 7+H﯃I ~t7*trOj7N5i/zbϗ>O<|6g'o9ObZKb~ &`TRafj'*Kt"jD^R0L6»Bp=s6Vz\/()gKt6{??9FcB4ġ[tq;OS%8uyR.$ܚ?&7> _+fdRPUrm,|h),3NyFa)0%'cDKM֜9J|7D$%z_4S}} VD p(2` JJ{7_vWDWꭢ^Gu /|ж[W:/Sjuu/yS^`Ч  һ(|$#r6pBP*ACR\=(yRdR.Ɣ*T) ݏGw\EbtoK; ^gC7e7Ԡ^C+n[pv9BC<rJc.J^|]lDuȤ y峤-꟤rی>M dKgI(hԦH /A2bg'agkWstR܁+nFD%i;emuo+0"q+IRʳ>S1aQ9jSKqӠb r%w'[h[ZIhLwn4_+y~4 J'% dQ۸H aHx K*dγq |o嵰#ĕ^MxX|Zj)w ,VJCT6 E @PSkH@n+ ( Pk, )dnіφ2d.lp`R'  } r;/Qk"H2AIGj.\+o֌h֬%m%;S0Q Z)G( [@M n(->AwF $[|SZJ|,㘵&>FzWCW1T'&jH_iwZh;?ZQ+> }k LJCL{SSf@ &&3k.3M:=ijdzߢ~)_cW2/SeʘLӗՌjΡu`m9GHhb2P4A{QIL!uKuvj_m|w,4nTաfڗų2rs3bjƕ>+_'Ss3g%g&;$7<ǂO}nʵk*}Y>&/vÏd0w1WJ3+^f;qĐD Oeq{ʂ`5t_&qpk%*;]l[u.F/Eˆ'Ys[u.Ug 1Z݅Tig׳kTO{;rBj'qb$osSg?f?-%T !(vD:1H?7j(fp!ZSEΤh|lu_jByLn nzU= r(YS;mB\?Ğ/_}XVm&f#і˛7'_#⭓zMݡuH1q4&i+փLt'U^ہ_n "XKΥx??kA $j;VÇ~u$/z?TO3#8'+nj+AMyi!a_~&n7-LtF]MRڎ}fMEih}QiK Ae[Poyl s”JC)jVz[X!Qvƺr(~r/(g|$Bgr ndu3n* X:4yfLx:ū;:74brgw7;r7@]ٴkL/NvT#/w}Y` rp>8 XVq r4JSlnݯ, eٻ?u־r\sVeZ9ՕxB.Sp`jZ?{m Gb5g9od|뭊_t%w拽nA ^mL J#1 \ۦURi׵O|@^>L,x0CyZrd=!YIB /`DX0dkbNCڳPEɣ"z) a6Q%R=6"Ńi"xQ)AY-)p?v-hT4\yELhA^zu8*T)B+م9U^heIa7wcM{;OHOvYEm fdMM|>6X&:)o<(LW#34jls@-=]~`Q x((WRk rv#;5ׅTܦqIq\`D#20 9k(Dq{AZigl1.(jbd`[q5" cT\EEa8ъqJYa'LH. 7ٵRnjhi؁Eka,%IU@AS0DO\=JU)ۻ$N\i F=P~<7-vT'A2"8uTJw2Vp]7c'oRtLtVXW{S7wߎKސ\ߢu)9yYȿiT;Z+ajkm4ˀWFHPIE X NիPq350 #ԫP"B* 7%+7D$JD \q|U\60*[(t1A4VJQJd@3R5E$ A]W̞P)tyyA$$;$\MeRk;,4zodLrBA,RViR~?,>xxMK]PR`lpz2|p@*(zk`[]j-__Gc^c'e%4:JYhQi=^ ƎDO9>`c\&XGc4ӊxT:bYaUʾvUT T<ҥۊtd#XȬ?-h9CJt$d TΐLV;fvS)ؾ*L9OUgN[s6[B5S?usčH X+7gW$@5V=c4x9aEtTDX1\Z!8+,[3y`` ^āPꂲF"* ̱(L"#+A+?M_ioHhwZ >,@z gB A֭A[ JMi*E*-NZNMTջW5S\r#{+c|<$M } 4j3imʀÃ5J [߯dkP@TpP´|@f,_ٝؤO۩A֠+ p$GCة}R#l]sFWX!;]VT;.+~Tf sM vT_2v9I>2A&mvx126WPTKkoE 7W΢}֏>gb@%XeX9;f!w}ȮGT.G!\[lx.[9]-[ Z,}!VY.Wǡ-=l7upIl߾~yN{r-\J/`^U+WUkڨwhӗW[#H ,۴:ަyť z7{A>3 MRtN] I"*>sxp=X=&HF 4iHPS]ݾl fnNI ) ('a?'-Jy5{&QCnwWPyeָ[A4.Gߋh,ϔb><-(D^p־[,ekI 7ZD@IjƍNw/ᴛK@ oN>;st>Wr7upԨPR8Po c,WF #Pu^z^b=7eq=5$`e^A忎<f'[`^:8^y@o(Phа"/-bBRfk |XOqg}p'f,)9hv‡y!mqYMþ~cI= KA -ӧ㼜-mDջ_N&c̩boIpëBƔbӶ8%ԩk$)KeR2E83'rlft^YĠ1Rs>gxqI0`-B'A -ʓ* GXx CIE`yM s7)#aНaq/jkˎ}'/[\վ)a$f`1"xcD4 Uhe\xRw6x-OKp! oGV~{#.ϟx6e2pk`ۢjI 3qJГF)EF4jq6u؟~˼nn 6:JXG;(\8LV1< l`D$Jli0&D  t-爆`CFHcs#D\) PeCu)tNG,N( 0)c*ycL%s6Ʉ0V413"t\Ag_ + r507g?nPl\.^]!QR~F(VCp6Ot6蕙<%M|YjVnlA9|RrH䬂XijbT7'+[7_ńV]B#YKf̸r%w "$3N|̏>qRp-/ۢ Y0:t3iL\۱!,0F”paEbHC"1P)HBBBHXd6F$%Ycj0y 'TL2xܒH*CsւR8J` 1R8`R$"!HnzM2Xv:bږy Vsh}&3 ۶sUr"o$# v49!%Y%+=8#`VijtX~6HfҸXI7e 4<ٚQɿթ&ZYm!C*(Kl0OwuVݎQlmBҝԯ$6Y_=:ͦQ}M?vn_EVEI?(N`n?&x%BfN{/һyUц3՜ObqgR6r.=sQzAO3 $!߸)Λȡݬڭ(.Sy[Ergh[GĊ8 qITPd4NB=XƠys1"*'*{ ܳ89ٳ+Vy΁O<g{ZfpocLa D{v>yTEE V, 4W`;+f~zBx?؃?ρ*{a7]Tvvrk[`)e Wh^ݚsVhPjg w̱sOv@ *h͏7!tӃ|' ?IZx_+# S aL+#[p: j1ΥPXPLsM`c$d·8zj= ;q2oBF؏Ze7ۀw kݐV,ZHp+$f&qMh-}C ]{؉77nmw)䙌žc]`P&d&楦2a9e6xѮ׻h CѫGR__#ht903[d:߽Z_\u5a:\u/?K#=%! (* @bl:&t;ItLt0BLKݿۗon޽|usw,X}y/_痷oo߾.fvnv;7Rh6^J>, IEx.U;}YO붠h6< .I4$AB @!\CE07.2m~EUnj;[SW72?݅{ѥyK.ͧݸ?vu/?'T8oT:֠y.!0 X1#vAW=+L.-B8ٚ3NͰ0 V'##S7Ќz2!(d~^܈3g3Q?Ϻo&5'h8Pu~ѵ*Č١CߥNT\tUFئ(AO'.6`-n6 ݭݪݭݪݭnmX^}ANw `HWdq9 $IyN6l"9%PKx*:`QQ%Kb!?EC[ XB ў1$1Ldvym(G Wgd*F}wGˆyfw..;gPWDЀ!2(I@$ GpJFfX2R ek:+MC*,"rr^'sJ&YL06sq:cǑk}fK}m7g43piͬhVh>'m9\7B6ч6N2oa#c e5<~ʅG_TY5d<x{26pyJ 18 @ rf2Q L$G! Al#H̵R0&! wY1Y~ dJj-"[BH9%`_ gV}őO)>Z"K # Ah7 r9[4 3  )~^ou)Zn)=yGEF кx@NxEn/@#<'筁C SjIaYaʮtVWRl iAZUO!0SD>5C\sҋ&ڋ?;79WلhO &\y-іgۛ'S絤7ka@G$A 8OdZ qצiC .s+_0hA24 89f:l6- C8 RuS#|ʳ#^_JOɠdM)\7L#L)`GAoYl%*c}Mkt֜vi`[~4"ϔ^LBK qS;SVg Ye!$A~Ƅ`o`dӯ. 6cBwbuΩgS9KD8p]#-~J`h;CpQ !,3ȒVW/:OV^f+~ΰUett ȳ }0rIL0qs%܆u:ƞ')>tl"%drd7hy즠Лv)cw[ =h06\t%f?V]Qw$дmO~ɒ)Xs% hO?d}Ǜ^Ʃ;vJ_ս}ݽrJzyKtUn4.曢\lM0$T0`O-wP:? 1  5 X%'r'2qHc,0RŊ"haXhmYzO!N`{$Z" D2 F$GH#lơHB#D(!w?nH_̗݅ ExOw0H6e^;L_QҌZfȱxbE[8h3j{u^S4n)uM^ةJ+^B ڛԄXJC#-Qs^/HL!HV dr)0ZwjЄ44\+V=bDdX+[ҹSo{q.zdҹ͑׻(C a-E 8#A.Q%C<T0+yd46V[M\-tg5~y~,4hgh MMǠ(jg8/ :8h ؒQ M-TvRx"D!O 4Á"FE|)V(FҵCHK7h2؄f'ݺqt7Ci&XD['sWDWIbE%뫷EyzYedˡ&/Qif{Y@gBDxFKɈg%6 7y-ɚھl9*S' ܭiMNOi@m= ap;-EI%c%ZnƸGnIFd& x:FqD3吭5(B7O*?&vXр6jj~vm9 :zp jM *bld w R faSFǨzUR2trW?ZO7;ʷH"TfiOnx>Ư?jM `(OA{UYu*ڭ;;w 7Z3gW>pQY9̖#ia(PUТ ta T,*ࠉmD HG`/\/.Kk?)M|}ԫIмlBLȘ|_bk@lf/Y3D@ h ק(>кD?Pcn[ i֊ֹSB8j3@.>legu |&! DΌG\KNt.ڲpn$Kύ/_ њ1_u*S? xmG8U_e #@0 Kdu朼BK W^VL]{QW[Fp8(٥="tkL1f" { /*KQ𭈺F$rn֒KTJ _F)'9"V˰ѻNY(T,`%B2j NIfۀ ӄf<Az} zʂYe6{(jp.(UޅKKUƕxb^JɜV:ZPxg8 cY12P9m!gv㘕J-+YKB;bbU O-КP[a7 2D2[vA%}ojvN뼱;FAD6+b:fnBS3_T"ȏ9Ru :xtȇfI7i= ĉoxrajf:JMφ( K.l}6<DtڞR>?2d%g:H%6CX~cz;΃!2IDd?~KdA&/~Ly ۋ^z}QwP.N?V0ȵ/_e[tt.ܮ a찋g@7oekT/ ^Si9w،)17޵X{;&8quFAAkMiǭb1O}k\VMUc"в sCWg}ֶhRiϢ]KK`*X[?6k%V'ĘjCKq[g/^Qj ay.5 W+k]~?1^GzQ:?3<b7obs'\0&Z| _/s JٛpSk:R)ݼS8vQ,_ WW; ieSD͘ `rvkM=\LRs(@jKYC\J.$뉋׸n1`N)%ttOBJ-LA0..җea \S n@'JxO '%WxbVsG`pP#FD6\eL UZ[4SB(>OF͗ڴtcbWA ;Ա3bHMUkaJc)QBȜN #ٻSN6d7-T)X᱓b k91B&LSz>\ dEA !Jrh!:gӤis:uwe,T DSynY_ D-w8Ew#Ҙv|޹-s2tMBPi0DJu.y eǬv +if͜c|V\PHC˜ aNi_y T54բmVNM6_l7P+GDEIi !Б1p<գ'`,[{iPvGwн">IY_ER\*I{|3Q3eƶ[qEL}辎:dXZcT_2Z`$k1d ^d4 \Y:)g)~~ @yms msO/wIYQۜ&UBpbǕr3zͳwl/jҮg{Hط^tqF/-Z6n<;7 iĀ7Mc̛yh>guY?ph)Y1u҃+'hmz5!jf~6F@k}vh?Iy7'bm/+J]RL!wFPg5NKP0:zY X;kB"BcZlqu5_ʨ޼o=!u/l˥ R3X-csp/jUYb,2mE(˸E4e9 J[-j*1\i}܆j-_ .%ԎLK{SR8o G/}p5}Mykbڹ8q~;bݺ6yqNX<;w8$NMRPɔȔ19ԗ88/[z"!z=G{;hZ oƞzg&}sHͰ6_>7)L'Q=ٱǹ 8whs9AѠco}L'I̻ jȄGRIpfOKՔz 妀<# 0hNp>ʃ)`=I 0Jf*+ӡ /=7 $(m H!6Kl।VbGu3zvO)#̠/AVbrȏZ3. )to*W io_4AZw׬|{[ۇ)9vz3-ac^J"1*CDʇt/i R{ןw X&W*\]UI*T/ӽ3p7]ݎ6b:2D5]&L^(?ŷ=rmhx }K$DAv-#^Fe0F.$„);-o8p>2Q$N툸^͏hj.Ǭy9g/@u"22J<.0f$1p.TB<󄌥  ;ZutLWRWz 6FjޝʾMB][˲Nito߿J؇MŒ0?<}w. 'Bqfo_ O\< /q-@6ZOvU}D7ͻ~pLwnvY+aӲj]ѹcGSf%K1ZTJLJ>_0~r-TΥbȃfEթOy.O+fʙwsLIZ|GV OyG互pڙ犩xZҊ&i\p_6Mתz7/L}.S)v\X}&ss|w#;Zxfҵ%QuvuicZWmか:?I59?W-|J*#5n #PN:^v>kM8i|aUE˄ޤ*ijqF,1vȒhI,؈ <5co PONND}9('6`O>tT{`spq EI_L_\^=l=4%+`U=9-;f?=0|p+ޕ>#E̘6CUu͌S9v$A0FWt$II%YI]- 5ݴrOrz*1Y'm+8דjNTdŝr"$&`qzO_e7:}IaiM_ 9ǜ~_hD(y`՞%PU}6#j}|%B)S|>l~0FhOD˭-^zr-`h#q>Ja1iqĹqXHiT< 5"oϫ֨]$\4T+[Q]*C x;Kj?ڨr~nVT{Gb,K"JY5T;&„LbelZQ" (A6L3} d>SڻJ7it~Jעq|J$z~3y$pQgz\?ܯ W\ \r':}( 7 W{|0e f.pO FmgYˢ2^]( \Fτk<{ֹSϵI&D~%/Ք+?hhS<~&2~. c&,OqW s&4n[P VǼŔD ̙JQ% JT1(Q+nd~P6Pc4"PNF$pZ"1Q 4)'ZИa(-kԟ50=*3C9ΎC$=}iz,E1GR UIudjnt͝*iQ[)*=eJPPsIi;XEJٲ$wz@+0MC?JPPsgQn`9jI%[n8!s1 m-Nhi:<./U낦7/tHW2] -@OIBiWG1,LiL{|[oԢiF_/H8Ћd!A؋.ˀrg9=^$͋/ df+ Ve 恁 ֔IL0U.(Kߜ mDƙ.X3 ʋ%\Hs?d!v Vn{`pYmet&"7w5x2YIzDZKV{<. lGP_LP0d-ٰgˠ"9SKM|( "d4~7$JcLd̦Ldg ƍ*~vQս;{2x?lz`tik8{)ogOp~ fF\_o{귳$d+xpkʰt &5y(BۦMZA '7t3jJ oSM%erPL|ݲm:ݼz~iFWbG^oo'W/_g_h]Xz-~ɺ:Nd;淛W:`& #EZtO_Lנo,< T ;lL>tǭ󉓆_:_fWm3\I _ kD*gnН%E}6EI?&@>Obx~g:W0y.-}۠7E> &))zk r}nâ?OkZNa94~I0)yV7})C_C2k^7L>߀&}O0G$?4"%Kۘu4Lv߿ʓ<2lwL" Km]$W&=vYdF0`@E`e#tX!F?%ѾQB?> jLK*Rt>6gP Gv-yYgV253r{[O2m=1 דyVĿ/,ܫ{fu}pEAp9"};Ep&=LT8HOEJ=ryN*ɞ_z4Ts~bdc!`l<@h)&)p+'YO1ARׅ&K`A# ~ 惁V^KVtʀvX}yW^NCzJZj-G̉+8sM*~IiCZ۫yPr\[ aJO;є6q u kK{{h\Zlk='C+E;Khر2y7+*5B,mM Pކ0iIRlϜ*ՠ%ʥ&J1FemN ł|8dz#KC6TE>!3}  ܠ-ƌITtAr~V~C+rZӊתe\fDs27cJG_)A5gl'&w*b[ruM(`R 0Υd뜥XWKN,zs?I!Wq2C즻_zp;D.f _c% ra9ۡN'dq%KEodi}j[? u@"1eb܂tѿءa* I&;(4V"J)L0vDGiJ :zR4{6BYC44 B!F. zQʒ\ԪTӃɒPY }U;Zm㮧yXٶC喠WʘDL@ U(BLqr(IkWA2w& U )APՙ{OzZʓ#VtV a&T}2 0dF.> >fn}a;Pd γh*J(V$RPpK_(bc "a4;H B $_?ɋ"5,[9"_). fEĠ:ǔbMvIrznh$čZdmZ.ޣ@3UBYQQ3irHX=Bh*fb."EKMF9N᝻8MFWLCU{+~Y(HMoصwВG~&lI:+])7\jJ77\bxZ\?K \yPVAe "Dis _z-n l89 Y8H~j53m3=˶#>z9I^&}tv#*piKӻ R-"Q6l( D"3dN>upGߖ1A V|{tc쌫p} %\T5Vg UěBP0+f|̔R`$}Ψ Ar3{-2 Q -k2/ yβCѰSy FtOS-3]q0sf\;>i'gŃ''`ھ "˒-o@#v#I'ah26kS0p'a.DRɆw\Od[K>:[Yz0_f:;*U1`Ol&Yt1haPṬe!D,f`BIb,*͘g2KvEG1aӳxN=.Aٖ&d8/+ChA8c7Ve6TgktV[iSbÙGr0eVm1D1'E WE{F/$R)Z g am^)[FI :M{PuQ"Cc*]},]X_Fi`q_ !M&Hy!^R,7-l/BYZT?1Ni̮S캜 2`RN\YʹrB1G(F[9,Dl{+9"#ף72#_EGF 1%,S/;Ga\7~y||.H)b!%r@Bi:mS]mMu]ަ* Q6g" Xt ^ LcۨuWz4Eϣ7*[7}|L!cSbZ*0r`E#Hl 2aPI^DDUxf$[j`ݮv9ȅ$Zim#: .PHQ" 0<<[e)آL)9RurM&h5s 9hQIITX) Ff9ʁ м&ZW!9 [cvMJ> 0;q7V$udb[Ҝ49R*AGReJ ؠT;Gôw4RN3Ǝ+"eXXKDQ'pJxLz˴`Fo1sQ;˕[I<Grlj6u^ L5XdX}~ʍT]Ps^W fՅ89A;VX=7`h$AsF+`wT@=j%!LcOOr] Bp~H+VO8؛́AD^F8ZP*P4" ko1ە.DnDxjr9 HJD 퇂R\-9!ZJC>$-#'q>+SkCX"%q[u.4! j8p D{" <0'b>-ZE;mz3C49`ͥaY9+"'8M*S`# jI>cLKdP@(PT'AJcDI]J5Qo# @ cBXp}%di0&b#X8|11s :j`'a6Zs RFJ >,OR t4y,O+pPxBJMSiR yآn]FJ- d`]RhbED9a%~0bd,X RX^'fG0uP,!0۩W\ïEO v}"$=&G.;72ۑ=-ui}'låhErjW,wApڙ \, PcT׮>L$>>5| Q%|<ø4'9; QOs؞-J U%8U0.ځ*% ;*`tŌfKVv\Vڗ閎wTFdЭyUKg'nEtL$ڰY˛De7BkR- WGQ;Tcӆkiai(1;s8=e0Om'>~7_ܛtq2w?" ֠I ,dYg1j3veIArlҢ yF M!gA<>^`^YQzYu:n~CTbɜV$J4HL SxL&>h%mqSj$C\u0OeJTP$ p2L! 3l98xvgee]xFi {Oyպw2ww=*vV~c/ )RڟӅR^k: a{KUg`ʩ Lcw =0Fr2e~:~~7uf')ɂ.f=}07 PZjA5Soej7#K$J i&? -S~Z:}5srkfS579ۈUCVN4Jrx*k5⥔ԫRd0$f +e~: Eq7Xf ,0VYa2zd)qRz+3Y̍2"xv?t%;)jo7)&*:9sCSr-w Gqvw;ɿ.^{ꪪ -]t;}@?=*^ۮ %*q?Nw|1^\8vbnFLnfa4 ^xS @f_^ݯmz?_-wHs;K;q3*ݰY~9kU3J UGY"i_6sf=6uqݲFrH9[3;3%|IIwYlg<_"$꓈Ǖ39ΕևV;9yY?t&:㲒פre;`#/kt5u\ p53KX3MHG] xI:/K<߅ŏ_}h)vоXqD#2)x-9cK%:T9EG-qա@RJٿ7sXPu F,_Qg_ތ*C,t_!<_{hfn&AW˫WG*bÌTa^ބG7sL8QRSzKmipgͲ?_5I%ϕJPH!P;B.Z4)C8[P$(Aua4sNiwЎm$*Ŝ1"F8@t4\Tc 4"A2y<LM{jYgavpc8;(´K=XʵSdzAG"72z/pJrԼ;V;?JD (FdҍhO e_:Gұ Io6RRNYJt2* |~0_}qkHk>+X<ƩM:f}KQW!x4bHJM/Sx'+%+vvZ 4gYhWg> Xi/ QNÕMY{OÕO!pB@m(4\ֽmfLH5+H/w^!V0m2`GoxzzsҌgf5or ZCwP4bz {Hx{M @gu5ChEZXjw YbtBTȂh /$#Ϣ}us^:yC5"ɶ}ȗ,+1`ȹ>:5KG#=-8pQwm2-/erٱԊlcǘcoyq'sT? 24jSg BіyvxgպwQYob_}z)UX=>ufnhdq,s'h} 7jүXe=PF%>hCMpvc! v C )02bb ˋ%_TK3&Sʺ*T${H 1-3Dᄅ08j % 4ǶZN%3 B,pC%_E>/بeٯ(KWsxJ?M6tUUꔘٛ-?žP!o(8H  t{  X-1#W'd2߄W])LJ'J;QynnbgN-pmyT 8wNO uFyª7l1"{BY+DXH۬wx}4@1:IcmL opFU0R6j\+Ϯ䨤l"rFT쐌+d%Hl.ap@ 3)($1#O*8U1hB+Ϯѻ !r^˔='gNpDzy{9(%xd}0z[;\Xm'fDZ ֕ښ1ca5ׁ)"\ANWZn͸uN )xc%FAO ^kyjBBɑ[IXHщyJ8-|ZJ#:kNt+h TK 聻c5}Q>;W8 j+፲{<撀3+׋٣mK@T,||_:ll=O(LnYGө cm5: %t4^TjOqwoz::!L;"F珯_F&\eƶOG34sE0ZZ!),qMTvem X~1۫G3psCë p-V}L-,2:b:ջ4FU8S&6Q N !fOM/*4sqh*]j:ݖjU]>'6\תڊj .!Nh.IԹX4:*ӂ}[T~3@h-,)$slWy"bT%UY-YUy)mm>seU]Cpk\Xg iNq-OUiW= |0vH*M'xE.f܎k~9KP3iT(N{S4;´r8iEYYzsFݜi^i^uŌzr O B8r̭UqH$8!)N1uJ`e Fv"K U/akD6/{y{P %rp4u/]fXd2=/*V:tL~.2CjWo̞:ӧ8Ńx+I (19NM u^{su!F1jFzocr]i: :o_7 1J \9$Y> ql>8 jVoRIk*4TObtl41J[%(՜s M$R|CzXݻ^z?~c*݆k9*F3G=R}i_ b1uΉb ޥ%*t551ӂ/ێae<"XJ2` !)Б\@Av)ʶi=Rx\]3Շ4W3gH %k,m *n4Ļ􆡀3Yi6}2"0zl8?Uy|o0X_ڟ~alk9Lbf mf( @c!X F;v4I|pX"%D )$ (IxR#"$!h ;/{떢 lRЋa3?-9[YdkT;g'~W+7$^Zз1ޯ#H$0*uŠ^C0'3$܃=SKGO*y7X52?߷W3Lk2J&W|FCqMcb7ď,Owl'J~7n9P0Q;r+L#Lr$Ul y|{'t7u>taRi!ՑIowo=4'f2{؟|7^l V!9f=E+Ug`pcʼnJO8Ɗo Oo A%CàGKK}n/PFVgQ,O!-=7<{{LV~=ᜟ+} 26ЉsGhu# Bf (GL-ſ ꤬N]!lo2[g7m.9>@úZkљ18V+6Q J%,cHr85Heє'N?|>g7?7+n}Gn:-C |lS|6kL?mt׭w+A'Df*$=Dyb^uuz[n>nj Fb$P&88McQ0lB aa6EƸZ*i/?Z1D75$$9r$K"Lt}Y\"<*xP澪Կ&и9KVOXVlZ]v ipM$(Ch xg:3PdTʽzRJ,T,;ӾcRcmêUUn2cK=%{oiY~c)`.TdzB39%Tʸ xhœ=TSC*vN*[sA0[ ldK + VjNRhAAb$!>`TI([xμ )@'*0Z%'ky@$ą |%*f[de7UaҜ7cҋdQD[}^\mi}M;Zw j81BdtOw\*芋9cRX}DpE[͵':*%+;J?k0gРW5U# J1Dad.f;~f0U#/v0YjWY ZQNi(骍WJxuNH4G*}3fs:Qi(jHbO r9i>p\vp  0*6iO/'78f߃3}=hϿ]t~zk:6XYO* }zQ "'`F|5iXӤDAk0&u x;gtO΁Y)g s{^wg_K[cWݙSj3xZTQLju!L!ы[xbwK ~ 93]"zu]"i#@=:L/|" ipwBt擱%WjETb8=5m^}[OHhyEJ*l*3QhU#AM2LhvW 2?O1g~.k?IA^?CڌnΔ?S&Oy:~v70?Ny>DYd$P("1 tDz*9]2`64in2rn0fV*ecc>hV Js0e_tLbRv+a#wyCs4 S>7<ںCC^"S|^ٍzyhQhS6fu#OQ'm6-VTG%%>" h`F'ImBH{ɱ;D FlVL)V́grD3Iy /wʉ{%n#QjцviCqLY512g\LOs Z+*hMiZrBd %K5S9ɸ,od뻟7ZFOEhW<]ݛFot4m$y>ͲaS3L}̴߬5kʗ NK *~W}У}Lbe$ @ʹ RپDA҄rpeNmOG~^BJ0dnnHW$9堰!칪.Eͭ6۩H4ӪI}l̤%D3W *w:jIViWk,5su{KS)n\{j %˕\/! w=nXeI|q"F`@`XjВxߗ,IҫENU*璗9DJڧO&}jWezFGqXC^vaƤ(l6.;;LhxfKG!h\LUH?pA{AbYOrH 3HWXeP؛}M_w^SYJn<-ǒ~+̭1fe*j!?հ~l>]Ҹ?7J!Dt:״v^~K@_L+i?I_2p/u\$ (eG6'8 O3G= λrP^3_ԟlH!7-V @*"9"wO˫ii>k驭dug9 : 9R;/!r ]E:7?r%s: $<4NQsHq=\)њO9f%9tZg(5gj"%pQ"^BxILWNTaJ* h΂uĚPRe³4Zz֒(Ìk9Vk!*N_Npݿ xUYp@UVLE,8bl=ކF\P ن|rM `ނh9;-@x{rF%jdFn}ʚ4Ն+N#@G2B\OW/Ri;?z ߴl`~ϳN}uЧ }Ч .y%I3& KJ$bERi)Sd&!@90QX÷Cn77%S6[L}ȉm{<ųOzyiFO2?B_G҅9pFek:Q5'u8x*]!ٱ yJ4:d tU}K.[«\&IgT&db$B$HUK0ݰV@xM1 SëEqmhI"Ejq}M g ^E_\ 4atF.wk: uC`IH#OeuB"3xǟPe݋@Y{Gb}U0/ͱ2 k^:Fz=1fZxtu-tgn9v `srW^q`OT~u *Y1U.ܭ>n]L1eO0 f>Zb6C ]N`R0+f U1_ƓH02u/Xl4>%s mLgAiuR324O(Z!cP¸4M-4Sj]OJOb>Ziwh tJ)ěM^ 9n&c*D%%H`L 1Fh3 yC5ʼ1j$ gf<g)F2TX v_)&jsk'b()6Sax'VſvA/XcIsWWO7nըE"!8So*7"lb.$klOeazӼlQL`7/e~lEn]]0d.G"HU Du*-iu% N4,@rЦ #ȫ,Mfxʁ!r !r EF |T\1 YjLƍbd3K33 g4rlA(~ve2~ݎhBBkj/@riuhM辱1x7.~g5}׺_wB,6 kd}E#Dkb!*漧1()"H[I% Ő&DɄhifhI*%)ѭ0ӡ#;1߆gs/HdfVOryu+9(~u<T֟aמ` b gӏS`wSвX??~{lxЁ04 0t"1 N%w^yrI9{ R(*{ {oءanBpEr:5kx1́J;I"#$Ey-dT-1fΣejI`keRT`H(՜+H,ec]9dJ VH%Nv2+ijܜKiF,׉n'{GB")W&2 >uJn8ָ°&8S)$GK%H"/+2n1$(µ8 3d¦rs-1aR9aD*wo20c\ULS\Z v*]{uKt](<:^Zb-E * |{ % 0{±8t }HUy[G@U*b &yaU>H86"!rzK 88D c-G{K\ -o4EAE%XU@DB%pi[w~S $8ٗ؃_U+-#2g޹ b /v瓉2 떸{El]Hjv!G5*r7I9h17'K<-WyA7&e-"8꘵ΐOoZ˧XRΠLL˧jTt)ƢJ1ܔDw*T19Ua!1@?(,.? ejD- P V.nsܡ=ټk߫BNa%{\m=)hz+{6ާԡNAe )ika)$mA=܎CAbҞDyHLT &j lNk#*, TKqh3 .PnNu. aS7[Y^V&W>.o< `ИF'yrU7JDw/٢p(~-n7'ϗmt\kV)K)ٲm˾d>g[F*^*c߆RQXx\şmq2zc Ќ5U -7PaN*2*LҐjZQUe3F\[Q0sOee/h!_SOouQrut}\z~-dn`Y\5'Q>lR(!Т ~b0gIR)ʕ0 ϐjq AK#{YA)8t7&~&\`})R :0(>L5LvixYG!( :$Q~]CCT4<`ڔoBw9tu^{hRR1qݬUy[mA,VC|Î]LMe yB\>'QOn8l~*~7q}b6o$v\+ν2b=|lxl{Ot1;=b?gzܭS;k$H4RF5v#E@)^"7*-+5aR8qdb5blŷJ TI3r(8W4>eK&0Hu)+sRlĀ|eX _"RI4L$Ǡ!%$I(Iրh!a1`²),[xxqL_L.fv!+18$T(}"xBsy|͌` Kݒ.!\EhBqZ5D(jFyl@21RJf"'0\TgyoV{ l`3E&ӧ/~8}Àyp??~{Fϳ*~=`K۽LJe|&ܱ^[>:p;o0p+]dum~  L!wm"Jbz=EŶ(Jul-Prűbz(R[3p8jO1ݱ|ńL(0Rz.4Ø]kCx݉te"(8\C:A\z2מA 0xEהppe}I%"[hQmO>UEfܬ6 z*L"%CHEk}2Mom%Y˾%EVsg籞GAM461bAANf=pjƻ1*N|nVJƸGMQ j#Jp+YhM־TB\JRd=w(W0mx2.Œ3XrKβ%̾i"s8nPJa9G9ΊPE',U+ٷjA2V-|8GnX\@!FB,(<=Gy= W,>4BBQ+"P2T̢ BG.?bKAE 7p4ve(W0{H0cۃ Xjc L[)X$`l02(kFq%0,;h&PH@ܤQ;#hlPcÑ6Xi/F@VX̘ıVVGV\ 70VՆV l*D2P&=yny0ɡnP|{+ɿB_[X7h)2c.PNOƭSu-Ьm?Fz{-8&1-&M )g|_vu InTØf=X4uaSšj%j4# t_t95+ (  0&]B]5%.Uug|BTHӲm< -Qy{ Gj/6Ww"b:/?AX2LZtv qJsVh(sgy'F ՝ xOn\ ‘No " }}BaR` ATxff%KU@W4[}BOdp|O@W/ܥAѯ;,\202zn:}y4Xa<8rfC"7ݎ'7yov3w%4Afp07Js\£Y(ӝӛ.7 %`*8i%=:ҝA @ b" x N=* <;=㷤A-΃ƣq}j#Ho^xdo_|#x=/+ѳ勅d"brp痓/ǿot|Yb/|~|:73\o?}p] *5vp9Ku6{x .}ד T&ߐ-9m0dwM?6Ù;Nyo;?^ɥůĜ'? o={|t6v[ JurxE#jx :=X5W'[8%W-ނZ7v~䦗QR6蚩~1ͧL{מÁ~57@g0O@D ^  C۷Wt/o?^}j~:%|oMVWW_O!_^xa<JظE_ޙeQ3k)?+yۿIr;gjoReeJTeŃ&W4{90hw`?}'P ?I1Ћ0CowDG/Z]OҀd[l+3+{=X}I>(hY0Cv .|ۥ擡_<6heh( v dЏ}o|MkHC7p{0~ξpYbv)I=?*=ǽٟT ^`cG K`iV0\:_`!p1XrG`HcٞpWx,Y_ؓ?8M/2 b,h}_T: zi2b`8뼚 *"Ն<%뵾_G ađa#$Hsj|tKA^RuwGlћV *wz1ctJ3q4[r un ~2D8ӨS Nszpg=mhS#ȥQ;G}3? ~娗x+1f5Z{)+׾T (UNyGGmsЦ{Fϵ{knm𭑡eG9Lī],4֬)W ;d KĻvU{v oo19c56iMR3V!-2c/z+sn/Ytܵoe\ޗߎ~ys/ >Lܽ x37b),h#a+AĂ1RHph1 c,-0{ Tq;PķN.n)]T@1S$汉XiP(,wRqN9 `d1@+<ؖ  @q}X/Q.0#W3 <%}KtF ;l|5lbh_Αe6,:cNg! ə%NX-c  q, b,i%Nj ?A,ZF"C)Qa~IC{?ŃQpZ?W-87YA5zz5J|yRfo~2&Br۷j*E^m`(r~1]SX"ՑЈz$X$5.fAQ@"P,ǫN;%J0B&D$qQ95sVY  !d"t!"cKIA)G:8 vpE!3FȈm/5T6epw&+^S"me:]ILܙ\e704?s$O{.}L ]a_1i.ڀXku+2Ѽz DnMm޿ ;÷h m-Lb6~apP0BL` ̣qxD"Gɥ STF!yo"sFaՈ+I*On=Iފ>`YV9%HH`tһ@CG d&@Cc4 Z|82a [IjClɣU"14ӞRGbvgaƺMBOԌ d2iKL&#ԴkMS)&Xq欙 u($Zb<ERTVk7}.K.c] 3ؓuhaƁd6n<)&聃3*Hb+GFb'd9L\@BW܃q" I섞ڌ\l#:+2J2:H|!Y&W'X^gɵ :^0k#˾QYR&X-DC$ a2-hdx@9E//}yQˋ}YYvɺhXAE#r0Ez1BvB% x2 .buY:VqQY6yNvrs1WJ̖w?ɢ??]M}ZxQٌ'/޼eĭjdvd2K+#+WH3G~# D@B*WaNd$wP0)9nB W4< ()Hx\.529ֽNN&bb#hIFfA̸bܑ8 ) ߯onc:g] ]WJ[/C#r}&Zڎ3!먓Q($"pD% y#$k%]+fFe߯s\D#V^|PRvGn|H,@?܏snĻ?<0[ fSiA`L;ϥ%ޜ+r_6N:?)mƋ?#߹vOܮ憞/ Zۊ 2=tgNC]أ?4ǺA r"ڴdC :XEqU=srZ Vn=ƚ1i2fo^uC&M&sgDzү^RpJ6Ieɖ`G>mKhIت)iϹ\[H]9&̂+81Z0s!IEJi ã+?%5EA,o XV*.@:&,B$wd836}~0EXYZʭ.Jnb5, ,SN#LҲ(FRRI)q4Mb 5" i( k-TF3(*Eo=u:$$*X]Y -8μ5`5;j[-ƍڥŠ\kq+aEі,3u8ߐje;X3HNW{3"y$y-̰d͠2I>1 5en$;ucYA2>V#J72X}$a eV)ScdqY޺AUlijTr]. .9pلx"7Lz?i QQfIĂd*1mӑuT,QX\r2v@K ;J:=㕭OvwsSNNjGPE#ء-eB}r[ Reژ*2Q/|oR.9Xm\rRKq\wdQ[RZg 5Ay^ "DKA\V!ƬP ȪТ'.:*~/upb!f˿GyZfýz:8nt~U{]Z\Xc]Or;4rR.r6Nr4HxȎ]˛+bIh_~Iռ˓Bi: +i2` [ѲLq ֒+I@g9rO)yξLR "ihm:~u9!n%l숖-)ioD>D1پl&40><,sQyWxHPRh[;(҇blkm`6#xû\W"]323v9xMd-$; *e8nrǾU1,o|LnLKD10Bc<$øemUq定`2ThrٌU%w D [NZ=J) !0i]aKWðS@i;mZ {Nvˡ&/Sל~G>}gLlp+ =xGy=sʝ+prnJ縻Ylrց֢z H]%kF>cN%fgtevՙ5N]SBlts[*]K<1p.:}Xjj´v/<fYxnaT~mn\*?|jFu s-l2Faǿu\[ eN\[5cZ !]8hW [S!z݆uoo/3ᇻy.9%9ɴa\i h̍'_}Y~Zѓ-UǺPJ M>N?w/)r>/EPueTB)o[t\~4'*?wT(|T '롹ɇ.哵j9VSmI]̧|Hҧdzćo!7]}DvGVNn_ jH;p'om5&B7 RUk^/7^W2_*ARLeQN½;2Zhħi_ `7UW\O4AIrCbe¦׊ysIvq|QR #v.oi$r=1=-QLǻwYK}nè +92`:AV0Avsoܛ\R<}xsp;M-ճ?]wˡ fT0;s%r0 Hoj-E崦@]|y4%ӫ ͲV&=rZCS흆?E%VqTu?+ HlgJy_H2{vQý#C #¡!jxZhHxF>`p 8{ PfiU?5N  ~NnC@Ow;clYt U[b; dhR>L䛕]ǻ71)/>~L#wqȐKiFC}rHp 50U0P4x"q~oV.ne7zi~K66oAӪ&#YUM!p˜(4Q4zH'IoABZ➓}-=v7_W B az:*a">m,Pbq+_aeU{W*hA2iYdvM>sl2Qz7cr#H(7.7V#ػgPF -3!;rdLYA4PD|# \oe?Mq{׉F5G@/ 1plis4Lb_{evr_#MStS5lRQ B[z 2& k'< $c;yDfC&' e9db,U |Jh̄TKsڌ$Ls ! c2ED,7Rqu Dᦴ#&.;@G&%=yˣ7mִmM IgEG>bX@|*2j,G +FnbVVpyqF.4R.,7fNE."D> k<}3!˻f>_m?6\z]'s7o[Y3!s(K~ZZdnA'qvjKpWxŬ L9b0^2yZKw>Yz)?]s@l )+++N |eXS!ScaԤr,D+ف"yU >{㌿1%ny]|5̓=dۦf}Y, ?ߌ?y8y ղAe4Nm0Z}_/ As /IK2CSf4x  +rKI02n0sny7JQr*_[i@җi,b(&* w`_H Wg\¡tA;? ~m΄n.+!QGl}1'p4 uHx>J ׼C3Z9$Q[RjYdĽ:ۿ i6hZShSJMI7VqsyˉJãncʐ+=\᛻G#݇0WNP<]fY'/7y&a:&$e،jMXw~^ niĪQVycn6,$} .6Ev(rBFzUO΀uSܛ:gR<՚bu޼ 8qN< Kbanׇ-V"VxOdsYx~9Γ{wW]UoyrwS񗳋[gݭa*&qޟTdy3V& _9Ʉa=#6b{D@%%C*YT *||5R{(%|ysy@˪euB|ی.n}QD,MVQ _1{E]mqZ vmW 6cZO(U";T{ewN>,Ͼ)/Փ?uyV{Z!_cJ-'Ėw7𑿭6a7*<+N$% xhkI\QLJx I*!!/Brr!N 3{BZNs2KʗZVv%J: ²;oOj~[9' /@_A(\`5&F"2&/c q`ђe0ƔŐ/u8(ЮT v˄wKLbɔRbU ?OR7WI]>.[_XujP2keD YoέO9JqhO_B;rr0  ߢ=$e~WUШ, _HtraERJc~ }5kVH~՞PW#~1J1\Y;J澝4#F=Cϻqi~I%Nc}-&K@bDW|;W^\ysUUWt4Sw:3Yh;Me9NJiȥ}]("d}@M<Cl//ͿlNn?VdEyr{n \744k3yaɇt?772y D9}ucjq! !Sۑ8)sT23ˑTHb2J#&.ŠW'2bp)ga32Ո)YNTW_Z\.+U u'!zqb6B倳baLl"e)HU1(X'0/1؂KYDV:[1"32W MmʍF)V;D_Ў~us'şBşgjL`g>0}ݛ&zz.+oיVpygݗ G~'0I_ra\Bl HŚwj8AzにÂDga#.t~sv.3}_.~gRzD b҅p{SΧQЪ'G\Zc@y'L)XSPf2ӰU^qtceҽ_OI!޺ͺzX:D ?m[gO)lg>+,^ lN\ F+m׃yv֑unt6|Z䙶KRKpΐx|JOígG/1$otAAn?RXrj?3{Cw@3 Xs1YjA>ETךEY(v&g~Z'V, '@B'=w֙B"q ?Z N9&AMZY*eg3`V™P,u6&9fY F`_Ӝ*rrSVtw'\K^Gbz8<5GSi8}rc7z`,:*Dss d)lʐm̂MjC\jbʹt09̄*W 2s%sFBVՁ] fbBb~T-CD0cLl;*qal96ՅrOa b1);Fn'JXϳn}|v_SI,5~D7-޴ 7mGn[SirLU]PՆ OieݥS1Ӑ!r,maL{ї8nyl[ ̂ҊC4&wK ɺǒ5toV9Aj[\'5T>rMd,9@F4ol/Nt.QMt*9hQ.Ixt#!Xu Q^TGfLn LN@/Ks $j &<-9bAyBT\I9<ъ؄1T4q )'o-Z8W?.OW+=a 0KF `%CncʐkozV={ #JE}b9vw z)rt;sѢO35ZTJ_t$8|zt.֓1~T>E)IY,"u M G=GcE8%u;͠ԉԺpU̳p **G1*?>(mB\f*LF3N\߇ Ƙ4p!ۆsz*$ ƬQ-~QaB 6*lM*l16W# v6H '/DrTXntKrvdi\v\ݮiN;.M:*z͋b:st+3~ ڷ|a' cP*.hпe-.Cᓆ1ڐwTڰVyc B"OmqfzH+%9 dz82N詏^0( Gӄ9xPL1|1NkSѣud:qG2ڪmmXwB%*%z&1ywj+{PƱ;H< TNňnX`dT`0G.y y?GATۖXSQR@CVyɨhV0=_O&[] _7&#ӯ|dLWrq;^g@KsL.&n$i0S yƭ̰̪~yW`~>_.}uԢn[؁.f: *J/zk_#uEyr2*7X'ooy-﯋ypo^ZY,~<ߌ} &8S㖋H `R۠y c,ՋJOE.|PLhDsˤ,3"@ 28,+NZzMНjO4٨4/YZ =IBs!`R#a|rr $rcwm~H;ĻX9 }8 6=,{%Iߢ$-e-[ ZWbH~1c$(C'8 S[ciʉjÍk&cz eQ4XG|bT*8笧ւkŵ)_K9{t)\ Ldե6)հ Fuu\a7+!0 s.k=LSJmՁ䶳w܅hѻJW]@*ޅӤJJr o_hF\ͅj'=f0nm-0٧#҉2 $aoBRvX5-ݧ10gJ%?W\j e.}G:b!ɉUt yEl ꟨6슘?RCX4VcgW).s;+SBd%;fLf⠛&DdfʈWv1gH |:z#lOFZ)5펴̫N<[q{WKl-ib'+r%_kٔc#۰dͺM(N%ڌ15+3VF#|]gzsDYy$t(;mYR'(J q"lwW>T5Jޛ$e`+NZTpN䐵@L C.yRc_A1s.gTcKsqQ;Zsڈyam4C(@8ǟ :eGɘB*5 *́wzW 9[6^xɉHpQ*9R!'"gCEyKc8jhSPعvL7ʟkmVyP}ڣBh;c0P C}^jکxb$ϝ2iw-v]~DEYH!~+c]ew"U0}7l:r;[   R޳oc @SMy"_'9ԧS "U~݋R &@iX} $0; _;bZ8WsÝRcZh(hq7T_|L :+e飇 *xثa6Orj@h.=V(Ch&R3ƩͼJki "Mf.>j}yopθ쌟2"]CBQ=2rVeY1J{$4aqM)ㆦ:CspˈJ Ts+ j2YI@t .*2IjkcB꺎:aPAI@iÄKdj Q\x3AS@SG=wX%u& QDS&x|JvPCIWpaqIbEppauYf5otvrE9hxFj_muU0c:i׎>/#|ʹwD} .)c gg֏'L8E)1FafwpP%@SNLP>F&9KR ]5uoH"1 ZcVI _n&)9E,M<%$`8M<9 qNuRn1Id5hi6uЋVqHN=q -Wpgus߁=kݙj";bm2!ݳJFh:P!@^c/*;A0IP~фdC%u4@1VMHgz@$Q {7M<]zXmMjyo_lWONVbUaWǛ u aԎҶZ Y!* 07FuΟuRTuX= AU{(ցJJ&O5n2%0DPɤ#Ii2B&6)L9 x y(NPɩmXqp.KQEduư P>%0/9b:2^0ͺqKH%պbB(#VI9<>P"G;)yNaw"4 G2P Ġ7\p,hц]npо}XA$K O6Ώ7x%vGv,UCM4AkX'VO^Sp(PaM~sLa<+ӞټMg6 ΌwO3ְJ=}UadVcfs VFdKz8G]/! aM!Vy'czQ枃qW#zsJX( L5q^F\iޢşc{w f Z0@IMHIs]a:G>0B!HDP#j}TD)&D1-na.kgu4k4ϬYJyR3F@EgNT'$X) kL(hZ^F@vzjXz?9Rؖ ܧRG ӜS!UHp҃Tbb%S~@'z z91$+mً,#؆⋝pmD4esv/)FLlx QcȇQcZr83V|M'_' \Cnd6 s+4Yihn1VAu1\|+05ZhcU{&UVt ,%unQiV:$?5DžGޓݏ='#f׏'zM ;jʨ&`Crvنޮ] pI]7>rH&J5,)_gz0䧗wl@P2Ҁwa.i29!R:e%I ;<ftܤ@FhjR& w.uQ u$$M@ӌ&( XʠRm)i22<=ϐYol :jjeTQ.OSHL ,($:2NTRaR~KQ+&up;}07V[YjJ %M¬R1>JE Ɗٳrijg 6` $DWN B2Y2cAP~0huSK-5עrB%Bq/fեZ5wO]ɈڥT[]jN9k*{2ܙw{)(twENW;_uU_`&q=zUl()ǵߗ T:)/]"Y+_`աuՕWK9GwiJ|u]' .E6stwawsy~l# !l|4 ZNj6n2K|}@3@'߯2Σlqs%:~ekHζ%}rv/\\,3.Ql|-kS~э>tc-g.G[܌4z~BG.h{&x=;N|Jnxg_¡#|M{]j&΀j"a:Ƃ'cFM eT!IoF1}߿1w$Bf6HwHW8?w~:v E5/9tQ\,xWsi(c`þk 9:r .'qJ;DEH(o $c\V߲AxW-{&ϠegQ m]Xu[CU6ߝmo~5]8/ҷs)iF얙T6C0-hӶ#3ۅXsJy0qXȜ@>' r%WHՄs<u*C$ 8gVՁV)38qFhz?DlAQJsQZ9X2 qb͇^DC*Su[b HjY!b5ݾr$a 9qQ*Ew!DU)JhKhIdGEwB)[-HaYDwjD~q2h 8UV]2*4 *.U\Owb1RBj5կNxqA6Uqj *_e{UED~HiUp7ҭ"XE&'F[AҎ?Ma{ WʝUZWylt;Y~Y=#G!TGp.ͦɻͫ ,6~yy=K QQc:⥊.Ui=XLS!/!06ƍe)IN^#K+%J2RRHGbq%B;e6:Xc(L$JܘiaR C嵋:2UH+bc=]:/ "4'w ׆ߐࠊs>/}E̹iw6iw{\SeNf-֫\YYBzxTppg^2IaL 5ViG>Pu=8JWA%!\% czU.`~?ykCU)ρȂ+ޕ3h)Ml/\W.׵Tոy%"wsmE@< "Yn*퐡hÎbU~{TXߠ?oz~PoM}cin)ʷH˴|`z+mtsBWhc]y0 xZJ"N7/<]::P,5I٤V $eXc SFBt ٟ)@;ZhL z!;y 휆'pBTNS{TxiK0'8g! 2s1J+X d%㭌rC Tqo2oCZ6&(xgMPyfFԒϖ$QxG!W A8Ml Q[;䵍vXq9CH{&Q$B5qm-7]vsXHw$E;b4ERTC S~TOÝ3{&^ZؾoޯKQEj)e%[ 5nyx |q7^f@U辽?Yd7g#%8}|Qef=!]!3CnBţr'l"p0ۅdEC3]J / s-Dڧ?^Bv<}%Iͬ/\>iE1ܐrM){nպp'nĘN;xF- zEz&,䕛6Eqȸ鈼nqw[khv8[m5ܥ?0X5Z &Y5;_ ?1PBWK&'T?9^(-ɴ|%8ń5DwD7$ @v0Z`G2 ?`𗕌e*RD`o}U -C e/H.V݇>‚L&Oaz72%)R_Haj kU-XqoR)Ry$q(o9+jh.!ikOhhц֒SPa >:Jus YyHl9Yּ0LrA3sEcM,\KZR͋hg땝&F<@L9&F/s&NGz/>3wZ pujG>cd}:[z6a=_@bmw07 FߒI9~ ^;IEߘuZ(/\@xhki0C]XTBxLB-45 e71W#n H@qSGqu'p86{_VR+|۾DSBYq1Y9%ZH{;;!02EEa5\pkc>w+ż5Т <-7^f|/5޼ Hc<+aBg0A[d۞h-'dMGY<}z?K \ē7o0X%%9 Opip ` hA0 7U T|;S,*!6hwk\^ ;Y\>vv̙j vݱx=J"ZdˮׂZ&EɎ%0w#52NR8wU)$(2 jEazZA-ծ@Hq@ԟ*+):1-%u%S7^vJJ91PK@2STQX[P!v( 2`F )a(T荃ZX !+O%׊Ӝ0VTi iqsË 8s.$&k@) `(c#p?kC6JbVHdڑ WY "K99',*JP)(e#M-5SH95JB`o B$x\[t[FS( *êH(wǐKJuU6F HoE˧U]QG.ht jh]INU=ֹ޿Oy Ul*wjp۳q'%MԬ^cBI!? ,x %C-U}Dc> LDZV8x:Nշ nJkTR$nk09`րE*]#AMx"5m9FH{f5U *@zPIt>P5~cTZkq  m)FqKF(uk3aœ1s3e":P&P(q\a~,lrԼw$5:R2j WSS8oEuPos /!*xcΑ!OtMDir6]SPL8n8n;8]<k@u:]tݛn42ֽ 2k6\?''ao++ڧjzі7˒v9zyOwK }8 ?=ܗ]ʽ >&'zy)I$uА]Aת6`uEKeGO,{紩ᖎCuHVzs|DB h"g1Y!<䝝(V2vb޳NԍpH 6=]Q*uI [c0ްr6Z6TDⴳ\\bz0Mh˰;4e .TGlחIM0"dӼ1riX"{j/ R!M7 ҧ^uyQZ%п / 4`Nnx ]7hPw[ ˯Y<# ~}1`4oiVMsހ:N,F{Ŏ%=nj{,8&"1N˂h?|,UpTb] E7%Հ!8]60Lwwn`X$hkfBPIq:k_$:/SD s8(%T  9qjE`f Ƚ\AIjQƀ^=tСs2`Ȱn>CKo C+V3|7ȱĿ|~(wא&Y7n~tw^]>1xzx-]]8n-mIF{ߩkkShKʃu_diStV ti$A4v1Q5i∀N|޹^^OI6!YփNudZc1ªo~7ݶ;V)V$G%n<ڽ7e^B4Oii]VS&Et\=[k8-SQL ѯy`qrrv <UHI * ƬZj{wRP-gUt{u\IE+<[7YqaMfxnt<0D"pJ&4%Q$3H:40NT靍842Ɠek`ȴ$<8H˻jm `$bqo;5Gs)!XV&_+Ē(3%~]&M+ pY%VNW3:SRTx)Ɯ^١ꋾNU{NIt@_(IJI[k'iJkaj/# U99M}5KڪGfJh' H" K4&#W4)RKY=#g2TFx==Kf}-y8͖^)IX^1TrKWWV~ʻߕ#IeI_򬎗kʏ D'_H>Zj%0CʣaeRcLi%2 b,,{_Ꜫ"]g/j+C>Ͻޢ&411og[%tfK_--ҪKb{\}IJ[\*^_u@ʚcNwMkda;zyq-D\ P[ ɯwXnϙ"-p*gf_f%`.f~=x/Uj2zVqpAUZ-5-A0jem0iX/;-@s}^i\+ k mj% @WӊSџMvܾ $ *F #hx1&49U_-Q9ٻ6v$WX0Hr̼ dY6$9-d%,V$%]X"S+/텠G,!%Q1/DX F_Rnc}Ok˞W(F3? ̎2z=.Tݱ·r\>ӭn$BH LW!o^?}~s E[;=\Da/W0K{Ge߉Mzh 3Ra,n.3o1ORL -~ V~?K-KW"!.& ޲b%9&՘ tNb'\~5&9>|_G[^8:+N?fz5٫壯bpRn^}"X—SQV:]|?yۻ2Ňjb<3Ը85C5Yi޾bq<[!+.a?EJJvkXW񊘝vl3c;y5n94̾}q73Ǩ}X3?E ch?8h+% ۬R|uG@sMJU]8['#U1-bZ}, =>be61 q{c9F zO s]5eeJmvI^H՜uTZǻ;5WǻCoBޯIvd.dJܢc;[ճR>A}c|6Q T/>*8D^ 5L` |~rs0ˌ~ Cn92.X%iKK|H vq$ڛ"O d_%{T$~wl+=j DžtS8|=|QIgƱ-x#T]-$F4"I\,0X*欗{_C 2k4^(Z ˵*X;iF)^(H3<51}*Hڠ5^ܸ"I `|&0R̨!! @X ]>˕87`sI+#q#,Co$.˔!&:1ҎKlV*.QhqB Mcϋ{S=g< sd賠DZ9)ݕ*hG !MT0qMT@I(p} 2`E(\RP,]AlS(U ئlQh)xO7_Ocu0Mo&D?3oQ "3` ቃw^Kh}n;׃'Om9:8^~y`yNG)XI||a Ya!@eqlg3 1N>nOW.r V꬜&͌q2Mel!_,R(AԁIC30!"i5ɰZ T-Ʉ8+Be'c@!eYqG5AK, Dr$28Z -lKE?"oRZurOj1o9$])+m "!CE2X3!CFrm]/ 1+† `i)$,Ul&TR bE{p6f*B7K#u`i]Z AAQ@Lq3DkQ C/1(0k J#Ôdb*F LzVؙ; ZR$,e,~|fkz(N IZQmhK^G& ^sbv\cL}jAD YK us2FW6  SMNWm 4DoV2)XqC4Ld"!BZVui %XrZBHnGWE) lP;#@8jYe|nM$Z#&HRV=JqI6$'yǘKN8B5M?\f@_ e@O}.a5Ą a*]z%9oqu ,' U1 KR͠meS0>?ұy3_^^ͫ_٢(~VA ؊Vc {X7,ټ7zTIw^k5J9#7'݃XԶҳV.b\DkEvv[ex\/&h6&Ŗ0I90)ŅB<ϐ3{(%z=V0"NW=F$^k[iquasAO䠋˽شM^lũA 2ˆtW eJVU27jOf\I7~ThfW udi L8:T;MrS (Tvwfb>/Fan񿹏vwL@%P$߭Cs6xWl6D59ިc \a^aMIӄY.jpcGjBXs8t\?/*Wq7h$<< ߾ 8Lѩ)x1hX6я[A J3ξ$S],msqSbrRUUSOwbsiaj&$'2EWn jNgTn۱҂Ӷv_5.Q5!!?/S zs(|"6W0_V60c#H0d&c=x:r, G)wϏsba;%Y`$Ճ4/ 7 %Gv$¿?\C_/3Ksl- 0 7i@b0x1*ezhΈ:Ru/V6P Yje싽S/q9B%||Qh]cr"Z34XCC "w?=v?bhI3RG6=XOp_A)6TBHIҩ%}6@Zhuj4n4٩!4%m@Zg~KAH㸦 b̐n8V3}>6Y>J! xHil,0>c`*(W0 O:cḊby!!靸K7 o?,5׼`֔+qmޔ7]m8 [ ETq%Q6S?'6:1m1_MGC,(L83bFN;ch2Q5qN l.4 ':o6}`Y4 ))2zo'D?~k8 iTZUƫ$tuOIEn{Vri+s*-yXLe΅o5-B0dv)*fdTD* w~WCW띩VмF]!G0^X Ko]B: !TӕaV&\wڦ.; 0y5& <> _Q^Ū@W3W=OJݼZ>iI3O5SPR dW]qSJ>,Bc~[υ΄@[wL18H„ o3Tc,-Tqvn!3RYMY+y MqyX˚ }nd[턔V"*0q00I0fBYJUK,OkӖ2s))b4D;`0Q1΂ T1{R!H5fH fhX2,LvmLqHZe kT>Pd%(ƴƉ~;?:Xzi-d w|ۧ/ݐ"A >C޼~p,+#0AD Ag㫋AMg%|Bwf6u%<'>XF/n~ 1շ\XrS&(dxbN>N3 Aj22'Aê(#eSסZR$_(O*As!O%6Tj>9%X_ T@wId*0yBX{S0NJ7Yn٦HVMPh^oPk Hݾu]ҿ ]gxrf4E NʌzcD 1x(e(8L.2jKdQ_u M"n3nU$D$SN6طs]*P V2',1SUaT jh&C"yFj>c +)7ޟZW'0z05f?7*':}C[wYMH$П#wEIOu%vJrDeiҨH?R qPߪ}RhXzY-Fmu-gz㺑_3@"HS&fw o<,iV2-vr޺!Zs_Ev@X@0M4tB/$ ~L/GF ۤGc^~/1B0/v!!} sc|% u,'u, RL'^b#%BA"WIW":9,"7<"XE)hj|JpƧ-?8*цI8/W M.x%I]y-F"Fa(4f8bmj͒*S@`>?S(|³@2^'0=Oۄ%gl}E= ; ٓm)0$H݇y-T 7 5g>nz,ϧkݗΟ|'kN/}kT)E048JK!-xE39mJy?^~_vJ}u=D'RrxHQJ౪=a X AiLI"wGB-̤ԛ/џV m5CZԚ7c|v%#uG)'rq0`\_~-?)*;)yF㰧;AivzmK嶥Sїtzx@mhKNۓkqn\93ajVm}or`zak n]v X  FD0'4 ^suۦ%A@h^†> s3`EX|Q]ƸaUGwNPˈޚ*@!*(RBt&`Aޮ[_w&uB#[8.* sF1AZ%r⪒'**“Qj>pNv2s eENx.?ɮ]'/ɫG-nX!ڛcFr`)g;!x-[eQۧJ,8ccMI'c`xy=6H VR;W4Y7o+PYT󶴊C4 ag1:tZRw%0X [%xX'ohA!x%񯽨zHmRWV@"/oV +YAj!Y]ŚGRQR<B%QKth㎅#A 4~zVk3A 󰬅a ~S`8"ީM`{C6bGG/au%4 yTQZG~rT1xzvhOˆ.t:;xSd\?havS~%+fh2NUbu6{ڍJg!4ήb>*Q U}*QbYPG)Q?<fQ3ޛiy3uN+_NH =FYm MH2ӡHLīB<`[YEQ+u>Vm'!VkON_\yIOe(ЄTA] Ҋt*H!H7V7VEf""7N3ZxD..Ţ~B**1:jITTS D@x' VQaZhTΡLMʺZ6@ZdHXp]Pp;v(-!= f+U$_O]/1]EF'o}o}8c81OwkjoXLebW@r.c722(IS?BX#K'A2T lB!P갱@խbJc 斱caR m~ ADDe]S, /82][JM`QI9,y@ \ 8aBwVK,h[f[&)3% 9@9<C^hyg-htkEJX ܞuP4κha|({^Y0=:?vJX''ZD1O [yАU Ub+a>Si;xrƆ p`R": "9|jq8,/_t^~jUx 29эӛnP۝]qXJk\^X7{r2`ȷoku+XFcܤ1%:sPxM.W/"]vuP^G%MIp F@jUkcRK#&]+q~+ԁOT"/!g΁\ZD!kϓ|'ѧJkDc1Jߠw`,$D\jPOU咲^aoR"L@&H VA+MMNBXHf Imrj~ة#MڌfR1 =#!~5SnP+1q .exuւLƣ*X:

    4TwAZ&~}jgpsU57oZ_s0',/ 4D֏+6Ky^JْsŗY`)G{g uFPKnއZ OaDLf[áL3AYޒb,c<ϋ׋E/QY['瞍29G߼VN=#d{d{`7jLNɪ]>Ni#A*{O.o7jPDQג%9 2QRe`P\DԷ۾7ϧkwoO.ݓ~ꐕ(hÙMwf?@Ba)''+ĠM>C5KYJʖ>kiB_1LGy:pH#\n\9;{BuK\[j.fg|{r-cvvuR9ד|:-BxJʀS[O\n лA#7^|I@@orm!Ͽhpr*/`۞bakBEG:&]0ёwngIf֙T[ (#kR㰞2RG?ߑځ; q3|G+|*8r:O)}MM){w# bPb:}ƻ{N̻o S݆Wnl+ބw3M9{bPчVJ>rlSV)IR O &,9c ΐR@@-I:dY#Ѕ#Kj;0xSt%OK'΃R a缲$Ph5]%?u({kڻ$P*D`Q@'{H4YcdY3t5(XоAekq0oLڵgAkh,в6eVsd7YD-@4a*l8vMyC–kF LNc¶ПacO4]ep/k,8 $]q,@A h2l?9zh5 g|Gb;&6Rv0ey5lrNݥ89clvI}uQ ~[j !/%w}&`>:U WnlkԄ+߮wkRѻŠtw/:54wxл a!D{ٔ8w([_ j$zUό FxzJ/-B^nٔ-)PYpeGOeKF3ڈnl:,B3K>v仜t2]d*Mm_np.xJw=Y~Y_r֠lU9$0 -in:.:c"E0 ,*m PF8rk^pW \#9ǜƬ|t+;VZxJ"c7+kMH#mvtJmVҁL6J4a5 eg! YX*!YҒq[ |;,46PR/!J(T.t w7B(tr;e A 0vAb`kB")bQjUY [- @6ITT2kQ }1)i+T+LW *UIS]yLMr#> RNdc32 #r-XG^[Zn4>69%sH2BR~[Z/Ɛ!8Z@h(c$(RYfwg:^&5ֱwC]fTz;  +, fsLqʣ) F ~'!U&<|Y(gzu_ el]H]] a؃ ]ss;/e;Iq\eL:N̏(Lޟjkyӭ堈9YCD 8,F%d<=,86ՎTTg!ZGK̅jѲYrT(]ELfk1k@?dܳ)!Dr&ơɥ4.g|Ƭ% W=kJՋVf#x$D<=:mhZ%lI.z6-[7IȆ`)/lq? .h'BeX.6d/yO/{n9x?7=д}P#Q3(R1O0/R,{;-7B6 .tdxY6WTa[m}Hjwq<Ҍ8'>ZxmI~zvo6X|YhKf7S&\O]$s Ue>}x`-;Y+ vvuvl߶/]~믚>B}Sn,?u 98ѨoV4%ݚv^7|i|n2jjE{{vB+ԉ;dV57?hFZ:A[ W97$=]9՗m,Bޙ^;M?= q9p7vWhڌB7-*2_<2A%d=1_@e:(ү_ ƞV<Ļz#$^O.* ux),Y1X |wx)F8fλ=q!<Vx`r݈s D_ךW-KqXQAaa ޥ)rڞ~[{mdڦ83ؐ]GX AN< :,1sPB֧,4\uV'&`Xe@6c k>L_f+?PHn[oE(Z>ZMYvR_쨮$ƤRpfDZs(L RȾ ^em 5?m-ܧg.ӺVk2}dvUrundKk.wbx"Vbc`!dz+ ??iMYXgo7<_e?j:-aKX۟C]J?~qQ-Ͽx?|\޲i)g/G~ڭU#~!oom~xk>Vb5nSܺ ;r)QZϧb }94ReM2\6%(=X@3sօ}9'9odPGCԺަsw%hW= ƆR҇9:;P(*-dvJF4?Cͳ!T pQݗcj)~_*k'q*vtUr ݴҼzDb'Vf(G}tslu/S9B 2;BКg 5>ni-{Ѻ{A;I)c/>>Х3s=Nf#t!hA{o˓: kC]n3κk(x#ټ`_Q6յ#{viT#Ac=b(:1㤭JTOTl&;^-Q )Y%AI*ɝ*Y9N*Bx . Z)m]Kih״ҵjQw-LZlZ -͕jùZ[zWS9l]U-WݻzZy9N)_Su77S&ezs$)?,xz|?bf}spIF?uY$痑-/+QȠgA0BY1}[yzSlFSPʝ}!:蓃J%J7RP{ 2y~4Xu''1/. K= -BO2bKz'6j _&E%~~]o3d`b^XUH:}!J !=ajCޯ>LRiwq'SCr4UufdYj0h⒞y=D_F(h}n;wSXL}Q2LAL!X2eѩ&Lw#ձPS: K.rq_wm%S igc$QjEeI,EU?Ap9~yoSŚ@}6ba㝹TFXU(Yh(}J84~@R4l̗rjH)33NhΒV{\4<o<)#kvHrN k0Q"s0xm0?X=]Ҩ-T$ZwWȟr-M3@(dR|biqܸJHq?k4F'r&qJDŽ& jxtĈ#'ޥI ![dJT0H2x&H?Spm ]ĉNJF YG&q.)VhǛo&[b$B+>E2]N7X_2 g({RzkVsz%]_`2{5M$NϋdV'rƊ34!gRL| HhmNCfSʒJKQ^0EP2K7|/.(.*%?j~+'^ZZXҁ4Ȝz92n+j U꾚CB]_WeB I.GZ2hJ\Ņ2~v_Br1t۠rzs}Ϸ$2\WFbH-SHV_ 6.;D?痡ah1Cds8O"̉ H%NPr1J_VK~4i+l;c$nh$KBaze0zuw zw E렘v\-s@Y$4@%Ih3$(zN: ɑN[ct1eo7G4@^Q`d UDuqpb(%JqpZc}\Ti"̆L:ldސ_($+!B ;ofy6!@*1Yfj$ἢy]{U;ΡdT+-<,M#eS5ֻZn'Eـ ђO1!kf[a{X%V(f~q'RRbvuq5)r liR&$ +" xd@j"թ!k ZgN4mXdS?FV~w"DxYȬA~hg2ч^e,n^ 0aO{nO:" @\19bȕ\曽iݼc`lk1К11q$WI!Np"QAEaOfYݒ[f  ) O,;E^@kN&A/fDiPé&Q{+f #'d%X= F϶P*UAțJOL زaf!}~VC{:j~usqqQaSYB4V٘P-+hXGx(_s/έ!^ǦX.l&u^}ծh N Q0SSp"!vԪ+6}Gfb$5nt90c;ެ^J]PEFAD-OJddzm&PLG"-f&m6?S:/Y|ofzbeJ}LQL@~]@Ÿqcbs5ױcǣ%g-k#~6ZԞш/8DץU)7-5ĸ:քѱeӨuwǖ_cZڟkPWļbdab ڲB Yc Xf?revb7!4['ugJ(UMkaLbWx ppܢ;u2Cp+.JV8 sɫ7 1pً6lݮiEjbrQZz̩d*jh/7t5yEʵ:-Af .bFӬ1JrF [UٚBCGmuъgֵ mc7U!(6`~F ^B᝘F=})jjŝ6Y+  (ֱעQ1IUuiQ˺>z'R7N${1I!)I, VCznvWw(߄>&CGZrUJ5rj WD}Lk 6=}|ԢK0uAUw{3uF3'DJ`#nBվѡD팰OP\veF&!h M7+guTFm FA/>ѳ튋A ڸi}L{DBC&__7歖ON/\DM抈WM'@p㓻bԩ/}vq~~0|.bRB)@ IX?K-7z{y\D_ླྀVgo]TQkP®h -\؇ ,c;^bvUU{OhHf<f{zbg<ɒy\1$$ƁQ*`*'1x]#2 Hvp]w\3AHf1 6+Pe$Ѷ9?<+..>빐|ѼR8nަ,jBΪs}Z" }{8-|5) 4j ȘLEAH8(v볝uɕ x|)_  SdT2Z[-#@JO@ɧ^FZZaU =FdOй'_.~k5fW`X:ї",P@g mgSc LS7;HawSw;CS4..Y9!y^}2t:zܱ$lK+ Wya9r%wyBIM ]Y. (BWz [O{H/0\9ݴ!~K/ȽZ[YVWck4~[0a ³qS˞)~sw~;JH(*DUI8{]q2y8YZHȨ k3ٛZm_[8͑Spɼ;e;ºC/mlX HtDObxFElb6}FE:(c%;訧Bk٢-UυM9Dvs9D+P9(EqAV@% &NpXX͢d߲—e璪zgI*:\n9ݷ?={⣧k(6O$rjl9RЪ jYE_[hHnD)_//nVCn>2*Rᘘd2yNSMHE^ś>L?|~7P<%e)1bV e'EZf*%Xsj}K "2oz(|:¬Ѡ0 Zƙw1DGm?$Um}hBGkYx1ɲEIK%Zuc&pqc5LY/_NCĉg]EVDSX*[0XnkϼJn[܁h~Y HuBϴp M5M&I} sƇ}|d Wd[.EYgET 4B7i L(sׯ%%er0D%4O=/Tw#لʈ7`a.KrLzXHn~GD6صU X2Tmlޅr>R՗#'CmeR T[A\*s:'+9 *٦}fCh>x(Z ]B#kf^!LZ_-Ziènl^$~HT۔%./^,(5H84:k T(M"(Cz;:%5vG{K2xuьv2(WgjjĜ:؟F=AOh M>~9~hqE&M>~1j1:_ZKiI%ё yϾ?2{D?E]_O_xyv?Ohv" zkYu?Ь?ڟfp=ơ5.DM^~X^޿}z|r|h*ABJl$DٶB~giu'<#3 ;ϛcƭх5hGJNODczUnVQ.P4 mqFϤ/~I\[ٻҵr]D&4y<&LI萛2өEi!`dY^e AHaO=DhbQ41E_^|־MK-O:ŦRV1ʮ,mO 9,<=.؃o2IѬx,ۺ V*~޼ݗ:}s *3p]d ^i.|~~' |Z|Ο.F^1R~_~n}(*MxIAd`QIȀCzoK-]M߫t|J糊$X@Ljb&_Rjכ 6Oaq_#E[/n%ul `$vH_2{< 8.O6Iӿ[BECɷ82yuYy{!"v$D^O&8࿳Y-@:_dC2 .ĊFkp; -.4ݽQs5#"K4J47QKFxe/() A4t'8ɽI='~"g_xg!s h4rY*{>Rg@̤h|)YT7&hH3C56V[b0}8,8Β٘lĥ|7-=_jK!]_.rV&@82:HC-\E2,  sxV5h & #-앤ckcHCʀgC11+K\aɐc΢xP.fExEܗZJ3bxt{;??{/Eo4p'1G'W=;RyBo!<3#ضU< RI?!" "a}=DS[;?D@V>Ƽ5EgtZӚ^Z%#xL Ì*--/#aIm{%;K}tZ;99Fe5Cv1Ҹe0DZ9 |JsJ|iꀨC]lآE/a x4_d6? xqai"/aBkcר,{],ZCΧc|_B{т7h?\>=G";7Q9FeWVZ Tc%EYhb;{x@^F.ߗZx[x\hN#ɳU||_oHTo|ó̚`pi1K1WJ.y!\{GAPvyz}}js# l".>}ܠ4f`ݨG2r[4*ezQ*3ےh7gf [ʚ3*>T4rzqzU*3 #njK1lKrbCwkS<boM1瞍؎NMg3sԩ% Fai!t32|KIWB+Y^<]t]>>xc!¨g +4 Wꬲ%:YQ?vtE@W. Dכ.Rb`=Tp/T ̿dǿd! rh>x)rKKr~PWPC2`x^l_i{RXŞˡ ۏ_<%|!v;DVp4Z͊ (F*\v 9A NĸШ~O5tIt)!Km ݒSEl?Ǝqs |ԝ7HB5̖hf˜.6 ElI˘?~AH!-]Grws]n@G@*Ev  ѯ^]Bk%F33aOJ.린sݵ~wKnZVf Eµ.]ӓ  s;h6]j ZSHqPA v;6 S(Q.5ovJiŃ~EzS^"~q"_W iFL'F_vi졈!7Ѕ1+WDvs͡17~2?fic1~[d (LmxCZQEfmM׫AczS)BbxXM`"N[' ɿY }>bإ#YK곞FN#lAIK&c3oWԤX31fW c۱:Q 7EuRKxM1˅FD4Nys oqhzع.1b :m( 6=8L08¼9M?9}$hЦB/Bt 6Ȋ\6N4;QRPxFRJY4[d_Gq~R+-ƶ=")"fkg4F Erڐ)TV|Cǹe??_jc\8;+]m[]:-g5i>q7>e9FpU0axt-TG { >!"bhM\R ؖ&ԼלK,YILȕOE *2r155$RtA^Rl'^ؔF2V%9]m1ryH+O v-EtmUķMՍgq^19@ .cֈcp)^7lk0Igz8_!r6-A 'l/R̋l'?C]F)69C]"mR.]]]]E8BZz1,pRfmzC~"T2u3'+rkt/Y(T%[%jxv4c gLjr+򐽦=k6$&nqP9{eZxM┧TPQ8&eGX:PH rk ì7NqӉjļ9ɞ!p]~2GD9 TD.$tb$8pZ9QTP@0DI\'jդ3Zh :oە^i"\!Sȳ4M _{S; njy;깈iu΁QN3aAd{ǵц jIhc`;? ˙FID\g.%=,/12:G־Ђ(,{sg~ >c^ssM }0aDVZEΏGŏkl(5 Yq:fq2~ye9x,IDDՅߊ-xyԛΑn3{LD|'|B`zjﲇ~qOgVpbW;㓻KcIDM/J3?<2FH58^ 0z@3_oJ(Uiju࿑=O{i2>]ƞ|vy&'C4=; Sx> (Yvhmyh9w}q;3AGy0+!ǓMdnSFvrT0&q|8gIy?.ekB\ug QN׍pmS=l>[[Nwlc/l\nO [UtgbL)+} e"kftpD#-65yeuy_>8\=^] }X{I.+ _YR]ٺC5tq1jfB7]Vn)K&HkXWZ|oPb|iZ2ºi 9keUZ!̜-+)UV3cH"` FXC<-$Ťn4.='=,FbdStLVxQs lTGc::xƣTHa9DD?t \Ҕ$T/vG-$gdPSS1eu4_OFc؎$K2'y<-KCY5θPe-?7oB9jSz9srG .ON{5ːMx[7o~r9GZ}!C {3޻DZ]s|:ʨƩrM&Fɺ&jSi\1,gڬ?t>-bQC,D,D- ɫo[1Y` 'Gb\(!'NizprKDiNZ3WTt]h((AMN%&[+J$LPm\6TƛEwɧ莘x%Rn2*b)yfY<نt"Cndg'gƔ.vCwNQnW0_X_섵ckw  : ^\f}!WN9[\zOѧZtZUwo&!kS y܈j]&q,q8h0Mȝ[F)%n^w' -]s .`7/+-ө/+l8beq2H_.lq2t_ڧtӫD$T<245) *s5ɝ%ɝI()߹K\X,q@Y_0{drd`׿bx'ׇ.ST}5UQG=eQKjJyp2#Ĥ _M-gkpd6Hh<a ׷x= y< =vƥmqS>_ d.*+Rr(ּU][6g<"XH8kC,"%'li2:"z]鷕M/ڣmL;H#3_}Ҥh?vi+h@'R 9q o_@g>DK]#ҫեϡ{: 䫏OT7kP+QdQdQdQ4E}9 ƕ&y{pi:ĢEa2I4ʠCgj~ȟGG]v/:]ɞ**Y1벞C\2_^?.k%y  c\{ Ժ+X}yCc޼p-Pck#z oj{㸍_9ܷтCҨ  Ha#%-} JwxwIr$Y@ CrK STTBafA JcO:dato)5F3JX!ڙcxpk虄!/Kk2gTW"St)W˰(TAK!+ )rC+lC- c"MKE^q sy'Z s&JFL+Y10JK#[;w߾\̿Iݭ-2GM>(6{NIAˇvN!1Zm\yx ?41$~yU|?}*wWNOc0Y~5T>ޭYwB vo/ND!pƹ2]q8S = Qru wԖX[{NeqY1eԿn[x inƳaF˔g\iU(IP)YEנ*-+(hcUy@``j4"M99>ec pl䶌uރNhʙd.Z̥#oHg4ZohliOJA9LQ-"U󄞯O-&1Hl ɷy ;f{|wpzXæ%4<ρseHi wos=MuJ%lU3XsYwܥhi=p]ѽ Gw+AWYJ Dڝ EΜXIYI- MJȬ*՞ wRKSHbp05" M藖v1%f3gOjuQ€;ZeĶ ⺭*(OR$LZ> nv777O.nRp3H~r65 UutTe8p4Ciff|ny6[rke|}Zr:`014@xM+ v,-^KI5DUA{-6#@m*[>{ml& FbS7![Hv=txdz`tU4Jg52ZX |T'!m:Wd*j!C[6Oևrҩڍ\=Cfj&ZX |T'!m])Ѭ[<uCC^6txI3-Du!Κ ‘ ۪nhUPejsU0Y(f^>H'>\@CBeEkjLF@ZwƯOttDg3%SK*t UZ*3#Ja 0ӪR4Wkه?tw;ӝh7SrD奻pp #33QʱL,gR #bz,ߴpeУ% 0 fqK.of*Nӷf~; wU`ٽ|2:zъA5DuY^TR"CCw`A*x!TZ[]āENjG*/H$%*WP1ssmpfRkShHEZ<4K O%=f8^:hɇ@t:cI~6yZb%x߮ԫзo~OzZmjl^L`{mZ4Irp>d=62'v4k/w>i* ,.7gpM[I$ aV)? X1 8f?>p STW9Bk-mrBX+1xJC'O4GK: )J6&,i!q y}{gm]ۻS]Dzyȼ8P^Uc{Ou}\8'L:~\8OCA` `*0 C o@ ulhlĴ{m׉!p! @uw.bѽ׊Zڒfb08X |Dc6.Ѭ[|?Oևrm'ӻ2ɾsAiHLS&=^wwj:4jEҩJE3vzM(E,ة˚bBcMO%>5H*nbRJHe'4j=F!a`8V=CV)' nrϝXV!Zy=dK8ʛ;zI| L4+7r %;)$CV9GY,7ӏX F:}\"aW!xK@- Y(O(#Ds49inP6a̢x 79L/eIH?Ci衤c} 5\$F[{b <"*e=KEV!Z&ސw׺4ak vw?G}zT),*r+UhjTT˥DJT^cN+lJ"ϹUFƌx˼zs`?W^h4h0JH [RUju!DSYd2?ϭ͵U)Jju,*ܷt*r{`:ݏyx ?nk`7-)|qu.08!medXnV[|[s]D%h$^^\`:eV=ҜLQ6,q>tE׻kjˍ='NhyOş[t[JcͲSmza}7;pK8PD;vߕWsӯyI}Q<k6h'6 O'CJi1裋y/0@4zi>N FZcT%gTvr]NbOnLAwz=C9Wb4 gs#I 3 L~ ՖᑘCU9"Sc)a A{oӣr{7zaj B"oT>21%Б4 4$i.&f:Ʈ>bBB[ov mK $|=BC} )Aђb&` (rbo~[[Φ9R Oð4]P˶GGotﵰ8SCrvk[ 'oL8IG#<~АWQ:Uڟ:}vx{F~#v2ߋdj!ERFR!\EtJ{MhukaF6.1>wAC^6tJz#iTM$~Ώ4#i?&o4var/home/core/zuul-output/logs/kubelet.log0000644000000000000000004764111515145077456017724 0ustar rootrootFeb 17 14:06:12 crc systemd[1]: Starting Kubernetes Kubelet... Feb 17 14:06:12 crc restorecon[4703]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 14:06:12 crc restorecon[4703]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 14:06:13 crc restorecon[4703]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 14:06:13 crc restorecon[4703]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Feb 17 14:06:14 crc kubenswrapper[4836]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 17 14:06:14 crc kubenswrapper[4836]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Feb 17 14:06:14 crc kubenswrapper[4836]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 17 14:06:14 crc kubenswrapper[4836]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 17 14:06:14 crc kubenswrapper[4836]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Feb 17 14:06:14 crc kubenswrapper[4836]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.187758 4836 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.191847 4836 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.191968 4836 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.192057 4836 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.192123 4836 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.192183 4836 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.192252 4836 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.192369 4836 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.192451 4836 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.192515 4836 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.192574 4836 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.192643 4836 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.192719 4836 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.192786 4836 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.192849 4836 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.192916 4836 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.192993 4836 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.193069 4836 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.193131 4836 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.193196 4836 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.193257 4836 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.193365 4836 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.193430 4836 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.193502 4836 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.193581 4836 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.193660 4836 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.193755 4836 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.193823 4836 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.193904 4836 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.193984 4836 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.194067 4836 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.194142 4836 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.194207 4836 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.194274 4836 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.194384 4836 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.194457 4836 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.194525 4836 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.194590 4836 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.194673 4836 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.194769 4836 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.194841 4836 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.194904 4836 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.194990 4836 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.195082 4836 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.195145 4836 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.195204 4836 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.195263 4836 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.195402 4836 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.195496 4836 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.195574 4836 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.195657 4836 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.195730 4836 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.195804 4836 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.195868 4836 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.195934 4836 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.196033 4836 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.196107 4836 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.196170 4836 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.196232 4836 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.196339 4836 feature_gate.go:330] unrecognized feature gate: Example Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.196418 4836 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.196503 4836 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.196600 4836 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.196713 4836 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.196802 4836 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.196867 4836 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.196935 4836 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.197035 4836 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.197127 4836 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.197416 4836 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.197488 4836 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.197549 4836 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.199326 4836 flags.go:64] FLAG: --address="0.0.0.0" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.199360 4836 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.199375 4836 flags.go:64] FLAG: --anonymous-auth="true" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.199386 4836 flags.go:64] FLAG: --application-metrics-count-limit="100" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.199396 4836 flags.go:64] FLAG: --authentication-token-webhook="false" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.199405 4836 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.199416 4836 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.199427 4836 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.199435 4836 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.199442 4836 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.199451 4836 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.199460 4836 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.199468 4836 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.199476 4836 flags.go:64] FLAG: --cgroup-root="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.199483 4836 flags.go:64] FLAG: --cgroups-per-qos="true" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.199491 4836 flags.go:64] FLAG: --client-ca-file="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.199498 4836 flags.go:64] FLAG: --cloud-config="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.199506 4836 flags.go:64] FLAG: --cloud-provider="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.199513 4836 flags.go:64] FLAG: --cluster-dns="[]" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.199522 4836 flags.go:64] FLAG: --cluster-domain="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.199529 4836 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.199537 4836 flags.go:64] FLAG: --config-dir="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.199544 4836 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.199552 4836 flags.go:64] FLAG: --container-log-max-files="5" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.199562 4836 flags.go:64] FLAG: --container-log-max-size="10Mi" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.199569 4836 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.199577 4836 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.199585 4836 flags.go:64] FLAG: --containerd-namespace="k8s.io" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.199592 4836 flags.go:64] FLAG: --contention-profiling="false" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.199600 4836 flags.go:64] FLAG: --cpu-cfs-quota="true" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.199607 4836 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.199624 4836 flags.go:64] FLAG: --cpu-manager-policy="none" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.199664 4836 flags.go:64] FLAG: --cpu-manager-policy-options="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.199674 4836 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.199681 4836 flags.go:64] FLAG: --enable-controller-attach-detach="true" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.199689 4836 flags.go:64] FLAG: --enable-debugging-handlers="true" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.199699 4836 flags.go:64] FLAG: --enable-load-reader="false" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.199707 4836 flags.go:64] FLAG: --enable-server="true" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.199714 4836 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.199727 4836 flags.go:64] FLAG: --event-burst="100" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.199735 4836 flags.go:64] FLAG: --event-qps="50" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.199742 4836 flags.go:64] FLAG: --event-storage-age-limit="default=0" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.199750 4836 flags.go:64] FLAG: --event-storage-event-limit="default=0" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.199757 4836 flags.go:64] FLAG: --eviction-hard="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.199767 4836 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.199774 4836 flags.go:64] FLAG: --eviction-minimum-reclaim="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.199781 4836 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.199789 4836 flags.go:64] FLAG: --eviction-soft="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.199797 4836 flags.go:64] FLAG: --eviction-soft-grace-period="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.199804 4836 flags.go:64] FLAG: --exit-on-lock-contention="false" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.199812 4836 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.199820 4836 flags.go:64] FLAG: --experimental-mounter-path="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.199828 4836 flags.go:64] FLAG: --fail-cgroupv1="false" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.199836 4836 flags.go:64] FLAG: --fail-swap-on="true" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.199843 4836 flags.go:64] FLAG: --feature-gates="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.199852 4836 flags.go:64] FLAG: --file-check-frequency="20s" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.199860 4836 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.199868 4836 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.199876 4836 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.199883 4836 flags.go:64] FLAG: --healthz-port="10248" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.199891 4836 flags.go:64] FLAG: --help="false" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.199899 4836 flags.go:64] FLAG: --hostname-override="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.199906 4836 flags.go:64] FLAG: --housekeeping-interval="10s" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.199914 4836 flags.go:64] FLAG: --http-check-frequency="20s" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.199921 4836 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.199928 4836 flags.go:64] FLAG: --image-credential-provider-config="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.199936 4836 flags.go:64] FLAG: --image-gc-high-threshold="85" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.199944 4836 flags.go:64] FLAG: --image-gc-low-threshold="80" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.199952 4836 flags.go:64] FLAG: --image-service-endpoint="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.199959 4836 flags.go:64] FLAG: --kernel-memcg-notification="false" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.199966 4836 flags.go:64] FLAG: --kube-api-burst="100" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.199974 4836 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.199984 4836 flags.go:64] FLAG: --kube-api-qps="50" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.199991 4836 flags.go:64] FLAG: --kube-reserved="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.199999 4836 flags.go:64] FLAG: --kube-reserved-cgroup="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.200007 4836 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.200014 4836 flags.go:64] FLAG: --kubelet-cgroups="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.200022 4836 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.200030 4836 flags.go:64] FLAG: --lock-file="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.200037 4836 flags.go:64] FLAG: --log-cadvisor-usage="false" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.200045 4836 flags.go:64] FLAG: --log-flush-frequency="5s" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.200053 4836 flags.go:64] FLAG: --log-json-info-buffer-size="0" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.200065 4836 flags.go:64] FLAG: --log-json-split-stream="false" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.200074 4836 flags.go:64] FLAG: --log-text-info-buffer-size="0" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.200081 4836 flags.go:64] FLAG: --log-text-split-stream="false" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.200089 4836 flags.go:64] FLAG: --logging-format="text" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.200096 4836 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.200104 4836 flags.go:64] FLAG: --make-iptables-util-chains="true" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.200112 4836 flags.go:64] FLAG: --manifest-url="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.200119 4836 flags.go:64] FLAG: --manifest-url-header="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.200130 4836 flags.go:64] FLAG: --max-housekeeping-interval="15s" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.200138 4836 flags.go:64] FLAG: --max-open-files="1000000" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.200147 4836 flags.go:64] FLAG: --max-pods="110" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.200154 4836 flags.go:64] FLAG: --maximum-dead-containers="-1" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.200162 4836 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.200169 4836 flags.go:64] FLAG: --memory-manager-policy="None" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.200177 4836 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.200185 4836 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.200194 4836 flags.go:64] FLAG: --node-ip="192.168.126.11" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.200202 4836 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.200225 4836 flags.go:64] FLAG: --node-status-max-images="50" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.200233 4836 flags.go:64] FLAG: --node-status-update-frequency="10s" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.200240 4836 flags.go:64] FLAG: --oom-score-adj="-999" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.200250 4836 flags.go:64] FLAG: --pod-cidr="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.200257 4836 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.200287 4836 flags.go:64] FLAG: --pod-manifest-path="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.200316 4836 flags.go:64] FLAG: --pod-max-pids="-1" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.200326 4836 flags.go:64] FLAG: --pods-per-core="0" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.200334 4836 flags.go:64] FLAG: --port="10250" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.200342 4836 flags.go:64] FLAG: --protect-kernel-defaults="false" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.200350 4836 flags.go:64] FLAG: --provider-id="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.200357 4836 flags.go:64] FLAG: --qos-reserved="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.200365 4836 flags.go:64] FLAG: --read-only-port="10255" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.200373 4836 flags.go:64] FLAG: --register-node="true" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.200380 4836 flags.go:64] FLAG: --register-schedulable="true" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.200388 4836 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.200403 4836 flags.go:64] FLAG: --registry-burst="10" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.200410 4836 flags.go:64] FLAG: --registry-qps="5" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.200417 4836 flags.go:64] FLAG: --reserved-cpus="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.200424 4836 flags.go:64] FLAG: --reserved-memory="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.200434 4836 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.200442 4836 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.200450 4836 flags.go:64] FLAG: --rotate-certificates="false" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.200457 4836 flags.go:64] FLAG: --rotate-server-certificates="false" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.200464 4836 flags.go:64] FLAG: --runonce="false" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.200472 4836 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.200480 4836 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.200487 4836 flags.go:64] FLAG: --seccomp-default="false" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.200494 4836 flags.go:64] FLAG: --serialize-image-pulls="true" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.200502 4836 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.200510 4836 flags.go:64] FLAG: --storage-driver-db="cadvisor" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.200518 4836 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.200526 4836 flags.go:64] FLAG: --storage-driver-password="root" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.200533 4836 flags.go:64] FLAG: --storage-driver-secure="false" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.200541 4836 flags.go:64] FLAG: --storage-driver-table="stats" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.200549 4836 flags.go:64] FLAG: --storage-driver-user="root" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.200557 4836 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.200565 4836 flags.go:64] FLAG: --sync-frequency="1m0s" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.200572 4836 flags.go:64] FLAG: --system-cgroups="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.200579 4836 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.200592 4836 flags.go:64] FLAG: --system-reserved-cgroup="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.200600 4836 flags.go:64] FLAG: --tls-cert-file="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.200607 4836 flags.go:64] FLAG: --tls-cipher-suites="[]" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.200616 4836 flags.go:64] FLAG: --tls-min-version="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.200625 4836 flags.go:64] FLAG: --tls-private-key-file="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.200632 4836 flags.go:64] FLAG: --topology-manager-policy="none" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.200640 4836 flags.go:64] FLAG: --topology-manager-policy-options="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.200647 4836 flags.go:64] FLAG: --topology-manager-scope="container" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.200656 4836 flags.go:64] FLAG: --v="2" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.200666 4836 flags.go:64] FLAG: --version="false" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.200675 4836 flags.go:64] FLAG: --vmodule="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.200685 4836 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.200693 4836 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.202779 4836 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.202795 4836 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.202803 4836 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.202810 4836 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.202818 4836 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.202825 4836 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.202832 4836 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.202839 4836 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.202845 4836 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.202852 4836 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.202859 4836 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.202865 4836 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.202873 4836 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.202881 4836 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.202888 4836 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.202896 4836 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.202903 4836 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.202910 4836 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.202918 4836 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.202927 4836 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.202934 4836 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.202941 4836 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.202948 4836 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.202955 4836 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.202961 4836 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.202968 4836 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.202974 4836 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.202983 4836 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.202992 4836 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.203000 4836 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.203007 4836 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.203016 4836 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.203025 4836 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.203033 4836 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.203039 4836 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.203047 4836 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.203055 4836 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.203064 4836 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.203071 4836 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.203077 4836 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.203084 4836 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.203090 4836 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.203097 4836 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.203103 4836 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.203109 4836 feature_gate.go:330] unrecognized feature gate: Example Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.203116 4836 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.203122 4836 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.203129 4836 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.203135 4836 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.203141 4836 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.203148 4836 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.203155 4836 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.203161 4836 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.203167 4836 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.203173 4836 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.203179 4836 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.203185 4836 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.203192 4836 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.203199 4836 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.203205 4836 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.203211 4836 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.203218 4836 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.203224 4836 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.203231 4836 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.203238 4836 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.203244 4836 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.203251 4836 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.203257 4836 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.203263 4836 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.203269 4836 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.203275 4836 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.208433 4836 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.236811 4836 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.237671 4836 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.237934 4836 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.237951 4836 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.237960 4836 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.237967 4836 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.237975 4836 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.237982 4836 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.237988 4836 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.237993 4836 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.237998 4836 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238003 4836 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238009 4836 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238015 4836 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238020 4836 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238026 4836 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238032 4836 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238037 4836 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238042 4836 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238047 4836 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238052 4836 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238057 4836 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238063 4836 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238068 4836 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238075 4836 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238081 4836 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238088 4836 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238094 4836 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238100 4836 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238106 4836 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238111 4836 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238117 4836 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238123 4836 feature_gate.go:330] unrecognized feature gate: Example Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238128 4836 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238134 4836 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238139 4836 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238145 4836 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238149 4836 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238154 4836 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238159 4836 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238164 4836 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238169 4836 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238173 4836 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238179 4836 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238183 4836 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238189 4836 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238194 4836 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238199 4836 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238204 4836 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238209 4836 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238214 4836 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238218 4836 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238225 4836 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238232 4836 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238237 4836 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238243 4836 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238249 4836 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238254 4836 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238259 4836 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238264 4836 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238270 4836 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238275 4836 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238282 4836 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238287 4836 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238314 4836 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238320 4836 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238325 4836 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238330 4836 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238336 4836 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238341 4836 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238346 4836 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238351 4836 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238355 4836 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.238365 4836 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238523 4836 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238532 4836 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238584 4836 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238590 4836 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238595 4836 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238601 4836 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238605 4836 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238610 4836 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238615 4836 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238622 4836 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238627 4836 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238632 4836 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238637 4836 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238641 4836 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238647 4836 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238652 4836 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238660 4836 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238668 4836 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238673 4836 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238679 4836 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238685 4836 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238692 4836 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238698 4836 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238702 4836 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238707 4836 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238713 4836 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238718 4836 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238722 4836 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238727 4836 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238732 4836 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238738 4836 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238743 4836 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238747 4836 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238754 4836 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238760 4836 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238765 4836 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238772 4836 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238778 4836 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238782 4836 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238788 4836 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238794 4836 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238799 4836 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238804 4836 feature_gate.go:330] unrecognized feature gate: Example Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238810 4836 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238816 4836 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238822 4836 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238828 4836 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238834 4836 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238840 4836 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238847 4836 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238853 4836 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238860 4836 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238868 4836 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238875 4836 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238882 4836 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238890 4836 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238897 4836 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238906 4836 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238913 4836 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238920 4836 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238926 4836 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238932 4836 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238938 4836 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238943 4836 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238948 4836 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238953 4836 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238959 4836 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238965 4836 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238970 4836 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238975 4836 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.238979 4836 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.238988 4836 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.240155 4836 server.go:940] "Client rotation is on, will bootstrap in background" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.249199 4836 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.249416 4836 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.251564 4836 server.go:997] "Starting client certificate rotation" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.251606 4836 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.251810 4836 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2026-01-12 02:47:06.199031913 +0000 UTC Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.251912 4836 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.286126 4836 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 17 14:06:14 crc kubenswrapper[4836]: E0217 14:06:14.289015 4836 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.233:6443: connect: connection refused" logger="UnhandledError" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.290335 4836 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.307371 4836 log.go:25] "Validated CRI v1 runtime API" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.376949 4836 log.go:25] "Validated CRI v1 image API" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.379723 4836 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.387709 4836 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-02-17-14-00-55-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.387755 4836 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.423886 4836 manager.go:217] Machine: {Timestamp:2026-02-17 14:06:14.411030626 +0000 UTC m=+0.753958945 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654124544 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:f194f106-0bf2-4b65-bcb3-5215631b39d2 BootID:d638d470-b0e0-4be9-938f-7ec815bf6bd8 Filesystems:[{Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108169 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827060224 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:61:d6:c8 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:61:d6:c8 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:40:38:0b Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:56:ea:a0 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:9c:ba:c2 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:91:cc:ac Speed:-1 Mtu:1496} {Name:eth10 MacAddress:2a:3c:b3:3b:43:d3 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:c2:39:ac:01:6a:5e Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654124544 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.424197 4836 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.424443 4836 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.426036 4836 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.426275 4836 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.426347 4836 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.426602 4836 topology_manager.go:138] "Creating topology manager with none policy" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.426618 4836 container_manager_linux.go:303] "Creating device plugin manager" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.427114 4836 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.427167 4836 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.429192 4836 state_mem.go:36] "Initialized new in-memory state store" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.429302 4836 server.go:1245] "Using root directory" path="/var/lib/kubelet" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.436324 4836 kubelet.go:418] "Attempting to sync node with API server" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.436355 4836 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.436393 4836 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.436409 4836 kubelet.go:324] "Adding apiserver pod source" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.436434 4836 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.444771 4836 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.233:6443: connect: connection refused Feb 17 14:06:14 crc kubenswrapper[4836]: E0217 14:06:14.444901 4836 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.233:6443: connect: connection refused" logger="UnhandledError" Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.450702 4836 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.233:6443: connect: connection refused Feb 17 14:06:14 crc kubenswrapper[4836]: E0217 14:06:14.450776 4836 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.233:6443: connect: connection refused" logger="UnhandledError" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.453760 4836 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.455970 4836 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.460475 4836 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.462895 4836 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.462934 4836 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.462949 4836 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.462962 4836 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.462988 4836 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.463005 4836 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.463018 4836 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.463037 4836 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.463051 4836 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.463063 4836 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.463084 4836 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.463096 4836 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.464206 4836 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.464891 4836 server.go:1280] "Started kubelet" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.465256 4836 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.233:6443: connect: connection refused Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.466243 4836 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.466704 4836 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.466889 4836 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.466930 4836 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.466987 4836 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-24 17:44:57.915309409 +0000 UTC Feb 17 14:06:14 crc systemd[1]: Started Kubernetes Kubelet. Feb 17 14:06:14 crc kubenswrapper[4836]: E0217 14:06:14.467503 4836 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.467602 4836 volume_manager.go:287] "The desired_state_of_world populator starts" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.467611 4836 volume_manager.go:289] "Starting Kubelet Volume Manager" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.467744 4836 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.468330 4836 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.233:6443: connect: connection refused Feb 17 14:06:14 crc kubenswrapper[4836]: E0217 14:06:14.468485 4836 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.233:6443: connect: connection refused" logger="UnhandledError" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.468535 4836 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Feb 17 14:06:14 crc kubenswrapper[4836]: E0217 14:06:14.469605 4836 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.233:6443: connect: connection refused" interval="200ms" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.472070 4836 factory.go:55] Registering systemd factory Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.472098 4836 factory.go:221] Registration of the systemd container factory successfully Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.472504 4836 factory.go:153] Registering CRI-O factory Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.472515 4836 factory.go:221] Registration of the crio container factory successfully Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.472728 4836 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.472773 4836 factory.go:103] Registering Raw factory Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.472794 4836 manager.go:1196] Started watching for new ooms in manager Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.473681 4836 manager.go:319] Starting recovery of all containers Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.497103 4836 server.go:460] "Adding debug handlers to kubelet server" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.498575 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.498657 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.498672 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.498685 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.498705 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.498716 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.498727 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.498852 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.498869 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.498882 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.498895 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.498909 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.498920 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.498935 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.498948 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.498962 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.498976 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.498988 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.499000 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.499012 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.499024 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.499036 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.499049 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.499061 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.499075 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.499088 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.499103 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.499116 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.499162 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.499177 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.499191 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.499205 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.499219 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.499231 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.499244 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.499256 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.499267 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.499278 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.499290 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.499320 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.499332 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.499344 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.499356 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.499371 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.499383 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.499395 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.499406 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.499419 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.499430 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.499442 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.499453 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.499464 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.499480 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.499493 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.499505 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.499518 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.499531 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.499542 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.499554 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.499565 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.499576 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.499587 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.499600 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.499617 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.499627 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.499638 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.499650 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.499662 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.499673 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.499685 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.499697 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.499707 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.499717 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.499725 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.499734 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.499742 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.499751 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.499760 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.499769 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.499779 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.499787 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.499796 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.499805 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.499813 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.499822 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.499838 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.499850 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.499859 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.499869 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.499877 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.499886 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.499895 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.499904 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.499914 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.499922 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.499936 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.499945 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.499953 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.499963 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.499972 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.499982 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.499991 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.500000 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.500010 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.500024 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: E0217 14:06:14.498168 4836 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.233:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.18950dc6756be7a7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-17 14:06:14.464849831 +0000 UTC m=+0.807778130,LastTimestamp:2026-02-17 14:06:14.464849831 +0000 UTC m=+0.807778130,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.500033 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.500124 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.500172 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.500196 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.500216 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.500238 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.500258 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.500278 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.500326 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.500349 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.500369 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.500388 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.500407 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.500426 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.501065 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.501097 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.501112 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.501127 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.501146 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.501160 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.501175 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.501190 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.501206 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.501220 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.501235 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.501248 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.501262 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.501277 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.501343 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.501365 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.501380 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.501396 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.501410 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.501425 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.501440 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.501458 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.501496 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.501516 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.501534 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.501555 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.501570 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.501585 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.501599 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.501613 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.501627 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.501642 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.501657 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.501671 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.501689 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.501704 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.501719 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.501734 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.501748 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.501761 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.501775 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.501794 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.501811 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.501825 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.501873 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.501888 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.501902 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.501915 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.507258 4836 manager.go:324] Recovery completed Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.518821 4836 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.520420 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.520458 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.520467 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.521307 4836 cpu_manager.go:225] "Starting CPU manager" policy="none" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.521323 4836 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.521339 4836 state_mem.go:36] "Initialized new in-memory state store" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.528942 4836 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.529021 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.529046 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.529061 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.529080 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.529093 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.529106 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.529119 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.529135 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.529148 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.529161 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.529174 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.529189 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.529203 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.529215 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.529228 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.529242 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.529257 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.529271 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.529283 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.529327 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.529340 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.529353 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.529366 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.529378 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.529390 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.529403 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.529416 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.529446 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.529459 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.529472 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.529484 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.529509 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.529521 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.529534 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.529547 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.529560 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.529571 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.529582 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.529594 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.529608 4836 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.529619 4836 reconstruct.go:97] "Volume reconstruction finished" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.529628 4836 reconciler.go:26] "Reconciler: start to sync state" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.564156 4836 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.565790 4836 policy_none.go:49] "None policy: Start" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.566600 4836 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.566682 4836 status_manager.go:217] "Starting to sync pod status with apiserver" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.566749 4836 kubelet.go:2335] "Starting kubelet main sync loop" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.566963 4836 memory_manager.go:170] "Starting memorymanager" policy="None" Feb 17 14:06:14 crc kubenswrapper[4836]: E0217 14:06:14.566980 4836 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.566999 4836 state_mem.go:35] "Initializing new in-memory state store" Feb 17 14:06:14 crc kubenswrapper[4836]: W0217 14:06:14.567321 4836 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.233:6443: connect: connection refused Feb 17 14:06:14 crc kubenswrapper[4836]: E0217 14:06:14.567399 4836 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.233:6443: connect: connection refused" logger="UnhandledError" Feb 17 14:06:14 crc kubenswrapper[4836]: E0217 14:06:14.567752 4836 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.642885 4836 manager.go:334] "Starting Device Plugin manager" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.643096 4836 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.643129 4836 server.go:79] "Starting device plugin registration server" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.643657 4836 eviction_manager.go:189] "Eviction manager: starting control loop" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.643685 4836 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.643956 4836 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.644082 4836 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.644105 4836 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Feb 17 14:06:14 crc kubenswrapper[4836]: E0217 14:06:14.654232 4836 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.667924 4836 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.668131 4836 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.669691 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.669746 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.669760 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.669982 4836 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.670571 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 17 14:06:14 crc kubenswrapper[4836]: E0217 14:06:14.670642 4836 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.233:6443: connect: connection refused" interval="400ms" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.670676 4836 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.675501 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.675562 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.675576 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.675794 4836 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.675979 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.676035 4836 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.676075 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.676095 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.676155 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.676889 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.676909 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.676918 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.677111 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.677150 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.677164 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.677354 4836 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.677538 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.677614 4836 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.678069 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.678091 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.678100 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.678199 4836 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.678358 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.678411 4836 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.679189 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.679216 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.679229 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.679399 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.679430 4836 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.679735 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.679755 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.679763 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.679886 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.679902 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.679910 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.680564 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.680612 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.680633 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.743842 4836 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.744929 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.744962 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.744971 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.744995 4836 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 17 14:06:14 crc kubenswrapper[4836]: E0217 14:06:14.745510 4836 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.233:6443: connect: connection refused" node="crc" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.834118 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.834177 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.834207 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.834232 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.834315 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.834360 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.834378 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.834394 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.834412 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.834430 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.834509 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.834608 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.834660 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.834686 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.834716 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.935873 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.936344 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.936381 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.936409 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.936438 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.936447 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.936511 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.936517 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.936474 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.936542 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.936521 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.936598 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.936628 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.936632 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.936121 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.936671 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.936696 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.936716 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.936660 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.936738 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.936761 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.936801 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.936824 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.936806 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.936803 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.936855 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.936897 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.936824 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.936954 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.937037 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.946511 4836 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.948130 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.948166 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.948177 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:14 crc kubenswrapper[4836]: I0217 14:06:14.948200 4836 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 17 14:06:14 crc kubenswrapper[4836]: E0217 14:06:14.948676 4836 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.233:6443: connect: connection refused" node="crc" Feb 17 14:06:15 crc kubenswrapper[4836]: I0217 14:06:15.000604 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 17 14:06:15 crc kubenswrapper[4836]: I0217 14:06:15.008965 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 17 14:06:15 crc kubenswrapper[4836]: I0217 14:06:15.025656 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Feb 17 14:06:15 crc kubenswrapper[4836]: I0217 14:06:15.042767 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 14:06:15 crc kubenswrapper[4836]: I0217 14:06:15.047004 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:06:15 crc kubenswrapper[4836]: E0217 14:06:15.072240 4836 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.233:6443: connect: connection refused" interval="800ms" Feb 17 14:06:15 crc kubenswrapper[4836]: W0217 14:06:15.161204 4836 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-8542663a80ad2dce8cc30d6656370152510811bebc10a960a687ed9d28809db6 WatchSource:0}: Error finding container 8542663a80ad2dce8cc30d6656370152510811bebc10a960a687ed9d28809db6: Status 404 returned error can't find the container with id 8542663a80ad2dce8cc30d6656370152510811bebc10a960a687ed9d28809db6 Feb 17 14:06:15 crc kubenswrapper[4836]: W0217 14:06:15.161614 4836 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-79b4471c38baad726843d2388b9c56eddcde98457fb9444a0a0ba4fd89f9eac4 WatchSource:0}: Error finding container 79b4471c38baad726843d2388b9c56eddcde98457fb9444a0a0ba4fd89f9eac4: Status 404 returned error can't find the container with id 79b4471c38baad726843d2388b9c56eddcde98457fb9444a0a0ba4fd89f9eac4 Feb 17 14:06:15 crc kubenswrapper[4836]: I0217 14:06:15.349402 4836 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:06:15 crc kubenswrapper[4836]: I0217 14:06:15.351228 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:15 crc kubenswrapper[4836]: I0217 14:06:15.351280 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:15 crc kubenswrapper[4836]: I0217 14:06:15.351308 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:15 crc kubenswrapper[4836]: I0217 14:06:15.351349 4836 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 17 14:06:15 crc kubenswrapper[4836]: E0217 14:06:15.351834 4836 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.233:6443: connect: connection refused" node="crc" Feb 17 14:06:15 crc kubenswrapper[4836]: I0217 14:06:15.466548 4836 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.233:6443: connect: connection refused Feb 17 14:06:15 crc kubenswrapper[4836]: I0217 14:06:15.467580 4836 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-13 07:06:36.231395621 +0000 UTC Feb 17 14:06:15 crc kubenswrapper[4836]: W0217 14:06:15.492492 4836 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.233:6443: connect: connection refused Feb 17 14:06:15 crc kubenswrapper[4836]: E0217 14:06:15.492572 4836 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.233:6443: connect: connection refused" logger="UnhandledError" Feb 17 14:06:15 crc kubenswrapper[4836]: I0217 14:06:15.571699 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"31942fdcee1d24800a616c3b72535d0bfedb200e7db6cf8b9a5cb69248777533"} Feb 17 14:06:15 crc kubenswrapper[4836]: I0217 14:06:15.573306 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"79b4471c38baad726843d2388b9c56eddcde98457fb9444a0a0ba4fd89f9eac4"} Feb 17 14:06:15 crc kubenswrapper[4836]: I0217 14:06:15.578146 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"ac3aa5cc00f4173264fad5b484e64e792869df345c4a14cccf18b9917864c92e"} Feb 17 14:06:15 crc kubenswrapper[4836]: I0217 14:06:15.580399 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"8542663a80ad2dce8cc30d6656370152510811bebc10a960a687ed9d28809db6"} Feb 17 14:06:15 crc kubenswrapper[4836]: I0217 14:06:15.581727 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"e06cb61d981c06c98abcbe7e0f8de1d57343d3a093805b6133bbfae4618ce6b8"} Feb 17 14:06:15 crc kubenswrapper[4836]: W0217 14:06:15.607269 4836 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.233:6443: connect: connection refused Feb 17 14:06:15 crc kubenswrapper[4836]: E0217 14:06:15.607384 4836 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.233:6443: connect: connection refused" logger="UnhandledError" Feb 17 14:06:15 crc kubenswrapper[4836]: W0217 14:06:15.732653 4836 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.233:6443: connect: connection refused Feb 17 14:06:15 crc kubenswrapper[4836]: E0217 14:06:15.732764 4836 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.233:6443: connect: connection refused" logger="UnhandledError" Feb 17 14:06:15 crc kubenswrapper[4836]: W0217 14:06:15.820473 4836 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.233:6443: connect: connection refused Feb 17 14:06:15 crc kubenswrapper[4836]: E0217 14:06:15.820573 4836 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.233:6443: connect: connection refused" logger="UnhandledError" Feb 17 14:06:15 crc kubenswrapper[4836]: E0217 14:06:15.873423 4836 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.233:6443: connect: connection refused" interval="1.6s" Feb 17 14:06:16 crc kubenswrapper[4836]: I0217 14:06:16.152700 4836 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:06:16 crc kubenswrapper[4836]: I0217 14:06:16.154153 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:16 crc kubenswrapper[4836]: I0217 14:06:16.154195 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:16 crc kubenswrapper[4836]: I0217 14:06:16.154221 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:16 crc kubenswrapper[4836]: I0217 14:06:16.154251 4836 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 17 14:06:16 crc kubenswrapper[4836]: E0217 14:06:16.155082 4836 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.233:6443: connect: connection refused" node="crc" Feb 17 14:06:16 crc kubenswrapper[4836]: I0217 14:06:16.296585 4836 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 17 14:06:16 crc kubenswrapper[4836]: E0217 14:06:16.298160 4836 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.233:6443: connect: connection refused" logger="UnhandledError" Feb 17 14:06:16 crc kubenswrapper[4836]: I0217 14:06:16.466555 4836 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.233:6443: connect: connection refused Feb 17 14:06:16 crc kubenswrapper[4836]: I0217 14:06:16.468740 4836 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-09 09:48:01.364125556 +0000 UTC Feb 17 14:06:16 crc kubenswrapper[4836]: I0217 14:06:16.587692 4836 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="bf6a761df18c5e7699f2ec87d4b97f19da71f588cb12eff8e541140a361fa905" exitCode=0 Feb 17 14:06:16 crc kubenswrapper[4836]: I0217 14:06:16.587845 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"bf6a761df18c5e7699f2ec87d4b97f19da71f588cb12eff8e541140a361fa905"} Feb 17 14:06:16 crc kubenswrapper[4836]: I0217 14:06:16.587876 4836 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:06:16 crc kubenswrapper[4836]: I0217 14:06:16.589061 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:16 crc kubenswrapper[4836]: I0217 14:06:16.589093 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:16 crc kubenswrapper[4836]: I0217 14:06:16.589112 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:16 crc kubenswrapper[4836]: I0217 14:06:16.589717 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"b6196237bf10370f0d43ac7ba96f1fbc861d8690734cf883081f67616e6047dc"} Feb 17 14:06:16 crc kubenswrapper[4836]: I0217 14:06:16.591124 4836 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:06:16 crc kubenswrapper[4836]: I0217 14:06:16.591360 4836 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="854199187b81a3d6692a05073c35985feb14993f76f4ac86149c2b6bac635e4d" exitCode=0 Feb 17 14:06:16 crc kubenswrapper[4836]: I0217 14:06:16.591442 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"854199187b81a3d6692a05073c35985feb14993f76f4ac86149c2b6bac635e4d"} Feb 17 14:06:16 crc kubenswrapper[4836]: I0217 14:06:16.591799 4836 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:06:16 crc kubenswrapper[4836]: I0217 14:06:16.592077 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:16 crc kubenswrapper[4836]: I0217 14:06:16.592106 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:16 crc kubenswrapper[4836]: I0217 14:06:16.592114 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:16 crc kubenswrapper[4836]: I0217 14:06:16.592683 4836 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="735aeae35fb2663b5537053014ee78275b2abd919cbedb4730f40aca0a6921fd" exitCode=0 Feb 17 14:06:16 crc kubenswrapper[4836]: I0217 14:06:16.592741 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"735aeae35fb2663b5537053014ee78275b2abd919cbedb4730f40aca0a6921fd"} Feb 17 14:06:16 crc kubenswrapper[4836]: I0217 14:06:16.592821 4836 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:06:16 crc kubenswrapper[4836]: I0217 14:06:16.593929 4836 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="73bf4b735f1125a8bb5b1f3f449b651c60fdc4ee62b918e28eb3b13278e1f7d1" exitCode=0 Feb 17 14:06:16 crc kubenswrapper[4836]: I0217 14:06:16.593952 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"73bf4b735f1125a8bb5b1f3f449b651c60fdc4ee62b918e28eb3b13278e1f7d1"} Feb 17 14:06:16 crc kubenswrapper[4836]: I0217 14:06:16.594057 4836 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:06:16 crc kubenswrapper[4836]: I0217 14:06:16.594066 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:16 crc kubenswrapper[4836]: I0217 14:06:16.594095 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:16 crc kubenswrapper[4836]: I0217 14:06:16.594113 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:16 crc kubenswrapper[4836]: I0217 14:06:16.594826 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:16 crc kubenswrapper[4836]: I0217 14:06:16.594848 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:16 crc kubenswrapper[4836]: I0217 14:06:16.594861 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:16 crc kubenswrapper[4836]: I0217 14:06:16.596134 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:16 crc kubenswrapper[4836]: I0217 14:06:16.596196 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:16 crc kubenswrapper[4836]: I0217 14:06:16.596217 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:17 crc kubenswrapper[4836]: W0217 14:06:17.436109 4836 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.233:6443: connect: connection refused Feb 17 14:06:17 crc kubenswrapper[4836]: E0217 14:06:17.436187 4836 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.233:6443: connect: connection refused" logger="UnhandledError" Feb 17 14:06:17 crc kubenswrapper[4836]: I0217 14:06:17.466034 4836 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.233:6443: connect: connection refused Feb 17 14:06:17 crc kubenswrapper[4836]: I0217 14:06:17.469222 4836 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-13 11:03:57.224778909 +0000 UTC Feb 17 14:06:17 crc kubenswrapper[4836]: E0217 14:06:17.557014 4836 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.233:6443: connect: connection refused" interval="3.2s" Feb 17 14:06:17 crc kubenswrapper[4836]: I0217 14:06:17.611550 4836 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="67b26f0d65128004f7645ac1f54a11bc56909269cb9b9337e447b0d5bf783c4e" exitCode=0 Feb 17 14:06:17 crc kubenswrapper[4836]: I0217 14:06:17.611616 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"67b26f0d65128004f7645ac1f54a11bc56909269cb9b9337e447b0d5bf783c4e"} Feb 17 14:06:17 crc kubenswrapper[4836]: I0217 14:06:17.611722 4836 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:06:17 crc kubenswrapper[4836]: I0217 14:06:17.612422 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:17 crc kubenswrapper[4836]: I0217 14:06:17.612453 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:17 crc kubenswrapper[4836]: I0217 14:06:17.612463 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:17 crc kubenswrapper[4836]: I0217 14:06:17.614796 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"e246e50c75b51b522a87eb1e3c23d1a8a008b63a663fc03fa1e5b7feef6451c7"} Feb 17 14:06:17 crc kubenswrapper[4836]: I0217 14:06:17.614861 4836 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:06:17 crc kubenswrapper[4836]: I0217 14:06:17.615660 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:17 crc kubenswrapper[4836]: I0217 14:06:17.615677 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:17 crc kubenswrapper[4836]: I0217 14:06:17.615685 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:17 crc kubenswrapper[4836]: I0217 14:06:17.617876 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"bd23a5cef9ad5b95a15765a1cc2055e05de0f9821f426745c1e50829ed1c8141"} Feb 17 14:06:17 crc kubenswrapper[4836]: I0217 14:06:17.617947 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"125ff8db8d520f18c812d9fabdb8e31c63958022c790a31167fabc3549de5f7a"} Feb 17 14:06:17 crc kubenswrapper[4836]: I0217 14:06:17.619649 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"edaaa277a27d7e7b9e8b42acaa9ee04fc0933b440441ad3f8abee458e96945c0"} Feb 17 14:06:17 crc kubenswrapper[4836]: I0217 14:06:17.619666 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"281c6f51b95a96d557a29b449759275131f2fde30c4814e7692edfff96442e71"} Feb 17 14:06:17 crc kubenswrapper[4836]: I0217 14:06:17.621650 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"8c21e5f00c6fb44be82d2a907c0f9d17ea47071615370135f0e94a62f1eb7840"} Feb 17 14:06:17 crc kubenswrapper[4836]: I0217 14:06:17.621671 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"21927a2142504e88fd8bb2a254d804c0e13f6a42e81e16c02205292de3de3155"} Feb 17 14:06:17 crc kubenswrapper[4836]: I0217 14:06:17.621679 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"8787a45a4a94da65a0f67f1a402bfa526fa57631259981ded7ec3569fa977f36"} Feb 17 14:06:17 crc kubenswrapper[4836]: I0217 14:06:17.621974 4836 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:06:17 crc kubenswrapper[4836]: I0217 14:06:17.622596 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:17 crc kubenswrapper[4836]: I0217 14:06:17.622612 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:17 crc kubenswrapper[4836]: I0217 14:06:17.622620 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:17 crc kubenswrapper[4836]: I0217 14:06:17.755735 4836 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:06:17 crc kubenswrapper[4836]: I0217 14:06:17.756828 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:17 crc kubenswrapper[4836]: I0217 14:06:17.756868 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:17 crc kubenswrapper[4836]: I0217 14:06:17.756879 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:17 crc kubenswrapper[4836]: I0217 14:06:17.756902 4836 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 17 14:06:17 crc kubenswrapper[4836]: E0217 14:06:17.757251 4836 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.233:6443: connect: connection refused" node="crc" Feb 17 14:06:18 crc kubenswrapper[4836]: W0217 14:06:18.224111 4836 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.233:6443: connect: connection refused Feb 17 14:06:18 crc kubenswrapper[4836]: E0217 14:06:18.224206 4836 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.233:6443: connect: connection refused" logger="UnhandledError" Feb 17 14:06:18 crc kubenswrapper[4836]: E0217 14:06:18.426730 4836 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.233:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.18950dc6756be7a7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-17 14:06:14.464849831 +0000 UTC m=+0.807778130,LastTimestamp:2026-02-17 14:06:14.464849831 +0000 UTC m=+0.807778130,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 17 14:06:18 crc kubenswrapper[4836]: W0217 14:06:18.426962 4836 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.233:6443: connect: connection refused Feb 17 14:06:18 crc kubenswrapper[4836]: E0217 14:06:18.427022 4836 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.233:6443: connect: connection refused" logger="UnhandledError" Feb 17 14:06:18 crc kubenswrapper[4836]: I0217 14:06:18.466156 4836 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.233:6443: connect: connection refused Feb 17 14:06:18 crc kubenswrapper[4836]: I0217 14:06:18.469321 4836 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-09 13:15:48.82668008 +0000 UTC Feb 17 14:06:18 crc kubenswrapper[4836]: I0217 14:06:18.626698 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"a1c943885038503c4b93bc7688fb4ac415496280e3f969326883c0d366c66e08"} Feb 17 14:06:18 crc kubenswrapper[4836]: I0217 14:06:18.626785 4836 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:06:18 crc kubenswrapper[4836]: I0217 14:06:18.627561 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:18 crc kubenswrapper[4836]: I0217 14:06:18.627591 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:18 crc kubenswrapper[4836]: I0217 14:06:18.627604 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:18 crc kubenswrapper[4836]: I0217 14:06:18.629798 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c1a52f71b0a88c92eb73fc0f119373f12587dd887e7dbb4a06dbb6e6d33d55c9"} Feb 17 14:06:18 crc kubenswrapper[4836]: I0217 14:06:18.629929 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"2b36202875b19f7dafcfde168e006d9d47571de30240d5886e5e0920b74b609b"} Feb 17 14:06:18 crc kubenswrapper[4836]: I0217 14:06:18.630020 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"9bc90fa5f9363034d51c35721493655d10f8f8d5fad4dda86ca4c882963fb7a3"} Feb 17 14:06:18 crc kubenswrapper[4836]: I0217 14:06:18.629873 4836 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:06:18 crc kubenswrapper[4836]: I0217 14:06:18.631003 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:18 crc kubenswrapper[4836]: I0217 14:06:18.631111 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:18 crc kubenswrapper[4836]: I0217 14:06:18.631202 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:18 crc kubenswrapper[4836]: I0217 14:06:18.631508 4836 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="e9fc4a71369f80c06598341729c7db0cb6e0f33d70b0be64470bc6fd40d1cce2" exitCode=0 Feb 17 14:06:18 crc kubenswrapper[4836]: I0217 14:06:18.631579 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"e9fc4a71369f80c06598341729c7db0cb6e0f33d70b0be64470bc6fd40d1cce2"} Feb 17 14:06:18 crc kubenswrapper[4836]: I0217 14:06:18.631666 4836 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:06:18 crc kubenswrapper[4836]: I0217 14:06:18.631691 4836 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:06:18 crc kubenswrapper[4836]: I0217 14:06:18.632145 4836 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:06:18 crc kubenswrapper[4836]: I0217 14:06:18.632425 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:18 crc kubenswrapper[4836]: I0217 14:06:18.632449 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:18 crc kubenswrapper[4836]: I0217 14:06:18.632460 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:18 crc kubenswrapper[4836]: I0217 14:06:18.632651 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:18 crc kubenswrapper[4836]: I0217 14:06:18.632769 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:18 crc kubenswrapper[4836]: I0217 14:06:18.632851 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:18 crc kubenswrapper[4836]: I0217 14:06:18.632789 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:18 crc kubenswrapper[4836]: I0217 14:06:18.633037 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:18 crc kubenswrapper[4836]: I0217 14:06:18.633049 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:18 crc kubenswrapper[4836]: W0217 14:06:18.867126 4836 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.233:6443: connect: connection refused Feb 17 14:06:18 crc kubenswrapper[4836]: E0217 14:06:18.867276 4836 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.233:6443: connect: connection refused" logger="UnhandledError" Feb 17 14:06:19 crc kubenswrapper[4836]: I0217 14:06:18.990648 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 14:06:19 crc kubenswrapper[4836]: I0217 14:06:19.335370 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:06:19 crc kubenswrapper[4836]: I0217 14:06:19.384017 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 17 14:06:19 crc kubenswrapper[4836]: I0217 14:06:19.467333 4836 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.233:6443: connect: connection refused Feb 17 14:06:19 crc kubenswrapper[4836]: I0217 14:06:19.469837 4836 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-24 12:16:26.711818581 +0000 UTC Feb 17 14:06:19 crc kubenswrapper[4836]: I0217 14:06:19.571842 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:06:19 crc kubenswrapper[4836]: I0217 14:06:19.635480 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 17 14:06:19 crc kubenswrapper[4836]: I0217 14:06:19.637439 4836 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c1a52f71b0a88c92eb73fc0f119373f12587dd887e7dbb4a06dbb6e6d33d55c9" exitCode=255 Feb 17 14:06:19 crc kubenswrapper[4836]: I0217 14:06:19.637492 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"c1a52f71b0a88c92eb73fc0f119373f12587dd887e7dbb4a06dbb6e6d33d55c9"} Feb 17 14:06:19 crc kubenswrapper[4836]: I0217 14:06:19.637556 4836 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:06:19 crc kubenswrapper[4836]: I0217 14:06:19.638216 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:19 crc kubenswrapper[4836]: I0217 14:06:19.638251 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:19 crc kubenswrapper[4836]: I0217 14:06:19.638263 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:19 crc kubenswrapper[4836]: I0217 14:06:19.638805 4836 scope.go:117] "RemoveContainer" containerID="c1a52f71b0a88c92eb73fc0f119373f12587dd887e7dbb4a06dbb6e6d33d55c9" Feb 17 14:06:19 crc kubenswrapper[4836]: I0217 14:06:19.642025 4836 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:06:19 crc kubenswrapper[4836]: I0217 14:06:19.642432 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"7e054e9ec9a1f8d54b704be11ff2e55c9aaad9d717b1817aa553cfbf2cc6ac16"} Feb 17 14:06:19 crc kubenswrapper[4836]: I0217 14:06:19.642460 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"18b1d7908740d818568e489caf3447ab0f5768c180cc6d443adc67414e51cd07"} Feb 17 14:06:19 crc kubenswrapper[4836]: I0217 14:06:19.642473 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"0143c73054039d5957c29bdf62217900d803ba2e0046c09dcff149b27fc94995"} Feb 17 14:06:19 crc kubenswrapper[4836]: I0217 14:06:19.642482 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"4fbf3d3daf43d0da96ad0c1340b3c72815adceec24d0ed5fc97f965d1d93e7e8"} Feb 17 14:06:19 crc kubenswrapper[4836]: I0217 14:06:19.642530 4836 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:06:19 crc kubenswrapper[4836]: I0217 14:06:19.643070 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:19 crc kubenswrapper[4836]: I0217 14:06:19.643091 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:19 crc kubenswrapper[4836]: I0217 14:06:19.643098 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:19 crc kubenswrapper[4836]: I0217 14:06:19.643551 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:19 crc kubenswrapper[4836]: I0217 14:06:19.643567 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:19 crc kubenswrapper[4836]: I0217 14:06:19.643574 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:20 crc kubenswrapper[4836]: I0217 14:06:20.405108 4836 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 17 14:06:20 crc kubenswrapper[4836]: I0217 14:06:20.470086 4836 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-12 09:15:12.274510797 +0000 UTC Feb 17 14:06:20 crc kubenswrapper[4836]: I0217 14:06:20.648414 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"6a5d1fd5d2de253417bcb1e4edadc2363a1c259769418eb2b2aab3ff8253de2b"} Feb 17 14:06:20 crc kubenswrapper[4836]: I0217 14:06:20.648481 4836 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:06:20 crc kubenswrapper[4836]: I0217 14:06:20.649552 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:20 crc kubenswrapper[4836]: I0217 14:06:20.649582 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:20 crc kubenswrapper[4836]: I0217 14:06:20.649594 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:20 crc kubenswrapper[4836]: I0217 14:06:20.650146 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 17 14:06:20 crc kubenswrapper[4836]: I0217 14:06:20.652621 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a13a546a929cc67fa3a708f23186f65c3dbc28afd67a421a66c732b8308e4177"} Feb 17 14:06:20 crc kubenswrapper[4836]: I0217 14:06:20.652676 4836 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:06:20 crc kubenswrapper[4836]: I0217 14:06:20.652821 4836 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:06:20 crc kubenswrapper[4836]: I0217 14:06:20.653850 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:20 crc kubenswrapper[4836]: I0217 14:06:20.653916 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:20 crc kubenswrapper[4836]: I0217 14:06:20.653935 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:20 crc kubenswrapper[4836]: I0217 14:06:20.654097 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:20 crc kubenswrapper[4836]: I0217 14:06:20.654132 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:20 crc kubenswrapper[4836]: I0217 14:06:20.654143 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:20 crc kubenswrapper[4836]: I0217 14:06:20.862985 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Feb 17 14:06:20 crc kubenswrapper[4836]: I0217 14:06:20.957786 4836 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:06:20 crc kubenswrapper[4836]: I0217 14:06:20.958990 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:20 crc kubenswrapper[4836]: I0217 14:06:20.959029 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:20 crc kubenswrapper[4836]: I0217 14:06:20.959039 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:20 crc kubenswrapper[4836]: I0217 14:06:20.959063 4836 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 17 14:06:21 crc kubenswrapper[4836]: I0217 14:06:21.199081 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:06:21 crc kubenswrapper[4836]: I0217 14:06:21.470571 4836 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-23 21:13:05.369418191 +0000 UTC Feb 17 14:06:21 crc kubenswrapper[4836]: I0217 14:06:21.654378 4836 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:06:21 crc kubenswrapper[4836]: I0217 14:06:21.654451 4836 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:06:21 crc kubenswrapper[4836]: I0217 14:06:21.654624 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:06:21 crc kubenswrapper[4836]: I0217 14:06:21.655401 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:21 crc kubenswrapper[4836]: I0217 14:06:21.655464 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:21 crc kubenswrapper[4836]: I0217 14:06:21.655481 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:21 crc kubenswrapper[4836]: I0217 14:06:21.655850 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:21 crc kubenswrapper[4836]: I0217 14:06:21.655921 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:21 crc kubenswrapper[4836]: I0217 14:06:21.655953 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:21 crc kubenswrapper[4836]: I0217 14:06:21.784193 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 14:06:21 crc kubenswrapper[4836]: I0217 14:06:21.784418 4836 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:06:21 crc kubenswrapper[4836]: I0217 14:06:21.785684 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:21 crc kubenswrapper[4836]: I0217 14:06:21.785721 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:21 crc kubenswrapper[4836]: I0217 14:06:21.785733 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:21 crc kubenswrapper[4836]: I0217 14:06:21.990888 4836 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded" start-of-body= Feb 17 14:06:21 crc kubenswrapper[4836]: I0217 14:06:21.990990 4836 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded" Feb 17 14:06:22 crc kubenswrapper[4836]: I0217 14:06:22.470689 4836 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-22 12:28:00.217074616 +0000 UTC Feb 17 14:06:22 crc kubenswrapper[4836]: I0217 14:06:22.660759 4836 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:06:22 crc kubenswrapper[4836]: I0217 14:06:22.660923 4836 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:06:22 crc kubenswrapper[4836]: I0217 14:06:22.661832 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:22 crc kubenswrapper[4836]: I0217 14:06:22.661868 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:22 crc kubenswrapper[4836]: I0217 14:06:22.661878 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:22 crc kubenswrapper[4836]: I0217 14:06:22.662375 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:22 crc kubenswrapper[4836]: I0217 14:06:22.662403 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:22 crc kubenswrapper[4836]: I0217 14:06:22.662414 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:23 crc kubenswrapper[4836]: I0217 14:06:23.467230 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 14:06:23 crc kubenswrapper[4836]: I0217 14:06:23.467440 4836 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:06:23 crc kubenswrapper[4836]: I0217 14:06:23.468686 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:23 crc kubenswrapper[4836]: I0217 14:06:23.468752 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:23 crc kubenswrapper[4836]: I0217 14:06:23.468771 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:23 crc kubenswrapper[4836]: I0217 14:06:23.471370 4836 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-06 06:20:59.339888234 +0000 UTC Feb 17 14:06:24 crc kubenswrapper[4836]: I0217 14:06:24.472097 4836 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-27 05:49:01.945698427 +0000 UTC Feb 17 14:06:24 crc kubenswrapper[4836]: E0217 14:06:24.654428 4836 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 17 14:06:24 crc kubenswrapper[4836]: I0217 14:06:24.830786 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Feb 17 14:06:24 crc kubenswrapper[4836]: I0217 14:06:24.830963 4836 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:06:24 crc kubenswrapper[4836]: I0217 14:06:24.831984 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:24 crc kubenswrapper[4836]: I0217 14:06:24.832018 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:24 crc kubenswrapper[4836]: I0217 14:06:24.832027 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:25 crc kubenswrapper[4836]: I0217 14:06:25.472366 4836 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-21 15:20:57.464223835 +0000 UTC Feb 17 14:06:26 crc kubenswrapper[4836]: I0217 14:06:26.437857 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 14:06:26 crc kubenswrapper[4836]: I0217 14:06:26.438115 4836 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:06:26 crc kubenswrapper[4836]: I0217 14:06:26.440363 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:26 crc kubenswrapper[4836]: I0217 14:06:26.440489 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:26 crc kubenswrapper[4836]: I0217 14:06:26.440582 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:26 crc kubenswrapper[4836]: I0217 14:06:26.446153 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 14:06:26 crc kubenswrapper[4836]: I0217 14:06:26.473442 4836 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-18 21:30:23.581972888 +0000 UTC Feb 17 14:06:26 crc kubenswrapper[4836]: I0217 14:06:26.668895 4836 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:06:26 crc kubenswrapper[4836]: I0217 14:06:26.670151 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:26 crc kubenswrapper[4836]: I0217 14:06:26.670227 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:26 crc kubenswrapper[4836]: I0217 14:06:26.670259 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:26 crc kubenswrapper[4836]: I0217 14:06:26.674167 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 14:06:27 crc kubenswrapper[4836]: I0217 14:06:27.473974 4836 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-13 01:46:01.08478938 +0000 UTC Feb 17 14:06:27 crc kubenswrapper[4836]: I0217 14:06:27.670705 4836 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:06:27 crc kubenswrapper[4836]: I0217 14:06:27.671785 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:27 crc kubenswrapper[4836]: I0217 14:06:27.671847 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:27 crc kubenswrapper[4836]: I0217 14:06:27.671864 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:28 crc kubenswrapper[4836]: I0217 14:06:28.474323 4836 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-09 12:13:39.456549446 +0000 UTC Feb 17 14:06:29 crc kubenswrapper[4836]: I0217 14:06:29.335908 4836 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Feb 17 14:06:29 crc kubenswrapper[4836]: I0217 14:06:29.335962 4836 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Feb 17 14:06:29 crc kubenswrapper[4836]: I0217 14:06:29.474524 4836 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-25 14:25:45.412269269 +0000 UTC Feb 17 14:06:29 crc kubenswrapper[4836]: I0217 14:06:29.572229 4836 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="Get \"https://192.168.126.11:6443/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 17 14:06:29 crc kubenswrapper[4836]: I0217 14:06:29.572309 4836 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="Get \"https://192.168.126.11:6443/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 17 14:06:30 crc kubenswrapper[4836]: I0217 14:06:30.263284 4836 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 17 14:06:30 crc kubenswrapper[4836]: I0217 14:06:30.263353 4836 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Feb 17 14:06:30 crc kubenswrapper[4836]: I0217 14:06:30.480083 4836 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-18 23:51:45.167330691 +0000 UTC Feb 17 14:06:31 crc kubenswrapper[4836]: I0217 14:06:31.480592 4836 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-07 01:36:08.755481393 +0000 UTC Feb 17 14:06:32 crc kubenswrapper[4836]: I0217 14:06:32.173923 4836 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded" start-of-body= Feb 17 14:06:32 crc kubenswrapper[4836]: I0217 14:06:32.174066 4836 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded" Feb 17 14:06:32 crc kubenswrapper[4836]: I0217 14:06:32.481776 4836 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-07 04:25:58.199976609 +0000 UTC Feb 17 14:06:33 crc kubenswrapper[4836]: I0217 14:06:33.482452 4836 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 12:11:16.49054462 +0000 UTC Feb 17 14:06:34 crc kubenswrapper[4836]: I0217 14:06:34.482573 4836 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-25 01:30:00.234956783 +0000 UTC Feb 17 14:06:34 crc kubenswrapper[4836]: I0217 14:06:34.581430 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:06:34 crc kubenswrapper[4836]: I0217 14:06:34.581771 4836 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:06:34 crc kubenswrapper[4836]: I0217 14:06:34.584631 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:34 crc kubenswrapper[4836]: I0217 14:06:34.584697 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:34 crc kubenswrapper[4836]: I0217 14:06:34.584708 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:34 crc kubenswrapper[4836]: I0217 14:06:34.586756 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:06:34 crc kubenswrapper[4836]: E0217 14:06:34.654596 4836 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 17 14:06:34 crc kubenswrapper[4836]: I0217 14:06:34.693315 4836 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:06:34 crc kubenswrapper[4836]: I0217 14:06:34.694160 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:34 crc kubenswrapper[4836]: I0217 14:06:34.694191 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:34 crc kubenswrapper[4836]: I0217 14:06:34.694201 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:34 crc kubenswrapper[4836]: I0217 14:06:34.886618 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Feb 17 14:06:34 crc kubenswrapper[4836]: I0217 14:06:34.887337 4836 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:06:34 crc kubenswrapper[4836]: I0217 14:06:34.888560 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:34 crc kubenswrapper[4836]: I0217 14:06:34.888614 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:34 crc kubenswrapper[4836]: I0217 14:06:34.888627 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:34 crc kubenswrapper[4836]: I0217 14:06:34.899529 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Feb 17 14:06:35 crc kubenswrapper[4836]: E0217 14:06:35.268005 4836 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.270424 4836 trace.go:236] Trace[1456549407]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (17-Feb-2026 14:06:21.994) (total time: 13276ms): Feb 17 14:06:35 crc kubenswrapper[4836]: Trace[1456549407]: ---"Objects listed" error: 13276ms (14:06:35.270) Feb 17 14:06:35 crc kubenswrapper[4836]: Trace[1456549407]: [13.27613291s] [13.27613291s] END Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.270478 4836 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.270795 4836 trace.go:236] Trace[1113258187]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (17-Feb-2026 14:06:23.149) (total time: 12120ms): Feb 17 14:06:35 crc kubenswrapper[4836]: Trace[1113258187]: ---"Objects listed" error: 12120ms (14:06:35.270) Feb 17 14:06:35 crc kubenswrapper[4836]: Trace[1113258187]: [12.120864197s] [12.120864197s] END Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.270826 4836 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.272131 4836 trace.go:236] Trace[1105507227]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (17-Feb-2026 14:06:23.637) (total time: 11634ms): Feb 17 14:06:35 crc kubenswrapper[4836]: Trace[1105507227]: ---"Objects listed" error: 11634ms (14:06:35.272) Feb 17 14:06:35 crc kubenswrapper[4836]: Trace[1105507227]: [11.634940224s] [11.634940224s] END Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.272332 4836 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.272928 4836 trace.go:236] Trace[1684451451]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (17-Feb-2026 14:06:22.037) (total time: 13234ms): Feb 17 14:06:35 crc kubenswrapper[4836]: Trace[1684451451]: ---"Objects listed" error: 13234ms (14:06:35.272) Feb 17 14:06:35 crc kubenswrapper[4836]: Trace[1684451451]: [13.234925026s] [13.234925026s] END Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.272958 4836 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.278947 4836 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Feb 17 14:06:35 crc kubenswrapper[4836]: E0217 14:06:35.279107 4836 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.289469 4836 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.320025 4836 csr.go:261] certificate signing request csr-fn48j is approved, waiting to be issued Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.327093 4836 csr.go:257] certificate signing request csr-fn48j is issued Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.471724 4836 apiserver.go:52] "Watching apiserver" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.475067 4836 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.475682 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf"] Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.476151 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.476571 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.477054 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.477054 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.477101 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.477309 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:06:35 crc kubenswrapper[4836]: E0217 14:06:35.477432 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:06:35 crc kubenswrapper[4836]: E0217 14:06:35.477501 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:06:35 crc kubenswrapper[4836]: E0217 14:06:35.477648 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.478208 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-vt5sw"] Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.478549 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-jlz6g"] Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.478680 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-vt5sw" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.478817 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-jlz6g" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.480481 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.480778 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.481285 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.482966 4836 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-29 12:04:27.304922284 +0000 UTC Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.486924 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.487310 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.487551 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.487750 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.487772 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.494030 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.494076 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.494930 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.499190 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.499209 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.499459 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.506443 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.520662 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.567253 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.569190 4836 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.578988 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.579619 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.579724 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.579803 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.579880 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.579945 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.580012 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.580082 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.580149 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.580234 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.580330 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.580411 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.580479 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.580549 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.580604 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.580622 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.580711 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.580740 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.580764 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.580784 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.580808 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.580829 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.580852 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.580877 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.580899 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.580922 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.580944 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.580965 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.580989 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.581009 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.581024 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.581039 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.581057 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.581071 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.581085 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.581105 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.581128 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.581156 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.581177 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.581200 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.581220 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.581241 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.581263 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.581286 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.581339 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.581365 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.581387 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.581409 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.581429 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.581460 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.581480 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.581509 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.581530 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.581551 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.581572 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.581604 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.581624 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.581646 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.581670 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.581691 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.581711 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.581732 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.581753 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.581787 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.581810 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.581832 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.581852 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.582010 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.582036 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.582067 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.582088 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.582112 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.582134 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.582152 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.582171 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.582187 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.582204 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.582220 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.582237 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.582264 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.582310 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.582332 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.582350 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.582367 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.582383 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.582398 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.582415 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.582432 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.582447 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.582463 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.582478 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.582492 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.582506 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.582524 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.582539 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.582558 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.582574 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.582589 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.582606 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.582622 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.582659 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.582678 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.582713 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.582728 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.582744 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.582760 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.582776 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.582792 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.582808 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.582824 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.582844 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.582861 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.582878 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.582895 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.582910 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.582925 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.582945 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.582962 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.582978 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.582996 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.583013 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.583029 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.583044 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.583059 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.583058 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.583074 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.583089 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.583106 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.583125 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.583142 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.583158 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.583173 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.583190 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.583205 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.583220 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.583236 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.583254 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.583274 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.583311 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.583329 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.583345 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.584983 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.583228 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.583250 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.593704 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.583311 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.583324 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.583414 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: E0217 14:06:35.583481 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:06:36.083446558 +0000 UTC m=+22.426374827 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.583513 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.583599 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.583723 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.583737 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.583772 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.583821 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.583920 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.583947 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.583972 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.584040 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.584115 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.584183 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.584205 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.584248 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.584413 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.584417 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.584642 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.584714 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.584664 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.584806 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.584837 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.584882 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.584929 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.585027 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.585080 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.585126 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.585193 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.586395 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.586899 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.587148 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.587147 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.587264 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.587348 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.587494 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.587789 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.588107 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.588160 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.588394 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.588416 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.588457 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.588632 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.588743 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.588866 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.588873 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.589071 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.589133 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.589322 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.589341 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.589404 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.589424 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.589537 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.589547 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.589722 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.589777 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.589977 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.590326 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.590360 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.590382 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.590414 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.590521 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.590766 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.590930 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.591033 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.591580 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.591658 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.591900 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.592030 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.592272 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.592288 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.592613 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.592673 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.592755 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.592875 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.592882 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.592405 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.593377 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.594186 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.593496 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.593501 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.594147 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.594246 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.594278 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.594326 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.594350 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.594432 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.594453 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.594552 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.594555 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.594573 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.594594 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.594614 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.594632 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.594617 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.594648 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.594641 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.594632 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.594708 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.594756 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.594896 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.595233 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.595328 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.595551 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.595595 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.595762 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.595881 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.595921 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.595967 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.595965 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.594807 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.596340 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.596379 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.596779 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.596829 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.596866 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.596922 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.596960 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.596995 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.597046 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.597083 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.596223 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.596310 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.596645 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.596699 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.596731 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.597088 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.597648 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.598128 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.598162 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.598153 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.598426 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.598788 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.599240 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.599264 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.599329 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.597118 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.599391 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.599400 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.599411 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.599636 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.599660 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.599671 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.599430 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.599750 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.599701 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.599870 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.599899 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.599913 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.599918 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.599948 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.599976 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.600002 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.600021 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.600041 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.600059 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.600100 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.600144 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.600148 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.600162 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.600182 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.600202 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.600218 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.600234 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.600250 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.600272 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.600310 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.600329 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.600344 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.600362 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.600377 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.600393 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.600409 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.600412 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.600427 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.600502 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.600544 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f87e91ef-e64c-45a5-9bd5-cc6537e51b1b-serviceca\") pod \"node-ca-jlz6g\" (UID: \"f87e91ef-e64c-45a5-9bd5-cc6537e51b1b\") " pod="openshift-image-registry/node-ca-jlz6g" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.600569 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.600592 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.600610 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f87e91ef-e64c-45a5-9bd5-cc6537e51b1b-host\") pod \"node-ca-jlz6g\" (UID: \"f87e91ef-e64c-45a5-9bd5-cc6537e51b1b\") " pod="openshift-image-registry/node-ca-jlz6g" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.600628 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vbzg\" (UniqueName: \"kubernetes.io/projected/f87e91ef-e64c-45a5-9bd5-cc6537e51b1b-kube-api-access-8vbzg\") pod \"node-ca-jlz6g\" (UID: \"f87e91ef-e64c-45a5-9bd5-cc6537e51b1b\") " pod="openshift-image-registry/node-ca-jlz6g" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.600653 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.600676 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.600700 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.600724 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqtsz\" (UniqueName: \"kubernetes.io/projected/f6d1f430-35ed-4c4e-a797-d7a0a5a45266-kube-api-access-kqtsz\") pod \"node-resolver-vt5sw\" (UID: \"f6d1f430-35ed-4c4e-a797-d7a0a5a45266\") " pod="openshift-dns/node-resolver-vt5sw" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.600746 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.600769 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.600775 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.600790 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.600819 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.600840 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.600869 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.600895 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.600919 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.600937 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f6d1f430-35ed-4c4e-a797-d7a0a5a45266-hosts-file\") pod \"node-resolver-vt5sw\" (UID: \"f6d1f430-35ed-4c4e-a797-d7a0a5a45266\") " pod="openshift-dns/node-resolver-vt5sw" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.600938 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.601049 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.601065 4836 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.601077 4836 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.601091 4836 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.601103 4836 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.601115 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.601128 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.601142 4836 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.601153 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.601162 4836 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.601173 4836 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.601183 4836 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.601194 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.601208 4836 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.601221 4836 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.601234 4836 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.601248 4836 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.601257 4836 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.601266 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.601275 4836 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.601284 4836 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.601316 4836 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.601321 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.601329 4836 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.601341 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.601368 4836 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.601388 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.601404 4836 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.601416 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.601428 4836 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.601438 4836 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.601449 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.601463 4836 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.601477 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.601490 4836 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.601660 4836 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.601677 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.601677 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.601838 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.601884 4836 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.601897 4836 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.601909 4836 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.601993 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.602011 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.602025 4836 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.602038 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.602051 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.602063 4836 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.602081 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.602094 4836 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.602106 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.602119 4836 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.602132 4836 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.602145 4836 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.602159 4836 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.602172 4836 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.602184 4836 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.602211 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.602224 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.602241 4836 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.602251 4836 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.602264 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.602276 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.602287 4836 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.602319 4836 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.602331 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.602344 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.602356 4836 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.602484 4836 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.602501 4836 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.602513 4836 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.602526 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.602537 4836 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.602550 4836 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.602562 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.602575 4836 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.602588 4836 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.602604 4836 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.602566 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.602617 4836 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.602719 4836 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.602734 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.602746 4836 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.602756 4836 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.602766 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.602776 4836 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.602786 4836 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.602797 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.602807 4836 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.602819 4836 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.602832 4836 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.602846 4836 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.602863 4836 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.602876 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.602889 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.602906 4836 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.602918 4836 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.602931 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.602945 4836 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.602957 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.602981 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.602994 4836 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.603006 4836 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.603018 4836 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.603029 4836 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.603041 4836 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.603053 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.603065 4836 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.603078 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.603090 4836 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.603104 4836 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.603116 4836 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.603131 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.603145 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.603160 4836 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.603173 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.603185 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.603198 4836 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.603210 4836 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.603223 4836 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.603236 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.603249 4836 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.603261 4836 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.603272 4836 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.603287 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.603320 4836 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.603333 4836 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.603348 4836 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.603360 4836 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.603371 4836 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.602008 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.602042 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.602359 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.602585 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.602736 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.602961 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.603390 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.603051 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.603123 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.603469 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.603184 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.603621 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.603925 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.604147 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.604567 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.604632 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.604639 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.604908 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.605163 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.605231 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.605539 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.605818 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.606270 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: E0217 14:06:35.606455 4836 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 14:06:35 crc kubenswrapper[4836]: E0217 14:06:35.606579 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 14:06:36.106562249 +0000 UTC m=+22.449490518 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 14:06:35 crc kubenswrapper[4836]: E0217 14:06:35.606644 4836 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 14:06:35 crc kubenswrapper[4836]: E0217 14:06:35.606683 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 14:06:36.106676812 +0000 UTC m=+22.449605081 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.606686 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.607148 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.607223 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.607233 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.607641 4836 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.608664 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.608764 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.608989 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.609191 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.609181 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.609384 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.609741 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.610093 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.611389 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.611520 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.611989 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.612045 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.612368 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.615491 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.616250 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.617505 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.618176 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.619278 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.619405 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.619433 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: E0217 14:06:35.619790 4836 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 14:06:35 crc kubenswrapper[4836]: E0217 14:06:35.619813 4836 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 14:06:35 crc kubenswrapper[4836]: E0217 14:06:35.619848 4836 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.619876 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 14:06:35 crc kubenswrapper[4836]: E0217 14:06:35.619939 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-17 14:06:36.119919263 +0000 UTC m=+22.462847532 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.620402 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.621680 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.621738 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.622404 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: E0217 14:06:35.622468 4836 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 14:06:35 crc kubenswrapper[4836]: E0217 14:06:35.622492 4836 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 14:06:35 crc kubenswrapper[4836]: E0217 14:06:35.622506 4836 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 14:06:35 crc kubenswrapper[4836]: E0217 14:06:35.622561 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-17 14:06:36.122541443 +0000 UTC m=+22.465469882 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.622558 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.623132 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.623554 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.624490 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.624589 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.624836 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.626108 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.626881 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.627011 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.627056 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.627542 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.627796 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.629561 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.630968 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.631121 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.631626 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.635911 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.638329 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vt5sw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d1f430-35ed-4c4e-a797-d7a0a5a45266\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqtsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vt5sw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.639869 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.640511 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.649765 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.650122 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.652924 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.653824 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.659541 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.675548 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.692243 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jlz6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f87e91ef-e64c-45a5-9bd5-cc6537e51b1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vbzg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jlz6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.703717 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqtsz\" (UniqueName: \"kubernetes.io/projected/f6d1f430-35ed-4c4e-a797-d7a0a5a45266-kube-api-access-kqtsz\") pod \"node-resolver-vt5sw\" (UID: \"f6d1f430-35ed-4c4e-a797-d7a0a5a45266\") " pod="openshift-dns/node-resolver-vt5sw" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.703792 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.703820 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.703843 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f6d1f430-35ed-4c4e-a797-d7a0a5a45266-hosts-file\") pod \"node-resolver-vt5sw\" (UID: \"f6d1f430-35ed-4c4e-a797-d7a0a5a45266\") " pod="openshift-dns/node-resolver-vt5sw" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.703868 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f87e91ef-e64c-45a5-9bd5-cc6537e51b1b-serviceca\") pod \"node-ca-jlz6g\" (UID: \"f87e91ef-e64c-45a5-9bd5-cc6537e51b1b\") " pod="openshift-image-registry/node-ca-jlz6g" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.703894 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f87e91ef-e64c-45a5-9bd5-cc6537e51b1b-host\") pod \"node-ca-jlz6g\" (UID: \"f87e91ef-e64c-45a5-9bd5-cc6537e51b1b\") " pod="openshift-image-registry/node-ca-jlz6g" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.703915 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vbzg\" (UniqueName: \"kubernetes.io/projected/f87e91ef-e64c-45a5-9bd5-cc6537e51b1b-kube-api-access-8vbzg\") pod \"node-ca-jlz6g\" (UID: \"f87e91ef-e64c-45a5-9bd5-cc6537e51b1b\") " pod="openshift-image-registry/node-ca-jlz6g" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.703968 4836 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.703983 4836 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.703995 4836 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.704009 4836 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.704023 4836 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.704036 4836 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.704048 4836 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.704062 4836 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.704075 4836 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.704085 4836 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.704096 4836 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.704111 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.704124 4836 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.704120 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f6d1f430-35ed-4c4e-a797-d7a0a5a45266-hosts-file\") pod \"node-resolver-vt5sw\" (UID: \"f6d1f430-35ed-4c4e-a797-d7a0a5a45266\") " pod="openshift-dns/node-resolver-vt5sw" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.704136 4836 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.704148 4836 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.704331 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.704369 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.704401 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f87e91ef-e64c-45a5-9bd5-cc6537e51b1b-host\") pod \"node-ca-jlz6g\" (UID: \"f87e91ef-e64c-45a5-9bd5-cc6537e51b1b\") " pod="openshift-image-registry/node-ca-jlz6g" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.704419 4836 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.704434 4836 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.704446 4836 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.704459 4836 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.704470 4836 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.704482 4836 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.704495 4836 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.704507 4836 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.704519 4836 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.704541 4836 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.704565 4836 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.704577 4836 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.704589 4836 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.704600 4836 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.704613 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.704625 4836 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.704637 4836 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.704647 4836 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.704659 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.704671 4836 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.704681 4836 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.704692 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.704703 4836 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.704713 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.704725 4836 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.704737 4836 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.704750 4836 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.704762 4836 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.704774 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.704786 4836 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.704798 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.704810 4836 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.704821 4836 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.704832 4836 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.704844 4836 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.704873 4836 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.704885 4836 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.704899 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.704919 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.704930 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.704943 4836 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.704954 4836 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.704964 4836 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.704975 4836 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.704986 4836 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.704998 4836 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.705008 4836 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.705018 4836 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.705028 4836 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.705038 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.705050 4836 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.705061 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.705072 4836 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.705107 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f87e91ef-e64c-45a5-9bd5-cc6537e51b1b-serviceca\") pod \"node-ca-jlz6g\" (UID: \"f87e91ef-e64c-45a5-9bd5-cc6537e51b1b\") " pod="openshift-image-registry/node-ca-jlz6g" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.718698 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqtsz\" (UniqueName: \"kubernetes.io/projected/f6d1f430-35ed-4c4e-a797-d7a0a5a45266-kube-api-access-kqtsz\") pod \"node-resolver-vt5sw\" (UID: \"f6d1f430-35ed-4c4e-a797-d7a0a5a45266\") " pod="openshift-dns/node-resolver-vt5sw" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.719376 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vbzg\" (UniqueName: \"kubernetes.io/projected/f87e91ef-e64c-45a5-9bd5-cc6537e51b1b-kube-api-access-8vbzg\") pod \"node-ca-jlz6g\" (UID: \"f87e91ef-e64c-45a5-9bd5-cc6537e51b1b\") " pod="openshift-image-registry/node-ca-jlz6g" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.721414 4836 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:34420->192.168.126.11:17697: read: connection reset by peer" start-of-body= Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.721494 4836 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:34420->192.168.126.11:17697: read: connection reset by peer" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.720406 4836 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:41092->192.168.126.11:17697: read: connection reset by peer" start-of-body= Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.722664 4836 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:41092->192.168.126.11:17697: read: connection reset by peer" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.723017 4836 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.723129 4836 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.795557 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.806616 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 17 14:06:35 crc kubenswrapper[4836]: W0217 14:06:35.817411 4836 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-fddcb96714784863e96abf5ed55db043ac0bfb9c2084ffe566e853311f983486 WatchSource:0}: Error finding container fddcb96714784863e96abf5ed55db043ac0bfb9c2084ffe566e853311f983486: Status 404 returned error can't find the container with id fddcb96714784863e96abf5ed55db043ac0bfb9c2084ffe566e853311f983486 Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.821175 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.831056 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-vt5sw" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.836731 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-jlz6g" Feb 17 14:06:35 crc kubenswrapper[4836]: W0217 14:06:35.857709 4836 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf6d1f430_35ed_4c4e_a797_d7a0a5a45266.slice/crio-2944fbb7f287893828c9c9b126d5f545dfd917e4c6229460a9aedb029850836e WatchSource:0}: Error finding container 2944fbb7f287893828c9c9b126d5f545dfd917e4c6229460a9aedb029850836e: Status 404 returned error can't find the container with id 2944fbb7f287893828c9c9b126d5f545dfd917e4c6229460a9aedb029850836e Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.863099 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.931394 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-bkk9g"] Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.932350 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.936608 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.936638 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.936613 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.936842 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.937013 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.945021 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.958837 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.985169 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 14:06:35 crc kubenswrapper[4836]: I0217 14:06:35.995077 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jlz6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f87e91ef-e64c-45a5-9bd5-cc6537e51b1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vbzg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jlz6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.005799 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"895a19c9-a3f0-4a15-aa19-19347121388c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99tf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99tf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bkk9g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.006680 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/895a19c9-a3f0-4a15-aa19-19347121388c-proxy-tls\") pod \"machine-config-daemon-bkk9g\" (UID: \"895a19c9-a3f0-4a15-aa19-19347121388c\") " pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.006733 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/895a19c9-a3f0-4a15-aa19-19347121388c-rootfs\") pod \"machine-config-daemon-bkk9g\" (UID: \"895a19c9-a3f0-4a15-aa19-19347121388c\") " pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.006775 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/895a19c9-a3f0-4a15-aa19-19347121388c-mcd-auth-proxy-config\") pod \"machine-config-daemon-bkk9g\" (UID: \"895a19c9-a3f0-4a15-aa19-19347121388c\") " pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.006792 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99tf9\" (UniqueName: \"kubernetes.io/projected/895a19c9-a3f0-4a15-aa19-19347121388c-kube-api-access-99tf9\") pod \"machine-config-daemon-bkk9g\" (UID: \"895a19c9-a3f0-4a15-aa19-19347121388c\") " pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.017623 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.025988 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vt5sw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d1f430-35ed-4c4e-a797-d7a0a5a45266\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqtsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vt5sw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.042514 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"243121c7-6c63-4df6-a6a6-1ef6770e1125\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0143c73054039d5957c29bdf62217900d803ba2e0046c09dcff149b27fc94995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18b1d7908740d818568e489caf3447ab0f5768c180cc6d443adc67414e51cd07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e054e9ec9a1f8d54b704be11ff2e55c9aaad9d717b1817aa553cfbf2cc6ac16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a5d1fd5d2de253417bcb1e4edadc2363a1c259769418eb2b2aab3ff8253de2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fbf3d3daf43d0da96ad0c1340b3c72815adceec24d0ed5fc97f965d1d93e7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://854199187b81a3d6692a05073c35985feb14993f76f4ac86149c2b6bac635e4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://854199187b81a3d6692a05073c35985feb14993f76f4ac86149c2b6bac635e4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67b26f0d65128004f7645ac1f54a11bc56909269cb9b9337e447b0d5bf783c4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67b26f0d65128004f7645ac1f54a11bc56909269cb9b9337e447b0d5bf783c4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e9fc4a71369f80c06598341729c7db0cb6e0f33d70b0be64470bc6fd40d1cce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9fc4a71369f80c06598341729c7db0cb6e0f33d70b0be64470bc6fd40d1cce2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.057285 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.068732 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 14:06:36 crc kubenswrapper[4836]: E0217 14:06:36.107550 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:06:37.107524241 +0000 UTC m=+23.450452510 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.107881 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.107986 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.108013 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.108040 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/895a19c9-a3f0-4a15-aa19-19347121388c-proxy-tls\") pod \"machine-config-daemon-bkk9g\" (UID: \"895a19c9-a3f0-4a15-aa19-19347121388c\") " pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.108064 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/895a19c9-a3f0-4a15-aa19-19347121388c-rootfs\") pod \"machine-config-daemon-bkk9g\" (UID: \"895a19c9-a3f0-4a15-aa19-19347121388c\") " pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.108087 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/895a19c9-a3f0-4a15-aa19-19347121388c-mcd-auth-proxy-config\") pod \"machine-config-daemon-bkk9g\" (UID: \"895a19c9-a3f0-4a15-aa19-19347121388c\") " pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.108117 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99tf9\" (UniqueName: \"kubernetes.io/projected/895a19c9-a3f0-4a15-aa19-19347121388c-kube-api-access-99tf9\") pod \"machine-config-daemon-bkk9g\" (UID: \"895a19c9-a3f0-4a15-aa19-19347121388c\") " pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" Feb 17 14:06:36 crc kubenswrapper[4836]: E0217 14:06:36.108577 4836 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 14:06:36 crc kubenswrapper[4836]: E0217 14:06:36.108630 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 14:06:37.10861273 +0000 UTC m=+23.451541049 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 14:06:36 crc kubenswrapper[4836]: E0217 14:06:36.108694 4836 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 14:06:36 crc kubenswrapper[4836]: E0217 14:06:36.108797 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 14:06:37.108716302 +0000 UTC m=+23.451644571 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.109615 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/895a19c9-a3f0-4a15-aa19-19347121388c-rootfs\") pod \"machine-config-daemon-bkk9g\" (UID: \"895a19c9-a3f0-4a15-aa19-19347121388c\") " pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.110613 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/895a19c9-a3f0-4a15-aa19-19347121388c-mcd-auth-proxy-config\") pod \"machine-config-daemon-bkk9g\" (UID: \"895a19c9-a3f0-4a15-aa19-19347121388c\") " pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.116358 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/895a19c9-a3f0-4a15-aa19-19347121388c-proxy-tls\") pod \"machine-config-daemon-bkk9g\" (UID: \"895a19c9-a3f0-4a15-aa19-19347121388c\") " pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.128442 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99tf9\" (UniqueName: \"kubernetes.io/projected/895a19c9-a3f0-4a15-aa19-19347121388c-kube-api-access-99tf9\") pod \"machine-config-daemon-bkk9g\" (UID: \"895a19c9-a3f0-4a15-aa19-19347121388c\") " pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.208997 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.209033 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:06:36 crc kubenswrapper[4836]: E0217 14:06:36.219346 4836 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 14:06:36 crc kubenswrapper[4836]: E0217 14:06:36.219397 4836 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 14:06:36 crc kubenswrapper[4836]: E0217 14:06:36.219412 4836 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 14:06:36 crc kubenswrapper[4836]: E0217 14:06:36.219414 4836 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 14:06:36 crc kubenswrapper[4836]: E0217 14:06:36.219454 4836 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 14:06:36 crc kubenswrapper[4836]: E0217 14:06:36.219471 4836 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 14:06:36 crc kubenswrapper[4836]: E0217 14:06:36.219490 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-17 14:06:37.219469974 +0000 UTC m=+23.562398243 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 14:06:36 crc kubenswrapper[4836]: E0217 14:06:36.219532 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-17 14:06:37.219511675 +0000 UTC m=+23.562439944 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.257013 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.302171 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-c76cc"] Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.302551 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-c76cc" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.308115 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.308401 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.308759 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.308889 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.309103 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.310012 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-gfznp"] Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.312154 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-t7845"] Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.312368 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.313169 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-t7845" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.322024 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.322450 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.322581 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.322689 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.322816 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.323047 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.323197 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.323197 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.328150 4836 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-17 14:01:35 +0000 UTC, rotation deadline is 2026-11-09 17:39:03.674999823 +0000 UTC Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.328205 4836 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6363h32m27.34679789s for next certificate rotation Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.331320 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.341865 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.366127 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 14:06:36 crc kubenswrapper[4836]: W0217 14:06:36.367632 4836 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod895a19c9_a3f0_4a15_aa19_19347121388c.slice/crio-089eaed7b83958ef4cd6e49ca80a36b3b45719b2e4981b5ea960d68f4da80549 WatchSource:0}: Error finding container 089eaed7b83958ef4cd6e49ca80a36b3b45719b2e4981b5ea960d68f4da80549: Status 404 returned error can't find the container with id 089eaed7b83958ef4cd6e49ca80a36b3b45719b2e4981b5ea960d68f4da80549 Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.384573 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.408366 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jlz6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f87e91ef-e64c-45a5-9bd5-cc6537e51b1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vbzg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jlz6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.410568 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/592aa549-1b1b-441e-93e4-0821e05ff2b2-system-cni-dir\") pod \"multus-c76cc\" (UID: \"592aa549-1b1b-441e-93e4-0821e05ff2b2\") " pod="openshift-multus/multus-c76cc" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.410604 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/592aa549-1b1b-441e-93e4-0821e05ff2b2-host-var-lib-cni-multus\") pod \"multus-c76cc\" (UID: \"592aa549-1b1b-441e-93e4-0821e05ff2b2\") " pod="openshift-multus/multus-c76cc" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.410632 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jc8vh\" (UniqueName: \"kubernetes.io/projected/592aa549-1b1b-441e-93e4-0821e05ff2b2-kube-api-access-jc8vh\") pod \"multus-c76cc\" (UID: \"592aa549-1b1b-441e-93e4-0821e05ff2b2\") " pod="openshift-multus/multus-c76cc" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.410654 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e-etc-openvswitch\") pod \"ovnkube-node-gfznp\" (UID: \"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\") " pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.410687 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e-host-kubelet\") pod \"ovnkube-node-gfznp\" (UID: \"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\") " pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.410710 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e-host-slash\") pod \"ovnkube-node-gfznp\" (UID: \"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\") " pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.410732 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-gfznp\" (UID: \"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\") " pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.410756 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3eeaa6bd-bab3-4310-9522-747924f2e825-cni-binary-copy\") pod \"multus-additional-cni-plugins-t7845\" (UID: \"3eeaa6bd-bab3-4310-9522-747924f2e825\") " pod="openshift-multus/multus-additional-cni-plugins-t7845" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.410775 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zdwb\" (UniqueName: \"kubernetes.io/projected/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e-kube-api-access-7zdwb\") pod \"ovnkube-node-gfznp\" (UID: \"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\") " pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.410793 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e-log-socket\") pod \"ovnkube-node-gfznp\" (UID: \"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\") " pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.410811 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e-ovnkube-config\") pod \"ovnkube-node-gfznp\" (UID: \"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\") " pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.410829 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3eeaa6bd-bab3-4310-9522-747924f2e825-cnibin\") pod \"multus-additional-cni-plugins-t7845\" (UID: \"3eeaa6bd-bab3-4310-9522-747924f2e825\") " pod="openshift-multus/multus-additional-cni-plugins-t7845" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.410851 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/592aa549-1b1b-441e-93e4-0821e05ff2b2-os-release\") pod \"multus-c76cc\" (UID: \"592aa549-1b1b-441e-93e4-0821e05ff2b2\") " pod="openshift-multus/multus-c76cc" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.410877 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/592aa549-1b1b-441e-93e4-0821e05ff2b2-etc-kubernetes\") pod \"multus-c76cc\" (UID: \"592aa549-1b1b-441e-93e4-0821e05ff2b2\") " pod="openshift-multus/multus-c76cc" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.410923 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grf7r\" (UniqueName: \"kubernetes.io/projected/3eeaa6bd-bab3-4310-9522-747924f2e825-kube-api-access-grf7r\") pod \"multus-additional-cni-plugins-t7845\" (UID: \"3eeaa6bd-bab3-4310-9522-747924f2e825\") " pod="openshift-multus/multus-additional-cni-plugins-t7845" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.410944 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/592aa549-1b1b-441e-93e4-0821e05ff2b2-cnibin\") pod \"multus-c76cc\" (UID: \"592aa549-1b1b-441e-93e4-0821e05ff2b2\") " pod="openshift-multus/multus-c76cc" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.410964 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/592aa549-1b1b-441e-93e4-0821e05ff2b2-host-var-lib-kubelet\") pod \"multus-c76cc\" (UID: \"592aa549-1b1b-441e-93e4-0821e05ff2b2\") " pod="openshift-multus/multus-c76cc" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.410983 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/592aa549-1b1b-441e-93e4-0821e05ff2b2-host-var-lib-cni-bin\") pod \"multus-c76cc\" (UID: \"592aa549-1b1b-441e-93e4-0821e05ff2b2\") " pod="openshift-multus/multus-c76cc" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.411002 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e-systemd-units\") pod \"ovnkube-node-gfznp\" (UID: \"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\") " pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.411022 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e-host-run-ovn-kubernetes\") pod \"ovnkube-node-gfznp\" (UID: \"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\") " pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.411042 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e-env-overrides\") pod \"ovnkube-node-gfznp\" (UID: \"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\") " pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.411065 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e-var-lib-openvswitch\") pod \"ovnkube-node-gfznp\" (UID: \"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\") " pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.411086 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e-run-openvswitch\") pod \"ovnkube-node-gfznp\" (UID: \"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\") " pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.411106 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e-node-log\") pod \"ovnkube-node-gfznp\" (UID: \"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\") " pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.411145 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/592aa549-1b1b-441e-93e4-0821e05ff2b2-host-run-multus-certs\") pod \"multus-c76cc\" (UID: \"592aa549-1b1b-441e-93e4-0821e05ff2b2\") " pod="openshift-multus/multus-c76cc" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.411167 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e-ovnkube-script-lib\") pod \"ovnkube-node-gfznp\" (UID: \"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\") " pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.411199 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3eeaa6bd-bab3-4310-9522-747924f2e825-system-cni-dir\") pod \"multus-additional-cni-plugins-t7845\" (UID: \"3eeaa6bd-bab3-4310-9522-747924f2e825\") " pod="openshift-multus/multus-additional-cni-plugins-t7845" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.411221 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/592aa549-1b1b-441e-93e4-0821e05ff2b2-cni-binary-copy\") pod \"multus-c76cc\" (UID: \"592aa549-1b1b-441e-93e4-0821e05ff2b2\") " pod="openshift-multus/multus-c76cc" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.411241 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e-host-cni-bin\") pod \"ovnkube-node-gfznp\" (UID: \"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\") " pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.411262 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e-host-cni-netd\") pod \"ovnkube-node-gfznp\" (UID: \"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\") " pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.411284 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/592aa549-1b1b-441e-93e4-0821e05ff2b2-multus-cni-dir\") pod \"multus-c76cc\" (UID: \"592aa549-1b1b-441e-93e4-0821e05ff2b2\") " pod="openshift-multus/multus-c76cc" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.411331 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e-run-ovn\") pod \"ovnkube-node-gfznp\" (UID: \"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\") " pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.411353 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/592aa549-1b1b-441e-93e4-0821e05ff2b2-multus-socket-dir-parent\") pod \"multus-c76cc\" (UID: \"592aa549-1b1b-441e-93e4-0821e05ff2b2\") " pod="openshift-multus/multus-c76cc" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.411375 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/592aa549-1b1b-441e-93e4-0821e05ff2b2-multus-conf-dir\") pod \"multus-c76cc\" (UID: \"592aa549-1b1b-441e-93e4-0821e05ff2b2\") " pod="openshift-multus/multus-c76cc" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.411406 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e-ovn-node-metrics-cert\") pod \"ovnkube-node-gfznp\" (UID: \"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\") " pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.411428 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3eeaa6bd-bab3-4310-9522-747924f2e825-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-t7845\" (UID: \"3eeaa6bd-bab3-4310-9522-747924f2e825\") " pod="openshift-multus/multus-additional-cni-plugins-t7845" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.411446 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/592aa549-1b1b-441e-93e4-0821e05ff2b2-hostroot\") pod \"multus-c76cc\" (UID: \"592aa549-1b1b-441e-93e4-0821e05ff2b2\") " pod="openshift-multus/multus-c76cc" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.411464 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/592aa549-1b1b-441e-93e4-0821e05ff2b2-multus-daemon-config\") pod \"multus-c76cc\" (UID: \"592aa549-1b1b-441e-93e4-0821e05ff2b2\") " pod="openshift-multus/multus-c76cc" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.411496 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e-host-run-netns\") pod \"ovnkube-node-gfznp\" (UID: \"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\") " pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.411516 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/592aa549-1b1b-441e-93e4-0821e05ff2b2-host-run-netns\") pod \"multus-c76cc\" (UID: \"592aa549-1b1b-441e-93e4-0821e05ff2b2\") " pod="openshift-multus/multus-c76cc" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.411538 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3eeaa6bd-bab3-4310-9522-747924f2e825-tuning-conf-dir\") pod \"multus-additional-cni-plugins-t7845\" (UID: \"3eeaa6bd-bab3-4310-9522-747924f2e825\") " pod="openshift-multus/multus-additional-cni-plugins-t7845" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.411560 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/592aa549-1b1b-441e-93e4-0821e05ff2b2-host-run-k8s-cni-cncf-io\") pod \"multus-c76cc\" (UID: \"592aa549-1b1b-441e-93e4-0821e05ff2b2\") " pod="openshift-multus/multus-c76cc" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.411584 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e-run-systemd\") pod \"ovnkube-node-gfznp\" (UID: \"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\") " pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.411651 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3eeaa6bd-bab3-4310-9522-747924f2e825-os-release\") pod \"multus-additional-cni-plugins-t7845\" (UID: \"3eeaa6bd-bab3-4310-9522-747924f2e825\") " pod="openshift-multus/multus-additional-cni-plugins-t7845" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.419023 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c76cc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"592aa549-1b1b-441e-93e4-0821e05ff2b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8vh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c76cc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.442563 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"895a19c9-a3f0-4a15-aa19-19347121388c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99tf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99tf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bkk9g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.451903 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.462288 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.471313 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.481526 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vt5sw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d1f430-35ed-4c4e-a797-d7a0a5a45266\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqtsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vt5sw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.483523 4836 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-03 11:26:56.999702702 +0000 UTC Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.512439 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/592aa549-1b1b-441e-93e4-0821e05ff2b2-system-cni-dir\") pod \"multus-c76cc\" (UID: \"592aa549-1b1b-441e-93e4-0821e05ff2b2\") " pod="openshift-multus/multus-c76cc" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.512476 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/592aa549-1b1b-441e-93e4-0821e05ff2b2-host-var-lib-cni-multus\") pod \"multus-c76cc\" (UID: \"592aa549-1b1b-441e-93e4-0821e05ff2b2\") " pod="openshift-multus/multus-c76cc" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.512739 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/592aa549-1b1b-441e-93e4-0821e05ff2b2-system-cni-dir\") pod \"multus-c76cc\" (UID: \"592aa549-1b1b-441e-93e4-0821e05ff2b2\") " pod="openshift-multus/multus-c76cc" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.512827 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/592aa549-1b1b-441e-93e4-0821e05ff2b2-host-var-lib-cni-multus\") pod \"multus-c76cc\" (UID: \"592aa549-1b1b-441e-93e4-0821e05ff2b2\") " pod="openshift-multus/multus-c76cc" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.512948 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e-etc-openvswitch\") pod \"ovnkube-node-gfznp\" (UID: \"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\") " pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.512497 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e-etc-openvswitch\") pod \"ovnkube-node-gfznp\" (UID: \"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\") " pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.514006 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jc8vh\" (UniqueName: \"kubernetes.io/projected/592aa549-1b1b-441e-93e4-0821e05ff2b2-kube-api-access-jc8vh\") pod \"multus-c76cc\" (UID: \"592aa549-1b1b-441e-93e4-0821e05ff2b2\") " pod="openshift-multus/multus-c76cc" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.514039 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e-host-kubelet\") pod \"ovnkube-node-gfznp\" (UID: \"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\") " pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.514056 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e-host-slash\") pod \"ovnkube-node-gfznp\" (UID: \"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\") " pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.514072 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-gfznp\" (UID: \"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\") " pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.514090 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7zdwb\" (UniqueName: \"kubernetes.io/projected/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e-kube-api-access-7zdwb\") pod \"ovnkube-node-gfznp\" (UID: \"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\") " pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.514106 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3eeaa6bd-bab3-4310-9522-747924f2e825-cni-binary-copy\") pod \"multus-additional-cni-plugins-t7845\" (UID: \"3eeaa6bd-bab3-4310-9522-747924f2e825\") " pod="openshift-multus/multus-additional-cni-plugins-t7845" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.514120 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e-log-socket\") pod \"ovnkube-node-gfznp\" (UID: \"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\") " pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.514057 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"243121c7-6c63-4df6-a6a6-1ef6770e1125\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0143c73054039d5957c29bdf62217900d803ba2e0046c09dcff149b27fc94995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18b1d7908740d818568e489caf3447ab0f5768c180cc6d443adc67414e51cd07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e054e9ec9a1f8d54b704be11ff2e55c9aaad9d717b1817aa553cfbf2cc6ac16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a5d1fd5d2de253417bcb1e4edadc2363a1c259769418eb2b2aab3ff8253de2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fbf3d3daf43d0da96ad0c1340b3c72815adceec24d0ed5fc97f965d1d93e7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://854199187b81a3d6692a05073c35985feb14993f76f4ac86149c2b6bac635e4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://854199187b81a3d6692a05073c35985feb14993f76f4ac86149c2b6bac635e4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67b26f0d65128004f7645ac1f54a11bc56909269cb9b9337e447b0d5bf783c4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67b26f0d65128004f7645ac1f54a11bc56909269cb9b9337e447b0d5bf783c4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e9fc4a71369f80c06598341729c7db0cb6e0f33d70b0be64470bc6fd40d1cce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9fc4a71369f80c06598341729c7db0cb6e0f33d70b0be64470bc6fd40d1cce2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.514134 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e-ovnkube-config\") pod \"ovnkube-node-gfznp\" (UID: \"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\") " pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.514165 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3eeaa6bd-bab3-4310-9522-747924f2e825-cnibin\") pod \"multus-additional-cni-plugins-t7845\" (UID: \"3eeaa6bd-bab3-4310-9522-747924f2e825\") " pod="openshift-multus/multus-additional-cni-plugins-t7845" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.514183 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/592aa549-1b1b-441e-93e4-0821e05ff2b2-os-release\") pod \"multus-c76cc\" (UID: \"592aa549-1b1b-441e-93e4-0821e05ff2b2\") " pod="openshift-multus/multus-c76cc" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.514196 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/592aa549-1b1b-441e-93e4-0821e05ff2b2-etc-kubernetes\") pod \"multus-c76cc\" (UID: \"592aa549-1b1b-441e-93e4-0821e05ff2b2\") " pod="openshift-multus/multus-c76cc" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.514227 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grf7r\" (UniqueName: \"kubernetes.io/projected/3eeaa6bd-bab3-4310-9522-747924f2e825-kube-api-access-grf7r\") pod \"multus-additional-cni-plugins-t7845\" (UID: \"3eeaa6bd-bab3-4310-9522-747924f2e825\") " pod="openshift-multus/multus-additional-cni-plugins-t7845" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.514241 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/592aa549-1b1b-441e-93e4-0821e05ff2b2-cnibin\") pod \"multus-c76cc\" (UID: \"592aa549-1b1b-441e-93e4-0821e05ff2b2\") " pod="openshift-multus/multus-c76cc" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.514255 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/592aa549-1b1b-441e-93e4-0821e05ff2b2-host-var-lib-cni-bin\") pod \"multus-c76cc\" (UID: \"592aa549-1b1b-441e-93e4-0821e05ff2b2\") " pod="openshift-multus/multus-c76cc" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.514269 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/592aa549-1b1b-441e-93e4-0821e05ff2b2-host-var-lib-kubelet\") pod \"multus-c76cc\" (UID: \"592aa549-1b1b-441e-93e4-0821e05ff2b2\") " pod="openshift-multus/multus-c76cc" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.514284 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e-systemd-units\") pod \"ovnkube-node-gfznp\" (UID: \"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\") " pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.514322 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e-host-run-ovn-kubernetes\") pod \"ovnkube-node-gfznp\" (UID: \"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\") " pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.514345 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e-env-overrides\") pod \"ovnkube-node-gfznp\" (UID: \"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\") " pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.514363 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e-var-lib-openvswitch\") pod \"ovnkube-node-gfznp\" (UID: \"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\") " pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.514380 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e-run-openvswitch\") pod \"ovnkube-node-gfznp\" (UID: \"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\") " pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.514395 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e-node-log\") pod \"ovnkube-node-gfznp\" (UID: \"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\") " pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.514414 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/592aa549-1b1b-441e-93e4-0821e05ff2b2-host-run-multus-certs\") pod \"multus-c76cc\" (UID: \"592aa549-1b1b-441e-93e4-0821e05ff2b2\") " pod="openshift-multus/multus-c76cc" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.514431 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e-ovnkube-script-lib\") pod \"ovnkube-node-gfznp\" (UID: \"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\") " pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.514447 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3eeaa6bd-bab3-4310-9522-747924f2e825-system-cni-dir\") pod \"multus-additional-cni-plugins-t7845\" (UID: \"3eeaa6bd-bab3-4310-9522-747924f2e825\") " pod="openshift-multus/multus-additional-cni-plugins-t7845" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.514463 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/592aa549-1b1b-441e-93e4-0821e05ff2b2-cni-binary-copy\") pod \"multus-c76cc\" (UID: \"592aa549-1b1b-441e-93e4-0821e05ff2b2\") " pod="openshift-multus/multus-c76cc" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.514479 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e-run-ovn\") pod \"ovnkube-node-gfznp\" (UID: \"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\") " pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.514494 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e-host-cni-bin\") pod \"ovnkube-node-gfznp\" (UID: \"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\") " pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.514509 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e-host-cni-netd\") pod \"ovnkube-node-gfznp\" (UID: \"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\") " pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.514524 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/592aa549-1b1b-441e-93e4-0821e05ff2b2-multus-cni-dir\") pod \"multus-c76cc\" (UID: \"592aa549-1b1b-441e-93e4-0821e05ff2b2\") " pod="openshift-multus/multus-c76cc" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.514539 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/592aa549-1b1b-441e-93e4-0821e05ff2b2-multus-socket-dir-parent\") pod \"multus-c76cc\" (UID: \"592aa549-1b1b-441e-93e4-0821e05ff2b2\") " pod="openshift-multus/multus-c76cc" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.514554 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/592aa549-1b1b-441e-93e4-0821e05ff2b2-multus-conf-dir\") pod \"multus-c76cc\" (UID: \"592aa549-1b1b-441e-93e4-0821e05ff2b2\") " pod="openshift-multus/multus-c76cc" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.514576 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e-ovn-node-metrics-cert\") pod \"ovnkube-node-gfznp\" (UID: \"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\") " pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.514590 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3eeaa6bd-bab3-4310-9522-747924f2e825-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-t7845\" (UID: \"3eeaa6bd-bab3-4310-9522-747924f2e825\") " pod="openshift-multus/multus-additional-cni-plugins-t7845" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.514605 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/592aa549-1b1b-441e-93e4-0821e05ff2b2-hostroot\") pod \"multus-c76cc\" (UID: \"592aa549-1b1b-441e-93e4-0821e05ff2b2\") " pod="openshift-multus/multus-c76cc" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.514619 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/592aa549-1b1b-441e-93e4-0821e05ff2b2-multus-daemon-config\") pod \"multus-c76cc\" (UID: \"592aa549-1b1b-441e-93e4-0821e05ff2b2\") " pod="openshift-multus/multus-c76cc" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.514633 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e-host-run-netns\") pod \"ovnkube-node-gfznp\" (UID: \"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\") " pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.514647 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/592aa549-1b1b-441e-93e4-0821e05ff2b2-host-run-netns\") pod \"multus-c76cc\" (UID: \"592aa549-1b1b-441e-93e4-0821e05ff2b2\") " pod="openshift-multus/multus-c76cc" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.514663 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3eeaa6bd-bab3-4310-9522-747924f2e825-tuning-conf-dir\") pod \"multus-additional-cni-plugins-t7845\" (UID: \"3eeaa6bd-bab3-4310-9522-747924f2e825\") " pod="openshift-multus/multus-additional-cni-plugins-t7845" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.514678 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/592aa549-1b1b-441e-93e4-0821e05ff2b2-host-run-k8s-cni-cncf-io\") pod \"multus-c76cc\" (UID: \"592aa549-1b1b-441e-93e4-0821e05ff2b2\") " pod="openshift-multus/multus-c76cc" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.514694 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e-run-systemd\") pod \"ovnkube-node-gfznp\" (UID: \"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\") " pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.514710 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3eeaa6bd-bab3-4310-9522-747924f2e825-os-release\") pod \"multus-additional-cni-plugins-t7845\" (UID: \"3eeaa6bd-bab3-4310-9522-747924f2e825\") " pod="openshift-multus/multus-additional-cni-plugins-t7845" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.514953 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e-ovnkube-config\") pod \"ovnkube-node-gfznp\" (UID: \"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\") " pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.515023 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3eeaa6bd-bab3-4310-9522-747924f2e825-os-release\") pod \"multus-additional-cni-plugins-t7845\" (UID: \"3eeaa6bd-bab3-4310-9522-747924f2e825\") " pod="openshift-multus/multus-additional-cni-plugins-t7845" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.515066 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3eeaa6bd-bab3-4310-9522-747924f2e825-cnibin\") pod \"multus-additional-cni-plugins-t7845\" (UID: \"3eeaa6bd-bab3-4310-9522-747924f2e825\") " pod="openshift-multus/multus-additional-cni-plugins-t7845" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.515100 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/592aa549-1b1b-441e-93e4-0821e05ff2b2-os-release\") pod \"multus-c76cc\" (UID: \"592aa549-1b1b-441e-93e4-0821e05ff2b2\") " pod="openshift-multus/multus-c76cc" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.515121 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/592aa549-1b1b-441e-93e4-0821e05ff2b2-etc-kubernetes\") pod \"multus-c76cc\" (UID: \"592aa549-1b1b-441e-93e4-0821e05ff2b2\") " pod="openshift-multus/multus-c76cc" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.515170 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e-host-kubelet\") pod \"ovnkube-node-gfznp\" (UID: \"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\") " pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.515200 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e-host-slash\") pod \"ovnkube-node-gfznp\" (UID: \"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\") " pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.515223 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-gfznp\" (UID: \"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\") " pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.515456 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/592aa549-1b1b-441e-93e4-0821e05ff2b2-cnibin\") pod \"multus-c76cc\" (UID: \"592aa549-1b1b-441e-93e4-0821e05ff2b2\") " pod="openshift-multus/multus-c76cc" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.515495 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/592aa549-1b1b-441e-93e4-0821e05ff2b2-host-var-lib-cni-bin\") pod \"multus-c76cc\" (UID: \"592aa549-1b1b-441e-93e4-0821e05ff2b2\") " pod="openshift-multus/multus-c76cc" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.515528 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/592aa549-1b1b-441e-93e4-0821e05ff2b2-host-var-lib-kubelet\") pod \"multus-c76cc\" (UID: \"592aa549-1b1b-441e-93e4-0821e05ff2b2\") " pod="openshift-multus/multus-c76cc" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.515558 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e-systemd-units\") pod \"ovnkube-node-gfznp\" (UID: \"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\") " pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.515582 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e-host-run-ovn-kubernetes\") pod \"ovnkube-node-gfznp\" (UID: \"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\") " pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.516006 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e-env-overrides\") pod \"ovnkube-node-gfznp\" (UID: \"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\") " pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.516050 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e-var-lib-openvswitch\") pod \"ovnkube-node-gfznp\" (UID: \"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\") " pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.516071 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e-run-openvswitch\") pod \"ovnkube-node-gfznp\" (UID: \"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\") " pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.516092 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e-node-log\") pod \"ovnkube-node-gfznp\" (UID: \"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\") " pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.516114 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/592aa549-1b1b-441e-93e4-0821e05ff2b2-host-run-multus-certs\") pod \"multus-c76cc\" (UID: \"592aa549-1b1b-441e-93e4-0821e05ff2b2\") " pod="openshift-multus/multus-c76cc" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.516567 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e-ovnkube-script-lib\") pod \"ovnkube-node-gfznp\" (UID: \"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\") " pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.516612 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3eeaa6bd-bab3-4310-9522-747924f2e825-system-cni-dir\") pod \"multus-additional-cni-plugins-t7845\" (UID: \"3eeaa6bd-bab3-4310-9522-747924f2e825\") " pod="openshift-multus/multus-additional-cni-plugins-t7845" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.516760 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/592aa549-1b1b-441e-93e4-0821e05ff2b2-host-run-netns\") pod \"multus-c76cc\" (UID: \"592aa549-1b1b-441e-93e4-0821e05ff2b2\") " pod="openshift-multus/multus-c76cc" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.516838 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/592aa549-1b1b-441e-93e4-0821e05ff2b2-hostroot\") pod \"multus-c76cc\" (UID: \"592aa549-1b1b-441e-93e4-0821e05ff2b2\") " pod="openshift-multus/multus-c76cc" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.517152 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3eeaa6bd-bab3-4310-9522-747924f2e825-cni-binary-copy\") pod \"multus-additional-cni-plugins-t7845\" (UID: \"3eeaa6bd-bab3-4310-9522-747924f2e825\") " pod="openshift-multus/multus-additional-cni-plugins-t7845" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.517223 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e-log-socket\") pod \"ovnkube-node-gfznp\" (UID: \"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\") " pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.517193 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/592aa549-1b1b-441e-93e4-0821e05ff2b2-cni-binary-copy\") pod \"multus-c76cc\" (UID: \"592aa549-1b1b-441e-93e4-0821e05ff2b2\") " pod="openshift-multus/multus-c76cc" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.517272 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/592aa549-1b1b-441e-93e4-0821e05ff2b2-host-run-k8s-cni-cncf-io\") pod \"multus-c76cc\" (UID: \"592aa549-1b1b-441e-93e4-0821e05ff2b2\") " pod="openshift-multus/multus-c76cc" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.517277 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e-run-systemd\") pod \"ovnkube-node-gfznp\" (UID: \"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\") " pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.517315 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3eeaa6bd-bab3-4310-9522-747924f2e825-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-t7845\" (UID: \"3eeaa6bd-bab3-4310-9522-747924f2e825\") " pod="openshift-multus/multus-additional-cni-plugins-t7845" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.517461 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/592aa549-1b1b-441e-93e4-0821e05ff2b2-multus-cni-dir\") pod \"multus-c76cc\" (UID: \"592aa549-1b1b-441e-93e4-0821e05ff2b2\") " pod="openshift-multus/multus-c76cc" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.517489 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e-run-ovn\") pod \"ovnkube-node-gfznp\" (UID: \"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\") " pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.517504 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/592aa549-1b1b-441e-93e4-0821e05ff2b2-multus-socket-dir-parent\") pod \"multus-c76cc\" (UID: \"592aa549-1b1b-441e-93e4-0821e05ff2b2\") " pod="openshift-multus/multus-c76cc" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.517510 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e-host-cni-bin\") pod \"ovnkube-node-gfznp\" (UID: \"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\") " pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.517529 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e-host-cni-netd\") pod \"ovnkube-node-gfznp\" (UID: \"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\") " pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.517547 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/592aa549-1b1b-441e-93e4-0821e05ff2b2-multus-conf-dir\") pod \"multus-c76cc\" (UID: \"592aa549-1b1b-441e-93e4-0821e05ff2b2\") " pod="openshift-multus/multus-c76cc" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.517575 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e-host-run-netns\") pod \"ovnkube-node-gfznp\" (UID: \"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\") " pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.517591 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/592aa549-1b1b-441e-93e4-0821e05ff2b2-multus-daemon-config\") pod \"multus-c76cc\" (UID: \"592aa549-1b1b-441e-93e4-0821e05ff2b2\") " pod="openshift-multus/multus-c76cc" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.518156 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3eeaa6bd-bab3-4310-9522-747924f2e825-tuning-conf-dir\") pod \"multus-additional-cni-plugins-t7845\" (UID: \"3eeaa6bd-bab3-4310-9522-747924f2e825\") " pod="openshift-multus/multus-additional-cni-plugins-t7845" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.520935 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e-ovn-node-metrics-cert\") pod \"ovnkube-node-gfznp\" (UID: \"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\") " pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.545278 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c76cc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"592aa549-1b1b-441e-93e4-0821e05ff2b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8vh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c76cc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.545892 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jc8vh\" (UniqueName: \"kubernetes.io/projected/592aa549-1b1b-441e-93e4-0821e05ff2b2-kube-api-access-jc8vh\") pod \"multus-c76cc\" (UID: \"592aa549-1b1b-441e-93e4-0821e05ff2b2\") " pod="openshift-multus/multus-c76cc" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.552424 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grf7r\" (UniqueName: \"kubernetes.io/projected/3eeaa6bd-bab3-4310-9522-747924f2e825-kube-api-access-grf7r\") pod \"multus-additional-cni-plugins-t7845\" (UID: \"3eeaa6bd-bab3-4310-9522-747924f2e825\") " pod="openshift-multus/multus-additional-cni-plugins-t7845" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.560026 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"895a19c9-a3f0-4a15-aa19-19347121388c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99tf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99tf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bkk9g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.561339 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zdwb\" (UniqueName: \"kubernetes.io/projected/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e-kube-api-access-7zdwb\") pod \"ovnkube-node-gfznp\" (UID: \"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\") " pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.573881 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.574575 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.576174 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.576904 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.578325 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.578948 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.579708 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.581046 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.581951 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.584758 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.585812 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.586756 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.588079 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.588805 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.589495 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.589676 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"243121c7-6c63-4df6-a6a6-1ef6770e1125\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0143c73054039d5957c29bdf62217900d803ba2e0046c09dcff149b27fc94995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18b1d7908740d818568e489caf3447ab0f5768c180cc6d443adc67414e51cd07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e054e9ec9a1f8d54b704be11ff2e55c9aaad9d717b1817aa553cfbf2cc6ac16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a5d1fd5d2de253417bcb1e4edadc2363a1c259769418eb2b2aab3ff8253de2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fbf3d3daf43d0da96ad0c1340b3c72815adceec24d0ed5fc97f965d1d93e7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://854199187b81a3d6692a05073c35985feb14993f76f4ac86149c2b6bac635e4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://854199187b81a3d6692a05073c35985feb14993f76f4ac86149c2b6bac635e4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67b26f0d65128004f7645ac1f54a11bc56909269cb9b9337e447b0d5bf783c4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67b26f0d65128004f7645ac1f54a11bc56909269cb9b9337e447b0d5bf783c4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e9fc4a71369f80c06598341729c7db0cb6e0f33d70b0be64470bc6fd40d1cce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9fc4a71369f80c06598341729c7db0cb6e0f33d70b0be64470bc6fd40d1cce2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.591413 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.592265 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.593284 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.593947 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.594608 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.597809 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.598706 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.599223 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.600470 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.602313 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.603731 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.604503 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.605496 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.606036 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.606859 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.607341 4836 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.607445 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.609739 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.610521 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.610968 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.612648 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.613817 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.614436 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.615952 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.616679 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.617221 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.617889 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.618050 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-c76cc" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.619285 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.620335 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.620871 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.621921 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.622536 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.624169 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.624780 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.625733 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.626310 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.626924 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.629192 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.629982 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.627814 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.634945 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-t7845" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.679675 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.688860 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.697917 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.703066 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"91b942ef1c58993ae6344d4aef5fcfe21434b5f3b3282bdf8070560fa8973a3e"} Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.703120 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"77390b24cb4a223b8eb6a46db49a4a88b68f87aa0b4b134b45a552c180dec5b4"} Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.703136 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"fddcb96714784863e96abf5ed55db043ac0bfb9c2084ffe566e853311f983486"} Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.703793 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vt5sw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d1f430-35ed-4c4e-a797-d7a0a5a45266\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqtsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vt5sw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.705617 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.709360 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.710953 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jlz6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f87e91ef-e64c-45a5-9bd5-cc6537e51b1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vbzg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jlz6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.711028 4836 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="a13a546a929cc67fa3a708f23186f65c3dbc28afd67a421a66c732b8308e4177" exitCode=255 Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.711145 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"a13a546a929cc67fa3a708f23186f65c3dbc28afd67a421a66c732b8308e4177"} Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.711207 4836 scope.go:117] "RemoveContainer" containerID="c1a52f71b0a88c92eb73fc0f119373f12587dd887e7dbb4a06dbb6e6d33d55c9" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.715110 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"47d2c13bf3e4d71fa10e400c6e38b31f9db6c3c21c8413e9a420649f4d4cfa4d"} Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.730950 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.731256 4836 scope.go:117] "RemoveContainer" containerID="a13a546a929cc67fa3a708f23186f65c3dbc28afd67a421a66c732b8308e4177" Feb 17 14:06:36 crc kubenswrapper[4836]: E0217 14:06:36.731553 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.731688 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gfznp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.740484 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"bc30b1c397bb81c58abc71fb2c6d71c94b9a0f7962d8278edc4f65f1a1c64ef8"} Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.740520 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"44528f3c87bd92c020dcd61eef6bdf96440546a99cb2ad727c4d12c7b41ccd2c"} Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.741871 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-c76cc" event={"ID":"592aa549-1b1b-441e-93e4-0821e05ff2b2","Type":"ContainerStarted","Data":"dc193c55466e5525297bac82ed721d0f68341a9e983b8e349a3c89ceb9e53ab7"} Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.743074 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" event={"ID":"895a19c9-a3f0-4a15-aa19-19347121388c","Type":"ContainerStarted","Data":"c10f7b69881ea75bfa81905a42379fed8daf48e788b5f2787c522a52d28f58cb"} Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.743099 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" event={"ID":"895a19c9-a3f0-4a15-aa19-19347121388c","Type":"ContainerStarted","Data":"089eaed7b83958ef4cd6e49ca80a36b3b45719b2e4981b5ea960d68f4da80549"} Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.745553 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-jlz6g" event={"ID":"f87e91ef-e64c-45a5-9bd5-cc6537e51b1b","Type":"ContainerStarted","Data":"e03fa870790f606ebf6db3bac825e59b209b7a2bd6cb22264597b30211fbc2fa"} Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.745595 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-jlz6g" event={"ID":"f87e91ef-e64c-45a5-9bd5-cc6537e51b1b","Type":"ContainerStarted","Data":"6385b33f5ca92c383a9528f58e9bd8e4f9699b58d5246ef9d523a8df5756f25e"} Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.750808 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-vt5sw" event={"ID":"f6d1f430-35ed-4c4e-a797-d7a0a5a45266","Type":"ContainerStarted","Data":"25f1e7f83b2916db4faac1afe1f5107375c0e1674ff1d6f90f1025a9920111e6"} Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.750844 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-vt5sw" event={"ID":"f6d1f430-35ed-4c4e-a797-d7a0a5a45266","Type":"ContainerStarted","Data":"2944fbb7f287893828c9c9b126d5f545dfd917e4c6229460a9aedb029850836e"} Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.761194 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.773064 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.790762 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 14:06:36 crc kubenswrapper[4836]: W0217 14:06:36.800028 4836 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod67e8cda7_ec53_43bd_9fec_8ac4d6ecc26e.slice/crio-3bdc7f19fb50c4c29fda01e2e231206d0048a98eab720a4ee93274d360c514d1 WatchSource:0}: Error finding container 3bdc7f19fb50c4c29fda01e2e231206d0048a98eab720a4ee93274d360c514d1: Status 404 returned error can't find the container with id 3bdc7f19fb50c4c29fda01e2e231206d0048a98eab720a4ee93274d360c514d1 Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.806670 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t7845" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eeaa6bd-bab3-4310-9522-747924f2e825\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t7845\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.818167 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c76cc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"592aa549-1b1b-441e-93e4-0821e05ff2b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8vh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c76cc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.829395 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"895a19c9-a3f0-4a15-aa19-19347121388c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bce78b27c441ccb16f0fd2489cbe3e38be0f3d330fcd2b602f62a8ca0aa8616b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99tf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c10f7b69881ea75bfa81905a42379fed8daf48e788b5f2787c522a52d28f58cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99tf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bkk9g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.842565 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b942ef1c58993ae6344d4aef5fcfe21434b5f3b3282bdf8070560fa8973a3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77390b24cb4a223b8eb6a46db49a4a88b68f87aa0b4b134b45a552c180dec5b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.853878 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.871659 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.885413 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vt5sw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d1f430-35ed-4c4e-a797-d7a0a5a45266\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25f1e7f83b2916db4faac1afe1f5107375c0e1674ff1d6f90f1025a9920111e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqtsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vt5sw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.902053 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"243121c7-6c63-4df6-a6a6-1ef6770e1125\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0143c73054039d5957c29bdf62217900d803ba2e0046c09dcff149b27fc94995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18b1d7908740d818568e489caf3447ab0f5768c180cc6d443adc67414e51cd07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e054e9ec9a1f8d54b704be11ff2e55c9aaad9d717b1817aa553cfbf2cc6ac16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a5d1fd5d2de253417bcb1e4edadc2363a1c259769418eb2b2aab3ff8253de2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fbf3d3daf43d0da96ad0c1340b3c72815adceec24d0ed5fc97f965d1d93e7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://854199187b81a3d6692a05073c35985feb14993f76f4ac86149c2b6bac635e4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://854199187b81a3d6692a05073c35985feb14993f76f4ac86149c2b6bac635e4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67b26f0d65128004f7645ac1f54a11bc56909269cb9b9337e447b0d5bf783c4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67b26f0d65128004f7645ac1f54a11bc56909269cb9b9337e447b0d5bf783c4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e9fc4a71369f80c06598341729c7db0cb6e0f33d70b0be64470bc6fd40d1cce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9fc4a71369f80c06598341729c7db0cb6e0f33d70b0be64470bc6fd40d1cce2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.915053 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15e7e5f3-c249-4609-937e-ffdc78580880\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://281c6f51b95a96d557a29b449759275131f2fde30c4814e7692edfff96442e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bc90fa5f9363034d51c35721493655d10f8f8d5fad4dda86ca4c882963fb7a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edaaa277a27d7e7b9e8b42acaa9ee04fc0933b440441ad3f8abee458e96945c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a13a546a929cc67fa3a708f23186f65c3dbc28afd67a421a66c732b8308e4177\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1a52f71b0a88c92eb73fc0f119373f12587dd887e7dbb4a06dbb6e6d33d55c9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:06:19Z\\\",\\\"message\\\":\\\"W0217 14:06:18.530858 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0217 14:06:18.531266 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771337178 cert, and key in /tmp/serving-cert-770380950/serving-signer.crt, /tmp/serving-cert-770380950/serving-signer.key\\\\nI0217 14:06:18.912509 1 observer_polling.go:159] Starting file observer\\\\nW0217 14:06:18.916420 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0217 14:06:18.918412 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:06:18.920068 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-770380950/tls.crt::/tmp/serving-cert-770380950/tls.key\\\\\\\"\\\\nF0217 14:06:19.125256 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:18Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a13a546a929cc67fa3a708f23186f65c3dbc28afd67a421a66c732b8308e4177\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:06:30.193193 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:06:30.195418 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-478710080/tls.crt::/tmp/serving-cert-478710080/tls.key\\\\\\\"\\\\nI0217 14:06:35.690345 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:06:35.701676 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:06:35.701714 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:06:35.701739 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:06:35.701745 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:06:35.709986 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 14:06:35.710011 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:06:35.710016 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:06:35.710020 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:06:35.710023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:06:35.710027 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:06:35.710030 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 14:06:35.710186 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 14:06:35.711336 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b36202875b19f7dafcfde168e006d9d47571de30240d5886e5e0920b74b609b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf6a761df18c5e7699f2ec87d4b97f19da71f588cb12eff8e541140a361fa905\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf6a761df18c5e7699f2ec87d4b97f19da71f588cb12eff8e541140a361fa905\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.930038 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc30b1c397bb81c58abc71fb2c6d71c94b9a0f7962d8278edc4f65f1a1c64ef8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.939645 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.948160 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.954402 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jlz6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f87e91ef-e64c-45a5-9bd5-cc6537e51b1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e03fa870790f606ebf6db3bac825e59b209b7a2bd6cb22264597b30211fbc2fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vbzg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jlz6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.972750 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gfznp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 14:06:36 crc kubenswrapper[4836]: I0217 14:06:36.987403 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t7845" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eeaa6bd-bab3-4310-9522-747924f2e825\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t7845\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 14:06:37 crc kubenswrapper[4836]: I0217 14:06:37.119398 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:06:37 crc kubenswrapper[4836]: E0217 14:06:37.119562 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:06:39.11953413 +0000 UTC m=+25.462462399 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:06:37 crc kubenswrapper[4836]: I0217 14:06:37.119911 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:06:37 crc kubenswrapper[4836]: I0217 14:06:37.120026 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:06:37 crc kubenswrapper[4836]: E0217 14:06:37.120157 4836 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 14:06:37 crc kubenswrapper[4836]: E0217 14:06:37.120209 4836 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 14:06:37 crc kubenswrapper[4836]: E0217 14:06:37.120372 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 14:06:39.120338101 +0000 UTC m=+25.463266560 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 14:06:37 crc kubenswrapper[4836]: E0217 14:06:37.120401 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 14:06:39.120390802 +0000 UTC m=+25.463319291 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 14:06:37 crc kubenswrapper[4836]: I0217 14:06:37.221166 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:06:37 crc kubenswrapper[4836]: I0217 14:06:37.221217 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:06:37 crc kubenswrapper[4836]: E0217 14:06:37.221405 4836 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 14:06:37 crc kubenswrapper[4836]: E0217 14:06:37.221426 4836 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 14:06:37 crc kubenswrapper[4836]: E0217 14:06:37.221439 4836 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 14:06:37 crc kubenswrapper[4836]: E0217 14:06:37.221497 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-17 14:06:39.221481919 +0000 UTC m=+25.564410188 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 14:06:37 crc kubenswrapper[4836]: E0217 14:06:37.221552 4836 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 14:06:37 crc kubenswrapper[4836]: E0217 14:06:37.221588 4836 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 14:06:37 crc kubenswrapper[4836]: E0217 14:06:37.221600 4836 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 14:06:37 crc kubenswrapper[4836]: E0217 14:06:37.221653 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-17 14:06:39.221636953 +0000 UTC m=+25.564565222 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 14:06:37 crc kubenswrapper[4836]: I0217 14:06:37.483882 4836 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-16 06:54:47.830613584 +0000 UTC Feb 17 14:06:37 crc kubenswrapper[4836]: I0217 14:06:37.567173 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:06:37 crc kubenswrapper[4836]: I0217 14:06:37.567451 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:06:37 crc kubenswrapper[4836]: I0217 14:06:37.567188 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:06:37 crc kubenswrapper[4836]: E0217 14:06:37.567568 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:06:37 crc kubenswrapper[4836]: E0217 14:06:37.567777 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:06:37 crc kubenswrapper[4836]: E0217 14:06:37.567890 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:06:37 crc kubenswrapper[4836]: I0217 14:06:37.755104 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-c76cc" event={"ID":"592aa549-1b1b-441e-93e4-0821e05ff2b2","Type":"ContainerStarted","Data":"d56329a484e41f44539eb8ca01b461cc6f79eebf756350b0f49c47c0431e8abc"} Feb 17 14:06:37 crc kubenswrapper[4836]: I0217 14:06:37.757100 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Feb 17 14:06:37 crc kubenswrapper[4836]: I0217 14:06:37.760036 4836 scope.go:117] "RemoveContainer" containerID="a13a546a929cc67fa3a708f23186f65c3dbc28afd67a421a66c732b8308e4177" Feb 17 14:06:37 crc kubenswrapper[4836]: E0217 14:06:37.760373 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 17 14:06:37 crc kubenswrapper[4836]: I0217 14:06:37.760602 4836 generic.go:334] "Generic (PLEG): container finished" podID="3eeaa6bd-bab3-4310-9522-747924f2e825" containerID="b6778ba6ea7cfd1d378cae6dc4330542c35cedaff8e820a2158571f542e0ec55" exitCode=0 Feb 17 14:06:37 crc kubenswrapper[4836]: I0217 14:06:37.760687 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-t7845" event={"ID":"3eeaa6bd-bab3-4310-9522-747924f2e825","Type":"ContainerDied","Data":"b6778ba6ea7cfd1d378cae6dc4330542c35cedaff8e820a2158571f542e0ec55"} Feb 17 14:06:37 crc kubenswrapper[4836]: I0217 14:06:37.760740 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-t7845" event={"ID":"3eeaa6bd-bab3-4310-9522-747924f2e825","Type":"ContainerStarted","Data":"d19db0389ac58eebb9dc83dd13a3b70c233c32490fb0080720176b6469910e22"} Feb 17 14:06:37 crc kubenswrapper[4836]: I0217 14:06:37.762402 4836 generic.go:334] "Generic (PLEG): container finished" podID="67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e" containerID="81e6d07edc7d7857cb288980bc7e9d5002234c2fe642df6798611b150b6d66e9" exitCode=0 Feb 17 14:06:37 crc kubenswrapper[4836]: I0217 14:06:37.762489 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" event={"ID":"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e","Type":"ContainerDied","Data":"81e6d07edc7d7857cb288980bc7e9d5002234c2fe642df6798611b150b6d66e9"} Feb 17 14:06:37 crc kubenswrapper[4836]: I0217 14:06:37.762524 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" event={"ID":"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e","Type":"ContainerStarted","Data":"3bdc7f19fb50c4c29fda01e2e231206d0048a98eab720a4ee93274d360c514d1"} Feb 17 14:06:37 crc kubenswrapper[4836]: I0217 14:06:37.768016 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" event={"ID":"895a19c9-a3f0-4a15-aa19-19347121388c","Type":"ContainerStarted","Data":"bce78b27c441ccb16f0fd2489cbe3e38be0f3d330fcd2b602f62a8ca0aa8616b"} Feb 17 14:06:37 crc kubenswrapper[4836]: I0217 14:06:37.774627 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c76cc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"592aa549-1b1b-441e-93e4-0821e05ff2b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d56329a484e41f44539eb8ca01b461cc6f79eebf756350b0f49c47c0431e8abc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8vh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c76cc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:37Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:37 crc kubenswrapper[4836]: I0217 14:06:37.788011 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"895a19c9-a3f0-4a15-aa19-19347121388c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bce78b27c441ccb16f0fd2489cbe3e38be0f3d330fcd2b602f62a8ca0aa8616b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99tf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c10f7b69881ea75bfa81905a42379fed8daf48e788b5f2787c522a52d28f58cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99tf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bkk9g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:37Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:37 crc kubenswrapper[4836]: I0217 14:06:37.802701 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b942ef1c58993ae6344d4aef5fcfe21434b5f3b3282bdf8070560fa8973a3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77390b24cb4a223b8eb6a46db49a4a88b68f87aa0b4b134b45a552c180dec5b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:37Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:37 crc kubenswrapper[4836]: I0217 14:06:37.818084 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:37Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:37 crc kubenswrapper[4836]: I0217 14:06:37.830478 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:37Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:37 crc kubenswrapper[4836]: I0217 14:06:37.844692 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vt5sw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d1f430-35ed-4c4e-a797-d7a0a5a45266\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25f1e7f83b2916db4faac1afe1f5107375c0e1674ff1d6f90f1025a9920111e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqtsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vt5sw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:37Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:37 crc kubenswrapper[4836]: I0217 14:06:37.884580 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"243121c7-6c63-4df6-a6a6-1ef6770e1125\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0143c73054039d5957c29bdf62217900d803ba2e0046c09dcff149b27fc94995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18b1d7908740d818568e489caf3447ab0f5768c180cc6d443adc67414e51cd07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e054e9ec9a1f8d54b704be11ff2e55c9aaad9d717b1817aa553cfbf2cc6ac16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a5d1fd5d2de253417bcb1e4edadc2363a1c259769418eb2b2aab3ff8253de2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fbf3d3daf43d0da96ad0c1340b3c72815adceec24d0ed5fc97f965d1d93e7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://854199187b81a3d6692a05073c35985feb14993f76f4ac86149c2b6bac635e4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://854199187b81a3d6692a05073c35985feb14993f76f4ac86149c2b6bac635e4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67b26f0d65128004f7645ac1f54a11bc56909269cb9b9337e447b0d5bf783c4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67b26f0d65128004f7645ac1f54a11bc56909269cb9b9337e447b0d5bf783c4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e9fc4a71369f80c06598341729c7db0cb6e0f33d70b0be64470bc6fd40d1cce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9fc4a71369f80c06598341729c7db0cb6e0f33d70b0be64470bc6fd40d1cce2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:37Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:37 crc kubenswrapper[4836]: I0217 14:06:37.925145 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15e7e5f3-c249-4609-937e-ffdc78580880\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://281c6f51b95a96d557a29b449759275131f2fde30c4814e7692edfff96442e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bc90fa5f9363034d51c35721493655d10f8f8d5fad4dda86ca4c882963fb7a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edaaa277a27d7e7b9e8b42acaa9ee04fc0933b440441ad3f8abee458e96945c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a13a546a929cc67fa3a708f23186f65c3dbc28afd67a421a66c732b8308e4177\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c1a52f71b0a88c92eb73fc0f119373f12587dd887e7dbb4a06dbb6e6d33d55c9\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:06:19Z\\\",\\\"message\\\":\\\"W0217 14:06:18.530858 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0217 14:06:18.531266 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771337178 cert, and key in /tmp/serving-cert-770380950/serving-signer.crt, /tmp/serving-cert-770380950/serving-signer.key\\\\nI0217 14:06:18.912509 1 observer_polling.go:159] Starting file observer\\\\nW0217 14:06:18.916420 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0217 14:06:18.918412 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:06:18.920068 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-770380950/tls.crt::/tmp/serving-cert-770380950/tls.key\\\\\\\"\\\\nF0217 14:06:19.125256 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:18Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a13a546a929cc67fa3a708f23186f65c3dbc28afd67a421a66c732b8308e4177\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:06:30.193193 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:06:30.195418 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-478710080/tls.crt::/tmp/serving-cert-478710080/tls.key\\\\\\\"\\\\nI0217 14:06:35.690345 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:06:35.701676 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:06:35.701714 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:06:35.701739 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:06:35.701745 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:06:35.709986 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 14:06:35.710011 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:06:35.710016 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:06:35.710020 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:06:35.710023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:06:35.710027 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:06:35.710030 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 14:06:35.710186 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 14:06:35.711336 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b36202875b19f7dafcfde168e006d9d47571de30240d5886e5e0920b74b609b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf6a761df18c5e7699f2ec87d4b97f19da71f588cb12eff8e541140a361fa905\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf6a761df18c5e7699f2ec87d4b97f19da71f588cb12eff8e541140a361fa905\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:37Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:37 crc kubenswrapper[4836]: I0217 14:06:37.950500 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc30b1c397bb81c58abc71fb2c6d71c94b9a0f7962d8278edc4f65f1a1c64ef8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:37Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:37 crc kubenswrapper[4836]: I0217 14:06:37.965032 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:37Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:37 crc kubenswrapper[4836]: I0217 14:06:37.977501 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:37Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:37 crc kubenswrapper[4836]: I0217 14:06:37.989052 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jlz6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f87e91ef-e64c-45a5-9bd5-cc6537e51b1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e03fa870790f606ebf6db3bac825e59b209b7a2bd6cb22264597b30211fbc2fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vbzg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jlz6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:37Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:38 crc kubenswrapper[4836]: I0217 14:06:38.011353 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gfznp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:38Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:38 crc kubenswrapper[4836]: I0217 14:06:38.031974 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t7845" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eeaa6bd-bab3-4310-9522-747924f2e825\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t7845\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:38Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:38 crc kubenswrapper[4836]: I0217 14:06:38.044180 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"895a19c9-a3f0-4a15-aa19-19347121388c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bce78b27c441ccb16f0fd2489cbe3e38be0f3d330fcd2b602f62a8ca0aa8616b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99tf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c10f7b69881ea75bfa81905a42379fed8daf48e788b5f2787c522a52d28f58cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99tf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bkk9g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:38Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:38 crc kubenswrapper[4836]: I0217 14:06:38.062958 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"243121c7-6c63-4df6-a6a6-1ef6770e1125\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0143c73054039d5957c29bdf62217900d803ba2e0046c09dcff149b27fc94995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18b1d7908740d818568e489caf3447ab0f5768c180cc6d443adc67414e51cd07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e054e9ec9a1f8d54b704be11ff2e55c9aaad9d717b1817aa553cfbf2cc6ac16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a5d1fd5d2de253417bcb1e4edadc2363a1c259769418eb2b2aab3ff8253de2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fbf3d3daf43d0da96ad0c1340b3c72815adceec24d0ed5fc97f965d1d93e7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://854199187b81a3d6692a05073c35985feb14993f76f4ac86149c2b6bac635e4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://854199187b81a3d6692a05073c35985feb14993f76f4ac86149c2b6bac635e4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67b26f0d65128004f7645ac1f54a11bc56909269cb9b9337e447b0d5bf783c4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67b26f0d65128004f7645ac1f54a11bc56909269cb9b9337e447b0d5bf783c4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e9fc4a71369f80c06598341729c7db0cb6e0f33d70b0be64470bc6fd40d1cce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9fc4a71369f80c06598341729c7db0cb6e0f33d70b0be64470bc6fd40d1cce2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:38Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:38 crc kubenswrapper[4836]: I0217 14:06:38.081509 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b942ef1c58993ae6344d4aef5fcfe21434b5f3b3282bdf8070560fa8973a3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77390b24cb4a223b8eb6a46db49a4a88b68f87aa0b4b134b45a552c180dec5b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:38Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:38 crc kubenswrapper[4836]: I0217 14:06:38.096329 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:38Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:38 crc kubenswrapper[4836]: I0217 14:06:38.113004 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:38Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:38 crc kubenswrapper[4836]: I0217 14:06:38.128953 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vt5sw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d1f430-35ed-4c4e-a797-d7a0a5a45266\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25f1e7f83b2916db4faac1afe1f5107375c0e1674ff1d6f90f1025a9920111e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqtsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vt5sw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:38Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:38 crc kubenswrapper[4836]: I0217 14:06:38.143335 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15e7e5f3-c249-4609-937e-ffdc78580880\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://281c6f51b95a96d557a29b449759275131f2fde30c4814e7692edfff96442e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bc90fa5f9363034d51c35721493655d10f8f8d5fad4dda86ca4c882963fb7a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edaaa277a27d7e7b9e8b42acaa9ee04fc0933b440441ad3f8abee458e96945c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a13a546a929cc67fa3a708f23186f65c3dbc28afd67a421a66c732b8308e4177\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a13a546a929cc67fa3a708f23186f65c3dbc28afd67a421a66c732b8308e4177\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:06:30.193193 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:06:30.195418 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-478710080/tls.crt::/tmp/serving-cert-478710080/tls.key\\\\\\\"\\\\nI0217 14:06:35.690345 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:06:35.701676 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:06:35.701714 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:06:35.701739 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:06:35.701745 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:06:35.709986 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 14:06:35.710011 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:06:35.710016 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:06:35.710020 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:06:35.710023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:06:35.710027 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:06:35.710030 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 14:06:35.710186 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 14:06:35.711336 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b36202875b19f7dafcfde168e006d9d47571de30240d5886e5e0920b74b609b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf6a761df18c5e7699f2ec87d4b97f19da71f588cb12eff8e541140a361fa905\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf6a761df18c5e7699f2ec87d4b97f19da71f588cb12eff8e541140a361fa905\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:38Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:38 crc kubenswrapper[4836]: I0217 14:06:38.178130 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc30b1c397bb81c58abc71fb2c6d71c94b9a0f7962d8278edc4f65f1a1c64ef8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:38Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:38 crc kubenswrapper[4836]: I0217 14:06:38.191635 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:38Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:38 crc kubenswrapper[4836]: I0217 14:06:38.204534 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:38Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:38 crc kubenswrapper[4836]: I0217 14:06:38.216449 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jlz6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f87e91ef-e64c-45a5-9bd5-cc6537e51b1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e03fa870790f606ebf6db3bac825e59b209b7a2bd6cb22264597b30211fbc2fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vbzg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jlz6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:38Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:38 crc kubenswrapper[4836]: I0217 14:06:38.251374 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81e6d07edc7d7857cb288980bc7e9d5002234c2fe642df6798611b150b6d66e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81e6d07edc7d7857cb288980bc7e9d5002234c2fe642df6798611b150b6d66e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gfznp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:38Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:38 crc kubenswrapper[4836]: I0217 14:06:38.266946 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t7845" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eeaa6bd-bab3-4310-9522-747924f2e825\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6778ba6ea7cfd1d378cae6dc4330542c35cedaff8e820a2158571f542e0ec55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6778ba6ea7cfd1d378cae6dc4330542c35cedaff8e820a2158571f542e0ec55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t7845\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:38Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:38 crc kubenswrapper[4836]: I0217 14:06:38.282386 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c76cc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"592aa549-1b1b-441e-93e4-0821e05ff2b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d56329a484e41f44539eb8ca01b461cc6f79eebf756350b0f49c47c0431e8abc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8vh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c76cc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:38Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:38 crc kubenswrapper[4836]: I0217 14:06:38.491869 4836 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-22 09:30:30.029718158 +0000 UTC Feb 17 14:06:38 crc kubenswrapper[4836]: I0217 14:06:38.773816 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-t7845" event={"ID":"3eeaa6bd-bab3-4310-9522-747924f2e825","Type":"ContainerStarted","Data":"14eab60dcf798778aa77f01af0307c7a5ef44846ceaf5a12e4a5f529b812aaa4"} Feb 17 14:06:38 crc kubenswrapper[4836]: I0217 14:06:38.777453 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" event={"ID":"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e","Type":"ContainerStarted","Data":"bb05318b17722c3f731dc976d38c05a4ab7eafd686ad0fcf17bd01032409eaee"} Feb 17 14:06:38 crc kubenswrapper[4836]: I0217 14:06:38.777520 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" event={"ID":"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e","Type":"ContainerStarted","Data":"ebae2d4e3375fda616b0ea16c0c86a5c1c36004e7d22873a289ff37874bbc74b"} Feb 17 14:06:38 crc kubenswrapper[4836]: I0217 14:06:38.777534 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" event={"ID":"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e","Type":"ContainerStarted","Data":"1d36d269996169583e4e181aac060491640e74df94385276752f460c408273cc"} Feb 17 14:06:38 crc kubenswrapper[4836]: I0217 14:06:38.777544 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" event={"ID":"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e","Type":"ContainerStarted","Data":"c6545ec908eecb35a4405050dd77343f0acd4ec7d54ec55853a926e00b7f37f2"} Feb 17 14:06:38 crc kubenswrapper[4836]: I0217 14:06:38.793167 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"895a19c9-a3f0-4a15-aa19-19347121388c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bce78b27c441ccb16f0fd2489cbe3e38be0f3d330fcd2b602f62a8ca0aa8616b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99tf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c10f7b69881ea75bfa81905a42379fed8daf48e788b5f2787c522a52d28f58cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99tf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bkk9g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:38Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:38 crc kubenswrapper[4836]: I0217 14:06:38.810157 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:38Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:38 crc kubenswrapper[4836]: I0217 14:06:38.821948 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vt5sw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d1f430-35ed-4c4e-a797-d7a0a5a45266\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25f1e7f83b2916db4faac1afe1f5107375c0e1674ff1d6f90f1025a9920111e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqtsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vt5sw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:38Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:38 crc kubenswrapper[4836]: I0217 14:06:38.844332 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"243121c7-6c63-4df6-a6a6-1ef6770e1125\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0143c73054039d5957c29bdf62217900d803ba2e0046c09dcff149b27fc94995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18b1d7908740d818568e489caf3447ab0f5768c180cc6d443adc67414e51cd07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e054e9ec9a1f8d54b704be11ff2e55c9aaad9d717b1817aa553cfbf2cc6ac16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a5d1fd5d2de253417bcb1e4edadc2363a1c259769418eb2b2aab3ff8253de2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fbf3d3daf43d0da96ad0c1340b3c72815adceec24d0ed5fc97f965d1d93e7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://854199187b81a3d6692a05073c35985feb14993f76f4ac86149c2b6bac635e4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://854199187b81a3d6692a05073c35985feb14993f76f4ac86149c2b6bac635e4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67b26f0d65128004f7645ac1f54a11bc56909269cb9b9337e447b0d5bf783c4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67b26f0d65128004f7645ac1f54a11bc56909269cb9b9337e447b0d5bf783c4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e9fc4a71369f80c06598341729c7db0cb6e0f33d70b0be64470bc6fd40d1cce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9fc4a71369f80c06598341729c7db0cb6e0f33d70b0be64470bc6fd40d1cce2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:38Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:38 crc kubenswrapper[4836]: I0217 14:06:38.857814 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b942ef1c58993ae6344d4aef5fcfe21434b5f3b3282bdf8070560fa8973a3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77390b24cb4a223b8eb6a46db49a4a88b68f87aa0b4b134b45a552c180dec5b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:38Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:38 crc kubenswrapper[4836]: I0217 14:06:38.895850 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:38Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:38 crc kubenswrapper[4836]: I0217 14:06:38.910843 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc30b1c397bb81c58abc71fb2c6d71c94b9a0f7962d8278edc4f65f1a1c64ef8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:38Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:38 crc kubenswrapper[4836]: I0217 14:06:38.923410 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:38Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:38 crc kubenswrapper[4836]: I0217 14:06:38.938283 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:38Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:38 crc kubenswrapper[4836]: I0217 14:06:38.952601 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jlz6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f87e91ef-e64c-45a5-9bd5-cc6537e51b1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e03fa870790f606ebf6db3bac825e59b209b7a2bd6cb22264597b30211fbc2fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vbzg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jlz6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:38Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:38 crc kubenswrapper[4836]: I0217 14:06:38.977414 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81e6d07edc7d7857cb288980bc7e9d5002234c2fe642df6798611b150b6d66e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81e6d07edc7d7857cb288980bc7e9d5002234c2fe642df6798611b150b6d66e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gfznp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:38Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:38 crc kubenswrapper[4836]: I0217 14:06:38.989185 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15e7e5f3-c249-4609-937e-ffdc78580880\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://281c6f51b95a96d557a29b449759275131f2fde30c4814e7692edfff96442e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bc90fa5f9363034d51c35721493655d10f8f8d5fad4dda86ca4c882963fb7a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edaaa277a27d7e7b9e8b42acaa9ee04fc0933b440441ad3f8abee458e96945c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a13a546a929cc67fa3a708f23186f65c3dbc28afd67a421a66c732b8308e4177\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a13a546a929cc67fa3a708f23186f65c3dbc28afd67a421a66c732b8308e4177\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:06:30.193193 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:06:30.195418 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-478710080/tls.crt::/tmp/serving-cert-478710080/tls.key\\\\\\\"\\\\nI0217 14:06:35.690345 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:06:35.701676 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:06:35.701714 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:06:35.701739 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:06:35.701745 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:06:35.709986 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 14:06:35.710011 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:06:35.710016 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:06:35.710020 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:06:35.710023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:06:35.710027 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:06:35.710030 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 14:06:35.710186 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 14:06:35.711336 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b36202875b19f7dafcfde168e006d9d47571de30240d5886e5e0920b74b609b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf6a761df18c5e7699f2ec87d4b97f19da71f588cb12eff8e541140a361fa905\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf6a761df18c5e7699f2ec87d4b97f19da71f588cb12eff8e541140a361fa905\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:38Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:38 crc kubenswrapper[4836]: I0217 14:06:38.995013 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 14:06:39 crc kubenswrapper[4836]: I0217 14:06:39.000051 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 14:06:39 crc kubenswrapper[4836]: I0217 14:06:39.005719 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Feb 17 14:06:39 crc kubenswrapper[4836]: I0217 14:06:39.005879 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t7845" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eeaa6bd-bab3-4310-9522-747924f2e825\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6778ba6ea7cfd1d378cae6dc4330542c35cedaff8e820a2158571f542e0ec55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6778ba6ea7cfd1d378cae6dc4330542c35cedaff8e820a2158571f542e0ec55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14eab60dcf798778aa77f01af0307c7a5ef44846ceaf5a12e4a5f529b812aaa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t7845\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:39Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:39 crc kubenswrapper[4836]: I0217 14:06:39.023246 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c76cc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"592aa549-1b1b-441e-93e4-0821e05ff2b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d56329a484e41f44539eb8ca01b461cc6f79eebf756350b0f49c47c0431e8abc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8vh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c76cc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:39Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:39 crc kubenswrapper[4836]: I0217 14:06:39.035975 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a443a5a-83a8-4154-94b7-ace6b324f461\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8787a45a4a94da65a0f67f1a402bfa526fa57631259981ded7ec3569fa977f36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6196237bf10370f0d43ac7ba96f1fbc861d8690734cf883081f67616e6047dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21927a2142504e88fd8bb2a254d804c0e13f6a42e81e16c02205292de3de3155\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c21e5f00c6fb44be82d2a907c0f9d17ea47071615370135f0e94a62f1eb7840\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:39Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:39 crc kubenswrapper[4836]: I0217 14:06:39.060341 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c76cc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"592aa549-1b1b-441e-93e4-0821e05ff2b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d56329a484e41f44539eb8ca01b461cc6f79eebf756350b0f49c47c0431e8abc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8vh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c76cc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:39Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:39 crc kubenswrapper[4836]: I0217 14:06:39.070285 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"895a19c9-a3f0-4a15-aa19-19347121388c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bce78b27c441ccb16f0fd2489cbe3e38be0f3d330fcd2b602f62a8ca0aa8616b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99tf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c10f7b69881ea75bfa81905a42379fed8daf48e788b5f2787c522a52d28f58cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99tf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bkk9g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:39Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:39 crc kubenswrapper[4836]: I0217 14:06:39.090208 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"243121c7-6c63-4df6-a6a6-1ef6770e1125\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0143c73054039d5957c29bdf62217900d803ba2e0046c09dcff149b27fc94995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18b1d7908740d818568e489caf3447ab0f5768c180cc6d443adc67414e51cd07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e054e9ec9a1f8d54b704be11ff2e55c9aaad9d717b1817aa553cfbf2cc6ac16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a5d1fd5d2de253417bcb1e4edadc2363a1c259769418eb2b2aab3ff8253de2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fbf3d3daf43d0da96ad0c1340b3c72815adceec24d0ed5fc97f965d1d93e7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://854199187b81a3d6692a05073c35985feb14993f76f4ac86149c2b6bac635e4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://854199187b81a3d6692a05073c35985feb14993f76f4ac86149c2b6bac635e4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67b26f0d65128004f7645ac1f54a11bc56909269cb9b9337e447b0d5bf783c4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67b26f0d65128004f7645ac1f54a11bc56909269cb9b9337e447b0d5bf783c4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e9fc4a71369f80c06598341729c7db0cb6e0f33d70b0be64470bc6fd40d1cce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9fc4a71369f80c06598341729c7db0cb6e0f33d70b0be64470bc6fd40d1cce2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:39Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:39 crc kubenswrapper[4836]: I0217 14:06:39.102899 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b942ef1c58993ae6344d4aef5fcfe21434b5f3b3282bdf8070560fa8973a3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77390b24cb4a223b8eb6a46db49a4a88b68f87aa0b4b134b45a552c180dec5b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:39Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:39 crc kubenswrapper[4836]: I0217 14:06:39.118277 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:39Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:39 crc kubenswrapper[4836]: I0217 14:06:39.128030 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:39Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:39 crc kubenswrapper[4836]: I0217 14:06:39.136233 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vt5sw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d1f430-35ed-4c4e-a797-d7a0a5a45266\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25f1e7f83b2916db4faac1afe1f5107375c0e1674ff1d6f90f1025a9920111e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqtsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vt5sw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:39Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:39 crc kubenswrapper[4836]: I0217 14:06:39.142290 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:06:39 crc kubenswrapper[4836]: E0217 14:06:39.142410 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:06:43.142390678 +0000 UTC m=+29.485318947 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:06:39 crc kubenswrapper[4836]: I0217 14:06:39.142505 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:06:39 crc kubenswrapper[4836]: I0217 14:06:39.142563 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:06:39 crc kubenswrapper[4836]: E0217 14:06:39.142583 4836 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 14:06:39 crc kubenswrapper[4836]: E0217 14:06:39.142651 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 14:06:43.142633005 +0000 UTC m=+29.485561344 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 14:06:39 crc kubenswrapper[4836]: E0217 14:06:39.142667 4836 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 14:06:39 crc kubenswrapper[4836]: E0217 14:06:39.142698 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 14:06:43.142692046 +0000 UTC m=+29.485620315 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 14:06:39 crc kubenswrapper[4836]: I0217 14:06:39.146567 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:39Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:39 crc kubenswrapper[4836]: I0217 14:06:39.155113 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jlz6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f87e91ef-e64c-45a5-9bd5-cc6537e51b1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e03fa870790f606ebf6db3bac825e59b209b7a2bd6cb22264597b30211fbc2fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vbzg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jlz6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:39Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:39 crc kubenswrapper[4836]: I0217 14:06:39.172223 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81e6d07edc7d7857cb288980bc7e9d5002234c2fe642df6798611b150b6d66e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81e6d07edc7d7857cb288980bc7e9d5002234c2fe642df6798611b150b6d66e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gfznp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:39Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:39 crc kubenswrapper[4836]: I0217 14:06:39.187442 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15e7e5f3-c249-4609-937e-ffdc78580880\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://281c6f51b95a96d557a29b449759275131f2fde30c4814e7692edfff96442e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bc90fa5f9363034d51c35721493655d10f8f8d5fad4dda86ca4c882963fb7a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edaaa277a27d7e7b9e8b42acaa9ee04fc0933b440441ad3f8abee458e96945c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a13a546a929cc67fa3a708f23186f65c3dbc28afd67a421a66c732b8308e4177\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a13a546a929cc67fa3a708f23186f65c3dbc28afd67a421a66c732b8308e4177\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:06:30.193193 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:06:30.195418 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-478710080/tls.crt::/tmp/serving-cert-478710080/tls.key\\\\\\\"\\\\nI0217 14:06:35.690345 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:06:35.701676 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:06:35.701714 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:06:35.701739 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:06:35.701745 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:06:35.709986 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 14:06:35.710011 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:06:35.710016 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:06:35.710020 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:06:35.710023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:06:35.710027 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:06:35.710030 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 14:06:35.710186 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 14:06:35.711336 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b36202875b19f7dafcfde168e006d9d47571de30240d5886e5e0920b74b609b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf6a761df18c5e7699f2ec87d4b97f19da71f588cb12eff8e541140a361fa905\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf6a761df18c5e7699f2ec87d4b97f19da71f588cb12eff8e541140a361fa905\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:39Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:39 crc kubenswrapper[4836]: I0217 14:06:39.204941 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc30b1c397bb81c58abc71fb2c6d71c94b9a0f7962d8278edc4f65f1a1c64ef8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:39Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:39 crc kubenswrapper[4836]: I0217 14:06:39.220566 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:39Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:39 crc kubenswrapper[4836]: I0217 14:06:39.237852 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t7845" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eeaa6bd-bab3-4310-9522-747924f2e825\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6778ba6ea7cfd1d378cae6dc4330542c35cedaff8e820a2158571f542e0ec55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6778ba6ea7cfd1d378cae6dc4330542c35cedaff8e820a2158571f542e0ec55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14eab60dcf798778aa77f01af0307c7a5ef44846ceaf5a12e4a5f529b812aaa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t7845\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:39Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:39 crc kubenswrapper[4836]: I0217 14:06:39.243101 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:06:39 crc kubenswrapper[4836]: I0217 14:06:39.243145 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:06:39 crc kubenswrapper[4836]: E0217 14:06:39.243277 4836 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 14:06:39 crc kubenswrapper[4836]: E0217 14:06:39.243312 4836 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 14:06:39 crc kubenswrapper[4836]: E0217 14:06:39.243322 4836 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 14:06:39 crc kubenswrapper[4836]: E0217 14:06:39.243369 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-17 14:06:43.243356081 +0000 UTC m=+29.586284350 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 14:06:39 crc kubenswrapper[4836]: E0217 14:06:39.243383 4836 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 14:06:39 crc kubenswrapper[4836]: E0217 14:06:39.243414 4836 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 14:06:39 crc kubenswrapper[4836]: E0217 14:06:39.243427 4836 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 14:06:39 crc kubenswrapper[4836]: E0217 14:06:39.243481 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-17 14:06:43.243465684 +0000 UTC m=+29.586393953 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 14:06:39 crc kubenswrapper[4836]: I0217 14:06:39.492677 4836 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-11 10:37:22.158181485 +0000 UTC Feb 17 14:06:39 crc kubenswrapper[4836]: I0217 14:06:39.567552 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:06:39 crc kubenswrapper[4836]: E0217 14:06:39.567731 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:06:39 crc kubenswrapper[4836]: I0217 14:06:39.568142 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:06:39 crc kubenswrapper[4836]: E0217 14:06:39.568257 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:06:39 crc kubenswrapper[4836]: I0217 14:06:39.568364 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:06:39 crc kubenswrapper[4836]: E0217 14:06:39.568441 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:06:39 crc kubenswrapper[4836]: I0217 14:06:39.786979 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" event={"ID":"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e","Type":"ContainerStarted","Data":"f5576148df454de0fb3d15ea9de93736aab907ca8cf09669c3c25fde79ec6fad"} Feb 17 14:06:39 crc kubenswrapper[4836]: I0217 14:06:39.787035 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" event={"ID":"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e","Type":"ContainerStarted","Data":"47fdc413501c45a2d6e531a1427fd6df3e9d2c57e3d60098b31fbfd695d98a0a"} Feb 17 14:06:39 crc kubenswrapper[4836]: I0217 14:06:39.789089 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"cd6efd6a58b6d90897fa902c8d0c8082ecec2899f16adc3e56b98f0bc5c4640d"} Feb 17 14:06:39 crc kubenswrapper[4836]: I0217 14:06:39.791421 4836 generic.go:334] "Generic (PLEG): container finished" podID="3eeaa6bd-bab3-4310-9522-747924f2e825" containerID="14eab60dcf798778aa77f01af0307c7a5ef44846ceaf5a12e4a5f529b812aaa4" exitCode=0 Feb 17 14:06:39 crc kubenswrapper[4836]: I0217 14:06:39.792184 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-t7845" event={"ID":"3eeaa6bd-bab3-4310-9522-747924f2e825","Type":"ContainerDied","Data":"14eab60dcf798778aa77f01af0307c7a5ef44846ceaf5a12e4a5f529b812aaa4"} Feb 17 14:06:39 crc kubenswrapper[4836]: E0217 14:06:39.798721 4836 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-crc\" already exists" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 14:06:39 crc kubenswrapper[4836]: I0217 14:06:39.810483 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:39Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:39 crc kubenswrapper[4836]: I0217 14:06:39.830416 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:39Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:39 crc kubenswrapper[4836]: I0217 14:06:39.845534 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jlz6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f87e91ef-e64c-45a5-9bd5-cc6537e51b1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e03fa870790f606ebf6db3bac825e59b209b7a2bd6cb22264597b30211fbc2fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vbzg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jlz6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:39Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:39 crc kubenswrapper[4836]: I0217 14:06:39.861278 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81e6d07edc7d7857cb288980bc7e9d5002234c2fe642df6798611b150b6d66e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81e6d07edc7d7857cb288980bc7e9d5002234c2fe642df6798611b150b6d66e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gfznp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:39Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:39 crc kubenswrapper[4836]: I0217 14:06:39.877788 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15e7e5f3-c249-4609-937e-ffdc78580880\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://281c6f51b95a96d557a29b449759275131f2fde30c4814e7692edfff96442e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bc90fa5f9363034d51c35721493655d10f8f8d5fad4dda86ca4c882963fb7a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edaaa277a27d7e7b9e8b42acaa9ee04fc0933b440441ad3f8abee458e96945c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a13a546a929cc67fa3a708f23186f65c3dbc28afd67a421a66c732b8308e4177\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a13a546a929cc67fa3a708f23186f65c3dbc28afd67a421a66c732b8308e4177\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:06:30.193193 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:06:30.195418 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-478710080/tls.crt::/tmp/serving-cert-478710080/tls.key\\\\\\\"\\\\nI0217 14:06:35.690345 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:06:35.701676 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:06:35.701714 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:06:35.701739 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:06:35.701745 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:06:35.709986 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 14:06:35.710011 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:06:35.710016 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:06:35.710020 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:06:35.710023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:06:35.710027 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:06:35.710030 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 14:06:35.710186 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 14:06:35.711336 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b36202875b19f7dafcfde168e006d9d47571de30240d5886e5e0920b74b609b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf6a761df18c5e7699f2ec87d4b97f19da71f588cb12eff8e541140a361fa905\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf6a761df18c5e7699f2ec87d4b97f19da71f588cb12eff8e541140a361fa905\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:39Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:39 crc kubenswrapper[4836]: I0217 14:06:39.892055 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc30b1c397bb81c58abc71fb2c6d71c94b9a0f7962d8278edc4f65f1a1c64ef8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:39Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:39 crc kubenswrapper[4836]: I0217 14:06:39.906087 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t7845" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eeaa6bd-bab3-4310-9522-747924f2e825\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6778ba6ea7cfd1d378cae6dc4330542c35cedaff8e820a2158571f542e0ec55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6778ba6ea7cfd1d378cae6dc4330542c35cedaff8e820a2158571f542e0ec55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14eab60dcf798778aa77f01af0307c7a5ef44846ceaf5a12e4a5f529b812aaa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t7845\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:39Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:39 crc kubenswrapper[4836]: I0217 14:06:39.921430 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a443a5a-83a8-4154-94b7-ace6b324f461\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8787a45a4a94da65a0f67f1a402bfa526fa57631259981ded7ec3569fa977f36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6196237bf10370f0d43ac7ba96f1fbc861d8690734cf883081f67616e6047dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21927a2142504e88fd8bb2a254d804c0e13f6a42e81e16c02205292de3de3155\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c21e5f00c6fb44be82d2a907c0f9d17ea47071615370135f0e94a62f1eb7840\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:39Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:39 crc kubenswrapper[4836]: I0217 14:06:39.933857 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c76cc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"592aa549-1b1b-441e-93e4-0821e05ff2b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d56329a484e41f44539eb8ca01b461cc6f79eebf756350b0f49c47c0431e8abc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8vh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c76cc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:39Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:39 crc kubenswrapper[4836]: I0217 14:06:39.946173 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"895a19c9-a3f0-4a15-aa19-19347121388c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bce78b27c441ccb16f0fd2489cbe3e38be0f3d330fcd2b602f62a8ca0aa8616b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99tf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c10f7b69881ea75bfa81905a42379fed8daf48e788b5f2787c522a52d28f58cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99tf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bkk9g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:39Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:39 crc kubenswrapper[4836]: I0217 14:06:39.955914 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vt5sw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d1f430-35ed-4c4e-a797-d7a0a5a45266\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25f1e7f83b2916db4faac1afe1f5107375c0e1674ff1d6f90f1025a9920111e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqtsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vt5sw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:39Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:39 crc kubenswrapper[4836]: I0217 14:06:39.977128 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"243121c7-6c63-4df6-a6a6-1ef6770e1125\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0143c73054039d5957c29bdf62217900d803ba2e0046c09dcff149b27fc94995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18b1d7908740d818568e489caf3447ab0f5768c180cc6d443adc67414e51cd07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e054e9ec9a1f8d54b704be11ff2e55c9aaad9d717b1817aa553cfbf2cc6ac16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a5d1fd5d2de253417bcb1e4edadc2363a1c259769418eb2b2aab3ff8253de2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fbf3d3daf43d0da96ad0c1340b3c72815adceec24d0ed5fc97f965d1d93e7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://854199187b81a3d6692a05073c35985feb14993f76f4ac86149c2b6bac635e4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://854199187b81a3d6692a05073c35985feb14993f76f4ac86149c2b6bac635e4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67b26f0d65128004f7645ac1f54a11bc56909269cb9b9337e447b0d5bf783c4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67b26f0d65128004f7645ac1f54a11bc56909269cb9b9337e447b0d5bf783c4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e9fc4a71369f80c06598341729c7db0cb6e0f33d70b0be64470bc6fd40d1cce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9fc4a71369f80c06598341729c7db0cb6e0f33d70b0be64470bc6fd40d1cce2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:39Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:39 crc kubenswrapper[4836]: I0217 14:06:39.994003 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b942ef1c58993ae6344d4aef5fcfe21434b5f3b3282bdf8070560fa8973a3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77390b24cb4a223b8eb6a46db49a4a88b68f87aa0b4b134b45a552c180dec5b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:39Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:40 crc kubenswrapper[4836]: I0217 14:06:40.016979 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:40Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:40 crc kubenswrapper[4836]: I0217 14:06:40.027664 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd6efd6a58b6d90897fa902c8d0c8082ecec2899f16adc3e56b98f0bc5c4640d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:40Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:40 crc kubenswrapper[4836]: I0217 14:06:40.040037 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15e7e5f3-c249-4609-937e-ffdc78580880\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://281c6f51b95a96d557a29b449759275131f2fde30c4814e7692edfff96442e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bc90fa5f9363034d51c35721493655d10f8f8d5fad4dda86ca4c882963fb7a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edaaa277a27d7e7b9e8b42acaa9ee04fc0933b440441ad3f8abee458e96945c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a13a546a929cc67fa3a708f23186f65c3dbc28afd67a421a66c732b8308e4177\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a13a546a929cc67fa3a708f23186f65c3dbc28afd67a421a66c732b8308e4177\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:06:30.193193 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:06:30.195418 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-478710080/tls.crt::/tmp/serving-cert-478710080/tls.key\\\\\\\"\\\\nI0217 14:06:35.690345 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:06:35.701676 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:06:35.701714 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:06:35.701739 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:06:35.701745 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:06:35.709986 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 14:06:35.710011 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:06:35.710016 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:06:35.710020 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:06:35.710023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:06:35.710027 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:06:35.710030 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 14:06:35.710186 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 14:06:35.711336 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b36202875b19f7dafcfde168e006d9d47571de30240d5886e5e0920b74b609b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf6a761df18c5e7699f2ec87d4b97f19da71f588cb12eff8e541140a361fa905\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf6a761df18c5e7699f2ec87d4b97f19da71f588cb12eff8e541140a361fa905\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:40Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:40 crc kubenswrapper[4836]: I0217 14:06:40.052509 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc30b1c397bb81c58abc71fb2c6d71c94b9a0f7962d8278edc4f65f1a1c64ef8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:40Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:40 crc kubenswrapper[4836]: I0217 14:06:40.071923 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:40Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:40 crc kubenswrapper[4836]: I0217 14:06:40.090076 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:40Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:40 crc kubenswrapper[4836]: I0217 14:06:40.103453 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jlz6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f87e91ef-e64c-45a5-9bd5-cc6537e51b1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e03fa870790f606ebf6db3bac825e59b209b7a2bd6cb22264597b30211fbc2fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vbzg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jlz6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:40Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:40 crc kubenswrapper[4836]: I0217 14:06:40.122632 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81e6d07edc7d7857cb288980bc7e9d5002234c2fe642df6798611b150b6d66e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81e6d07edc7d7857cb288980bc7e9d5002234c2fe642df6798611b150b6d66e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gfznp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:40Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:40 crc kubenswrapper[4836]: I0217 14:06:40.137470 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t7845" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eeaa6bd-bab3-4310-9522-747924f2e825\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6778ba6ea7cfd1d378cae6dc4330542c35cedaff8e820a2158571f542e0ec55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6778ba6ea7cfd1d378cae6dc4330542c35cedaff8e820a2158571f542e0ec55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14eab60dcf798778aa77f01af0307c7a5ef44846ceaf5a12e4a5f529b812aaa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14eab60dcf798778aa77f01af0307c7a5ef44846ceaf5a12e4a5f529b812aaa4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t7845\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:40Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:40 crc kubenswrapper[4836]: I0217 14:06:40.151031 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a443a5a-83a8-4154-94b7-ace6b324f461\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8787a45a4a94da65a0f67f1a402bfa526fa57631259981ded7ec3569fa977f36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6196237bf10370f0d43ac7ba96f1fbc861d8690734cf883081f67616e6047dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21927a2142504e88fd8bb2a254d804c0e13f6a42e81e16c02205292de3de3155\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c21e5f00c6fb44be82d2a907c0f9d17ea47071615370135f0e94a62f1eb7840\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:40Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:40 crc kubenswrapper[4836]: I0217 14:06:40.162792 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c76cc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"592aa549-1b1b-441e-93e4-0821e05ff2b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d56329a484e41f44539eb8ca01b461cc6f79eebf756350b0f49c47c0431e8abc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8vh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c76cc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:40Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:40 crc kubenswrapper[4836]: I0217 14:06:40.173973 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"895a19c9-a3f0-4a15-aa19-19347121388c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bce78b27c441ccb16f0fd2489cbe3e38be0f3d330fcd2b602f62a8ca0aa8616b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99tf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c10f7b69881ea75bfa81905a42379fed8daf48e788b5f2787c522a52d28f58cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99tf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bkk9g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:40Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:40 crc kubenswrapper[4836]: I0217 14:06:40.189978 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"243121c7-6c63-4df6-a6a6-1ef6770e1125\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0143c73054039d5957c29bdf62217900d803ba2e0046c09dcff149b27fc94995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18b1d7908740d818568e489caf3447ab0f5768c180cc6d443adc67414e51cd07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e054e9ec9a1f8d54b704be11ff2e55c9aaad9d717b1817aa553cfbf2cc6ac16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a5d1fd5d2de253417bcb1e4edadc2363a1c259769418eb2b2aab3ff8253de2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fbf3d3daf43d0da96ad0c1340b3c72815adceec24d0ed5fc97f965d1d93e7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://854199187b81a3d6692a05073c35985feb14993f76f4ac86149c2b6bac635e4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://854199187b81a3d6692a05073c35985feb14993f76f4ac86149c2b6bac635e4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67b26f0d65128004f7645ac1f54a11bc56909269cb9b9337e447b0d5bf783c4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67b26f0d65128004f7645ac1f54a11bc56909269cb9b9337e447b0d5bf783c4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e9fc4a71369f80c06598341729c7db0cb6e0f33d70b0be64470bc6fd40d1cce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9fc4a71369f80c06598341729c7db0cb6e0f33d70b0be64470bc6fd40d1cce2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:40Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:40 crc kubenswrapper[4836]: I0217 14:06:40.200534 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b942ef1c58993ae6344d4aef5fcfe21434b5f3b3282bdf8070560fa8973a3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77390b24cb4a223b8eb6a46db49a4a88b68f87aa0b4b134b45a552c180dec5b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:40Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:40 crc kubenswrapper[4836]: I0217 14:06:40.221864 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:40Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:40 crc kubenswrapper[4836]: I0217 14:06:40.261443 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd6efd6a58b6d90897fa902c8d0c8082ecec2899f16adc3e56b98f0bc5c4640d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:40Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:40 crc kubenswrapper[4836]: I0217 14:06:40.302571 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vt5sw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d1f430-35ed-4c4e-a797-d7a0a5a45266\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25f1e7f83b2916db4faac1afe1f5107375c0e1674ff1d6f90f1025a9920111e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqtsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vt5sw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:40Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:40 crc kubenswrapper[4836]: I0217 14:06:40.493090 4836 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-10 10:00:13.625338251 +0000 UTC Feb 17 14:06:40 crc kubenswrapper[4836]: I0217 14:06:40.803674 4836 generic.go:334] "Generic (PLEG): container finished" podID="3eeaa6bd-bab3-4310-9522-747924f2e825" containerID="dd65cf675b0d34fc5ea309bdcd17ef648bbf1c0ee6e2ece25f459d8d2403387c" exitCode=0 Feb 17 14:06:40 crc kubenswrapper[4836]: I0217 14:06:40.803783 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-t7845" event={"ID":"3eeaa6bd-bab3-4310-9522-747924f2e825","Type":"ContainerDied","Data":"dd65cf675b0d34fc5ea309bdcd17ef648bbf1c0ee6e2ece25f459d8d2403387c"} Feb 17 14:06:40 crc kubenswrapper[4836]: I0217 14:06:40.821472 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b942ef1c58993ae6344d4aef5fcfe21434b5f3b3282bdf8070560fa8973a3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77390b24cb4a223b8eb6a46db49a4a88b68f87aa0b4b134b45a552c180dec5b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:40Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:40 crc kubenswrapper[4836]: I0217 14:06:40.834023 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:40Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:40 crc kubenswrapper[4836]: I0217 14:06:40.845770 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd6efd6a58b6d90897fa902c8d0c8082ecec2899f16adc3e56b98f0bc5c4640d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:40Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:40 crc kubenswrapper[4836]: I0217 14:06:40.856689 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vt5sw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d1f430-35ed-4c4e-a797-d7a0a5a45266\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25f1e7f83b2916db4faac1afe1f5107375c0e1674ff1d6f90f1025a9920111e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqtsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vt5sw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:40Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:40 crc kubenswrapper[4836]: I0217 14:06:40.880994 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"243121c7-6c63-4df6-a6a6-1ef6770e1125\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0143c73054039d5957c29bdf62217900d803ba2e0046c09dcff149b27fc94995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18b1d7908740d818568e489caf3447ab0f5768c180cc6d443adc67414e51cd07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e054e9ec9a1f8d54b704be11ff2e55c9aaad9d717b1817aa553cfbf2cc6ac16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a5d1fd5d2de253417bcb1e4edadc2363a1c259769418eb2b2aab3ff8253de2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fbf3d3daf43d0da96ad0c1340b3c72815adceec24d0ed5fc97f965d1d93e7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://854199187b81a3d6692a05073c35985feb14993f76f4ac86149c2b6bac635e4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://854199187b81a3d6692a05073c35985feb14993f76f4ac86149c2b6bac635e4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67b26f0d65128004f7645ac1f54a11bc56909269cb9b9337e447b0d5bf783c4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67b26f0d65128004f7645ac1f54a11bc56909269cb9b9337e447b0d5bf783c4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e9fc4a71369f80c06598341729c7db0cb6e0f33d70b0be64470bc6fd40d1cce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9fc4a71369f80c06598341729c7db0cb6e0f33d70b0be64470bc6fd40d1cce2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:40Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:40 crc kubenswrapper[4836]: I0217 14:06:40.894600 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15e7e5f3-c249-4609-937e-ffdc78580880\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://281c6f51b95a96d557a29b449759275131f2fde30c4814e7692edfff96442e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bc90fa5f9363034d51c35721493655d10f8f8d5fad4dda86ca4c882963fb7a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edaaa277a27d7e7b9e8b42acaa9ee04fc0933b440441ad3f8abee458e96945c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a13a546a929cc67fa3a708f23186f65c3dbc28afd67a421a66c732b8308e4177\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a13a546a929cc67fa3a708f23186f65c3dbc28afd67a421a66c732b8308e4177\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:06:30.193193 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:06:30.195418 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-478710080/tls.crt::/tmp/serving-cert-478710080/tls.key\\\\\\\"\\\\nI0217 14:06:35.690345 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:06:35.701676 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:06:35.701714 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:06:35.701739 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:06:35.701745 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:06:35.709986 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 14:06:35.710011 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:06:35.710016 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:06:35.710020 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:06:35.710023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:06:35.710027 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:06:35.710030 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 14:06:35.710186 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 14:06:35.711336 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b36202875b19f7dafcfde168e006d9d47571de30240d5886e5e0920b74b609b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf6a761df18c5e7699f2ec87d4b97f19da71f588cb12eff8e541140a361fa905\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf6a761df18c5e7699f2ec87d4b97f19da71f588cb12eff8e541140a361fa905\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:40Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:40 crc kubenswrapper[4836]: I0217 14:06:40.907105 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc30b1c397bb81c58abc71fb2c6d71c94b9a0f7962d8278edc4f65f1a1c64ef8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:40Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:40 crc kubenswrapper[4836]: I0217 14:06:40.918492 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:40Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:40 crc kubenswrapper[4836]: I0217 14:06:40.932367 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:40Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:40 crc kubenswrapper[4836]: I0217 14:06:40.944479 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jlz6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f87e91ef-e64c-45a5-9bd5-cc6537e51b1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e03fa870790f606ebf6db3bac825e59b209b7a2bd6cb22264597b30211fbc2fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vbzg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jlz6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:40Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:40 crc kubenswrapper[4836]: I0217 14:06:40.967740 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81e6d07edc7d7857cb288980bc7e9d5002234c2fe642df6798611b150b6d66e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81e6d07edc7d7857cb288980bc7e9d5002234c2fe642df6798611b150b6d66e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gfznp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:40Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:40 crc kubenswrapper[4836]: I0217 14:06:40.985855 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t7845" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eeaa6bd-bab3-4310-9522-747924f2e825\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6778ba6ea7cfd1d378cae6dc4330542c35cedaff8e820a2158571f542e0ec55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6778ba6ea7cfd1d378cae6dc4330542c35cedaff8e820a2158571f542e0ec55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14eab60dcf798778aa77f01af0307c7a5ef44846ceaf5a12e4a5f529b812aaa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14eab60dcf798778aa77f01af0307c7a5ef44846ceaf5a12e4a5f529b812aaa4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd65cf675b0d34fc5ea309bdcd17ef648bbf1c0ee6e2ece25f459d8d2403387c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd65cf675b0d34fc5ea309bdcd17ef648bbf1c0ee6e2ece25f459d8d2403387c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t7845\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:40Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:40 crc kubenswrapper[4836]: I0217 14:06:40.998403 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c76cc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"592aa549-1b1b-441e-93e4-0821e05ff2b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d56329a484e41f44539eb8ca01b461cc6f79eebf756350b0f49c47c0431e8abc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8vh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c76cc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:40Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:41 crc kubenswrapper[4836]: I0217 14:06:41.010775 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a443a5a-83a8-4154-94b7-ace6b324f461\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8787a45a4a94da65a0f67f1a402bfa526fa57631259981ded7ec3569fa977f36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6196237bf10370f0d43ac7ba96f1fbc861d8690734cf883081f67616e6047dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21927a2142504e88fd8bb2a254d804c0e13f6a42e81e16c02205292de3de3155\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c21e5f00c6fb44be82d2a907c0f9d17ea47071615370135f0e94a62f1eb7840\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:41Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:41 crc kubenswrapper[4836]: I0217 14:06:41.022383 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"895a19c9-a3f0-4a15-aa19-19347121388c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bce78b27c441ccb16f0fd2489cbe3e38be0f3d330fcd2b602f62a8ca0aa8616b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99tf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c10f7b69881ea75bfa81905a42379fed8daf48e788b5f2787c522a52d28f58cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99tf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bkk9g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:41Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:41 crc kubenswrapper[4836]: I0217 14:06:41.047386 4836 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:06:41 crc kubenswrapper[4836]: I0217 14:06:41.048134 4836 scope.go:117] "RemoveContainer" containerID="a13a546a929cc67fa3a708f23186f65c3dbc28afd67a421a66c732b8308e4177" Feb 17 14:06:41 crc kubenswrapper[4836]: E0217 14:06:41.048363 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 17 14:06:41 crc kubenswrapper[4836]: I0217 14:06:41.494124 4836 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 17:02:55.742413648 +0000 UTC Feb 17 14:06:41 crc kubenswrapper[4836]: I0217 14:06:41.567530 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:06:41 crc kubenswrapper[4836]: I0217 14:06:41.567584 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:06:41 crc kubenswrapper[4836]: I0217 14:06:41.567662 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:06:41 crc kubenswrapper[4836]: E0217 14:06:41.567774 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:06:41 crc kubenswrapper[4836]: E0217 14:06:41.567905 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:06:41 crc kubenswrapper[4836]: E0217 14:06:41.568053 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:06:41 crc kubenswrapper[4836]: I0217 14:06:41.679661 4836 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 14:06:41 crc kubenswrapper[4836]: I0217 14:06:41.682537 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:41 crc kubenswrapper[4836]: I0217 14:06:41.682588 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:41 crc kubenswrapper[4836]: I0217 14:06:41.682605 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:41 crc kubenswrapper[4836]: I0217 14:06:41.682732 4836 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 17 14:06:41 crc kubenswrapper[4836]: I0217 14:06:41.694574 4836 kubelet_node_status.go:115] "Node was previously registered" node="crc" Feb 17 14:06:41 crc kubenswrapper[4836]: I0217 14:06:41.694937 4836 kubelet_node_status.go:79] "Successfully registered node" node="crc" Feb 17 14:06:41 crc kubenswrapper[4836]: I0217 14:06:41.696677 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:41 crc kubenswrapper[4836]: I0217 14:06:41.696725 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:41 crc kubenswrapper[4836]: I0217 14:06:41.696745 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:41 crc kubenswrapper[4836]: I0217 14:06:41.696769 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:41 crc kubenswrapper[4836]: I0217 14:06:41.696787 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:41Z","lastTransitionTime":"2026-02-17T14:06:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:41 crc kubenswrapper[4836]: E0217 14:06:41.714419 4836 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d638d470-b0e0-4be9-938f-7ec815bf6bd8\\\",\\\"systemUUID\\\":\\\"f194f106-0bf2-4b65-bcb3-5215631b39d2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:41Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:41 crc kubenswrapper[4836]: I0217 14:06:41.718035 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:41 crc kubenswrapper[4836]: I0217 14:06:41.718068 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:41 crc kubenswrapper[4836]: I0217 14:06:41.718080 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:41 crc kubenswrapper[4836]: I0217 14:06:41.718095 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:41 crc kubenswrapper[4836]: I0217 14:06:41.718107 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:41Z","lastTransitionTime":"2026-02-17T14:06:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:41 crc kubenswrapper[4836]: E0217 14:06:41.729387 4836 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d638d470-b0e0-4be9-938f-7ec815bf6bd8\\\",\\\"systemUUID\\\":\\\"f194f106-0bf2-4b65-bcb3-5215631b39d2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:41Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:41 crc kubenswrapper[4836]: I0217 14:06:41.732929 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:41 crc kubenswrapper[4836]: I0217 14:06:41.732958 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:41 crc kubenswrapper[4836]: I0217 14:06:41.732965 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:41 crc kubenswrapper[4836]: I0217 14:06:41.732979 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:41 crc kubenswrapper[4836]: I0217 14:06:41.732988 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:41Z","lastTransitionTime":"2026-02-17T14:06:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:41 crc kubenswrapper[4836]: E0217 14:06:41.744433 4836 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d638d470-b0e0-4be9-938f-7ec815bf6bd8\\\",\\\"systemUUID\\\":\\\"f194f106-0bf2-4b65-bcb3-5215631b39d2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:41Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:41 crc kubenswrapper[4836]: I0217 14:06:41.748130 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:41 crc kubenswrapper[4836]: I0217 14:06:41.748160 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:41 crc kubenswrapper[4836]: I0217 14:06:41.748170 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:41 crc kubenswrapper[4836]: I0217 14:06:41.748186 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:41 crc kubenswrapper[4836]: I0217 14:06:41.748198 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:41Z","lastTransitionTime":"2026-02-17T14:06:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:41 crc kubenswrapper[4836]: E0217 14:06:41.759280 4836 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d638d470-b0e0-4be9-938f-7ec815bf6bd8\\\",\\\"systemUUID\\\":\\\"f194f106-0bf2-4b65-bcb3-5215631b39d2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:41Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:41 crc kubenswrapper[4836]: I0217 14:06:41.762241 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:41 crc kubenswrapper[4836]: I0217 14:06:41.762277 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:41 crc kubenswrapper[4836]: I0217 14:06:41.762305 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:41 crc kubenswrapper[4836]: I0217 14:06:41.762323 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:41 crc kubenswrapper[4836]: I0217 14:06:41.762335 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:41Z","lastTransitionTime":"2026-02-17T14:06:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:41 crc kubenswrapper[4836]: E0217 14:06:41.773959 4836 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d638d470-b0e0-4be9-938f-7ec815bf6bd8\\\",\\\"systemUUID\\\":\\\"f194f106-0bf2-4b65-bcb3-5215631b39d2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:41Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:41 crc kubenswrapper[4836]: E0217 14:06:41.774138 4836 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 17 14:06:41 crc kubenswrapper[4836]: I0217 14:06:41.775730 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:41 crc kubenswrapper[4836]: I0217 14:06:41.775765 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:41 crc kubenswrapper[4836]: I0217 14:06:41.775779 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:41 crc kubenswrapper[4836]: I0217 14:06:41.775798 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:41 crc kubenswrapper[4836]: I0217 14:06:41.775812 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:41Z","lastTransitionTime":"2026-02-17T14:06:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:41 crc kubenswrapper[4836]: I0217 14:06:41.809968 4836 generic.go:334] "Generic (PLEG): container finished" podID="3eeaa6bd-bab3-4310-9522-747924f2e825" containerID="b15d92528d05fd8bcab26df5eeb85616b44e750574cc8effa1f34c39dc329bf7" exitCode=0 Feb 17 14:06:41 crc kubenswrapper[4836]: I0217 14:06:41.810095 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-t7845" event={"ID":"3eeaa6bd-bab3-4310-9522-747924f2e825","Type":"ContainerDied","Data":"b15d92528d05fd8bcab26df5eeb85616b44e750574cc8effa1f34c39dc329bf7"} Feb 17 14:06:41 crc kubenswrapper[4836]: I0217 14:06:41.815957 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" event={"ID":"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e","Type":"ContainerStarted","Data":"0c8d2edfc6a88391d034f310661e09eb58503f33f015b5e067654db06e8b152e"} Feb 17 14:06:41 crc kubenswrapper[4836]: I0217 14:06:41.826057 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a443a5a-83a8-4154-94b7-ace6b324f461\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8787a45a4a94da65a0f67f1a402bfa526fa57631259981ded7ec3569fa977f36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6196237bf10370f0d43ac7ba96f1fbc861d8690734cf883081f67616e6047dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21927a2142504e88fd8bb2a254d804c0e13f6a42e81e16c02205292de3de3155\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c21e5f00c6fb44be82d2a907c0f9d17ea47071615370135f0e94a62f1eb7840\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:41Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:41 crc kubenswrapper[4836]: I0217 14:06:41.839726 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c76cc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"592aa549-1b1b-441e-93e4-0821e05ff2b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d56329a484e41f44539eb8ca01b461cc6f79eebf756350b0f49c47c0431e8abc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8vh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c76cc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:41Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:41 crc kubenswrapper[4836]: I0217 14:06:41.850757 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"895a19c9-a3f0-4a15-aa19-19347121388c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bce78b27c441ccb16f0fd2489cbe3e38be0f3d330fcd2b602f62a8ca0aa8616b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99tf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c10f7b69881ea75bfa81905a42379fed8daf48e788b5f2787c522a52d28f58cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99tf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bkk9g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:41Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:41 crc kubenswrapper[4836]: I0217 14:06:41.871523 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"243121c7-6c63-4df6-a6a6-1ef6770e1125\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0143c73054039d5957c29bdf62217900d803ba2e0046c09dcff149b27fc94995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18b1d7908740d818568e489caf3447ab0f5768c180cc6d443adc67414e51cd07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e054e9ec9a1f8d54b704be11ff2e55c9aaad9d717b1817aa553cfbf2cc6ac16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a5d1fd5d2de253417bcb1e4edadc2363a1c259769418eb2b2aab3ff8253de2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fbf3d3daf43d0da96ad0c1340b3c72815adceec24d0ed5fc97f965d1d93e7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://854199187b81a3d6692a05073c35985feb14993f76f4ac86149c2b6bac635e4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://854199187b81a3d6692a05073c35985feb14993f76f4ac86149c2b6bac635e4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67b26f0d65128004f7645ac1f54a11bc56909269cb9b9337e447b0d5bf783c4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67b26f0d65128004f7645ac1f54a11bc56909269cb9b9337e447b0d5bf783c4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e9fc4a71369f80c06598341729c7db0cb6e0f33d70b0be64470bc6fd40d1cce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9fc4a71369f80c06598341729c7db0cb6e0f33d70b0be64470bc6fd40d1cce2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:41Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:41 crc kubenswrapper[4836]: I0217 14:06:41.878979 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:41 crc kubenswrapper[4836]: I0217 14:06:41.879081 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:41 crc kubenswrapper[4836]: I0217 14:06:41.879143 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:41 crc kubenswrapper[4836]: I0217 14:06:41.879168 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:41 crc kubenswrapper[4836]: I0217 14:06:41.879227 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:41Z","lastTransitionTime":"2026-02-17T14:06:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:41 crc kubenswrapper[4836]: I0217 14:06:41.885679 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b942ef1c58993ae6344d4aef5fcfe21434b5f3b3282bdf8070560fa8973a3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77390b24cb4a223b8eb6a46db49a4a88b68f87aa0b4b134b45a552c180dec5b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:41Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:41 crc kubenswrapper[4836]: I0217 14:06:41.900093 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:41Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:41 crc kubenswrapper[4836]: I0217 14:06:41.912946 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd6efd6a58b6d90897fa902c8d0c8082ecec2899f16adc3e56b98f0bc5c4640d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:41Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:41 crc kubenswrapper[4836]: I0217 14:06:41.925484 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vt5sw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d1f430-35ed-4c4e-a797-d7a0a5a45266\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25f1e7f83b2916db4faac1afe1f5107375c0e1674ff1d6f90f1025a9920111e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqtsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vt5sw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:41Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:41 crc kubenswrapper[4836]: I0217 14:06:41.935653 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jlz6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f87e91ef-e64c-45a5-9bd5-cc6537e51b1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e03fa870790f606ebf6db3bac825e59b209b7a2bd6cb22264597b30211fbc2fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vbzg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jlz6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:41Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:41 crc kubenswrapper[4836]: I0217 14:06:41.956759 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81e6d07edc7d7857cb288980bc7e9d5002234c2fe642df6798611b150b6d66e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81e6d07edc7d7857cb288980bc7e9d5002234c2fe642df6798611b150b6d66e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gfznp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:41Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:41 crc kubenswrapper[4836]: I0217 14:06:41.968889 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15e7e5f3-c249-4609-937e-ffdc78580880\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://281c6f51b95a96d557a29b449759275131f2fde30c4814e7692edfff96442e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bc90fa5f9363034d51c35721493655d10f8f8d5fad4dda86ca4c882963fb7a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edaaa277a27d7e7b9e8b42acaa9ee04fc0933b440441ad3f8abee458e96945c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a13a546a929cc67fa3a708f23186f65c3dbc28afd67a421a66c732b8308e4177\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a13a546a929cc67fa3a708f23186f65c3dbc28afd67a421a66c732b8308e4177\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:06:30.193193 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:06:30.195418 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-478710080/tls.crt::/tmp/serving-cert-478710080/tls.key\\\\\\\"\\\\nI0217 14:06:35.690345 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:06:35.701676 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:06:35.701714 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:06:35.701739 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:06:35.701745 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:06:35.709986 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 14:06:35.710011 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:06:35.710016 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:06:35.710020 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:06:35.710023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:06:35.710027 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:06:35.710030 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 14:06:35.710186 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 14:06:35.711336 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b36202875b19f7dafcfde168e006d9d47571de30240d5886e5e0920b74b609b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf6a761df18c5e7699f2ec87d4b97f19da71f588cb12eff8e541140a361fa905\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf6a761df18c5e7699f2ec87d4b97f19da71f588cb12eff8e541140a361fa905\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:41Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:41 crc kubenswrapper[4836]: I0217 14:06:41.981491 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc30b1c397bb81c58abc71fb2c6d71c94b9a0f7962d8278edc4f65f1a1c64ef8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:41Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:41 crc kubenswrapper[4836]: I0217 14:06:41.982033 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:41 crc kubenswrapper[4836]: I0217 14:06:41.982064 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:41 crc kubenswrapper[4836]: I0217 14:06:41.982075 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:41 crc kubenswrapper[4836]: I0217 14:06:41.982091 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:41 crc kubenswrapper[4836]: I0217 14:06:41.982103 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:41Z","lastTransitionTime":"2026-02-17T14:06:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:41 crc kubenswrapper[4836]: I0217 14:06:41.992609 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:41Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:42 crc kubenswrapper[4836]: I0217 14:06:42.003600 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:42Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:42 crc kubenswrapper[4836]: I0217 14:06:42.017369 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t7845" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eeaa6bd-bab3-4310-9522-747924f2e825\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6778ba6ea7cfd1d378cae6dc4330542c35cedaff8e820a2158571f542e0ec55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6778ba6ea7cfd1d378cae6dc4330542c35cedaff8e820a2158571f542e0ec55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14eab60dcf798778aa77f01af0307c7a5ef44846ceaf5a12e4a5f529b812aaa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14eab60dcf798778aa77f01af0307c7a5ef44846ceaf5a12e4a5f529b812aaa4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd65cf675b0d34fc5ea309bdcd17ef648bbf1c0ee6e2ece25f459d8d2403387c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd65cf675b0d34fc5ea309bdcd17ef648bbf1c0ee6e2ece25f459d8d2403387c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b15d92528d05fd8bcab26df5eeb85616b44e750574cc8effa1f34c39dc329bf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b15d92528d05fd8bcab26df5eeb85616b44e750574cc8effa1f34c39dc329bf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t7845\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:42Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:42 crc kubenswrapper[4836]: I0217 14:06:42.085725 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:42 crc kubenswrapper[4836]: I0217 14:06:42.086033 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:42 crc kubenswrapper[4836]: I0217 14:06:42.086046 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:42 crc kubenswrapper[4836]: I0217 14:06:42.086063 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:42 crc kubenswrapper[4836]: I0217 14:06:42.086078 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:42Z","lastTransitionTime":"2026-02-17T14:06:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:42 crc kubenswrapper[4836]: I0217 14:06:42.188088 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:42 crc kubenswrapper[4836]: I0217 14:06:42.188166 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:42 crc kubenswrapper[4836]: I0217 14:06:42.188176 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:42 crc kubenswrapper[4836]: I0217 14:06:42.188192 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:42 crc kubenswrapper[4836]: I0217 14:06:42.188202 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:42Z","lastTransitionTime":"2026-02-17T14:06:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:42 crc kubenswrapper[4836]: I0217 14:06:42.290526 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:42 crc kubenswrapper[4836]: I0217 14:06:42.290598 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:42 crc kubenswrapper[4836]: I0217 14:06:42.290620 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:42 crc kubenswrapper[4836]: I0217 14:06:42.290650 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:42 crc kubenswrapper[4836]: I0217 14:06:42.290672 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:42Z","lastTransitionTime":"2026-02-17T14:06:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:42 crc kubenswrapper[4836]: I0217 14:06:42.393694 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:42 crc kubenswrapper[4836]: I0217 14:06:42.393740 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:42 crc kubenswrapper[4836]: I0217 14:06:42.393752 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:42 crc kubenswrapper[4836]: I0217 14:06:42.393769 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:42 crc kubenswrapper[4836]: I0217 14:06:42.393781 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:42Z","lastTransitionTime":"2026-02-17T14:06:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:42 crc kubenswrapper[4836]: I0217 14:06:42.494397 4836 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-25 05:19:37.1478885 +0000 UTC Feb 17 14:06:42 crc kubenswrapper[4836]: I0217 14:06:42.496085 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:42 crc kubenswrapper[4836]: I0217 14:06:42.496279 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:42 crc kubenswrapper[4836]: I0217 14:06:42.496612 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:42 crc kubenswrapper[4836]: I0217 14:06:42.496740 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:42 crc kubenswrapper[4836]: I0217 14:06:42.496854 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:42Z","lastTransitionTime":"2026-02-17T14:06:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:42 crc kubenswrapper[4836]: I0217 14:06:42.599676 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:42 crc kubenswrapper[4836]: I0217 14:06:42.599718 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:42 crc kubenswrapper[4836]: I0217 14:06:42.599727 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:42 crc kubenswrapper[4836]: I0217 14:06:42.599743 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:42 crc kubenswrapper[4836]: I0217 14:06:42.599752 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:42Z","lastTransitionTime":"2026-02-17T14:06:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:42 crc kubenswrapper[4836]: I0217 14:06:42.702451 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:42 crc kubenswrapper[4836]: I0217 14:06:42.702490 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:42 crc kubenswrapper[4836]: I0217 14:06:42.702498 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:42 crc kubenswrapper[4836]: I0217 14:06:42.702511 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:42 crc kubenswrapper[4836]: I0217 14:06:42.702521 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:42Z","lastTransitionTime":"2026-02-17T14:06:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:42 crc kubenswrapper[4836]: I0217 14:06:42.805606 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:42 crc kubenswrapper[4836]: I0217 14:06:42.805649 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:42 crc kubenswrapper[4836]: I0217 14:06:42.805659 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:42 crc kubenswrapper[4836]: I0217 14:06:42.805677 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:42 crc kubenswrapper[4836]: I0217 14:06:42.805690 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:42Z","lastTransitionTime":"2026-02-17T14:06:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:42 crc kubenswrapper[4836]: I0217 14:06:42.820223 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-t7845" event={"ID":"3eeaa6bd-bab3-4310-9522-747924f2e825","Type":"ContainerStarted","Data":"2ddc9616f0a81f7623eb10b0cdb194656ad856548f814df2e6cebdb2419293f0"} Feb 17 14:06:42 crc kubenswrapper[4836]: I0217 14:06:42.833707 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"895a19c9-a3f0-4a15-aa19-19347121388c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bce78b27c441ccb16f0fd2489cbe3e38be0f3d330fcd2b602f62a8ca0aa8616b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99tf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c10f7b69881ea75bfa81905a42379fed8daf48e788b5f2787c522a52d28f58cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99tf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bkk9g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:42Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:42 crc kubenswrapper[4836]: I0217 14:06:42.844816 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b942ef1c58993ae6344d4aef5fcfe21434b5f3b3282bdf8070560fa8973a3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77390b24cb4a223b8eb6a46db49a4a88b68f87aa0b4b134b45a552c180dec5b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:42Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:42 crc kubenswrapper[4836]: I0217 14:06:42.855703 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:42Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:42 crc kubenswrapper[4836]: I0217 14:06:42.866157 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd6efd6a58b6d90897fa902c8d0c8082ecec2899f16adc3e56b98f0bc5c4640d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:42Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:42 crc kubenswrapper[4836]: I0217 14:06:42.877044 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vt5sw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d1f430-35ed-4c4e-a797-d7a0a5a45266\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25f1e7f83b2916db4faac1afe1f5107375c0e1674ff1d6f90f1025a9920111e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqtsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vt5sw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:42Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:42 crc kubenswrapper[4836]: I0217 14:06:42.895008 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"243121c7-6c63-4df6-a6a6-1ef6770e1125\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0143c73054039d5957c29bdf62217900d803ba2e0046c09dcff149b27fc94995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18b1d7908740d818568e489caf3447ab0f5768c180cc6d443adc67414e51cd07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e054e9ec9a1f8d54b704be11ff2e55c9aaad9d717b1817aa553cfbf2cc6ac16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a5d1fd5d2de253417bcb1e4edadc2363a1c259769418eb2b2aab3ff8253de2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fbf3d3daf43d0da96ad0c1340b3c72815adceec24d0ed5fc97f965d1d93e7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://854199187b81a3d6692a05073c35985feb14993f76f4ac86149c2b6bac635e4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://854199187b81a3d6692a05073c35985feb14993f76f4ac86149c2b6bac635e4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67b26f0d65128004f7645ac1f54a11bc56909269cb9b9337e447b0d5bf783c4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67b26f0d65128004f7645ac1f54a11bc56909269cb9b9337e447b0d5bf783c4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e9fc4a71369f80c06598341729c7db0cb6e0f33d70b0be64470bc6fd40d1cce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9fc4a71369f80c06598341729c7db0cb6e0f33d70b0be64470bc6fd40d1cce2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:42Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:42 crc kubenswrapper[4836]: I0217 14:06:42.907667 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15e7e5f3-c249-4609-937e-ffdc78580880\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://281c6f51b95a96d557a29b449759275131f2fde30c4814e7692edfff96442e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bc90fa5f9363034d51c35721493655d10f8f8d5fad4dda86ca4c882963fb7a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edaaa277a27d7e7b9e8b42acaa9ee04fc0933b440441ad3f8abee458e96945c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a13a546a929cc67fa3a708f23186f65c3dbc28afd67a421a66c732b8308e4177\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a13a546a929cc67fa3a708f23186f65c3dbc28afd67a421a66c732b8308e4177\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:06:30.193193 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:06:30.195418 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-478710080/tls.crt::/tmp/serving-cert-478710080/tls.key\\\\\\\"\\\\nI0217 14:06:35.690345 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:06:35.701676 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:06:35.701714 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:06:35.701739 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:06:35.701745 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:06:35.709986 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 14:06:35.710011 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:06:35.710016 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:06:35.710020 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:06:35.710023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:06:35.710027 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:06:35.710030 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 14:06:35.710186 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 14:06:35.711336 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b36202875b19f7dafcfde168e006d9d47571de30240d5886e5e0920b74b609b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf6a761df18c5e7699f2ec87d4b97f19da71f588cb12eff8e541140a361fa905\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf6a761df18c5e7699f2ec87d4b97f19da71f588cb12eff8e541140a361fa905\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:42Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:42 crc kubenswrapper[4836]: I0217 14:06:42.908555 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:42 crc kubenswrapper[4836]: I0217 14:06:42.908587 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:42 crc kubenswrapper[4836]: I0217 14:06:42.908599 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:42 crc kubenswrapper[4836]: I0217 14:06:42.908615 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:42 crc kubenswrapper[4836]: I0217 14:06:42.908627 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:42Z","lastTransitionTime":"2026-02-17T14:06:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:42 crc kubenswrapper[4836]: I0217 14:06:42.918912 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc30b1c397bb81c58abc71fb2c6d71c94b9a0f7962d8278edc4f65f1a1c64ef8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:42Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:42 crc kubenswrapper[4836]: I0217 14:06:42.929719 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:42Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:42 crc kubenswrapper[4836]: I0217 14:06:42.941420 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:42Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:42 crc kubenswrapper[4836]: I0217 14:06:42.950428 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jlz6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f87e91ef-e64c-45a5-9bd5-cc6537e51b1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e03fa870790f606ebf6db3bac825e59b209b7a2bd6cb22264597b30211fbc2fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vbzg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jlz6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:42Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:42 crc kubenswrapper[4836]: I0217 14:06:42.967872 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81e6d07edc7d7857cb288980bc7e9d5002234c2fe642df6798611b150b6d66e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81e6d07edc7d7857cb288980bc7e9d5002234c2fe642df6798611b150b6d66e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gfznp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:42Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:42 crc kubenswrapper[4836]: I0217 14:06:42.981037 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t7845" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eeaa6bd-bab3-4310-9522-747924f2e825\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6778ba6ea7cfd1d378cae6dc4330542c35cedaff8e820a2158571f542e0ec55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6778ba6ea7cfd1d378cae6dc4330542c35cedaff8e820a2158571f542e0ec55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14eab60dcf798778aa77f01af0307c7a5ef44846ceaf5a12e4a5f529b812aaa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14eab60dcf798778aa77f01af0307c7a5ef44846ceaf5a12e4a5f529b812aaa4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd65cf675b0d34fc5ea309bdcd17ef648bbf1c0ee6e2ece25f459d8d2403387c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd65cf675b0d34fc5ea309bdcd17ef648bbf1c0ee6e2ece25f459d8d2403387c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b15d92528d05fd8bcab26df5eeb85616b44e750574cc8effa1f34c39dc329bf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b15d92528d05fd8bcab26df5eeb85616b44e750574cc8effa1f34c39dc329bf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ddc9616f0a81f7623eb10b0cdb194656ad856548f814df2e6cebdb2419293f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t7845\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:42Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:42 crc kubenswrapper[4836]: I0217 14:06:42.993331 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c76cc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"592aa549-1b1b-441e-93e4-0821e05ff2b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d56329a484e41f44539eb8ca01b461cc6f79eebf756350b0f49c47c0431e8abc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8vh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c76cc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:42Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:43 crc kubenswrapper[4836]: I0217 14:06:43.003819 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a443a5a-83a8-4154-94b7-ace6b324f461\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8787a45a4a94da65a0f67f1a402bfa526fa57631259981ded7ec3569fa977f36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6196237bf10370f0d43ac7ba96f1fbc861d8690734cf883081f67616e6047dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21927a2142504e88fd8bb2a254d804c0e13f6a42e81e16c02205292de3de3155\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c21e5f00c6fb44be82d2a907c0f9d17ea47071615370135f0e94a62f1eb7840\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:43Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:43 crc kubenswrapper[4836]: I0217 14:06:43.010411 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:43 crc kubenswrapper[4836]: I0217 14:06:43.010444 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:43 crc kubenswrapper[4836]: I0217 14:06:43.010456 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:43 crc kubenswrapper[4836]: I0217 14:06:43.010471 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:43 crc kubenswrapper[4836]: I0217 14:06:43.010483 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:43Z","lastTransitionTime":"2026-02-17T14:06:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:43 crc kubenswrapper[4836]: I0217 14:06:43.112988 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:43 crc kubenswrapper[4836]: I0217 14:06:43.113029 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:43 crc kubenswrapper[4836]: I0217 14:06:43.113038 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:43 crc kubenswrapper[4836]: I0217 14:06:43.113052 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:43 crc kubenswrapper[4836]: I0217 14:06:43.113061 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:43Z","lastTransitionTime":"2026-02-17T14:06:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:43 crc kubenswrapper[4836]: I0217 14:06:43.183716 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:06:43 crc kubenswrapper[4836]: I0217 14:06:43.183857 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:06:43 crc kubenswrapper[4836]: E0217 14:06:43.183912 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:06:51.183876672 +0000 UTC m=+37.526804981 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:06:43 crc kubenswrapper[4836]: E0217 14:06:43.184012 4836 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 14:06:43 crc kubenswrapper[4836]: I0217 14:06:43.184046 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:06:43 crc kubenswrapper[4836]: E0217 14:06:43.184075 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 14:06:51.184058017 +0000 UTC m=+37.526986316 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 14:06:43 crc kubenswrapper[4836]: E0217 14:06:43.184219 4836 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 14:06:43 crc kubenswrapper[4836]: E0217 14:06:43.184327 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 14:06:51.184307624 +0000 UTC m=+37.527235883 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 14:06:43 crc kubenswrapper[4836]: I0217 14:06:43.215821 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:43 crc kubenswrapper[4836]: I0217 14:06:43.215872 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:43 crc kubenswrapper[4836]: I0217 14:06:43.215888 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:43 crc kubenswrapper[4836]: I0217 14:06:43.215914 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:43 crc kubenswrapper[4836]: I0217 14:06:43.215931 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:43Z","lastTransitionTime":"2026-02-17T14:06:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:43 crc kubenswrapper[4836]: I0217 14:06:43.285128 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:06:43 crc kubenswrapper[4836]: I0217 14:06:43.285199 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:06:43 crc kubenswrapper[4836]: E0217 14:06:43.285428 4836 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 14:06:43 crc kubenswrapper[4836]: E0217 14:06:43.285433 4836 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 14:06:43 crc kubenswrapper[4836]: E0217 14:06:43.285455 4836 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 14:06:43 crc kubenswrapper[4836]: E0217 14:06:43.285478 4836 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 14:06:43 crc kubenswrapper[4836]: E0217 14:06:43.285484 4836 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 14:06:43 crc kubenswrapper[4836]: E0217 14:06:43.285497 4836 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 14:06:43 crc kubenswrapper[4836]: E0217 14:06:43.285567 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-17 14:06:51.285545654 +0000 UTC m=+37.628473953 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 14:06:43 crc kubenswrapper[4836]: E0217 14:06:43.285593 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-17 14:06:51.285581985 +0000 UTC m=+37.628510284 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 14:06:43 crc kubenswrapper[4836]: I0217 14:06:43.319325 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:43 crc kubenswrapper[4836]: I0217 14:06:43.319387 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:43 crc kubenswrapper[4836]: I0217 14:06:43.319406 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:43 crc kubenswrapper[4836]: I0217 14:06:43.319430 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:43 crc kubenswrapper[4836]: I0217 14:06:43.319446 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:43Z","lastTransitionTime":"2026-02-17T14:06:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:43 crc kubenswrapper[4836]: I0217 14:06:43.426583 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:43 crc kubenswrapper[4836]: I0217 14:06:43.426846 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:43 crc kubenswrapper[4836]: I0217 14:06:43.426920 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:43 crc kubenswrapper[4836]: I0217 14:06:43.426999 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:43 crc kubenswrapper[4836]: I0217 14:06:43.427070 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:43Z","lastTransitionTime":"2026-02-17T14:06:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:43 crc kubenswrapper[4836]: I0217 14:06:43.495324 4836 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-21 19:32:36.286672043 +0000 UTC Feb 17 14:06:43 crc kubenswrapper[4836]: I0217 14:06:43.533149 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:43 crc kubenswrapper[4836]: I0217 14:06:43.533211 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:43 crc kubenswrapper[4836]: I0217 14:06:43.533247 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:43 crc kubenswrapper[4836]: I0217 14:06:43.533264 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:43 crc kubenswrapper[4836]: I0217 14:06:43.533276 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:43Z","lastTransitionTime":"2026-02-17T14:06:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:43 crc kubenswrapper[4836]: I0217 14:06:43.567981 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:06:43 crc kubenswrapper[4836]: E0217 14:06:43.568116 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:06:43 crc kubenswrapper[4836]: I0217 14:06:43.568528 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:06:43 crc kubenswrapper[4836]: E0217 14:06:43.568586 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:06:43 crc kubenswrapper[4836]: I0217 14:06:43.568622 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:06:43 crc kubenswrapper[4836]: E0217 14:06:43.568658 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:06:43 crc kubenswrapper[4836]: I0217 14:06:43.636141 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:43 crc kubenswrapper[4836]: I0217 14:06:43.636188 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:43 crc kubenswrapper[4836]: I0217 14:06:43.636205 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:43 crc kubenswrapper[4836]: I0217 14:06:43.636225 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:43 crc kubenswrapper[4836]: I0217 14:06:43.636238 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:43Z","lastTransitionTime":"2026-02-17T14:06:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:43 crc kubenswrapper[4836]: I0217 14:06:43.739962 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:43 crc kubenswrapper[4836]: I0217 14:06:43.740028 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:43 crc kubenswrapper[4836]: I0217 14:06:43.740040 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:43 crc kubenswrapper[4836]: I0217 14:06:43.740061 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:43 crc kubenswrapper[4836]: I0217 14:06:43.740075 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:43Z","lastTransitionTime":"2026-02-17T14:06:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:43 crc kubenswrapper[4836]: I0217 14:06:43.826354 4836 generic.go:334] "Generic (PLEG): container finished" podID="3eeaa6bd-bab3-4310-9522-747924f2e825" containerID="2ddc9616f0a81f7623eb10b0cdb194656ad856548f814df2e6cebdb2419293f0" exitCode=0 Feb 17 14:06:43 crc kubenswrapper[4836]: I0217 14:06:43.826404 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-t7845" event={"ID":"3eeaa6bd-bab3-4310-9522-747924f2e825","Type":"ContainerDied","Data":"2ddc9616f0a81f7623eb10b0cdb194656ad856548f814df2e6cebdb2419293f0"} Feb 17 14:06:43 crc kubenswrapper[4836]: I0217 14:06:43.832486 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" event={"ID":"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e","Type":"ContainerStarted","Data":"efe40e27f5a0b564251e0f62c8b242039a04e92beaa2f621047a67d94afda3e5"} Feb 17 14:06:43 crc kubenswrapper[4836]: I0217 14:06:43.832809 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" Feb 17 14:06:43 crc kubenswrapper[4836]: I0217 14:06:43.843059 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:43 crc kubenswrapper[4836]: I0217 14:06:43.843095 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:43 crc kubenswrapper[4836]: I0217 14:06:43.843105 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:43 crc kubenswrapper[4836]: I0217 14:06:43.843120 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:43 crc kubenswrapper[4836]: I0217 14:06:43.843131 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:43Z","lastTransitionTime":"2026-02-17T14:06:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:43 crc kubenswrapper[4836]: I0217 14:06:43.848157 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15e7e5f3-c249-4609-937e-ffdc78580880\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://281c6f51b95a96d557a29b449759275131f2fde30c4814e7692edfff96442e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bc90fa5f9363034d51c35721493655d10f8f8d5fad4dda86ca4c882963fb7a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edaaa277a27d7e7b9e8b42acaa9ee04fc0933b440441ad3f8abee458e96945c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a13a546a929cc67fa3a708f23186f65c3dbc28afd67a421a66c732b8308e4177\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a13a546a929cc67fa3a708f23186f65c3dbc28afd67a421a66c732b8308e4177\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:06:30.193193 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:06:30.195418 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-478710080/tls.crt::/tmp/serving-cert-478710080/tls.key\\\\\\\"\\\\nI0217 14:06:35.690345 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:06:35.701676 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:06:35.701714 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:06:35.701739 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:06:35.701745 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:06:35.709986 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 14:06:35.710011 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:06:35.710016 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:06:35.710020 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:06:35.710023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:06:35.710027 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:06:35.710030 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 14:06:35.710186 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 14:06:35.711336 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b36202875b19f7dafcfde168e006d9d47571de30240d5886e5e0920b74b609b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf6a761df18c5e7699f2ec87d4b97f19da71f588cb12eff8e541140a361fa905\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf6a761df18c5e7699f2ec87d4b97f19da71f588cb12eff8e541140a361fa905\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:43Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:43 crc kubenswrapper[4836]: I0217 14:06:43.864696 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc30b1c397bb81c58abc71fb2c6d71c94b9a0f7962d8278edc4f65f1a1c64ef8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:43Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:43 crc kubenswrapper[4836]: I0217 14:06:43.874133 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" Feb 17 14:06:43 crc kubenswrapper[4836]: I0217 14:06:43.878315 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:43Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:43 crc kubenswrapper[4836]: I0217 14:06:43.893110 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:43Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:43 crc kubenswrapper[4836]: I0217 14:06:43.905602 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jlz6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f87e91ef-e64c-45a5-9bd5-cc6537e51b1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e03fa870790f606ebf6db3bac825e59b209b7a2bd6cb22264597b30211fbc2fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vbzg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jlz6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:43Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:43 crc kubenswrapper[4836]: I0217 14:06:43.929128 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81e6d07edc7d7857cb288980bc7e9d5002234c2fe642df6798611b150b6d66e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81e6d07edc7d7857cb288980bc7e9d5002234c2fe642df6798611b150b6d66e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gfznp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:43Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:43 crc kubenswrapper[4836]: I0217 14:06:43.943862 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t7845" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eeaa6bd-bab3-4310-9522-747924f2e825\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6778ba6ea7cfd1d378cae6dc4330542c35cedaff8e820a2158571f542e0ec55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6778ba6ea7cfd1d378cae6dc4330542c35cedaff8e820a2158571f542e0ec55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14eab60dcf798778aa77f01af0307c7a5ef44846ceaf5a12e4a5f529b812aaa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14eab60dcf798778aa77f01af0307c7a5ef44846ceaf5a12e4a5f529b812aaa4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd65cf675b0d34fc5ea309bdcd17ef648bbf1c0ee6e2ece25f459d8d2403387c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd65cf675b0d34fc5ea309bdcd17ef648bbf1c0ee6e2ece25f459d8d2403387c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b15d92528d05fd8bcab26df5eeb85616b44e750574cc8effa1f34c39dc329bf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b15d92528d05fd8bcab26df5eeb85616b44e750574cc8effa1f34c39dc329bf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ddc9616f0a81f7623eb10b0cdb194656ad856548f814df2e6cebdb2419293f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ddc9616f0a81f7623eb10b0cdb194656ad856548f814df2e6cebdb2419293f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t7845\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:43Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:43 crc kubenswrapper[4836]: I0217 14:06:43.945600 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:43 crc kubenswrapper[4836]: I0217 14:06:43.945641 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:43 crc kubenswrapper[4836]: I0217 14:06:43.945649 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:43 crc kubenswrapper[4836]: I0217 14:06:43.945665 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:43 crc kubenswrapper[4836]: I0217 14:06:43.945674 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:43Z","lastTransitionTime":"2026-02-17T14:06:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:43 crc kubenswrapper[4836]: I0217 14:06:43.959847 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a443a5a-83a8-4154-94b7-ace6b324f461\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8787a45a4a94da65a0f67f1a402bfa526fa57631259981ded7ec3569fa977f36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6196237bf10370f0d43ac7ba96f1fbc861d8690734cf883081f67616e6047dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21927a2142504e88fd8bb2a254d804c0e13f6a42e81e16c02205292de3de3155\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c21e5f00c6fb44be82d2a907c0f9d17ea47071615370135f0e94a62f1eb7840\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:43Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:43 crc kubenswrapper[4836]: I0217 14:06:43.973761 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c76cc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"592aa549-1b1b-441e-93e4-0821e05ff2b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d56329a484e41f44539eb8ca01b461cc6f79eebf756350b0f49c47c0431e8abc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8vh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c76cc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:43Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:43 crc kubenswrapper[4836]: I0217 14:06:43.984872 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"895a19c9-a3f0-4a15-aa19-19347121388c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bce78b27c441ccb16f0fd2489cbe3e38be0f3d330fcd2b602f62a8ca0aa8616b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99tf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c10f7b69881ea75bfa81905a42379fed8daf48e788b5f2787c522a52d28f58cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99tf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bkk9g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:43Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:44 crc kubenswrapper[4836]: I0217 14:06:44.002391 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"243121c7-6c63-4df6-a6a6-1ef6770e1125\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0143c73054039d5957c29bdf62217900d803ba2e0046c09dcff149b27fc94995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18b1d7908740d818568e489caf3447ab0f5768c180cc6d443adc67414e51cd07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e054e9ec9a1f8d54b704be11ff2e55c9aaad9d717b1817aa553cfbf2cc6ac16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a5d1fd5d2de253417bcb1e4edadc2363a1c259769418eb2b2aab3ff8253de2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fbf3d3daf43d0da96ad0c1340b3c72815adceec24d0ed5fc97f965d1d93e7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://854199187b81a3d6692a05073c35985feb14993f76f4ac86149c2b6bac635e4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://854199187b81a3d6692a05073c35985feb14993f76f4ac86149c2b6bac635e4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67b26f0d65128004f7645ac1f54a11bc56909269cb9b9337e447b0d5bf783c4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67b26f0d65128004f7645ac1f54a11bc56909269cb9b9337e447b0d5bf783c4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e9fc4a71369f80c06598341729c7db0cb6e0f33d70b0be64470bc6fd40d1cce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9fc4a71369f80c06598341729c7db0cb6e0f33d70b0be64470bc6fd40d1cce2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:44Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:44 crc kubenswrapper[4836]: I0217 14:06:44.014502 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b942ef1c58993ae6344d4aef5fcfe21434b5f3b3282bdf8070560fa8973a3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77390b24cb4a223b8eb6a46db49a4a88b68f87aa0b4b134b45a552c180dec5b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:44Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:44 crc kubenswrapper[4836]: I0217 14:06:44.024740 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:44Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:44 crc kubenswrapper[4836]: I0217 14:06:44.035960 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd6efd6a58b6d90897fa902c8d0c8082ecec2899f16adc3e56b98f0bc5c4640d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:44Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:44 crc kubenswrapper[4836]: I0217 14:06:44.043718 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vt5sw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d1f430-35ed-4c4e-a797-d7a0a5a45266\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25f1e7f83b2916db4faac1afe1f5107375c0e1674ff1d6f90f1025a9920111e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqtsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vt5sw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:44Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:44 crc kubenswrapper[4836]: I0217 14:06:44.048889 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:44 crc kubenswrapper[4836]: I0217 14:06:44.048925 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:44 crc kubenswrapper[4836]: I0217 14:06:44.048936 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:44 crc kubenswrapper[4836]: I0217 14:06:44.048951 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:44 crc kubenswrapper[4836]: I0217 14:06:44.048966 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:44Z","lastTransitionTime":"2026-02-17T14:06:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:44 crc kubenswrapper[4836]: I0217 14:06:44.054022 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"895a19c9-a3f0-4a15-aa19-19347121388c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bce78b27c441ccb16f0fd2489cbe3e38be0f3d330fcd2b602f62a8ca0aa8616b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99tf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c10f7b69881ea75bfa81905a42379fed8daf48e788b5f2787c522a52d28f58cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99tf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bkk9g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:44Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:44 crc kubenswrapper[4836]: I0217 14:06:44.070511 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vt5sw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d1f430-35ed-4c4e-a797-d7a0a5a45266\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25f1e7f83b2916db4faac1afe1f5107375c0e1674ff1d6f90f1025a9920111e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqtsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vt5sw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:44Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:44 crc kubenswrapper[4836]: I0217 14:06:44.095597 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"243121c7-6c63-4df6-a6a6-1ef6770e1125\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0143c73054039d5957c29bdf62217900d803ba2e0046c09dcff149b27fc94995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18b1d7908740d818568e489caf3447ab0f5768c180cc6d443adc67414e51cd07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e054e9ec9a1f8d54b704be11ff2e55c9aaad9d717b1817aa553cfbf2cc6ac16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a5d1fd5d2de253417bcb1e4edadc2363a1c259769418eb2b2aab3ff8253de2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fbf3d3daf43d0da96ad0c1340b3c72815adceec24d0ed5fc97f965d1d93e7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://854199187b81a3d6692a05073c35985feb14993f76f4ac86149c2b6bac635e4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://854199187b81a3d6692a05073c35985feb14993f76f4ac86149c2b6bac635e4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67b26f0d65128004f7645ac1f54a11bc56909269cb9b9337e447b0d5bf783c4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67b26f0d65128004f7645ac1f54a11bc56909269cb9b9337e447b0d5bf783c4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e9fc4a71369f80c06598341729c7db0cb6e0f33d70b0be64470bc6fd40d1cce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9fc4a71369f80c06598341729c7db0cb6e0f33d70b0be64470bc6fd40d1cce2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:44Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:44 crc kubenswrapper[4836]: I0217 14:06:44.111268 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b942ef1c58993ae6344d4aef5fcfe21434b5f3b3282bdf8070560fa8973a3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77390b24cb4a223b8eb6a46db49a4a88b68f87aa0b4b134b45a552c180dec5b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:44Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:44 crc kubenswrapper[4836]: I0217 14:06:44.126625 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:44Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:44 crc kubenswrapper[4836]: I0217 14:06:44.142505 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd6efd6a58b6d90897fa902c8d0c8082ecec2899f16adc3e56b98f0bc5c4640d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:44Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:44 crc kubenswrapper[4836]: I0217 14:06:44.150947 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:44 crc kubenswrapper[4836]: I0217 14:06:44.150979 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:44 crc kubenswrapper[4836]: I0217 14:06:44.150991 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:44 crc kubenswrapper[4836]: I0217 14:06:44.151008 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:44 crc kubenswrapper[4836]: I0217 14:06:44.151020 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:44Z","lastTransitionTime":"2026-02-17T14:06:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:44 crc kubenswrapper[4836]: I0217 14:06:44.164625 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:44Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:44 crc kubenswrapper[4836]: I0217 14:06:44.177410 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:44Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:44 crc kubenswrapper[4836]: I0217 14:06:44.192478 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jlz6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f87e91ef-e64c-45a5-9bd5-cc6537e51b1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e03fa870790f606ebf6db3bac825e59b209b7a2bd6cb22264597b30211fbc2fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vbzg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jlz6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:44Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:44 crc kubenswrapper[4836]: I0217 14:06:44.215875 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebae2d4e3375fda616b0ea16c0c86a5c1c36004e7d22873a289ff37874bbc74b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb05318b17722c3f731dc976d38c05a4ab7eafd686ad0fcf17bd01032409eaee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5576148df454de0fb3d15ea9de93736aab907ca8cf09669c3c25fde79ec6fad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47fdc413501c45a2d6e531a1427fd6df3e9d2c57e3d60098b31fbfd695d98a0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d36d269996169583e4e181aac060491640e74df94385276752f460c408273cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6545ec908eecb35a4405050dd77343f0acd4ec7d54ec55853a926e00b7f37f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efe40e27f5a0b564251e0f62c8b242039a04e92beaa2f621047a67d94afda3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8d2edfc6a88391d034f310661e09eb58503f33f015b5e067654db06e8b152e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81e6d07edc7d7857cb288980bc7e9d5002234c2fe642df6798611b150b6d66e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81e6d07edc7d7857cb288980bc7e9d5002234c2fe642df6798611b150b6d66e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gfznp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:44Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:44 crc kubenswrapper[4836]: I0217 14:06:44.237327 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15e7e5f3-c249-4609-937e-ffdc78580880\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://281c6f51b95a96d557a29b449759275131f2fde30c4814e7692edfff96442e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bc90fa5f9363034d51c35721493655d10f8f8d5fad4dda86ca4c882963fb7a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edaaa277a27d7e7b9e8b42acaa9ee04fc0933b440441ad3f8abee458e96945c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a13a546a929cc67fa3a708f23186f65c3dbc28afd67a421a66c732b8308e4177\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a13a546a929cc67fa3a708f23186f65c3dbc28afd67a421a66c732b8308e4177\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:06:30.193193 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:06:30.195418 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-478710080/tls.crt::/tmp/serving-cert-478710080/tls.key\\\\\\\"\\\\nI0217 14:06:35.690345 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:06:35.701676 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:06:35.701714 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:06:35.701739 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:06:35.701745 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:06:35.709986 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 14:06:35.710011 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:06:35.710016 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:06:35.710020 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:06:35.710023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:06:35.710027 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:06:35.710030 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 14:06:35.710186 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 14:06:35.711336 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b36202875b19f7dafcfde168e006d9d47571de30240d5886e5e0920b74b609b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf6a761df18c5e7699f2ec87d4b97f19da71f588cb12eff8e541140a361fa905\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf6a761df18c5e7699f2ec87d4b97f19da71f588cb12eff8e541140a361fa905\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:44Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:44 crc kubenswrapper[4836]: I0217 14:06:44.251927 4836 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Feb 17 14:06:44 crc kubenswrapper[4836]: I0217 14:06:44.253974 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc30b1c397bb81c58abc71fb2c6d71c94b9a0f7962d8278edc4f65f1a1c64ef8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Patch \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-operator/pods/network-operator-58b4c7f79c-55gtf/status\": read tcp 38.102.83.233:43382->38.102.83.233:6443: use of closed network connection" Feb 17 14:06:44 crc kubenswrapper[4836]: I0217 14:06:44.275742 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:44 crc kubenswrapper[4836]: I0217 14:06:44.275778 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:44 crc kubenswrapper[4836]: I0217 14:06:44.275786 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:44 crc kubenswrapper[4836]: I0217 14:06:44.275799 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:44 crc kubenswrapper[4836]: I0217 14:06:44.275807 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:44Z","lastTransitionTime":"2026-02-17T14:06:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:44 crc kubenswrapper[4836]: I0217 14:06:44.294336 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t7845" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eeaa6bd-bab3-4310-9522-747924f2e825\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6778ba6ea7cfd1d378cae6dc4330542c35cedaff8e820a2158571f542e0ec55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6778ba6ea7cfd1d378cae6dc4330542c35cedaff8e820a2158571f542e0ec55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14eab60dcf798778aa77f01af0307c7a5ef44846ceaf5a12e4a5f529b812aaa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14eab60dcf798778aa77f01af0307c7a5ef44846ceaf5a12e4a5f529b812aaa4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd65cf675b0d34fc5ea309bdcd17ef648bbf1c0ee6e2ece25f459d8d2403387c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd65cf675b0d34fc5ea309bdcd17ef648bbf1c0ee6e2ece25f459d8d2403387c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b15d92528d05fd8bcab26df5eeb85616b44e750574cc8effa1f34c39dc329bf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b15d92528d05fd8bcab26df5eeb85616b44e750574cc8effa1f34c39dc329bf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ddc9616f0a81f7623eb10b0cdb194656ad856548f814df2e6cebdb2419293f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ddc9616f0a81f7623eb10b0cdb194656ad856548f814df2e6cebdb2419293f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t7845\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:44Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:44 crc kubenswrapper[4836]: I0217 14:06:44.312201 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a443a5a-83a8-4154-94b7-ace6b324f461\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8787a45a4a94da65a0f67f1a402bfa526fa57631259981ded7ec3569fa977f36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6196237bf10370f0d43ac7ba96f1fbc861d8690734cf883081f67616e6047dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21927a2142504e88fd8bb2a254d804c0e13f6a42e81e16c02205292de3de3155\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c21e5f00c6fb44be82d2a907c0f9d17ea47071615370135f0e94a62f1eb7840\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:44Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:44 crc kubenswrapper[4836]: I0217 14:06:44.323222 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c76cc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"592aa549-1b1b-441e-93e4-0821e05ff2b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d56329a484e41f44539eb8ca01b461cc6f79eebf756350b0f49c47c0431e8abc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8vh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c76cc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:44Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:44 crc kubenswrapper[4836]: I0217 14:06:44.377774 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:44 crc kubenswrapper[4836]: I0217 14:06:44.377827 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:44 crc kubenswrapper[4836]: I0217 14:06:44.377847 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:44 crc kubenswrapper[4836]: I0217 14:06:44.377872 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:44 crc kubenswrapper[4836]: I0217 14:06:44.377890 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:44Z","lastTransitionTime":"2026-02-17T14:06:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:44 crc kubenswrapper[4836]: I0217 14:06:44.480353 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:44 crc kubenswrapper[4836]: I0217 14:06:44.480388 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:44 crc kubenswrapper[4836]: I0217 14:06:44.480396 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:44 crc kubenswrapper[4836]: I0217 14:06:44.480410 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:44 crc kubenswrapper[4836]: I0217 14:06:44.480422 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:44Z","lastTransitionTime":"2026-02-17T14:06:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:44 crc kubenswrapper[4836]: I0217 14:06:44.496107 4836 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 23:46:24.682049718 +0000 UTC Feb 17 14:06:44 crc kubenswrapper[4836]: I0217 14:06:44.583420 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:44 crc kubenswrapper[4836]: I0217 14:06:44.583469 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:44 crc kubenswrapper[4836]: I0217 14:06:44.583480 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:44 crc kubenswrapper[4836]: I0217 14:06:44.583499 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:44 crc kubenswrapper[4836]: I0217 14:06:44.583511 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:44Z","lastTransitionTime":"2026-02-17T14:06:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:44 crc kubenswrapper[4836]: I0217 14:06:44.601559 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"243121c7-6c63-4df6-a6a6-1ef6770e1125\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0143c73054039d5957c29bdf62217900d803ba2e0046c09dcff149b27fc94995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18b1d7908740d818568e489caf3447ab0f5768c180cc6d443adc67414e51cd07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e054e9ec9a1f8d54b704be11ff2e55c9aaad9d717b1817aa553cfbf2cc6ac16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a5d1fd5d2de253417bcb1e4edadc2363a1c259769418eb2b2aab3ff8253de2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fbf3d3daf43d0da96ad0c1340b3c72815adceec24d0ed5fc97f965d1d93e7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://854199187b81a3d6692a05073c35985feb14993f76f4ac86149c2b6bac635e4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://854199187b81a3d6692a05073c35985feb14993f76f4ac86149c2b6bac635e4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67b26f0d65128004f7645ac1f54a11bc56909269cb9b9337e447b0d5bf783c4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67b26f0d65128004f7645ac1f54a11bc56909269cb9b9337e447b0d5bf783c4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e9fc4a71369f80c06598341729c7db0cb6e0f33d70b0be64470bc6fd40d1cce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9fc4a71369f80c06598341729c7db0cb6e0f33d70b0be64470bc6fd40d1cce2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:44Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:44 crc kubenswrapper[4836]: I0217 14:06:44.644784 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b942ef1c58993ae6344d4aef5fcfe21434b5f3b3282bdf8070560fa8973a3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77390b24cb4a223b8eb6a46db49a4a88b68f87aa0b4b134b45a552c180dec5b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:44Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:44 crc kubenswrapper[4836]: I0217 14:06:44.670960 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:44Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:44 crc kubenswrapper[4836]: I0217 14:06:44.685826 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:44 crc kubenswrapper[4836]: I0217 14:06:44.685874 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:44 crc kubenswrapper[4836]: I0217 14:06:44.685886 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:44 crc kubenswrapper[4836]: I0217 14:06:44.685904 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:44 crc kubenswrapper[4836]: I0217 14:06:44.685916 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:44Z","lastTransitionTime":"2026-02-17T14:06:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:44 crc kubenswrapper[4836]: I0217 14:06:44.688390 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd6efd6a58b6d90897fa902c8d0c8082ecec2899f16adc3e56b98f0bc5c4640d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:44Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:44 crc kubenswrapper[4836]: I0217 14:06:44.699537 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vt5sw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d1f430-35ed-4c4e-a797-d7a0a5a45266\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25f1e7f83b2916db4faac1afe1f5107375c0e1674ff1d6f90f1025a9920111e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqtsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vt5sw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:44Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:44 crc kubenswrapper[4836]: I0217 14:06:44.721031 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15e7e5f3-c249-4609-937e-ffdc78580880\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://281c6f51b95a96d557a29b449759275131f2fde30c4814e7692edfff96442e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bc90fa5f9363034d51c35721493655d10f8f8d5fad4dda86ca4c882963fb7a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edaaa277a27d7e7b9e8b42acaa9ee04fc0933b440441ad3f8abee458e96945c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a13a546a929cc67fa3a708f23186f65c3dbc28afd67a421a66c732b8308e4177\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a13a546a929cc67fa3a708f23186f65c3dbc28afd67a421a66c732b8308e4177\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:06:30.193193 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:06:30.195418 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-478710080/tls.crt::/tmp/serving-cert-478710080/tls.key\\\\\\\"\\\\nI0217 14:06:35.690345 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:06:35.701676 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:06:35.701714 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:06:35.701739 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:06:35.701745 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:06:35.709986 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 14:06:35.710011 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:06:35.710016 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:06:35.710020 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:06:35.710023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:06:35.710027 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:06:35.710030 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 14:06:35.710186 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 14:06:35.711336 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b36202875b19f7dafcfde168e006d9d47571de30240d5886e5e0920b74b609b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf6a761df18c5e7699f2ec87d4b97f19da71f588cb12eff8e541140a361fa905\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf6a761df18c5e7699f2ec87d4b97f19da71f588cb12eff8e541140a361fa905\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:44Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:44 crc kubenswrapper[4836]: I0217 14:06:44.735738 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc30b1c397bb81c58abc71fb2c6d71c94b9a0f7962d8278edc4f65f1a1c64ef8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:44Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:44 crc kubenswrapper[4836]: I0217 14:06:44.751324 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:44Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:44 crc kubenswrapper[4836]: I0217 14:06:44.763264 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:44Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:44 crc kubenswrapper[4836]: I0217 14:06:44.774598 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jlz6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f87e91ef-e64c-45a5-9bd5-cc6537e51b1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e03fa870790f606ebf6db3bac825e59b209b7a2bd6cb22264597b30211fbc2fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vbzg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jlz6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:44Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:44 crc kubenswrapper[4836]: I0217 14:06:44.789256 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:44 crc kubenswrapper[4836]: I0217 14:06:44.789325 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:44 crc kubenswrapper[4836]: I0217 14:06:44.789340 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:44 crc kubenswrapper[4836]: I0217 14:06:44.789360 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:44 crc kubenswrapper[4836]: I0217 14:06:44.789374 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:44Z","lastTransitionTime":"2026-02-17T14:06:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:44 crc kubenswrapper[4836]: I0217 14:06:44.794714 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebae2d4e3375fda616b0ea16c0c86a5c1c36004e7d22873a289ff37874bbc74b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb05318b17722c3f731dc976d38c05a4ab7eafd686ad0fcf17bd01032409eaee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5576148df454de0fb3d15ea9de93736aab907ca8cf09669c3c25fde79ec6fad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47fdc413501c45a2d6e531a1427fd6df3e9d2c57e3d60098b31fbfd695d98a0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d36d269996169583e4e181aac060491640e74df94385276752f460c408273cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6545ec908eecb35a4405050dd77343f0acd4ec7d54ec55853a926e00b7f37f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efe40e27f5a0b564251e0f62c8b242039a04e92beaa2f621047a67d94afda3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8d2edfc6a88391d034f310661e09eb58503f33f015b5e067654db06e8b152e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81e6d07edc7d7857cb288980bc7e9d5002234c2fe642df6798611b150b6d66e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81e6d07edc7d7857cb288980bc7e9d5002234c2fe642df6798611b150b6d66e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gfznp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:44Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:44 crc kubenswrapper[4836]: I0217 14:06:44.810798 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t7845" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eeaa6bd-bab3-4310-9522-747924f2e825\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6778ba6ea7cfd1d378cae6dc4330542c35cedaff8e820a2158571f542e0ec55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6778ba6ea7cfd1d378cae6dc4330542c35cedaff8e820a2158571f542e0ec55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14eab60dcf798778aa77f01af0307c7a5ef44846ceaf5a12e4a5f529b812aaa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14eab60dcf798778aa77f01af0307c7a5ef44846ceaf5a12e4a5f529b812aaa4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd65cf675b0d34fc5ea309bdcd17ef648bbf1c0ee6e2ece25f459d8d2403387c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd65cf675b0d34fc5ea309bdcd17ef648bbf1c0ee6e2ece25f459d8d2403387c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b15d92528d05fd8bcab26df5eeb85616b44e750574cc8effa1f34c39dc329bf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b15d92528d05fd8bcab26df5eeb85616b44e750574cc8effa1f34c39dc329bf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ddc9616f0a81f7623eb10b0cdb194656ad856548f814df2e6cebdb2419293f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ddc9616f0a81f7623eb10b0cdb194656ad856548f814df2e6cebdb2419293f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t7845\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:44Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:44 crc kubenswrapper[4836]: I0217 14:06:44.824675 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a443a5a-83a8-4154-94b7-ace6b324f461\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8787a45a4a94da65a0f67f1a402bfa526fa57631259981ded7ec3569fa977f36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6196237bf10370f0d43ac7ba96f1fbc861d8690734cf883081f67616e6047dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21927a2142504e88fd8bb2a254d804c0e13f6a42e81e16c02205292de3de3155\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c21e5f00c6fb44be82d2a907c0f9d17ea47071615370135f0e94a62f1eb7840\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:44Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:44 crc kubenswrapper[4836]: I0217 14:06:44.840869 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c76cc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"592aa549-1b1b-441e-93e4-0821e05ff2b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d56329a484e41f44539eb8ca01b461cc6f79eebf756350b0f49c47c0431e8abc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8vh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c76cc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:44Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:44 crc kubenswrapper[4836]: I0217 14:06:44.849249 4836 generic.go:334] "Generic (PLEG): container finished" podID="3eeaa6bd-bab3-4310-9522-747924f2e825" containerID="8a96c0c25b8539ae8bb660010bbcd7d147a150bfb653dc725a34bc7e4e4b949b" exitCode=0 Feb 17 14:06:44 crc kubenswrapper[4836]: I0217 14:06:44.849319 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-t7845" event={"ID":"3eeaa6bd-bab3-4310-9522-747924f2e825","Type":"ContainerDied","Data":"8a96c0c25b8539ae8bb660010bbcd7d147a150bfb653dc725a34bc7e4e4b949b"} Feb 17 14:06:44 crc kubenswrapper[4836]: I0217 14:06:44.849626 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" Feb 17 14:06:44 crc kubenswrapper[4836]: I0217 14:06:44.849644 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" Feb 17 14:06:44 crc kubenswrapper[4836]: I0217 14:06:44.857529 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"895a19c9-a3f0-4a15-aa19-19347121388c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bce78b27c441ccb16f0fd2489cbe3e38be0f3d330fcd2b602f62a8ca0aa8616b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99tf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c10f7b69881ea75bfa81905a42379fed8daf48e788b5f2787c522a52d28f58cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99tf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bkk9g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:44Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:44 crc kubenswrapper[4836]: I0217 14:06:44.876931 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"243121c7-6c63-4df6-a6a6-1ef6770e1125\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0143c73054039d5957c29bdf62217900d803ba2e0046c09dcff149b27fc94995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18b1d7908740d818568e489caf3447ab0f5768c180cc6d443adc67414e51cd07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e054e9ec9a1f8d54b704be11ff2e55c9aaad9d717b1817aa553cfbf2cc6ac16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a5d1fd5d2de253417bcb1e4edadc2363a1c259769418eb2b2aab3ff8253de2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fbf3d3daf43d0da96ad0c1340b3c72815adceec24d0ed5fc97f965d1d93e7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://854199187b81a3d6692a05073c35985feb14993f76f4ac86149c2b6bac635e4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://854199187b81a3d6692a05073c35985feb14993f76f4ac86149c2b6bac635e4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67b26f0d65128004f7645ac1f54a11bc56909269cb9b9337e447b0d5bf783c4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67b26f0d65128004f7645ac1f54a11bc56909269cb9b9337e447b0d5bf783c4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e9fc4a71369f80c06598341729c7db0cb6e0f33d70b0be64470bc6fd40d1cce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9fc4a71369f80c06598341729c7db0cb6e0f33d70b0be64470bc6fd40d1cce2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:44Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:44 crc kubenswrapper[4836]: I0217 14:06:44.925100 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:44 crc kubenswrapper[4836]: I0217 14:06:44.925155 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:44 crc kubenswrapper[4836]: I0217 14:06:44.925170 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:44 crc kubenswrapper[4836]: I0217 14:06:44.925191 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:44 crc kubenswrapper[4836]: I0217 14:06:44.925208 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:44Z","lastTransitionTime":"2026-02-17T14:06:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:44 crc kubenswrapper[4836]: I0217 14:06:44.929669 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" Feb 17 14:06:44 crc kubenswrapper[4836]: I0217 14:06:44.929807 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b942ef1c58993ae6344d4aef5fcfe21434b5f3b3282bdf8070560fa8973a3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77390b24cb4a223b8eb6a46db49a4a88b68f87aa0b4b134b45a552c180dec5b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:44Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:44 crc kubenswrapper[4836]: I0217 14:06:44.951930 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:44Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:44 crc kubenswrapper[4836]: I0217 14:06:44.970455 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd6efd6a58b6d90897fa902c8d0c8082ecec2899f16adc3e56b98f0bc5c4640d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:44Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:44 crc kubenswrapper[4836]: I0217 14:06:44.982168 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vt5sw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d1f430-35ed-4c4e-a797-d7a0a5a45266\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25f1e7f83b2916db4faac1afe1f5107375c0e1674ff1d6f90f1025a9920111e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqtsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vt5sw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:44Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:45 crc kubenswrapper[4836]: I0217 14:06:45.005236 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebae2d4e3375fda616b0ea16c0c86a5c1c36004e7d22873a289ff37874bbc74b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb05318b17722c3f731dc976d38c05a4ab7eafd686ad0fcf17bd01032409eaee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5576148df454de0fb3d15ea9de93736aab907ca8cf09669c3c25fde79ec6fad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47fdc413501c45a2d6e531a1427fd6df3e9d2c57e3d60098b31fbfd695d98a0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d36d269996169583e4e181aac060491640e74df94385276752f460c408273cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6545ec908eecb35a4405050dd77343f0acd4ec7d54ec55853a926e00b7f37f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efe40e27f5a0b564251e0f62c8b242039a04e92beaa2f621047a67d94afda3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8d2edfc6a88391d034f310661e09eb58503f33f015b5e067654db06e8b152e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81e6d07edc7d7857cb288980bc7e9d5002234c2fe642df6798611b150b6d66e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81e6d07edc7d7857cb288980bc7e9d5002234c2fe642df6798611b150b6d66e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gfznp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:45Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:45 crc kubenswrapper[4836]: I0217 14:06:45.021783 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15e7e5f3-c249-4609-937e-ffdc78580880\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://281c6f51b95a96d557a29b449759275131f2fde30c4814e7692edfff96442e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bc90fa5f9363034d51c35721493655d10f8f8d5fad4dda86ca4c882963fb7a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edaaa277a27d7e7b9e8b42acaa9ee04fc0933b440441ad3f8abee458e96945c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a13a546a929cc67fa3a708f23186f65c3dbc28afd67a421a66c732b8308e4177\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a13a546a929cc67fa3a708f23186f65c3dbc28afd67a421a66c732b8308e4177\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:06:30.193193 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:06:30.195418 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-478710080/tls.crt::/tmp/serving-cert-478710080/tls.key\\\\\\\"\\\\nI0217 14:06:35.690345 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:06:35.701676 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:06:35.701714 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:06:35.701739 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:06:35.701745 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:06:35.709986 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 14:06:35.710011 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:06:35.710016 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:06:35.710020 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:06:35.710023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:06:35.710027 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:06:35.710030 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 14:06:35.710186 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 14:06:35.711336 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b36202875b19f7dafcfde168e006d9d47571de30240d5886e5e0920b74b609b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf6a761df18c5e7699f2ec87d4b97f19da71f588cb12eff8e541140a361fa905\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf6a761df18c5e7699f2ec87d4b97f19da71f588cb12eff8e541140a361fa905\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:45Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:45 crc kubenswrapper[4836]: I0217 14:06:45.028982 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:45 crc kubenswrapper[4836]: I0217 14:06:45.029031 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:45 crc kubenswrapper[4836]: I0217 14:06:45.029043 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:45 crc kubenswrapper[4836]: I0217 14:06:45.029060 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:45 crc kubenswrapper[4836]: I0217 14:06:45.029072 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:45Z","lastTransitionTime":"2026-02-17T14:06:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:45 crc kubenswrapper[4836]: I0217 14:06:45.035168 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc30b1c397bb81c58abc71fb2c6d71c94b9a0f7962d8278edc4f65f1a1c64ef8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:45Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:45 crc kubenswrapper[4836]: I0217 14:06:45.053370 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:45Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:45 crc kubenswrapper[4836]: I0217 14:06:45.068367 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:45Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:45 crc kubenswrapper[4836]: I0217 14:06:45.079214 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jlz6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f87e91ef-e64c-45a5-9bd5-cc6537e51b1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e03fa870790f606ebf6db3bac825e59b209b7a2bd6cb22264597b30211fbc2fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vbzg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jlz6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:45Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:45 crc kubenswrapper[4836]: I0217 14:06:45.096821 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t7845" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eeaa6bd-bab3-4310-9522-747924f2e825\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6778ba6ea7cfd1d378cae6dc4330542c35cedaff8e820a2158571f542e0ec55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6778ba6ea7cfd1d378cae6dc4330542c35cedaff8e820a2158571f542e0ec55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14eab60dcf798778aa77f01af0307c7a5ef44846ceaf5a12e4a5f529b812aaa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14eab60dcf798778aa77f01af0307c7a5ef44846ceaf5a12e4a5f529b812aaa4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd65cf675b0d34fc5ea309bdcd17ef648bbf1c0ee6e2ece25f459d8d2403387c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd65cf675b0d34fc5ea309bdcd17ef648bbf1c0ee6e2ece25f459d8d2403387c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b15d92528d05fd8bcab26df5eeb85616b44e750574cc8effa1f34c39dc329bf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b15d92528d05fd8bcab26df5eeb85616b44e750574cc8effa1f34c39dc329bf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ddc9616f0a81f7623eb10b0cdb194656ad856548f814df2e6cebdb2419293f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ddc9616f0a81f7623eb10b0cdb194656ad856548f814df2e6cebdb2419293f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a96c0c25b8539ae8bb660010bbcd7d147a150bfb653dc725a34bc7e4e4b949b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a96c0c25b8539ae8bb660010bbcd7d147a150bfb653dc725a34bc7e4e4b949b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t7845\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:45Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:45 crc kubenswrapper[4836]: I0217 14:06:45.123053 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a443a5a-83a8-4154-94b7-ace6b324f461\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8787a45a4a94da65a0f67f1a402bfa526fa57631259981ded7ec3569fa977f36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6196237bf10370f0d43ac7ba96f1fbc861d8690734cf883081f67616e6047dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21927a2142504e88fd8bb2a254d804c0e13f6a42e81e16c02205292de3de3155\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c21e5f00c6fb44be82d2a907c0f9d17ea47071615370135f0e94a62f1eb7840\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:45Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:45 crc kubenswrapper[4836]: I0217 14:06:45.131048 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:45 crc kubenswrapper[4836]: I0217 14:06:45.131078 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:45 crc kubenswrapper[4836]: I0217 14:06:45.131087 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:45 crc kubenswrapper[4836]: I0217 14:06:45.131105 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:45 crc kubenswrapper[4836]: I0217 14:06:45.131139 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:45Z","lastTransitionTime":"2026-02-17T14:06:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:45 crc kubenswrapper[4836]: I0217 14:06:45.137022 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c76cc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"592aa549-1b1b-441e-93e4-0821e05ff2b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d56329a484e41f44539eb8ca01b461cc6f79eebf756350b0f49c47c0431e8abc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8vh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c76cc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:45Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:45 crc kubenswrapper[4836]: I0217 14:06:45.150147 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"895a19c9-a3f0-4a15-aa19-19347121388c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bce78b27c441ccb16f0fd2489cbe3e38be0f3d330fcd2b602f62a8ca0aa8616b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99tf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c10f7b69881ea75bfa81905a42379fed8daf48e788b5f2787c522a52d28f58cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99tf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bkk9g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:45Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:45 crc kubenswrapper[4836]: I0217 14:06:45.165958 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:45Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:45 crc kubenswrapper[4836]: I0217 14:06:45.180041 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd6efd6a58b6d90897fa902c8d0c8082ecec2899f16adc3e56b98f0bc5c4640d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:45Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:45 crc kubenswrapper[4836]: I0217 14:06:45.192152 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vt5sw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d1f430-35ed-4c4e-a797-d7a0a5a45266\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25f1e7f83b2916db4faac1afe1f5107375c0e1674ff1d6f90f1025a9920111e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqtsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vt5sw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:45Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:45 crc kubenswrapper[4836]: I0217 14:06:45.212269 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"243121c7-6c63-4df6-a6a6-1ef6770e1125\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0143c73054039d5957c29bdf62217900d803ba2e0046c09dcff149b27fc94995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18b1d7908740d818568e489caf3447ab0f5768c180cc6d443adc67414e51cd07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e054e9ec9a1f8d54b704be11ff2e55c9aaad9d717b1817aa553cfbf2cc6ac16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a5d1fd5d2de253417bcb1e4edadc2363a1c259769418eb2b2aab3ff8253de2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fbf3d3daf43d0da96ad0c1340b3c72815adceec24d0ed5fc97f965d1d93e7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://854199187b81a3d6692a05073c35985feb14993f76f4ac86149c2b6bac635e4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://854199187b81a3d6692a05073c35985feb14993f76f4ac86149c2b6bac635e4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67b26f0d65128004f7645ac1f54a11bc56909269cb9b9337e447b0d5bf783c4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67b26f0d65128004f7645ac1f54a11bc56909269cb9b9337e447b0d5bf783c4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e9fc4a71369f80c06598341729c7db0cb6e0f33d70b0be64470bc6fd40d1cce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9fc4a71369f80c06598341729c7db0cb6e0f33d70b0be64470bc6fd40d1cce2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:45Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:45 crc kubenswrapper[4836]: I0217 14:06:45.233873 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b942ef1c58993ae6344d4aef5fcfe21434b5f3b3282bdf8070560fa8973a3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77390b24cb4a223b8eb6a46db49a4a88b68f87aa0b4b134b45a552c180dec5b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:45Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:45 crc kubenswrapper[4836]: I0217 14:06:45.235143 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:45 crc kubenswrapper[4836]: I0217 14:06:45.235211 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:45 crc kubenswrapper[4836]: I0217 14:06:45.235223 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:45 crc kubenswrapper[4836]: I0217 14:06:45.235246 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:45 crc kubenswrapper[4836]: I0217 14:06:45.235262 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:45Z","lastTransitionTime":"2026-02-17T14:06:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:45 crc kubenswrapper[4836]: I0217 14:06:45.249793 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc30b1c397bb81c58abc71fb2c6d71c94b9a0f7962d8278edc4f65f1a1c64ef8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:45Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:45 crc kubenswrapper[4836]: I0217 14:06:45.263380 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:45Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:45 crc kubenswrapper[4836]: I0217 14:06:45.280667 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:45Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:45 crc kubenswrapper[4836]: I0217 14:06:45.294098 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jlz6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f87e91ef-e64c-45a5-9bd5-cc6537e51b1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e03fa870790f606ebf6db3bac825e59b209b7a2bd6cb22264597b30211fbc2fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vbzg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jlz6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:45Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:45 crc kubenswrapper[4836]: I0217 14:06:45.319404 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebae2d4e3375fda616b0ea16c0c86a5c1c36004e7d22873a289ff37874bbc74b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb05318b17722c3f731dc976d38c05a4ab7eafd686ad0fcf17bd01032409eaee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5576148df454de0fb3d15ea9de93736aab907ca8cf09669c3c25fde79ec6fad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47fdc413501c45a2d6e531a1427fd6df3e9d2c57e3d60098b31fbfd695d98a0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d36d269996169583e4e181aac060491640e74df94385276752f460c408273cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6545ec908eecb35a4405050dd77343f0acd4ec7d54ec55853a926e00b7f37f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efe40e27f5a0b564251e0f62c8b242039a04e92beaa2f621047a67d94afda3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8d2edfc6a88391d034f310661e09eb58503f33f015b5e067654db06e8b152e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81e6d07edc7d7857cb288980bc7e9d5002234c2fe642df6798611b150b6d66e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81e6d07edc7d7857cb288980bc7e9d5002234c2fe642df6798611b150b6d66e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gfznp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:45Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:45 crc kubenswrapper[4836]: I0217 14:06:45.335790 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15e7e5f3-c249-4609-937e-ffdc78580880\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://281c6f51b95a96d557a29b449759275131f2fde30c4814e7692edfff96442e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bc90fa5f9363034d51c35721493655d10f8f8d5fad4dda86ca4c882963fb7a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edaaa277a27d7e7b9e8b42acaa9ee04fc0933b440441ad3f8abee458e96945c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a13a546a929cc67fa3a708f23186f65c3dbc28afd67a421a66c732b8308e4177\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a13a546a929cc67fa3a708f23186f65c3dbc28afd67a421a66c732b8308e4177\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:06:30.193193 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:06:30.195418 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-478710080/tls.crt::/tmp/serving-cert-478710080/tls.key\\\\\\\"\\\\nI0217 14:06:35.690345 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:06:35.701676 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:06:35.701714 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:06:35.701739 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:06:35.701745 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:06:35.709986 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 14:06:35.710011 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:06:35.710016 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:06:35.710020 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:06:35.710023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:06:35.710027 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:06:35.710030 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 14:06:35.710186 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 14:06:35.711336 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b36202875b19f7dafcfde168e006d9d47571de30240d5886e5e0920b74b609b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf6a761df18c5e7699f2ec87d4b97f19da71f588cb12eff8e541140a361fa905\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf6a761df18c5e7699f2ec87d4b97f19da71f588cb12eff8e541140a361fa905\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:45Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:45 crc kubenswrapper[4836]: I0217 14:06:45.337335 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:45 crc kubenswrapper[4836]: I0217 14:06:45.337381 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:45 crc kubenswrapper[4836]: I0217 14:06:45.337391 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:45 crc kubenswrapper[4836]: I0217 14:06:45.337405 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:45 crc kubenswrapper[4836]: I0217 14:06:45.337413 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:45Z","lastTransitionTime":"2026-02-17T14:06:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:45 crc kubenswrapper[4836]: I0217 14:06:45.353094 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t7845" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eeaa6bd-bab3-4310-9522-747924f2e825\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6778ba6ea7cfd1d378cae6dc4330542c35cedaff8e820a2158571f542e0ec55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6778ba6ea7cfd1d378cae6dc4330542c35cedaff8e820a2158571f542e0ec55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14eab60dcf798778aa77f01af0307c7a5ef44846ceaf5a12e4a5f529b812aaa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14eab60dcf798778aa77f01af0307c7a5ef44846ceaf5a12e4a5f529b812aaa4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd65cf675b0d34fc5ea309bdcd17ef648bbf1c0ee6e2ece25f459d8d2403387c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd65cf675b0d34fc5ea309bdcd17ef648bbf1c0ee6e2ece25f459d8d2403387c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b15d92528d05fd8bcab26df5eeb85616b44e750574cc8effa1f34c39dc329bf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b15d92528d05fd8bcab26df5eeb85616b44e750574cc8effa1f34c39dc329bf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ddc9616f0a81f7623eb10b0cdb194656ad856548f814df2e6cebdb2419293f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ddc9616f0a81f7623eb10b0cdb194656ad856548f814df2e6cebdb2419293f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a96c0c25b8539ae8bb660010bbcd7d147a150bfb653dc725a34bc7e4e4b949b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a96c0c25b8539ae8bb660010bbcd7d147a150bfb653dc725a34bc7e4e4b949b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t7845\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:45Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:45 crc kubenswrapper[4836]: I0217 14:06:45.373717 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a443a5a-83a8-4154-94b7-ace6b324f461\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8787a45a4a94da65a0f67f1a402bfa526fa57631259981ded7ec3569fa977f36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6196237bf10370f0d43ac7ba96f1fbc861d8690734cf883081f67616e6047dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21927a2142504e88fd8bb2a254d804c0e13f6a42e81e16c02205292de3de3155\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c21e5f00c6fb44be82d2a907c0f9d17ea47071615370135f0e94a62f1eb7840\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:45Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:45 crc kubenswrapper[4836]: I0217 14:06:45.387742 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c76cc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"592aa549-1b1b-441e-93e4-0821e05ff2b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d56329a484e41f44539eb8ca01b461cc6f79eebf756350b0f49c47c0431e8abc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8vh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c76cc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:45Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:45 crc kubenswrapper[4836]: I0217 14:06:45.403058 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"895a19c9-a3f0-4a15-aa19-19347121388c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bce78b27c441ccb16f0fd2489cbe3e38be0f3d330fcd2b602f62a8ca0aa8616b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99tf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c10f7b69881ea75bfa81905a42379fed8daf48e788b5f2787c522a52d28f58cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99tf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bkk9g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:45Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:45 crc kubenswrapper[4836]: I0217 14:06:45.440715 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:45 crc kubenswrapper[4836]: I0217 14:06:45.440774 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:45 crc kubenswrapper[4836]: I0217 14:06:45.440787 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:45 crc kubenswrapper[4836]: I0217 14:06:45.440812 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:45 crc kubenswrapper[4836]: I0217 14:06:45.440829 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:45Z","lastTransitionTime":"2026-02-17T14:06:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:45 crc kubenswrapper[4836]: I0217 14:06:45.496756 4836 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-03 16:21:12.641158485 +0000 UTC Feb 17 14:06:45 crc kubenswrapper[4836]: I0217 14:06:45.544429 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:45 crc kubenswrapper[4836]: I0217 14:06:45.544491 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:45 crc kubenswrapper[4836]: I0217 14:06:45.544504 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:45 crc kubenswrapper[4836]: I0217 14:06:45.544526 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:45 crc kubenswrapper[4836]: I0217 14:06:45.544539 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:45Z","lastTransitionTime":"2026-02-17T14:06:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:45 crc kubenswrapper[4836]: I0217 14:06:45.567797 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:06:45 crc kubenswrapper[4836]: I0217 14:06:45.567870 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:06:45 crc kubenswrapper[4836]: E0217 14:06:45.567935 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:06:45 crc kubenswrapper[4836]: I0217 14:06:45.567951 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:06:45 crc kubenswrapper[4836]: E0217 14:06:45.568103 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:06:45 crc kubenswrapper[4836]: E0217 14:06:45.568180 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:06:45 crc kubenswrapper[4836]: I0217 14:06:45.647208 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:45 crc kubenswrapper[4836]: I0217 14:06:45.647256 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:45 crc kubenswrapper[4836]: I0217 14:06:45.647268 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:45 crc kubenswrapper[4836]: I0217 14:06:45.647284 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:45 crc kubenswrapper[4836]: I0217 14:06:45.647309 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:45Z","lastTransitionTime":"2026-02-17T14:06:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:45 crc kubenswrapper[4836]: I0217 14:06:45.750386 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:45 crc kubenswrapper[4836]: I0217 14:06:45.750471 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:45 crc kubenswrapper[4836]: I0217 14:06:45.750482 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:45 crc kubenswrapper[4836]: I0217 14:06:45.750496 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:45 crc kubenswrapper[4836]: I0217 14:06:45.750506 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:45Z","lastTransitionTime":"2026-02-17T14:06:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:45 crc kubenswrapper[4836]: I0217 14:06:45.852485 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:45 crc kubenswrapper[4836]: I0217 14:06:45.852527 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:45 crc kubenswrapper[4836]: I0217 14:06:45.852541 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:45 crc kubenswrapper[4836]: I0217 14:06:45.852560 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:45 crc kubenswrapper[4836]: I0217 14:06:45.852575 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:45Z","lastTransitionTime":"2026-02-17T14:06:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:45 crc kubenswrapper[4836]: I0217 14:06:45.857963 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-t7845" event={"ID":"3eeaa6bd-bab3-4310-9522-747924f2e825","Type":"ContainerStarted","Data":"54ef47d3331023c2e4524dc2c8958b4ed4e2a499eb4d5f084f70cd4afcca1d32"} Feb 17 14:06:45 crc kubenswrapper[4836]: I0217 14:06:45.881202 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"243121c7-6c63-4df6-a6a6-1ef6770e1125\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0143c73054039d5957c29bdf62217900d803ba2e0046c09dcff149b27fc94995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18b1d7908740d818568e489caf3447ab0f5768c180cc6d443adc67414e51cd07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e054e9ec9a1f8d54b704be11ff2e55c9aaad9d717b1817aa553cfbf2cc6ac16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a5d1fd5d2de253417bcb1e4edadc2363a1c259769418eb2b2aab3ff8253de2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fbf3d3daf43d0da96ad0c1340b3c72815adceec24d0ed5fc97f965d1d93e7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://854199187b81a3d6692a05073c35985feb14993f76f4ac86149c2b6bac635e4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://854199187b81a3d6692a05073c35985feb14993f76f4ac86149c2b6bac635e4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67b26f0d65128004f7645ac1f54a11bc56909269cb9b9337e447b0d5bf783c4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67b26f0d65128004f7645ac1f54a11bc56909269cb9b9337e447b0d5bf783c4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e9fc4a71369f80c06598341729c7db0cb6e0f33d70b0be64470bc6fd40d1cce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9fc4a71369f80c06598341729c7db0cb6e0f33d70b0be64470bc6fd40d1cce2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:45Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:45 crc kubenswrapper[4836]: I0217 14:06:45.892280 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b942ef1c58993ae6344d4aef5fcfe21434b5f3b3282bdf8070560fa8973a3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77390b24cb4a223b8eb6a46db49a4a88b68f87aa0b4b134b45a552c180dec5b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:45Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:45 crc kubenswrapper[4836]: I0217 14:06:45.903791 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:45Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:45 crc kubenswrapper[4836]: I0217 14:06:45.916178 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd6efd6a58b6d90897fa902c8d0c8082ecec2899f16adc3e56b98f0bc5c4640d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:45Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:45 crc kubenswrapper[4836]: I0217 14:06:45.928415 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vt5sw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d1f430-35ed-4c4e-a797-d7a0a5a45266\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25f1e7f83b2916db4faac1afe1f5107375c0e1674ff1d6f90f1025a9920111e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqtsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vt5sw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:45Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:45 crc kubenswrapper[4836]: I0217 14:06:45.948843 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebae2d4e3375fda616b0ea16c0c86a5c1c36004e7d22873a289ff37874bbc74b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb05318b17722c3f731dc976d38c05a4ab7eafd686ad0fcf17bd01032409eaee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5576148df454de0fb3d15ea9de93736aab907ca8cf09669c3c25fde79ec6fad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47fdc413501c45a2d6e531a1427fd6df3e9d2c57e3d60098b31fbfd695d98a0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d36d269996169583e4e181aac060491640e74df94385276752f460c408273cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6545ec908eecb35a4405050dd77343f0acd4ec7d54ec55853a926e00b7f37f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efe40e27f5a0b564251e0f62c8b242039a04e92beaa2f621047a67d94afda3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8d2edfc6a88391d034f310661e09eb58503f33f015b5e067654db06e8b152e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81e6d07edc7d7857cb288980bc7e9d5002234c2fe642df6798611b150b6d66e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81e6d07edc7d7857cb288980bc7e9d5002234c2fe642df6798611b150b6d66e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gfznp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:45Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:45 crc kubenswrapper[4836]: I0217 14:06:45.954801 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:45 crc kubenswrapper[4836]: I0217 14:06:45.954856 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:45 crc kubenswrapper[4836]: I0217 14:06:45.954866 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:45 crc kubenswrapper[4836]: I0217 14:06:45.954882 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:45 crc kubenswrapper[4836]: I0217 14:06:45.954892 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:45Z","lastTransitionTime":"2026-02-17T14:06:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:45 crc kubenswrapper[4836]: I0217 14:06:45.966888 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15e7e5f3-c249-4609-937e-ffdc78580880\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://281c6f51b95a96d557a29b449759275131f2fde30c4814e7692edfff96442e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bc90fa5f9363034d51c35721493655d10f8f8d5fad4dda86ca4c882963fb7a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edaaa277a27d7e7b9e8b42acaa9ee04fc0933b440441ad3f8abee458e96945c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a13a546a929cc67fa3a708f23186f65c3dbc28afd67a421a66c732b8308e4177\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a13a546a929cc67fa3a708f23186f65c3dbc28afd67a421a66c732b8308e4177\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:06:30.193193 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:06:30.195418 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-478710080/tls.crt::/tmp/serving-cert-478710080/tls.key\\\\\\\"\\\\nI0217 14:06:35.690345 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:06:35.701676 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:06:35.701714 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:06:35.701739 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:06:35.701745 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:06:35.709986 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 14:06:35.710011 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:06:35.710016 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:06:35.710020 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:06:35.710023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:06:35.710027 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:06:35.710030 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 14:06:35.710186 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 14:06:35.711336 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b36202875b19f7dafcfde168e006d9d47571de30240d5886e5e0920b74b609b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf6a761df18c5e7699f2ec87d4b97f19da71f588cb12eff8e541140a361fa905\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf6a761df18c5e7699f2ec87d4b97f19da71f588cb12eff8e541140a361fa905\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:45Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:45 crc kubenswrapper[4836]: I0217 14:06:45.982557 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc30b1c397bb81c58abc71fb2c6d71c94b9a0f7962d8278edc4f65f1a1c64ef8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:45Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:45 crc kubenswrapper[4836]: I0217 14:06:45.994412 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:45Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:46 crc kubenswrapper[4836]: I0217 14:06:46.013517 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:46Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:46 crc kubenswrapper[4836]: I0217 14:06:46.026115 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jlz6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f87e91ef-e64c-45a5-9bd5-cc6537e51b1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e03fa870790f606ebf6db3bac825e59b209b7a2bd6cb22264597b30211fbc2fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vbzg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jlz6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:46Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:46 crc kubenswrapper[4836]: I0217 14:06:46.042531 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t7845" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eeaa6bd-bab3-4310-9522-747924f2e825\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54ef47d3331023c2e4524dc2c8958b4ed4e2a499eb4d5f084f70cd4afcca1d32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6778ba6ea7cfd1d378cae6dc4330542c35cedaff8e820a2158571f542e0ec55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6778ba6ea7cfd1d378cae6dc4330542c35cedaff8e820a2158571f542e0ec55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14eab60dcf798778aa77f01af0307c7a5ef44846ceaf5a12e4a5f529b812aaa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14eab60dcf798778aa77f01af0307c7a5ef44846ceaf5a12e4a5f529b812aaa4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd65cf675b0d34fc5ea309bdcd17ef648bbf1c0ee6e2ece25f459d8d2403387c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd65cf675b0d34fc5ea309bdcd17ef648bbf1c0ee6e2ece25f459d8d2403387c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b15d92528d05fd8bcab26df5eeb85616b44e750574cc8effa1f34c39dc329bf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b15d92528d05fd8bcab26df5eeb85616b44e750574cc8effa1f34c39dc329bf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ddc9616f0a81f7623eb10b0cdb194656ad856548f814df2e6cebdb2419293f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ddc9616f0a81f7623eb10b0cdb194656ad856548f814df2e6cebdb2419293f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a96c0c25b8539ae8bb660010bbcd7d147a150bfb653dc725a34bc7e4e4b949b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a96c0c25b8539ae8bb660010bbcd7d147a150bfb653dc725a34bc7e4e4b949b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t7845\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:46Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:46 crc kubenswrapper[4836]: I0217 14:06:46.054823 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a443a5a-83a8-4154-94b7-ace6b324f461\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8787a45a4a94da65a0f67f1a402bfa526fa57631259981ded7ec3569fa977f36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6196237bf10370f0d43ac7ba96f1fbc861d8690734cf883081f67616e6047dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21927a2142504e88fd8bb2a254d804c0e13f6a42e81e16c02205292de3de3155\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c21e5f00c6fb44be82d2a907c0f9d17ea47071615370135f0e94a62f1eb7840\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:46Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:46 crc kubenswrapper[4836]: I0217 14:06:46.056758 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:46 crc kubenswrapper[4836]: I0217 14:06:46.056793 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:46 crc kubenswrapper[4836]: I0217 14:06:46.056805 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:46 crc kubenswrapper[4836]: I0217 14:06:46.056822 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:46 crc kubenswrapper[4836]: I0217 14:06:46.056835 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:46Z","lastTransitionTime":"2026-02-17T14:06:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:46 crc kubenswrapper[4836]: I0217 14:06:46.068131 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c76cc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"592aa549-1b1b-441e-93e4-0821e05ff2b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d56329a484e41f44539eb8ca01b461cc6f79eebf756350b0f49c47c0431e8abc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8vh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c76cc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:46Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:46 crc kubenswrapper[4836]: I0217 14:06:46.078432 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"895a19c9-a3f0-4a15-aa19-19347121388c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bce78b27c441ccb16f0fd2489cbe3e38be0f3d330fcd2b602f62a8ca0aa8616b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99tf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c10f7b69881ea75bfa81905a42379fed8daf48e788b5f2787c522a52d28f58cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99tf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bkk9g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:46Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:46 crc kubenswrapper[4836]: I0217 14:06:46.159809 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:46 crc kubenswrapper[4836]: I0217 14:06:46.159854 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:46 crc kubenswrapper[4836]: I0217 14:06:46.159864 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:46 crc kubenswrapper[4836]: I0217 14:06:46.159880 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:46 crc kubenswrapper[4836]: I0217 14:06:46.159893 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:46Z","lastTransitionTime":"2026-02-17T14:06:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:46 crc kubenswrapper[4836]: I0217 14:06:46.262402 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:46 crc kubenswrapper[4836]: I0217 14:06:46.262476 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:46 crc kubenswrapper[4836]: I0217 14:06:46.262494 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:46 crc kubenswrapper[4836]: I0217 14:06:46.262521 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:46 crc kubenswrapper[4836]: I0217 14:06:46.262540 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:46Z","lastTransitionTime":"2026-02-17T14:06:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:46 crc kubenswrapper[4836]: I0217 14:06:46.365424 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:46 crc kubenswrapper[4836]: I0217 14:06:46.365471 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:46 crc kubenswrapper[4836]: I0217 14:06:46.365484 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:46 crc kubenswrapper[4836]: I0217 14:06:46.365503 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:46 crc kubenswrapper[4836]: I0217 14:06:46.365515 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:46Z","lastTransitionTime":"2026-02-17T14:06:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:46 crc kubenswrapper[4836]: I0217 14:06:46.468500 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:46 crc kubenswrapper[4836]: I0217 14:06:46.468561 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:46 crc kubenswrapper[4836]: I0217 14:06:46.468576 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:46 crc kubenswrapper[4836]: I0217 14:06:46.468600 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:46 crc kubenswrapper[4836]: I0217 14:06:46.468615 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:46Z","lastTransitionTime":"2026-02-17T14:06:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:46 crc kubenswrapper[4836]: I0217 14:06:46.497878 4836 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-31 12:50:39.598494518 +0000 UTC Feb 17 14:06:46 crc kubenswrapper[4836]: I0217 14:06:46.571388 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:46 crc kubenswrapper[4836]: I0217 14:06:46.571455 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:46 crc kubenswrapper[4836]: I0217 14:06:46.571480 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:46 crc kubenswrapper[4836]: I0217 14:06:46.571500 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:46 crc kubenswrapper[4836]: I0217 14:06:46.571513 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:46Z","lastTransitionTime":"2026-02-17T14:06:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:46 crc kubenswrapper[4836]: I0217 14:06:46.674201 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:46 crc kubenswrapper[4836]: I0217 14:06:46.674240 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:46 crc kubenswrapper[4836]: I0217 14:06:46.674249 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:46 crc kubenswrapper[4836]: I0217 14:06:46.674262 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:46 crc kubenswrapper[4836]: I0217 14:06:46.674274 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:46Z","lastTransitionTime":"2026-02-17T14:06:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:46 crc kubenswrapper[4836]: I0217 14:06:46.777564 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:46 crc kubenswrapper[4836]: I0217 14:06:46.778081 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:46 crc kubenswrapper[4836]: I0217 14:06:46.778179 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:46 crc kubenswrapper[4836]: I0217 14:06:46.778288 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:46 crc kubenswrapper[4836]: I0217 14:06:46.778390 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:46Z","lastTransitionTime":"2026-02-17T14:06:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:46 crc kubenswrapper[4836]: I0217 14:06:46.961966 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:46 crc kubenswrapper[4836]: I0217 14:06:46.962007 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:46 crc kubenswrapper[4836]: I0217 14:06:46.962015 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:46 crc kubenswrapper[4836]: I0217 14:06:46.962031 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:46 crc kubenswrapper[4836]: I0217 14:06:46.962040 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:46Z","lastTransitionTime":"2026-02-17T14:06:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:47 crc kubenswrapper[4836]: I0217 14:06:47.066415 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:47 crc kubenswrapper[4836]: I0217 14:06:47.066455 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:47 crc kubenswrapper[4836]: I0217 14:06:47.066468 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:47 crc kubenswrapper[4836]: I0217 14:06:47.066481 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:47 crc kubenswrapper[4836]: I0217 14:06:47.066490 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:47Z","lastTransitionTime":"2026-02-17T14:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:47 crc kubenswrapper[4836]: I0217 14:06:47.168495 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:47 crc kubenswrapper[4836]: I0217 14:06:47.168560 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:47 crc kubenswrapper[4836]: I0217 14:06:47.168582 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:47 crc kubenswrapper[4836]: I0217 14:06:47.168607 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:47 crc kubenswrapper[4836]: I0217 14:06:47.168625 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:47Z","lastTransitionTime":"2026-02-17T14:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:47 crc kubenswrapper[4836]: I0217 14:06:47.270963 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:47 crc kubenswrapper[4836]: I0217 14:06:47.271004 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:47 crc kubenswrapper[4836]: I0217 14:06:47.271013 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:47 crc kubenswrapper[4836]: I0217 14:06:47.271028 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:47 crc kubenswrapper[4836]: I0217 14:06:47.271036 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:47Z","lastTransitionTime":"2026-02-17T14:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:47 crc kubenswrapper[4836]: I0217 14:06:47.373542 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:47 crc kubenswrapper[4836]: I0217 14:06:47.373589 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:47 crc kubenswrapper[4836]: I0217 14:06:47.373605 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:47 crc kubenswrapper[4836]: I0217 14:06:47.373627 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:47 crc kubenswrapper[4836]: I0217 14:06:47.373643 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:47Z","lastTransitionTime":"2026-02-17T14:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:47 crc kubenswrapper[4836]: I0217 14:06:47.476801 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:47 crc kubenswrapper[4836]: I0217 14:06:47.477102 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:47 crc kubenswrapper[4836]: I0217 14:06:47.477208 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:47 crc kubenswrapper[4836]: I0217 14:06:47.477367 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:47 crc kubenswrapper[4836]: I0217 14:06:47.477444 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:47Z","lastTransitionTime":"2026-02-17T14:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:47 crc kubenswrapper[4836]: I0217 14:06:47.499590 4836 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-28 07:16:53.359093644 +0000 UTC Feb 17 14:06:47 crc kubenswrapper[4836]: I0217 14:06:47.567114 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:06:47 crc kubenswrapper[4836]: E0217 14:06:47.567518 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:06:47 crc kubenswrapper[4836]: I0217 14:06:47.567176 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:06:47 crc kubenswrapper[4836]: E0217 14:06:47.568079 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:06:47 crc kubenswrapper[4836]: I0217 14:06:47.567132 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:06:47 crc kubenswrapper[4836]: E0217 14:06:47.568374 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:06:47 crc kubenswrapper[4836]: I0217 14:06:47.579091 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:47 crc kubenswrapper[4836]: I0217 14:06:47.579386 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:47 crc kubenswrapper[4836]: I0217 14:06:47.579497 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:47 crc kubenswrapper[4836]: I0217 14:06:47.579572 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:47 crc kubenswrapper[4836]: I0217 14:06:47.579684 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:47Z","lastTransitionTime":"2026-02-17T14:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:47 crc kubenswrapper[4836]: I0217 14:06:47.682144 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:47 crc kubenswrapper[4836]: I0217 14:06:47.682439 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:47 crc kubenswrapper[4836]: I0217 14:06:47.682708 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:47 crc kubenswrapper[4836]: I0217 14:06:47.682793 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:47 crc kubenswrapper[4836]: I0217 14:06:47.682867 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:47Z","lastTransitionTime":"2026-02-17T14:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:47 crc kubenswrapper[4836]: I0217 14:06:47.785751 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:47 crc kubenswrapper[4836]: I0217 14:06:47.785797 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:47 crc kubenswrapper[4836]: I0217 14:06:47.785808 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:47 crc kubenswrapper[4836]: I0217 14:06:47.785832 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:47 crc kubenswrapper[4836]: I0217 14:06:47.785844 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:47Z","lastTransitionTime":"2026-02-17T14:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:47 crc kubenswrapper[4836]: I0217 14:06:47.889474 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:47 crc kubenswrapper[4836]: I0217 14:06:47.889549 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:47 crc kubenswrapper[4836]: I0217 14:06:47.889566 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:47 crc kubenswrapper[4836]: I0217 14:06:47.889601 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:47 crc kubenswrapper[4836]: I0217 14:06:47.889617 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:47Z","lastTransitionTime":"2026-02-17T14:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:47 crc kubenswrapper[4836]: I0217 14:06:47.992273 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:47 crc kubenswrapper[4836]: I0217 14:06:47.992321 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:47 crc kubenswrapper[4836]: I0217 14:06:47.992332 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:47 crc kubenswrapper[4836]: I0217 14:06:47.992348 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:47 crc kubenswrapper[4836]: I0217 14:06:47.992359 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:47Z","lastTransitionTime":"2026-02-17T14:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:48 crc kubenswrapper[4836]: I0217 14:06:48.095575 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:48 crc kubenswrapper[4836]: I0217 14:06:48.095644 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:48 crc kubenswrapper[4836]: I0217 14:06:48.095658 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:48 crc kubenswrapper[4836]: I0217 14:06:48.095681 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:48 crc kubenswrapper[4836]: I0217 14:06:48.095699 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:48Z","lastTransitionTime":"2026-02-17T14:06:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:48 crc kubenswrapper[4836]: I0217 14:06:48.199049 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:48 crc kubenswrapper[4836]: I0217 14:06:48.199099 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:48 crc kubenswrapper[4836]: I0217 14:06:48.199113 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:48 crc kubenswrapper[4836]: I0217 14:06:48.199131 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:48 crc kubenswrapper[4836]: I0217 14:06:48.199145 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:48Z","lastTransitionTime":"2026-02-17T14:06:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:48 crc kubenswrapper[4836]: I0217 14:06:48.301856 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:48 crc kubenswrapper[4836]: I0217 14:06:48.301915 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:48 crc kubenswrapper[4836]: I0217 14:06:48.301929 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:48 crc kubenswrapper[4836]: I0217 14:06:48.301948 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:48 crc kubenswrapper[4836]: I0217 14:06:48.301961 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:48Z","lastTransitionTime":"2026-02-17T14:06:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:48 crc kubenswrapper[4836]: I0217 14:06:48.405045 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:48 crc kubenswrapper[4836]: I0217 14:06:48.405467 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:48 crc kubenswrapper[4836]: I0217 14:06:48.405619 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:48 crc kubenswrapper[4836]: I0217 14:06:48.405766 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:48 crc kubenswrapper[4836]: I0217 14:06:48.405902 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:48Z","lastTransitionTime":"2026-02-17T14:06:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:48 crc kubenswrapper[4836]: I0217 14:06:48.470777 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7nmc8"] Feb 17 14:06:48 crc kubenswrapper[4836]: I0217 14:06:48.471330 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7nmc8" Feb 17 14:06:48 crc kubenswrapper[4836]: W0217 14:06:48.473233 4836 reflector.go:561] object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert": failed to list *v1.Secret: secrets "ovn-control-plane-metrics-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-ovn-kubernetes": no relationship found between node 'crc' and this object Feb 17 14:06:48 crc kubenswrapper[4836]: E0217 14:06:48.473273 4836 reflector.go:158] "Unhandled Error" err="object-\"openshift-ovn-kubernetes\"/\"ovn-control-plane-metrics-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"ovn-control-plane-metrics-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-ovn-kubernetes\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 17 14:06:48 crc kubenswrapper[4836]: W0217 14:06:48.473356 4836 reflector.go:561] object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd": failed to list *v1.Secret: secrets "ovn-kubernetes-control-plane-dockercfg-gs7dd" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-ovn-kubernetes": no relationship found between node 'crc' and this object Feb 17 14:06:48 crc kubenswrapper[4836]: E0217 14:06:48.473371 4836 reflector.go:158] "Unhandled Error" err="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-control-plane-dockercfg-gs7dd\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"ovn-kubernetes-control-plane-dockercfg-gs7dd\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-ovn-kubernetes\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 17 14:06:48 crc kubenswrapper[4836]: I0217 14:06:48.480194 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b98ed9fc-ca9f-49c8-b92d-c5b58df2ce3a-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-7nmc8\" (UID: \"b98ed9fc-ca9f-49c8-b92d-c5b58df2ce3a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7nmc8" Feb 17 14:06:48 crc kubenswrapper[4836]: I0217 14:06:48.480271 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6dh2\" (UniqueName: \"kubernetes.io/projected/b98ed9fc-ca9f-49c8-b92d-c5b58df2ce3a-kube-api-access-j6dh2\") pod \"ovnkube-control-plane-749d76644c-7nmc8\" (UID: \"b98ed9fc-ca9f-49c8-b92d-c5b58df2ce3a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7nmc8" Feb 17 14:06:48 crc kubenswrapper[4836]: I0217 14:06:48.480324 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b98ed9fc-ca9f-49c8-b92d-c5b58df2ce3a-env-overrides\") pod \"ovnkube-control-plane-749d76644c-7nmc8\" (UID: \"b98ed9fc-ca9f-49c8-b92d-c5b58df2ce3a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7nmc8" Feb 17 14:06:48 crc kubenswrapper[4836]: I0217 14:06:48.480458 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b98ed9fc-ca9f-49c8-b92d-c5b58df2ce3a-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-7nmc8\" (UID: \"b98ed9fc-ca9f-49c8-b92d-c5b58df2ce3a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7nmc8" Feb 17 14:06:48 crc kubenswrapper[4836]: I0217 14:06:48.484137 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"895a19c9-a3f0-4a15-aa19-19347121388c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bce78b27c441ccb16f0fd2489cbe3e38be0f3d330fcd2b602f62a8ca0aa8616b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99tf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c10f7b69881ea75bfa81905a42379fed8daf48e788b5f2787c522a52d28f58cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99tf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bkk9g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:48Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:48 crc kubenswrapper[4836]: I0217 14:06:48.499936 4836 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 01:56:51.906011823 +0000 UTC Feb 17 14:06:48 crc kubenswrapper[4836]: I0217 14:06:48.500011 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:48Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:48 crc kubenswrapper[4836]: I0217 14:06:48.508314 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:48 crc kubenswrapper[4836]: I0217 14:06:48.508373 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:48 crc kubenswrapper[4836]: I0217 14:06:48.508389 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:48 crc kubenswrapper[4836]: I0217 14:06:48.508410 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:48 crc kubenswrapper[4836]: I0217 14:06:48.508425 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:48Z","lastTransitionTime":"2026-02-17T14:06:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:48 crc kubenswrapper[4836]: I0217 14:06:48.514018 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd6efd6a58b6d90897fa902c8d0c8082ecec2899f16adc3e56b98f0bc5c4640d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:48Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:48 crc kubenswrapper[4836]: I0217 14:06:48.526539 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vt5sw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d1f430-35ed-4c4e-a797-d7a0a5a45266\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25f1e7f83b2916db4faac1afe1f5107375c0e1674ff1d6f90f1025a9920111e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqtsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vt5sw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:48Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:48 crc kubenswrapper[4836]: I0217 14:06:48.551382 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"243121c7-6c63-4df6-a6a6-1ef6770e1125\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0143c73054039d5957c29bdf62217900d803ba2e0046c09dcff149b27fc94995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18b1d7908740d818568e489caf3447ab0f5768c180cc6d443adc67414e51cd07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e054e9ec9a1f8d54b704be11ff2e55c9aaad9d717b1817aa553cfbf2cc6ac16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a5d1fd5d2de253417bcb1e4edadc2363a1c259769418eb2b2aab3ff8253de2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fbf3d3daf43d0da96ad0c1340b3c72815adceec24d0ed5fc97f965d1d93e7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://854199187b81a3d6692a05073c35985feb14993f76f4ac86149c2b6bac635e4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://854199187b81a3d6692a05073c35985feb14993f76f4ac86149c2b6bac635e4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67b26f0d65128004f7645ac1f54a11bc56909269cb9b9337e447b0d5bf783c4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67b26f0d65128004f7645ac1f54a11bc56909269cb9b9337e447b0d5bf783c4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e9fc4a71369f80c06598341729c7db0cb6e0f33d70b0be64470bc6fd40d1cce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9fc4a71369f80c06598341729c7db0cb6e0f33d70b0be64470bc6fd40d1cce2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:48Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:48 crc kubenswrapper[4836]: I0217 14:06:48.576715 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b942ef1c58993ae6344d4aef5fcfe21434b5f3b3282bdf8070560fa8973a3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77390b24cb4a223b8eb6a46db49a4a88b68f87aa0b4b134b45a552c180dec5b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:48Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:48 crc kubenswrapper[4836]: I0217 14:06:48.581543 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b98ed9fc-ca9f-49c8-b92d-c5b58df2ce3a-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-7nmc8\" (UID: \"b98ed9fc-ca9f-49c8-b92d-c5b58df2ce3a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7nmc8" Feb 17 14:06:48 crc kubenswrapper[4836]: I0217 14:06:48.581624 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b98ed9fc-ca9f-49c8-b92d-c5b58df2ce3a-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-7nmc8\" (UID: \"b98ed9fc-ca9f-49c8-b92d-c5b58df2ce3a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7nmc8" Feb 17 14:06:48 crc kubenswrapper[4836]: I0217 14:06:48.581683 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6dh2\" (UniqueName: \"kubernetes.io/projected/b98ed9fc-ca9f-49c8-b92d-c5b58df2ce3a-kube-api-access-j6dh2\") pod \"ovnkube-control-plane-749d76644c-7nmc8\" (UID: \"b98ed9fc-ca9f-49c8-b92d-c5b58df2ce3a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7nmc8" Feb 17 14:06:48 crc kubenswrapper[4836]: I0217 14:06:48.581720 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b98ed9fc-ca9f-49c8-b92d-c5b58df2ce3a-env-overrides\") pod \"ovnkube-control-plane-749d76644c-7nmc8\" (UID: \"b98ed9fc-ca9f-49c8-b92d-c5b58df2ce3a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7nmc8" Feb 17 14:06:48 crc kubenswrapper[4836]: I0217 14:06:48.582457 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b98ed9fc-ca9f-49c8-b92d-c5b58df2ce3a-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-7nmc8\" (UID: \"b98ed9fc-ca9f-49c8-b92d-c5b58df2ce3a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7nmc8" Feb 17 14:06:48 crc kubenswrapper[4836]: I0217 14:06:48.582497 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b98ed9fc-ca9f-49c8-b92d-c5b58df2ce3a-env-overrides\") pod \"ovnkube-control-plane-749d76644c-7nmc8\" (UID: \"b98ed9fc-ca9f-49c8-b92d-c5b58df2ce3a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7nmc8" Feb 17 14:06:48 crc kubenswrapper[4836]: I0217 14:06:48.597244 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc30b1c397bb81c58abc71fb2c6d71c94b9a0f7962d8278edc4f65f1a1c64ef8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:48Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:48 crc kubenswrapper[4836]: I0217 14:06:48.607245 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6dh2\" (UniqueName: \"kubernetes.io/projected/b98ed9fc-ca9f-49c8-b92d-c5b58df2ce3a-kube-api-access-j6dh2\") pod \"ovnkube-control-plane-749d76644c-7nmc8\" (UID: \"b98ed9fc-ca9f-49c8-b92d-c5b58df2ce3a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7nmc8" Feb 17 14:06:48 crc kubenswrapper[4836]: I0217 14:06:48.612545 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:48 crc kubenswrapper[4836]: I0217 14:06:48.612596 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:48 crc kubenswrapper[4836]: I0217 14:06:48.612612 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:48 crc kubenswrapper[4836]: I0217 14:06:48.612636 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:48 crc kubenswrapper[4836]: I0217 14:06:48.612651 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:48Z","lastTransitionTime":"2026-02-17T14:06:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:48 crc kubenswrapper[4836]: I0217 14:06:48.614927 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:48Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:48 crc kubenswrapper[4836]: I0217 14:06:48.632178 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:48Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:48 crc kubenswrapper[4836]: I0217 14:06:48.647435 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jlz6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f87e91ef-e64c-45a5-9bd5-cc6537e51b1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e03fa870790f606ebf6db3bac825e59b209b7a2bd6cb22264597b30211fbc2fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vbzg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jlz6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:48Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:48 crc kubenswrapper[4836]: I0217 14:06:48.677232 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebae2d4e3375fda616b0ea16c0c86a5c1c36004e7d22873a289ff37874bbc74b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb05318b17722c3f731dc976d38c05a4ab7eafd686ad0fcf17bd01032409eaee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5576148df454de0fb3d15ea9de93736aab907ca8cf09669c3c25fde79ec6fad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47fdc413501c45a2d6e531a1427fd6df3e9d2c57e3d60098b31fbfd695d98a0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d36d269996169583e4e181aac060491640e74df94385276752f460c408273cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6545ec908eecb35a4405050dd77343f0acd4ec7d54ec55853a926e00b7f37f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efe40e27f5a0b564251e0f62c8b242039a04e92beaa2f621047a67d94afda3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8d2edfc6a88391d034f310661e09eb58503f33f015b5e067654db06e8b152e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81e6d07edc7d7857cb288980bc7e9d5002234c2fe642df6798611b150b6d66e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81e6d07edc7d7857cb288980bc7e9d5002234c2fe642df6798611b150b6d66e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gfznp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:48Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:48 crc kubenswrapper[4836]: I0217 14:06:48.695129 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15e7e5f3-c249-4609-937e-ffdc78580880\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://281c6f51b95a96d557a29b449759275131f2fde30c4814e7692edfff96442e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bc90fa5f9363034d51c35721493655d10f8f8d5fad4dda86ca4c882963fb7a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edaaa277a27d7e7b9e8b42acaa9ee04fc0933b440441ad3f8abee458e96945c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a13a546a929cc67fa3a708f23186f65c3dbc28afd67a421a66c732b8308e4177\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a13a546a929cc67fa3a708f23186f65c3dbc28afd67a421a66c732b8308e4177\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:06:30.193193 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:06:30.195418 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-478710080/tls.crt::/tmp/serving-cert-478710080/tls.key\\\\\\\"\\\\nI0217 14:06:35.690345 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:06:35.701676 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:06:35.701714 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:06:35.701739 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:06:35.701745 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:06:35.709986 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 14:06:35.710011 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:06:35.710016 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:06:35.710020 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:06:35.710023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:06:35.710027 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:06:35.710030 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 14:06:35.710186 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 14:06:35.711336 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b36202875b19f7dafcfde168e006d9d47571de30240d5886e5e0920b74b609b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf6a761df18c5e7699f2ec87d4b97f19da71f588cb12eff8e541140a361fa905\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf6a761df18c5e7699f2ec87d4b97f19da71f588cb12eff8e541140a361fa905\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:48Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:48 crc kubenswrapper[4836]: I0217 14:06:48.715267 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:48 crc kubenswrapper[4836]: I0217 14:06:48.715329 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:48 crc kubenswrapper[4836]: I0217 14:06:48.715339 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:48 crc kubenswrapper[4836]: I0217 14:06:48.715359 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:48 crc kubenswrapper[4836]: I0217 14:06:48.715370 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:48Z","lastTransitionTime":"2026-02-17T14:06:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:48 crc kubenswrapper[4836]: I0217 14:06:48.715830 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t7845" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eeaa6bd-bab3-4310-9522-747924f2e825\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54ef47d3331023c2e4524dc2c8958b4ed4e2a499eb4d5f084f70cd4afcca1d32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6778ba6ea7cfd1d378cae6dc4330542c35cedaff8e820a2158571f542e0ec55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6778ba6ea7cfd1d378cae6dc4330542c35cedaff8e820a2158571f542e0ec55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14eab60dcf798778aa77f01af0307c7a5ef44846ceaf5a12e4a5f529b812aaa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14eab60dcf798778aa77f01af0307c7a5ef44846ceaf5a12e4a5f529b812aaa4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd65cf675b0d34fc5ea309bdcd17ef648bbf1c0ee6e2ece25f459d8d2403387c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd65cf675b0d34fc5ea309bdcd17ef648bbf1c0ee6e2ece25f459d8d2403387c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b15d92528d05fd8bcab26df5eeb85616b44e750574cc8effa1f34c39dc329bf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b15d92528d05fd8bcab26df5eeb85616b44e750574cc8effa1f34c39dc329bf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ddc9616f0a81f7623eb10b0cdb194656ad856548f814df2e6cebdb2419293f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ddc9616f0a81f7623eb10b0cdb194656ad856548f814df2e6cebdb2419293f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a96c0c25b8539ae8bb660010bbcd7d147a150bfb653dc725a34bc7e4e4b949b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a96c0c25b8539ae8bb660010bbcd7d147a150bfb653dc725a34bc7e4e4b949b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t7845\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:48Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:48 crc kubenswrapper[4836]: I0217 14:06:48.729116 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7nmc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b98ed9fc-ca9f-49c8-b92d-c5b58df2ce3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6dh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6dh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7nmc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:48Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:48 crc kubenswrapper[4836]: I0217 14:06:48.744769 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a443a5a-83a8-4154-94b7-ace6b324f461\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8787a45a4a94da65a0f67f1a402bfa526fa57631259981ded7ec3569fa977f36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6196237bf10370f0d43ac7ba96f1fbc861d8690734cf883081f67616e6047dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21927a2142504e88fd8bb2a254d804c0e13f6a42e81e16c02205292de3de3155\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c21e5f00c6fb44be82d2a907c0f9d17ea47071615370135f0e94a62f1eb7840\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:48Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:48 crc kubenswrapper[4836]: I0217 14:06:48.759555 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c76cc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"592aa549-1b1b-441e-93e4-0821e05ff2b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d56329a484e41f44539eb8ca01b461cc6f79eebf756350b0f49c47c0431e8abc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8vh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c76cc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:48Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:48 crc kubenswrapper[4836]: I0217 14:06:48.817420 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:48 crc kubenswrapper[4836]: I0217 14:06:48.817458 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:48 crc kubenswrapper[4836]: I0217 14:06:48.817467 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:48 crc kubenswrapper[4836]: I0217 14:06:48.817481 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:48 crc kubenswrapper[4836]: I0217 14:06:48.817490 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:48Z","lastTransitionTime":"2026-02-17T14:06:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:48 crc kubenswrapper[4836]: I0217 14:06:48.869273 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gfznp_67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e/ovnkube-controller/0.log" Feb 17 14:06:48 crc kubenswrapper[4836]: I0217 14:06:48.871586 4836 generic.go:334] "Generic (PLEG): container finished" podID="67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e" containerID="efe40e27f5a0b564251e0f62c8b242039a04e92beaa2f621047a67d94afda3e5" exitCode=1 Feb 17 14:06:48 crc kubenswrapper[4836]: I0217 14:06:48.871622 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" event={"ID":"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e","Type":"ContainerDied","Data":"efe40e27f5a0b564251e0f62c8b242039a04e92beaa2f621047a67d94afda3e5"} Feb 17 14:06:48 crc kubenswrapper[4836]: I0217 14:06:48.872508 4836 scope.go:117] "RemoveContainer" containerID="efe40e27f5a0b564251e0f62c8b242039a04e92beaa2f621047a67d94afda3e5" Feb 17 14:06:48 crc kubenswrapper[4836]: I0217 14:06:48.888505 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a443a5a-83a8-4154-94b7-ace6b324f461\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8787a45a4a94da65a0f67f1a402bfa526fa57631259981ded7ec3569fa977f36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6196237bf10370f0d43ac7ba96f1fbc861d8690734cf883081f67616e6047dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21927a2142504e88fd8bb2a254d804c0e13f6a42e81e16c02205292de3de3155\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c21e5f00c6fb44be82d2a907c0f9d17ea47071615370135f0e94a62f1eb7840\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:48Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:48 crc kubenswrapper[4836]: I0217 14:06:48.900564 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c76cc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"592aa549-1b1b-441e-93e4-0821e05ff2b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d56329a484e41f44539eb8ca01b461cc6f79eebf756350b0f49c47c0431e8abc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8vh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c76cc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:48Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:48 crc kubenswrapper[4836]: I0217 14:06:48.911305 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"895a19c9-a3f0-4a15-aa19-19347121388c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bce78b27c441ccb16f0fd2489cbe3e38be0f3d330fcd2b602f62a8ca0aa8616b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99tf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c10f7b69881ea75bfa81905a42379fed8daf48e788b5f2787c522a52d28f58cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99tf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bkk9g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:48Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:48 crc kubenswrapper[4836]: I0217 14:06:48.924528 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:48 crc kubenswrapper[4836]: I0217 14:06:48.924580 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:48 crc kubenswrapper[4836]: I0217 14:06:48.924590 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:48 crc kubenswrapper[4836]: I0217 14:06:48.924608 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:48 crc kubenswrapper[4836]: I0217 14:06:48.924620 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:48Z","lastTransitionTime":"2026-02-17T14:06:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:48 crc kubenswrapper[4836]: I0217 14:06:48.934025 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"243121c7-6c63-4df6-a6a6-1ef6770e1125\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0143c73054039d5957c29bdf62217900d803ba2e0046c09dcff149b27fc94995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18b1d7908740d818568e489caf3447ab0f5768c180cc6d443adc67414e51cd07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e054e9ec9a1f8d54b704be11ff2e55c9aaad9d717b1817aa553cfbf2cc6ac16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a5d1fd5d2de253417bcb1e4edadc2363a1c259769418eb2b2aab3ff8253de2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fbf3d3daf43d0da96ad0c1340b3c72815adceec24d0ed5fc97f965d1d93e7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://854199187b81a3d6692a05073c35985feb14993f76f4ac86149c2b6bac635e4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://854199187b81a3d6692a05073c35985feb14993f76f4ac86149c2b6bac635e4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67b26f0d65128004f7645ac1f54a11bc56909269cb9b9337e447b0d5bf783c4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67b26f0d65128004f7645ac1f54a11bc56909269cb9b9337e447b0d5bf783c4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e9fc4a71369f80c06598341729c7db0cb6e0f33d70b0be64470bc6fd40d1cce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9fc4a71369f80c06598341729c7db0cb6e0f33d70b0be64470bc6fd40d1cce2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:48Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:48 crc kubenswrapper[4836]: I0217 14:06:48.947102 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b942ef1c58993ae6344d4aef5fcfe21434b5f3b3282bdf8070560fa8973a3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77390b24cb4a223b8eb6a46db49a4a88b68f87aa0b4b134b45a552c180dec5b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:48Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:48 crc kubenswrapper[4836]: I0217 14:06:48.959866 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:48Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:48 crc kubenswrapper[4836]: I0217 14:06:48.973156 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd6efd6a58b6d90897fa902c8d0c8082ecec2899f16adc3e56b98f0bc5c4640d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:48Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:48 crc kubenswrapper[4836]: I0217 14:06:48.986277 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vt5sw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d1f430-35ed-4c4e-a797-d7a0a5a45266\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25f1e7f83b2916db4faac1afe1f5107375c0e1674ff1d6f90f1025a9920111e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqtsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vt5sw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:48Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:48 crc kubenswrapper[4836]: I0217 14:06:48.997759 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jlz6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f87e91ef-e64c-45a5-9bd5-cc6537e51b1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e03fa870790f606ebf6db3bac825e59b209b7a2bd6cb22264597b30211fbc2fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vbzg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jlz6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:48Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:49 crc kubenswrapper[4836]: I0217 14:06:49.022667 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebae2d4e3375fda616b0ea16c0c86a5c1c36004e7d22873a289ff37874bbc74b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb05318b17722c3f731dc976d38c05a4ab7eafd686ad0fcf17bd01032409eaee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5576148df454de0fb3d15ea9de93736aab907ca8cf09669c3c25fde79ec6fad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47fdc413501c45a2d6e531a1427fd6df3e9d2c57e3d60098b31fbfd695d98a0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d36d269996169583e4e181aac060491640e74df94385276752f460c408273cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6545ec908eecb35a4405050dd77343f0acd4ec7d54ec55853a926e00b7f37f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://efe40e27f5a0b564251e0f62c8b242039a04e92beaa2f621047a67d94afda3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efe40e27f5a0b564251e0f62c8b242039a04e92beaa2f621047a67d94afda3e5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T14:06:48Z\\\",\\\"message\\\":\\\"flector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 14:06:48.167820 6099 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 14:06:48.167887 6099 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 14:06:48.168346 6099 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0217 14:06:48.168371 6099 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0217 14:06:48.168384 6099 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0217 14:06:48.168403 6099 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0217 14:06:48.168427 6099 factory.go:656] Stopping watch factory\\\\nI0217 14:06:48.168443 6099 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0217 14:06:48.168450 6099 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0217 14:06:48.168457 6099 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0217 14:06:48.168463 6099 handler.go:208] Removed *v1.EgressIP ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8d2edfc6a88391d034f310661e09eb58503f33f015b5e067654db06e8b152e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81e6d07edc7d7857cb288980bc7e9d5002234c2fe642df6798611b150b6d66e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81e6d07edc7d7857cb288980bc7e9d5002234c2fe642df6798611b150b6d66e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gfznp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:49Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:49 crc kubenswrapper[4836]: I0217 14:06:49.027026 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:49 crc kubenswrapper[4836]: I0217 14:06:49.027072 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:49 crc kubenswrapper[4836]: I0217 14:06:49.027085 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:49 crc kubenswrapper[4836]: I0217 14:06:49.027102 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:49 crc kubenswrapper[4836]: I0217 14:06:49.027116 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:49Z","lastTransitionTime":"2026-02-17T14:06:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:49 crc kubenswrapper[4836]: I0217 14:06:49.039321 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15e7e5f3-c249-4609-937e-ffdc78580880\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://281c6f51b95a96d557a29b449759275131f2fde30c4814e7692edfff96442e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bc90fa5f9363034d51c35721493655d10f8f8d5fad4dda86ca4c882963fb7a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edaaa277a27d7e7b9e8b42acaa9ee04fc0933b440441ad3f8abee458e96945c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a13a546a929cc67fa3a708f23186f65c3dbc28afd67a421a66c732b8308e4177\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a13a546a929cc67fa3a708f23186f65c3dbc28afd67a421a66c732b8308e4177\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:06:30.193193 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:06:30.195418 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-478710080/tls.crt::/tmp/serving-cert-478710080/tls.key\\\\\\\"\\\\nI0217 14:06:35.690345 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:06:35.701676 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:06:35.701714 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:06:35.701739 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:06:35.701745 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:06:35.709986 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 14:06:35.710011 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:06:35.710016 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:06:35.710020 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:06:35.710023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:06:35.710027 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:06:35.710030 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 14:06:35.710186 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 14:06:35.711336 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b36202875b19f7dafcfde168e006d9d47571de30240d5886e5e0920b74b609b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf6a761df18c5e7699f2ec87d4b97f19da71f588cb12eff8e541140a361fa905\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf6a761df18c5e7699f2ec87d4b97f19da71f588cb12eff8e541140a361fa905\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:49Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:49 crc kubenswrapper[4836]: I0217 14:06:49.055981 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc30b1c397bb81c58abc71fb2c6d71c94b9a0f7962d8278edc4f65f1a1c64ef8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:49Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:49 crc kubenswrapper[4836]: I0217 14:06:49.070217 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:49Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:49 crc kubenswrapper[4836]: I0217 14:06:49.084615 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:49Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:49 crc kubenswrapper[4836]: I0217 14:06:49.099862 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t7845" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eeaa6bd-bab3-4310-9522-747924f2e825\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54ef47d3331023c2e4524dc2c8958b4ed4e2a499eb4d5f084f70cd4afcca1d32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6778ba6ea7cfd1d378cae6dc4330542c35cedaff8e820a2158571f542e0ec55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6778ba6ea7cfd1d378cae6dc4330542c35cedaff8e820a2158571f542e0ec55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14eab60dcf798778aa77f01af0307c7a5ef44846ceaf5a12e4a5f529b812aaa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14eab60dcf798778aa77f01af0307c7a5ef44846ceaf5a12e4a5f529b812aaa4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd65cf675b0d34fc5ea309bdcd17ef648bbf1c0ee6e2ece25f459d8d2403387c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd65cf675b0d34fc5ea309bdcd17ef648bbf1c0ee6e2ece25f459d8d2403387c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b15d92528d05fd8bcab26df5eeb85616b44e750574cc8effa1f34c39dc329bf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b15d92528d05fd8bcab26df5eeb85616b44e750574cc8effa1f34c39dc329bf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ddc9616f0a81f7623eb10b0cdb194656ad856548f814df2e6cebdb2419293f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ddc9616f0a81f7623eb10b0cdb194656ad856548f814df2e6cebdb2419293f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a96c0c25b8539ae8bb660010bbcd7d147a150bfb653dc725a34bc7e4e4b949b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a96c0c25b8539ae8bb660010bbcd7d147a150bfb653dc725a34bc7e4e4b949b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t7845\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:49Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:49 crc kubenswrapper[4836]: I0217 14:06:49.111449 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7nmc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b98ed9fc-ca9f-49c8-b92d-c5b58df2ce3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6dh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6dh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7nmc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:49Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:49 crc kubenswrapper[4836]: I0217 14:06:49.129352 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:49 crc kubenswrapper[4836]: I0217 14:06:49.129390 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:49 crc kubenswrapper[4836]: I0217 14:06:49.129399 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:49 crc kubenswrapper[4836]: I0217 14:06:49.129412 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:49 crc kubenswrapper[4836]: I0217 14:06:49.129420 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:49Z","lastTransitionTime":"2026-02-17T14:06:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:49 crc kubenswrapper[4836]: I0217 14:06:49.231531 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:49 crc kubenswrapper[4836]: I0217 14:06:49.231810 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:49 crc kubenswrapper[4836]: I0217 14:06:49.231901 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:49 crc kubenswrapper[4836]: I0217 14:06:49.231994 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:49 crc kubenswrapper[4836]: I0217 14:06:49.232112 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:49Z","lastTransitionTime":"2026-02-17T14:06:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:49 crc kubenswrapper[4836]: I0217 14:06:49.289698 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 17 14:06:49 crc kubenswrapper[4836]: I0217 14:06:49.301162 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b98ed9fc-ca9f-49c8-b92d-c5b58df2ce3a-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-7nmc8\" (UID: \"b98ed9fc-ca9f-49c8-b92d-c5b58df2ce3a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7nmc8" Feb 17 14:06:49 crc kubenswrapper[4836]: I0217 14:06:49.335124 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:49 crc kubenswrapper[4836]: I0217 14:06:49.335185 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:49 crc kubenswrapper[4836]: I0217 14:06:49.335203 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:49 crc kubenswrapper[4836]: I0217 14:06:49.335226 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:49 crc kubenswrapper[4836]: I0217 14:06:49.335245 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:49Z","lastTransitionTime":"2026-02-17T14:06:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:49 crc kubenswrapper[4836]: I0217 14:06:49.438535 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:49 crc kubenswrapper[4836]: I0217 14:06:49.438615 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:49 crc kubenswrapper[4836]: I0217 14:06:49.438627 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:49 crc kubenswrapper[4836]: I0217 14:06:49.438653 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:49 crc kubenswrapper[4836]: I0217 14:06:49.438669 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:49Z","lastTransitionTime":"2026-02-17T14:06:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:49 crc kubenswrapper[4836]: I0217 14:06:49.500380 4836 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 07:12:28.745568249 +0000 UTC Feb 17 14:06:49 crc kubenswrapper[4836]: I0217 14:06:49.541160 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:49 crc kubenswrapper[4836]: I0217 14:06:49.541212 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:49 crc kubenswrapper[4836]: I0217 14:06:49.541224 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:49 crc kubenswrapper[4836]: I0217 14:06:49.541247 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:49 crc kubenswrapper[4836]: I0217 14:06:49.541259 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:49Z","lastTransitionTime":"2026-02-17T14:06:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:49 crc kubenswrapper[4836]: I0217 14:06:49.568007 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:06:49 crc kubenswrapper[4836]: I0217 14:06:49.568035 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:06:49 crc kubenswrapper[4836]: I0217 14:06:49.568086 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:06:49 crc kubenswrapper[4836]: E0217 14:06:49.568177 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:06:49 crc kubenswrapper[4836]: E0217 14:06:49.568342 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:06:49 crc kubenswrapper[4836]: E0217 14:06:49.568518 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:06:49 crc kubenswrapper[4836]: I0217 14:06:49.643654 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:49 crc kubenswrapper[4836]: I0217 14:06:49.643689 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:49 crc kubenswrapper[4836]: I0217 14:06:49.643698 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:49 crc kubenswrapper[4836]: I0217 14:06:49.643714 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:49 crc kubenswrapper[4836]: I0217 14:06:49.643724 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:49Z","lastTransitionTime":"2026-02-17T14:06:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:49 crc kubenswrapper[4836]: I0217 14:06:49.746262 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:49 crc kubenswrapper[4836]: I0217 14:06:49.746331 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:49 crc kubenswrapper[4836]: I0217 14:06:49.746343 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:49 crc kubenswrapper[4836]: I0217 14:06:49.746357 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:49 crc kubenswrapper[4836]: I0217 14:06:49.746367 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:49Z","lastTransitionTime":"2026-02-17T14:06:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:49 crc kubenswrapper[4836]: I0217 14:06:49.849140 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:49 crc kubenswrapper[4836]: I0217 14:06:49.849196 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:49 crc kubenswrapper[4836]: I0217 14:06:49.849207 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:49 crc kubenswrapper[4836]: I0217 14:06:49.849225 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:49 crc kubenswrapper[4836]: I0217 14:06:49.849238 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:49Z","lastTransitionTime":"2026-02-17T14:06:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:49 crc kubenswrapper[4836]: I0217 14:06:49.877840 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gfznp_67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e/ovnkube-controller/1.log" Feb 17 14:06:49 crc kubenswrapper[4836]: I0217 14:06:49.878535 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gfznp_67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e/ovnkube-controller/0.log" Feb 17 14:06:49 crc kubenswrapper[4836]: I0217 14:06:49.881557 4836 generic.go:334] "Generic (PLEG): container finished" podID="67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e" containerID="b66e2b9fc807500c9a546e089579ae9af2485f2fa86b58e90b5979b4d18a052f" exitCode=1 Feb 17 14:06:49 crc kubenswrapper[4836]: I0217 14:06:49.881624 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" event={"ID":"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e","Type":"ContainerDied","Data":"b66e2b9fc807500c9a546e089579ae9af2485f2fa86b58e90b5979b4d18a052f"} Feb 17 14:06:49 crc kubenswrapper[4836]: I0217 14:06:49.881682 4836 scope.go:117] "RemoveContainer" containerID="efe40e27f5a0b564251e0f62c8b242039a04e92beaa2f621047a67d94afda3e5" Feb 17 14:06:49 crc kubenswrapper[4836]: I0217 14:06:49.882827 4836 scope.go:117] "RemoveContainer" containerID="b66e2b9fc807500c9a546e089579ae9af2485f2fa86b58e90b5979b4d18a052f" Feb 17 14:06:49 crc kubenswrapper[4836]: E0217 14:06:49.883089 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-gfznp_openshift-ovn-kubernetes(67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" podUID="67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e" Feb 17 14:06:49 crc kubenswrapper[4836]: I0217 14:06:49.901783 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a443a5a-83a8-4154-94b7-ace6b324f461\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8787a45a4a94da65a0f67f1a402bfa526fa57631259981ded7ec3569fa977f36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6196237bf10370f0d43ac7ba96f1fbc861d8690734cf883081f67616e6047dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21927a2142504e88fd8bb2a254d804c0e13f6a42e81e16c02205292de3de3155\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c21e5f00c6fb44be82d2a907c0f9d17ea47071615370135f0e94a62f1eb7840\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:49Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:49 crc kubenswrapper[4836]: I0217 14:06:49.919210 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c76cc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"592aa549-1b1b-441e-93e4-0821e05ff2b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d56329a484e41f44539eb8ca01b461cc6f79eebf756350b0f49c47c0431e8abc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8vh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c76cc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:49Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:49 crc kubenswrapper[4836]: I0217 14:06:49.934903 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"895a19c9-a3f0-4a15-aa19-19347121388c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bce78b27c441ccb16f0fd2489cbe3e38be0f3d330fcd2b602f62a8ca0aa8616b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99tf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c10f7b69881ea75bfa81905a42379fed8daf48e788b5f2787c522a52d28f58cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99tf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bkk9g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:49Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:49 crc kubenswrapper[4836]: I0217 14:06:49.944787 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-c4txt"] Feb 17 14:06:49 crc kubenswrapper[4836]: I0217 14:06:49.945352 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c4txt" Feb 17 14:06:49 crc kubenswrapper[4836]: E0217 14:06:49.945425 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c4txt" podUID="8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c" Feb 17 14:06:49 crc kubenswrapper[4836]: I0217 14:06:49.951961 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vt5sw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d1f430-35ed-4c4e-a797-d7a0a5a45266\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25f1e7f83b2916db4faac1afe1f5107375c0e1674ff1d6f90f1025a9920111e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqtsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vt5sw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:49Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:49 crc kubenswrapper[4836]: I0217 14:06:49.952339 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:49 crc kubenswrapper[4836]: I0217 14:06:49.952399 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:49 crc kubenswrapper[4836]: I0217 14:06:49.952411 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:49 crc kubenswrapper[4836]: I0217 14:06:49.952432 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:49 crc kubenswrapper[4836]: I0217 14:06:49.952444 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:49Z","lastTransitionTime":"2026-02-17T14:06:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:49 crc kubenswrapper[4836]: I0217 14:06:49.967939 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 17 14:06:49 crc kubenswrapper[4836]: I0217 14:06:49.972211 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7nmc8" Feb 17 14:06:49 crc kubenswrapper[4836]: I0217 14:06:49.982052 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"243121c7-6c63-4df6-a6a6-1ef6770e1125\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0143c73054039d5957c29bdf62217900d803ba2e0046c09dcff149b27fc94995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18b1d7908740d818568e489caf3447ab0f5768c180cc6d443adc67414e51cd07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e054e9ec9a1f8d54b704be11ff2e55c9aaad9d717b1817aa553cfbf2cc6ac16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a5d1fd5d2de253417bcb1e4edadc2363a1c259769418eb2b2aab3ff8253de2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fbf3d3daf43d0da96ad0c1340b3c72815adceec24d0ed5fc97f965d1d93e7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://854199187b81a3d6692a05073c35985feb14993f76f4ac86149c2b6bac635e4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://854199187b81a3d6692a05073c35985feb14993f76f4ac86149c2b6bac635e4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67b26f0d65128004f7645ac1f54a11bc56909269cb9b9337e447b0d5bf783c4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67b26f0d65128004f7645ac1f54a11bc56909269cb9b9337e447b0d5bf783c4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e9fc4a71369f80c06598341729c7db0cb6e0f33d70b0be64470bc6fd40d1cce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9fc4a71369f80c06598341729c7db0cb6e0f33d70b0be64470bc6fd40d1cce2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:49Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:49 crc kubenswrapper[4836]: I0217 14:06:49.998532 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b942ef1c58993ae6344d4aef5fcfe21434b5f3b3282bdf8070560fa8973a3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77390b24cb4a223b8eb6a46db49a4a88b68f87aa0b4b134b45a552c180dec5b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:49Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:49 crc kubenswrapper[4836]: I0217 14:06:49.998946 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g78bt\" (UniqueName: \"kubernetes.io/projected/8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c-kube-api-access-g78bt\") pod \"network-metrics-daemon-c4txt\" (UID: \"8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c\") " pod="openshift-multus/network-metrics-daemon-c4txt" Feb 17 14:06:49 crc kubenswrapper[4836]: I0217 14:06:49.999230 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c-metrics-certs\") pod \"network-metrics-daemon-c4txt\" (UID: \"8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c\") " pod="openshift-multus/network-metrics-daemon-c4txt" Feb 17 14:06:50 crc kubenswrapper[4836]: I0217 14:06:50.013254 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:50Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:50 crc kubenswrapper[4836]: I0217 14:06:50.026642 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd6efd6a58b6d90897fa902c8d0c8082ecec2899f16adc3e56b98f0bc5c4640d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:50Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:50 crc kubenswrapper[4836]: I0217 14:06:50.041903 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:50Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:50 crc kubenswrapper[4836]: I0217 14:06:50.056647 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:50Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:50 crc kubenswrapper[4836]: I0217 14:06:50.056778 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:50 crc kubenswrapper[4836]: I0217 14:06:50.056836 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:50 crc kubenswrapper[4836]: I0217 14:06:50.056849 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:50 crc kubenswrapper[4836]: I0217 14:06:50.056871 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:50 crc kubenswrapper[4836]: I0217 14:06:50.056886 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:50Z","lastTransitionTime":"2026-02-17T14:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:50 crc kubenswrapper[4836]: I0217 14:06:50.067542 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jlz6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f87e91ef-e64c-45a5-9bd5-cc6537e51b1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e03fa870790f606ebf6db3bac825e59b209b7a2bd6cb22264597b30211fbc2fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vbzg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jlz6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:50Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:50 crc kubenswrapper[4836]: I0217 14:06:50.087414 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebae2d4e3375fda616b0ea16c0c86a5c1c36004e7d22873a289ff37874bbc74b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb05318b17722c3f731dc976d38c05a4ab7eafd686ad0fcf17bd01032409eaee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5576148df454de0fb3d15ea9de93736aab907ca8cf09669c3c25fde79ec6fad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47fdc413501c45a2d6e531a1427fd6df3e9d2c57e3d60098b31fbfd695d98a0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d36d269996169583e4e181aac060491640e74df94385276752f460c408273cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6545ec908eecb35a4405050dd77343f0acd4ec7d54ec55853a926e00b7f37f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b66e2b9fc807500c9a546e089579ae9af2485f2fa86b58e90b5979b4d18a052f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efe40e27f5a0b564251e0f62c8b242039a04e92beaa2f621047a67d94afda3e5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T14:06:48Z\\\",\\\"message\\\":\\\"flector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 14:06:48.167820 6099 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 14:06:48.167887 6099 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 14:06:48.168346 6099 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0217 14:06:48.168371 6099 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0217 14:06:48.168384 6099 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0217 14:06:48.168403 6099 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0217 14:06:48.168427 6099 factory.go:656] Stopping watch factory\\\\nI0217 14:06:48.168443 6099 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0217 14:06:48.168450 6099 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0217 14:06:48.168457 6099 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0217 14:06:48.168463 6099 handler.go:208] Removed *v1.EgressIP ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:43Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b66e2b9fc807500c9a546e089579ae9af2485f2fa86b58e90b5979b4d18a052f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T14:06:49Z\\\",\\\"message\\\":\\\"rics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0217 14:06:49.751267 6265 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI0217 14:06:49.751271 6265 obj_retry.go:303] Retry object setup: *v1.Pod openshift-dns/node-resolver-vt5sw\\\\nI0217 14:06:49.751281 6265 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI0217 14:06:49.751311 6265 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI0217 14:06:49.751325 6265 ovn.go:134] Ensuring zone local for Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g in node crc\\\\nI0217 14:06:49.751329 6265 obj_retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nI0217 14:06:49.751338 6265 obj_retry.go:365] Adding new object: *v1.Pod openshift-etcd/etcd-crc\\\\nF0217 14:06:49.751346 6265 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8d2edfc6a88391d034f310661e09eb58503f33f015b5e067654db06e8b152e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81e6d07edc7d7857cb288980bc7e9d5002234c2fe642df6798611b150b6d66e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81e6d07edc7d7857cb288980bc7e9d5002234c2fe642df6798611b150b6d66e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gfznp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:50Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:50 crc kubenswrapper[4836]: I0217 14:06:50.100765 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g78bt\" (UniqueName: \"kubernetes.io/projected/8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c-kube-api-access-g78bt\") pod \"network-metrics-daemon-c4txt\" (UID: \"8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c\") " pod="openshift-multus/network-metrics-daemon-c4txt" Feb 17 14:06:50 crc kubenswrapper[4836]: I0217 14:06:50.100895 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c-metrics-certs\") pod \"network-metrics-daemon-c4txt\" (UID: \"8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c\") " pod="openshift-multus/network-metrics-daemon-c4txt" Feb 17 14:06:50 crc kubenswrapper[4836]: E0217 14:06:50.101074 4836 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 14:06:50 crc kubenswrapper[4836]: E0217 14:06:50.101230 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c-metrics-certs podName:8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c nodeName:}" failed. No retries permitted until 2026-02-17 14:06:50.601200393 +0000 UTC m=+36.944128662 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c-metrics-certs") pod "network-metrics-daemon-c4txt" (UID: "8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 14:06:50 crc kubenswrapper[4836]: I0217 14:06:50.102980 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15e7e5f3-c249-4609-937e-ffdc78580880\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://281c6f51b95a96d557a29b449759275131f2fde30c4814e7692edfff96442e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bc90fa5f9363034d51c35721493655d10f8f8d5fad4dda86ca4c882963fb7a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edaaa277a27d7e7b9e8b42acaa9ee04fc0933b440441ad3f8abee458e96945c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a13a546a929cc67fa3a708f23186f65c3dbc28afd67a421a66c732b8308e4177\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a13a546a929cc67fa3a708f23186f65c3dbc28afd67a421a66c732b8308e4177\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:06:30.193193 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:06:30.195418 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-478710080/tls.crt::/tmp/serving-cert-478710080/tls.key\\\\\\\"\\\\nI0217 14:06:35.690345 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:06:35.701676 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:06:35.701714 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:06:35.701739 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:06:35.701745 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:06:35.709986 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 14:06:35.710011 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:06:35.710016 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:06:35.710020 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:06:35.710023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:06:35.710027 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:06:35.710030 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 14:06:35.710186 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 14:06:35.711336 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b36202875b19f7dafcfde168e006d9d47571de30240d5886e5e0920b74b609b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf6a761df18c5e7699f2ec87d4b97f19da71f588cb12eff8e541140a361fa905\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf6a761df18c5e7699f2ec87d4b97f19da71f588cb12eff8e541140a361fa905\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:50Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:50 crc kubenswrapper[4836]: I0217 14:06:50.118025 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc30b1c397bb81c58abc71fb2c6d71c94b9a0f7962d8278edc4f65f1a1c64ef8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:50Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:50 crc kubenswrapper[4836]: I0217 14:06:50.120125 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g78bt\" (UniqueName: \"kubernetes.io/projected/8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c-kube-api-access-g78bt\") pod \"network-metrics-daemon-c4txt\" (UID: \"8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c\") " pod="openshift-multus/network-metrics-daemon-c4txt" Feb 17 14:06:50 crc kubenswrapper[4836]: I0217 14:06:50.135684 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t7845" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eeaa6bd-bab3-4310-9522-747924f2e825\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54ef47d3331023c2e4524dc2c8958b4ed4e2a499eb4d5f084f70cd4afcca1d32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6778ba6ea7cfd1d378cae6dc4330542c35cedaff8e820a2158571f542e0ec55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6778ba6ea7cfd1d378cae6dc4330542c35cedaff8e820a2158571f542e0ec55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14eab60dcf798778aa77f01af0307c7a5ef44846ceaf5a12e4a5f529b812aaa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14eab60dcf798778aa77f01af0307c7a5ef44846ceaf5a12e4a5f529b812aaa4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd65cf675b0d34fc5ea309bdcd17ef648bbf1c0ee6e2ece25f459d8d2403387c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd65cf675b0d34fc5ea309bdcd17ef648bbf1c0ee6e2ece25f459d8d2403387c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b15d92528d05fd8bcab26df5eeb85616b44e750574cc8effa1f34c39dc329bf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b15d92528d05fd8bcab26df5eeb85616b44e750574cc8effa1f34c39dc329bf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ddc9616f0a81f7623eb10b0cdb194656ad856548f814df2e6cebdb2419293f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ddc9616f0a81f7623eb10b0cdb194656ad856548f814df2e6cebdb2419293f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a96c0c25b8539ae8bb660010bbcd7d147a150bfb653dc725a34bc7e4e4b949b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a96c0c25b8539ae8bb660010bbcd7d147a150bfb653dc725a34bc7e4e4b949b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t7845\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:50Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:50 crc kubenswrapper[4836]: I0217 14:06:50.146351 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7nmc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b98ed9fc-ca9f-49c8-b92d-c5b58df2ce3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6dh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6dh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7nmc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:50Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:50 crc kubenswrapper[4836]: I0217 14:06:50.158200 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"895a19c9-a3f0-4a15-aa19-19347121388c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bce78b27c441ccb16f0fd2489cbe3e38be0f3d330fcd2b602f62a8ca0aa8616b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99tf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c10f7b69881ea75bfa81905a42379fed8daf48e788b5f2787c522a52d28f58cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99tf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bkk9g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:50Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:50 crc kubenswrapper[4836]: I0217 14:06:50.159668 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:50 crc kubenswrapper[4836]: I0217 14:06:50.159697 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:50 crc kubenswrapper[4836]: I0217 14:06:50.159705 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:50 crc kubenswrapper[4836]: I0217 14:06:50.159719 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:50 crc kubenswrapper[4836]: I0217 14:06:50.159730 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:50Z","lastTransitionTime":"2026-02-17T14:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:50 crc kubenswrapper[4836]: I0217 14:06:50.175896 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"243121c7-6c63-4df6-a6a6-1ef6770e1125\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0143c73054039d5957c29bdf62217900d803ba2e0046c09dcff149b27fc94995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18b1d7908740d818568e489caf3447ab0f5768c180cc6d443adc67414e51cd07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e054e9ec9a1f8d54b704be11ff2e55c9aaad9d717b1817aa553cfbf2cc6ac16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a5d1fd5d2de253417bcb1e4edadc2363a1c259769418eb2b2aab3ff8253de2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fbf3d3daf43d0da96ad0c1340b3c72815adceec24d0ed5fc97f965d1d93e7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://854199187b81a3d6692a05073c35985feb14993f76f4ac86149c2b6bac635e4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://854199187b81a3d6692a05073c35985feb14993f76f4ac86149c2b6bac635e4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67b26f0d65128004f7645ac1f54a11bc56909269cb9b9337e447b0d5bf783c4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67b26f0d65128004f7645ac1f54a11bc56909269cb9b9337e447b0d5bf783c4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e9fc4a71369f80c06598341729c7db0cb6e0f33d70b0be64470bc6fd40d1cce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9fc4a71369f80c06598341729c7db0cb6e0f33d70b0be64470bc6fd40d1cce2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:50Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:50 crc kubenswrapper[4836]: I0217 14:06:50.189004 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b942ef1c58993ae6344d4aef5fcfe21434b5f3b3282bdf8070560fa8973a3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77390b24cb4a223b8eb6a46db49a4a88b68f87aa0b4b134b45a552c180dec5b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:50Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:50 crc kubenswrapper[4836]: I0217 14:06:50.201415 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:50Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:50 crc kubenswrapper[4836]: I0217 14:06:50.213852 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd6efd6a58b6d90897fa902c8d0c8082ecec2899f16adc3e56b98f0bc5c4640d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:50Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:50 crc kubenswrapper[4836]: I0217 14:06:50.225964 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vt5sw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d1f430-35ed-4c4e-a797-d7a0a5a45266\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25f1e7f83b2916db4faac1afe1f5107375c0e1674ff1d6f90f1025a9920111e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqtsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vt5sw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:50Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:50 crc kubenswrapper[4836]: I0217 14:06:50.238414 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:50Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:50 crc kubenswrapper[4836]: I0217 14:06:50.250064 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jlz6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f87e91ef-e64c-45a5-9bd5-cc6537e51b1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e03fa870790f606ebf6db3bac825e59b209b7a2bd6cb22264597b30211fbc2fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vbzg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jlz6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:50Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:50 crc kubenswrapper[4836]: I0217 14:06:50.263076 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:50 crc kubenswrapper[4836]: I0217 14:06:50.263132 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:50 crc kubenswrapper[4836]: I0217 14:06:50.263142 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:50 crc kubenswrapper[4836]: I0217 14:06:50.263155 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:50 crc kubenswrapper[4836]: I0217 14:06:50.263163 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:50Z","lastTransitionTime":"2026-02-17T14:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:50 crc kubenswrapper[4836]: I0217 14:06:50.267801 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebae2d4e3375fda616b0ea16c0c86a5c1c36004e7d22873a289ff37874bbc74b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb05318b17722c3f731dc976d38c05a4ab7eafd686ad0fcf17bd01032409eaee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5576148df454de0fb3d15ea9de93736aab907ca8cf09669c3c25fde79ec6fad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47fdc413501c45a2d6e531a1427fd6df3e9d2c57e3d60098b31fbfd695d98a0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d36d269996169583e4e181aac060491640e74df94385276752f460c408273cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6545ec908eecb35a4405050dd77343f0acd4ec7d54ec55853a926e00b7f37f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b66e2b9fc807500c9a546e089579ae9af2485f2fa86b58e90b5979b4d18a052f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efe40e27f5a0b564251e0f62c8b242039a04e92beaa2f621047a67d94afda3e5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T14:06:48Z\\\",\\\"message\\\":\\\"flector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 14:06:48.167820 6099 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 14:06:48.167887 6099 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 14:06:48.168346 6099 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0217 14:06:48.168371 6099 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0217 14:06:48.168384 6099 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0217 14:06:48.168403 6099 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0217 14:06:48.168427 6099 factory.go:656] Stopping watch factory\\\\nI0217 14:06:48.168443 6099 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0217 14:06:48.168450 6099 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0217 14:06:48.168457 6099 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0217 14:06:48.168463 6099 handler.go:208] Removed *v1.EgressIP ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:43Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b66e2b9fc807500c9a546e089579ae9af2485f2fa86b58e90b5979b4d18a052f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T14:06:49Z\\\",\\\"message\\\":\\\"rics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0217 14:06:49.751267 6265 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI0217 14:06:49.751271 6265 obj_retry.go:303] Retry object setup: *v1.Pod openshift-dns/node-resolver-vt5sw\\\\nI0217 14:06:49.751281 6265 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI0217 14:06:49.751311 6265 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI0217 14:06:49.751325 6265 ovn.go:134] Ensuring zone local for Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g in node crc\\\\nI0217 14:06:49.751329 6265 obj_retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nI0217 14:06:49.751338 6265 obj_retry.go:365] Adding new object: *v1.Pod openshift-etcd/etcd-crc\\\\nF0217 14:06:49.751346 6265 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8d2edfc6a88391d034f310661e09eb58503f33f015b5e067654db06e8b152e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81e6d07edc7d7857cb288980bc7e9d5002234c2fe642df6798611b150b6d66e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81e6d07edc7d7857cb288980bc7e9d5002234c2fe642df6798611b150b6d66e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gfznp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:50Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:50 crc kubenswrapper[4836]: I0217 14:06:50.279516 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15e7e5f3-c249-4609-937e-ffdc78580880\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://281c6f51b95a96d557a29b449759275131f2fde30c4814e7692edfff96442e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bc90fa5f9363034d51c35721493655d10f8f8d5fad4dda86ca4c882963fb7a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edaaa277a27d7e7b9e8b42acaa9ee04fc0933b440441ad3f8abee458e96945c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a13a546a929cc67fa3a708f23186f65c3dbc28afd67a421a66c732b8308e4177\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a13a546a929cc67fa3a708f23186f65c3dbc28afd67a421a66c732b8308e4177\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:06:30.193193 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:06:30.195418 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-478710080/tls.crt::/tmp/serving-cert-478710080/tls.key\\\\\\\"\\\\nI0217 14:06:35.690345 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:06:35.701676 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:06:35.701714 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:06:35.701739 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:06:35.701745 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:06:35.709986 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 14:06:35.710011 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:06:35.710016 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:06:35.710020 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:06:35.710023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:06:35.710027 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:06:35.710030 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 14:06:35.710186 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 14:06:35.711336 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b36202875b19f7dafcfde168e006d9d47571de30240d5886e5e0920b74b609b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf6a761df18c5e7699f2ec87d4b97f19da71f588cb12eff8e541140a361fa905\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf6a761df18c5e7699f2ec87d4b97f19da71f588cb12eff8e541140a361fa905\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:50Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:50 crc kubenswrapper[4836]: I0217 14:06:50.292907 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc30b1c397bb81c58abc71fb2c6d71c94b9a0f7962d8278edc4f65f1a1c64ef8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:50Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:50 crc kubenswrapper[4836]: I0217 14:06:50.304135 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:50Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:50 crc kubenswrapper[4836]: I0217 14:06:50.316846 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t7845" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eeaa6bd-bab3-4310-9522-747924f2e825\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54ef47d3331023c2e4524dc2c8958b4ed4e2a499eb4d5f084f70cd4afcca1d32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6778ba6ea7cfd1d378cae6dc4330542c35cedaff8e820a2158571f542e0ec55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6778ba6ea7cfd1d378cae6dc4330542c35cedaff8e820a2158571f542e0ec55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14eab60dcf798778aa77f01af0307c7a5ef44846ceaf5a12e4a5f529b812aaa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14eab60dcf798778aa77f01af0307c7a5ef44846ceaf5a12e4a5f529b812aaa4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd65cf675b0d34fc5ea309bdcd17ef648bbf1c0ee6e2ece25f459d8d2403387c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd65cf675b0d34fc5ea309bdcd17ef648bbf1c0ee6e2ece25f459d8d2403387c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b15d92528d05fd8bcab26df5eeb85616b44e750574cc8effa1f34c39dc329bf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b15d92528d05fd8bcab26df5eeb85616b44e750574cc8effa1f34c39dc329bf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ddc9616f0a81f7623eb10b0cdb194656ad856548f814df2e6cebdb2419293f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ddc9616f0a81f7623eb10b0cdb194656ad856548f814df2e6cebdb2419293f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a96c0c25b8539ae8bb660010bbcd7d147a150bfb653dc725a34bc7e4e4b949b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a96c0c25b8539ae8bb660010bbcd7d147a150bfb653dc725a34bc7e4e4b949b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t7845\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:50Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:50 crc kubenswrapper[4836]: I0217 14:06:50.328027 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7nmc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b98ed9fc-ca9f-49c8-b92d-c5b58df2ce3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6dh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6dh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7nmc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:50Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:50 crc kubenswrapper[4836]: I0217 14:06:50.337994 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-c4txt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g78bt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g78bt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-c4txt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:50Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:50 crc kubenswrapper[4836]: I0217 14:06:50.348816 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a443a5a-83a8-4154-94b7-ace6b324f461\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8787a45a4a94da65a0f67f1a402bfa526fa57631259981ded7ec3569fa977f36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6196237bf10370f0d43ac7ba96f1fbc861d8690734cf883081f67616e6047dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21927a2142504e88fd8bb2a254d804c0e13f6a42e81e16c02205292de3de3155\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c21e5f00c6fb44be82d2a907c0f9d17ea47071615370135f0e94a62f1eb7840\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:50Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:50 crc kubenswrapper[4836]: I0217 14:06:50.362719 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c76cc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"592aa549-1b1b-441e-93e4-0821e05ff2b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d56329a484e41f44539eb8ca01b461cc6f79eebf756350b0f49c47c0431e8abc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8vh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c76cc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:50Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:50 crc kubenswrapper[4836]: I0217 14:06:50.365396 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:50 crc kubenswrapper[4836]: I0217 14:06:50.365467 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:50 crc kubenswrapper[4836]: I0217 14:06:50.365490 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:50 crc kubenswrapper[4836]: I0217 14:06:50.365523 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:50 crc kubenswrapper[4836]: I0217 14:06:50.365547 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:50Z","lastTransitionTime":"2026-02-17T14:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:50 crc kubenswrapper[4836]: I0217 14:06:50.468018 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:50 crc kubenswrapper[4836]: I0217 14:06:50.468084 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:50 crc kubenswrapper[4836]: I0217 14:06:50.468098 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:50 crc kubenswrapper[4836]: I0217 14:06:50.468110 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:50 crc kubenswrapper[4836]: I0217 14:06:50.468121 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:50Z","lastTransitionTime":"2026-02-17T14:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:50 crc kubenswrapper[4836]: I0217 14:06:50.500656 4836 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-07 05:26:52.995139649 +0000 UTC Feb 17 14:06:50 crc kubenswrapper[4836]: I0217 14:06:50.574112 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:50 crc kubenswrapper[4836]: I0217 14:06:50.574176 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:50 crc kubenswrapper[4836]: I0217 14:06:50.574194 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:50 crc kubenswrapper[4836]: I0217 14:06:50.574213 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:50 crc kubenswrapper[4836]: I0217 14:06:50.574229 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:50Z","lastTransitionTime":"2026-02-17T14:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:50 crc kubenswrapper[4836]: I0217 14:06:50.606225 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c-metrics-certs\") pod \"network-metrics-daemon-c4txt\" (UID: \"8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c\") " pod="openshift-multus/network-metrics-daemon-c4txt" Feb 17 14:06:50 crc kubenswrapper[4836]: E0217 14:06:50.606524 4836 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 14:06:50 crc kubenswrapper[4836]: E0217 14:06:50.606727 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c-metrics-certs podName:8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c nodeName:}" failed. No retries permitted until 2026-02-17 14:06:51.606691804 +0000 UTC m=+37.949620223 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c-metrics-certs") pod "network-metrics-daemon-c4txt" (UID: "8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 14:06:50 crc kubenswrapper[4836]: I0217 14:06:50.677056 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:50 crc kubenswrapper[4836]: I0217 14:06:50.677097 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:50 crc kubenswrapper[4836]: I0217 14:06:50.677107 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:50 crc kubenswrapper[4836]: I0217 14:06:50.677123 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:50 crc kubenswrapper[4836]: I0217 14:06:50.677134 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:50Z","lastTransitionTime":"2026-02-17T14:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:50 crc kubenswrapper[4836]: I0217 14:06:50.780358 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:50 crc kubenswrapper[4836]: I0217 14:06:50.780403 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:50 crc kubenswrapper[4836]: I0217 14:06:50.780420 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:50 crc kubenswrapper[4836]: I0217 14:06:50.780436 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:50 crc kubenswrapper[4836]: I0217 14:06:50.780448 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:50Z","lastTransitionTime":"2026-02-17T14:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:50 crc kubenswrapper[4836]: I0217 14:06:50.882957 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:50 crc kubenswrapper[4836]: I0217 14:06:50.882994 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:50 crc kubenswrapper[4836]: I0217 14:06:50.883005 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:50 crc kubenswrapper[4836]: I0217 14:06:50.883021 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:50 crc kubenswrapper[4836]: I0217 14:06:50.883032 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:50Z","lastTransitionTime":"2026-02-17T14:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:50 crc kubenswrapper[4836]: I0217 14:06:50.888663 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gfznp_67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e/ovnkube-controller/1.log" Feb 17 14:06:50 crc kubenswrapper[4836]: I0217 14:06:50.894315 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7nmc8" event={"ID":"b98ed9fc-ca9f-49c8-b92d-c5b58df2ce3a","Type":"ContainerStarted","Data":"61f8fdf114b2f1ca00e55833ea4eeb033f97b9fa48a3308c4db3c7dbc617c3b9"} Feb 17 14:06:50 crc kubenswrapper[4836]: I0217 14:06:50.894366 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7nmc8" event={"ID":"b98ed9fc-ca9f-49c8-b92d-c5b58df2ce3a","Type":"ContainerStarted","Data":"ac6f7d7722015b1a0d40a119c1c74e8a37a7e96e735f56a93ee2fde68d0e10f5"} Feb 17 14:06:50 crc kubenswrapper[4836]: I0217 14:06:50.894382 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7nmc8" event={"ID":"b98ed9fc-ca9f-49c8-b92d-c5b58df2ce3a","Type":"ContainerStarted","Data":"9008936532e387fa5da7596a6b296d2fa252df36d3f613f719407f505026a1e4"} Feb 17 14:06:50 crc kubenswrapper[4836]: I0217 14:06:50.912287 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c76cc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"592aa549-1b1b-441e-93e4-0821e05ff2b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d56329a484e41f44539eb8ca01b461cc6f79eebf756350b0f49c47c0431e8abc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8vh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c76cc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:50Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:50 crc kubenswrapper[4836]: I0217 14:06:50.927974 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a443a5a-83a8-4154-94b7-ace6b324f461\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8787a45a4a94da65a0f67f1a402bfa526fa57631259981ded7ec3569fa977f36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6196237bf10370f0d43ac7ba96f1fbc861d8690734cf883081f67616e6047dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21927a2142504e88fd8bb2a254d804c0e13f6a42e81e16c02205292de3de3155\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c21e5f00c6fb44be82d2a907c0f9d17ea47071615370135f0e94a62f1eb7840\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:50Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:50 crc kubenswrapper[4836]: I0217 14:06:50.938894 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"895a19c9-a3f0-4a15-aa19-19347121388c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bce78b27c441ccb16f0fd2489cbe3e38be0f3d330fcd2b602f62a8ca0aa8616b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99tf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c10f7b69881ea75bfa81905a42379fed8daf48e788b5f2787c522a52d28f58cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99tf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bkk9g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:50Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:50 crc kubenswrapper[4836]: I0217 14:06:50.949505 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b942ef1c58993ae6344d4aef5fcfe21434b5f3b3282bdf8070560fa8973a3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77390b24cb4a223b8eb6a46db49a4a88b68f87aa0b4b134b45a552c180dec5b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:50Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:50 crc kubenswrapper[4836]: I0217 14:06:50.962359 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:50Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:50 crc kubenswrapper[4836]: I0217 14:06:50.977055 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd6efd6a58b6d90897fa902c8d0c8082ecec2899f16adc3e56b98f0bc5c4640d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:50Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:50 crc kubenswrapper[4836]: I0217 14:06:50.986257 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:50 crc kubenswrapper[4836]: I0217 14:06:50.986315 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:50 crc kubenswrapper[4836]: I0217 14:06:50.986327 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:50 crc kubenswrapper[4836]: I0217 14:06:50.986370 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:50 crc kubenswrapper[4836]: I0217 14:06:50.986385 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:50Z","lastTransitionTime":"2026-02-17T14:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:50 crc kubenswrapper[4836]: I0217 14:06:50.987880 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vt5sw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d1f430-35ed-4c4e-a797-d7a0a5a45266\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25f1e7f83b2916db4faac1afe1f5107375c0e1674ff1d6f90f1025a9920111e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqtsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vt5sw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:50Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:51 crc kubenswrapper[4836]: I0217 14:06:51.006704 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"243121c7-6c63-4df6-a6a6-1ef6770e1125\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0143c73054039d5957c29bdf62217900d803ba2e0046c09dcff149b27fc94995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18b1d7908740d818568e489caf3447ab0f5768c180cc6d443adc67414e51cd07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e054e9ec9a1f8d54b704be11ff2e55c9aaad9d717b1817aa553cfbf2cc6ac16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a5d1fd5d2de253417bcb1e4edadc2363a1c259769418eb2b2aab3ff8253de2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fbf3d3daf43d0da96ad0c1340b3c72815adceec24d0ed5fc97f965d1d93e7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://854199187b81a3d6692a05073c35985feb14993f76f4ac86149c2b6bac635e4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://854199187b81a3d6692a05073c35985feb14993f76f4ac86149c2b6bac635e4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67b26f0d65128004f7645ac1f54a11bc56909269cb9b9337e447b0d5bf783c4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67b26f0d65128004f7645ac1f54a11bc56909269cb9b9337e447b0d5bf783c4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e9fc4a71369f80c06598341729c7db0cb6e0f33d70b0be64470bc6fd40d1cce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9fc4a71369f80c06598341729c7db0cb6e0f33d70b0be64470bc6fd40d1cce2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:51Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:51 crc kubenswrapper[4836]: I0217 14:06:51.028476 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15e7e5f3-c249-4609-937e-ffdc78580880\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://281c6f51b95a96d557a29b449759275131f2fde30c4814e7692edfff96442e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bc90fa5f9363034d51c35721493655d10f8f8d5fad4dda86ca4c882963fb7a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edaaa277a27d7e7b9e8b42acaa9ee04fc0933b440441ad3f8abee458e96945c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a13a546a929cc67fa3a708f23186f65c3dbc28afd67a421a66c732b8308e4177\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a13a546a929cc67fa3a708f23186f65c3dbc28afd67a421a66c732b8308e4177\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:06:30.193193 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:06:30.195418 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-478710080/tls.crt::/tmp/serving-cert-478710080/tls.key\\\\\\\"\\\\nI0217 14:06:35.690345 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:06:35.701676 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:06:35.701714 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:06:35.701739 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:06:35.701745 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:06:35.709986 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 14:06:35.710011 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:06:35.710016 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:06:35.710020 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:06:35.710023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:06:35.710027 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:06:35.710030 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 14:06:35.710186 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 14:06:35.711336 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b36202875b19f7dafcfde168e006d9d47571de30240d5886e5e0920b74b609b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf6a761df18c5e7699f2ec87d4b97f19da71f588cb12eff8e541140a361fa905\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf6a761df18c5e7699f2ec87d4b97f19da71f588cb12eff8e541140a361fa905\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:51Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:51 crc kubenswrapper[4836]: I0217 14:06:51.049379 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc30b1c397bb81c58abc71fb2c6d71c94b9a0f7962d8278edc4f65f1a1c64ef8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:51Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:51 crc kubenswrapper[4836]: I0217 14:06:51.065986 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:51Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:51 crc kubenswrapper[4836]: I0217 14:06:51.085190 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:51Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:51 crc kubenswrapper[4836]: I0217 14:06:51.089410 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:51 crc kubenswrapper[4836]: I0217 14:06:51.089436 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:51 crc kubenswrapper[4836]: I0217 14:06:51.089443 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:51 crc kubenswrapper[4836]: I0217 14:06:51.089455 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:51 crc kubenswrapper[4836]: I0217 14:06:51.089464 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:51Z","lastTransitionTime":"2026-02-17T14:06:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:51 crc kubenswrapper[4836]: I0217 14:06:51.098073 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jlz6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f87e91ef-e64c-45a5-9bd5-cc6537e51b1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e03fa870790f606ebf6db3bac825e59b209b7a2bd6cb22264597b30211fbc2fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vbzg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jlz6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:51Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:51 crc kubenswrapper[4836]: I0217 14:06:51.114724 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebae2d4e3375fda616b0ea16c0c86a5c1c36004e7d22873a289ff37874bbc74b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb05318b17722c3f731dc976d38c05a4ab7eafd686ad0fcf17bd01032409eaee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5576148df454de0fb3d15ea9de93736aab907ca8cf09669c3c25fde79ec6fad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47fdc413501c45a2d6e531a1427fd6df3e9d2c57e3d60098b31fbfd695d98a0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d36d269996169583e4e181aac060491640e74df94385276752f460c408273cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6545ec908eecb35a4405050dd77343f0acd4ec7d54ec55853a926e00b7f37f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b66e2b9fc807500c9a546e089579ae9af2485f2fa86b58e90b5979b4d18a052f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efe40e27f5a0b564251e0f62c8b242039a04e92beaa2f621047a67d94afda3e5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T14:06:48Z\\\",\\\"message\\\":\\\"flector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 14:06:48.167820 6099 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 14:06:48.167887 6099 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 14:06:48.168346 6099 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0217 14:06:48.168371 6099 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0217 14:06:48.168384 6099 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0217 14:06:48.168403 6099 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0217 14:06:48.168427 6099 factory.go:656] Stopping watch factory\\\\nI0217 14:06:48.168443 6099 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0217 14:06:48.168450 6099 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0217 14:06:48.168457 6099 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0217 14:06:48.168463 6099 handler.go:208] Removed *v1.EgressIP ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:43Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b66e2b9fc807500c9a546e089579ae9af2485f2fa86b58e90b5979b4d18a052f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T14:06:49Z\\\",\\\"message\\\":\\\"rics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0217 14:06:49.751267 6265 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI0217 14:06:49.751271 6265 obj_retry.go:303] Retry object setup: *v1.Pod openshift-dns/node-resolver-vt5sw\\\\nI0217 14:06:49.751281 6265 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI0217 14:06:49.751311 6265 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI0217 14:06:49.751325 6265 ovn.go:134] Ensuring zone local for Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g in node crc\\\\nI0217 14:06:49.751329 6265 obj_retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nI0217 14:06:49.751338 6265 obj_retry.go:365] Adding new object: *v1.Pod openshift-etcd/etcd-crc\\\\nF0217 14:06:49.751346 6265 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8d2edfc6a88391d034f310661e09eb58503f33f015b5e067654db06e8b152e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81e6d07edc7d7857cb288980bc7e9d5002234c2fe642df6798611b150b6d66e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81e6d07edc7d7857cb288980bc7e9d5002234c2fe642df6798611b150b6d66e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gfznp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:51Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:51 crc kubenswrapper[4836]: I0217 14:06:51.126334 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7nmc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b98ed9fc-ca9f-49c8-b92d-c5b58df2ce3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac6f7d7722015b1a0d40a119c1c74e8a37a7e96e735f56a93ee2fde68d0e10f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6dh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61f8fdf114b2f1ca00e55833ea4eeb033f97b9fa48a3308c4db3c7dbc617c3b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6dh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7nmc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:51Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:51 crc kubenswrapper[4836]: I0217 14:06:51.138142 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-c4txt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g78bt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g78bt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-c4txt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:51Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:51 crc kubenswrapper[4836]: I0217 14:06:51.189611 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t7845" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eeaa6bd-bab3-4310-9522-747924f2e825\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54ef47d3331023c2e4524dc2c8958b4ed4e2a499eb4d5f084f70cd4afcca1d32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6778ba6ea7cfd1d378cae6dc4330542c35cedaff8e820a2158571f542e0ec55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6778ba6ea7cfd1d378cae6dc4330542c35cedaff8e820a2158571f542e0ec55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14eab60dcf798778aa77f01af0307c7a5ef44846ceaf5a12e4a5f529b812aaa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14eab60dcf798778aa77f01af0307c7a5ef44846ceaf5a12e4a5f529b812aaa4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd65cf675b0d34fc5ea309bdcd17ef648bbf1c0ee6e2ece25f459d8d2403387c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd65cf675b0d34fc5ea309bdcd17ef648bbf1c0ee6e2ece25f459d8d2403387c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b15d92528d05fd8bcab26df5eeb85616b44e750574cc8effa1f34c39dc329bf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b15d92528d05fd8bcab26df5eeb85616b44e750574cc8effa1f34c39dc329bf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ddc9616f0a81f7623eb10b0cdb194656ad856548f814df2e6cebdb2419293f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ddc9616f0a81f7623eb10b0cdb194656ad856548f814df2e6cebdb2419293f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a96c0c25b8539ae8bb660010bbcd7d147a150bfb653dc725a34bc7e4e4b949b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a96c0c25b8539ae8bb660010bbcd7d147a150bfb653dc725a34bc7e4e4b949b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t7845\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:51Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:51 crc kubenswrapper[4836]: I0217 14:06:51.193517 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:51 crc kubenswrapper[4836]: I0217 14:06:51.193589 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:51 crc kubenswrapper[4836]: I0217 14:06:51.193599 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:51 crc kubenswrapper[4836]: I0217 14:06:51.193614 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:51 crc kubenswrapper[4836]: I0217 14:06:51.193630 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:51Z","lastTransitionTime":"2026-02-17T14:06:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:51 crc kubenswrapper[4836]: I0217 14:06:51.211737 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:06:51 crc kubenswrapper[4836]: I0217 14:06:51.211882 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:06:51 crc kubenswrapper[4836]: E0217 14:06:51.212026 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:07:07.211996518 +0000 UTC m=+53.554924797 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:06:51 crc kubenswrapper[4836]: I0217 14:06:51.212086 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:06:51 crc kubenswrapper[4836]: E0217 14:06:51.212124 4836 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 14:06:51 crc kubenswrapper[4836]: E0217 14:06:51.212277 4836 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 14:06:51 crc kubenswrapper[4836]: E0217 14:06:51.212319 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 14:07:07.212305146 +0000 UTC m=+53.555233415 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 14:06:51 crc kubenswrapper[4836]: E0217 14:06:51.212364 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 14:07:07.212346617 +0000 UTC m=+53.555274886 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 14:06:51 crc kubenswrapper[4836]: I0217 14:06:51.296932 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:51 crc kubenswrapper[4836]: I0217 14:06:51.296996 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:51 crc kubenswrapper[4836]: I0217 14:06:51.297012 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:51 crc kubenswrapper[4836]: I0217 14:06:51.297030 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:51 crc kubenswrapper[4836]: I0217 14:06:51.297041 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:51Z","lastTransitionTime":"2026-02-17T14:06:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:51 crc kubenswrapper[4836]: I0217 14:06:51.313607 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:06:51 crc kubenswrapper[4836]: I0217 14:06:51.313661 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:06:51 crc kubenswrapper[4836]: E0217 14:06:51.313828 4836 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 14:06:51 crc kubenswrapper[4836]: E0217 14:06:51.313867 4836 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 14:06:51 crc kubenswrapper[4836]: E0217 14:06:51.313900 4836 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 14:06:51 crc kubenswrapper[4836]: E0217 14:06:51.313945 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-17 14:07:07.313932427 +0000 UTC m=+53.656860696 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 14:06:51 crc kubenswrapper[4836]: E0217 14:06:51.313828 4836 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 14:06:51 crc kubenswrapper[4836]: E0217 14:06:51.314032 4836 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 14:06:51 crc kubenswrapper[4836]: E0217 14:06:51.314056 4836 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 14:06:51 crc kubenswrapper[4836]: E0217 14:06:51.314138 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-17 14:07:07.314116822 +0000 UTC m=+53.657045151 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 14:06:51 crc kubenswrapper[4836]: I0217 14:06:51.400108 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:51 crc kubenswrapper[4836]: I0217 14:06:51.400273 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:51 crc kubenswrapper[4836]: I0217 14:06:51.400408 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:51 crc kubenswrapper[4836]: I0217 14:06:51.400447 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:51 crc kubenswrapper[4836]: I0217 14:06:51.400470 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:51Z","lastTransitionTime":"2026-02-17T14:06:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:51 crc kubenswrapper[4836]: I0217 14:06:51.500886 4836 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 05:24:16.473979782 +0000 UTC Feb 17 14:06:51 crc kubenswrapper[4836]: I0217 14:06:51.503666 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:51 crc kubenswrapper[4836]: I0217 14:06:51.503705 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:51 crc kubenswrapper[4836]: I0217 14:06:51.503714 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:51 crc kubenswrapper[4836]: I0217 14:06:51.503728 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:51 crc kubenswrapper[4836]: I0217 14:06:51.503739 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:51Z","lastTransitionTime":"2026-02-17T14:06:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:51 crc kubenswrapper[4836]: I0217 14:06:51.567951 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:06:51 crc kubenswrapper[4836]: I0217 14:06:51.568062 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:06:51 crc kubenswrapper[4836]: I0217 14:06:51.567951 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:06:51 crc kubenswrapper[4836]: I0217 14:06:51.567990 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c4txt" Feb 17 14:06:51 crc kubenswrapper[4836]: E0217 14:06:51.568284 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:06:51 crc kubenswrapper[4836]: E0217 14:06:51.568364 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:06:51 crc kubenswrapper[4836]: E0217 14:06:51.568426 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c4txt" podUID="8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c" Feb 17 14:06:51 crc kubenswrapper[4836]: E0217 14:06:51.568481 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:06:51 crc kubenswrapper[4836]: I0217 14:06:51.605924 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:51 crc kubenswrapper[4836]: I0217 14:06:51.605974 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:51 crc kubenswrapper[4836]: I0217 14:06:51.605986 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:51 crc kubenswrapper[4836]: I0217 14:06:51.606004 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:51 crc kubenswrapper[4836]: I0217 14:06:51.606017 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:51Z","lastTransitionTime":"2026-02-17T14:06:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:51 crc kubenswrapper[4836]: I0217 14:06:51.617698 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c-metrics-certs\") pod \"network-metrics-daemon-c4txt\" (UID: \"8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c\") " pod="openshift-multus/network-metrics-daemon-c4txt" Feb 17 14:06:51 crc kubenswrapper[4836]: E0217 14:06:51.617921 4836 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 14:06:51 crc kubenswrapper[4836]: E0217 14:06:51.618052 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c-metrics-certs podName:8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c nodeName:}" failed. No retries permitted until 2026-02-17 14:06:53.618019186 +0000 UTC m=+39.960947515 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c-metrics-certs") pod "network-metrics-daemon-c4txt" (UID: "8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 14:06:51 crc kubenswrapper[4836]: I0217 14:06:51.750827 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:51 crc kubenswrapper[4836]: I0217 14:06:51.750874 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:51 crc kubenswrapper[4836]: I0217 14:06:51.750882 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:51 crc kubenswrapper[4836]: I0217 14:06:51.750896 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:51 crc kubenswrapper[4836]: I0217 14:06:51.750906 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:51Z","lastTransitionTime":"2026-02-17T14:06:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:51 crc kubenswrapper[4836]: I0217 14:06:51.852271 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:51 crc kubenswrapper[4836]: I0217 14:06:51.852342 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:51 crc kubenswrapper[4836]: I0217 14:06:51.852353 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:51 crc kubenswrapper[4836]: I0217 14:06:51.852369 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:51 crc kubenswrapper[4836]: I0217 14:06:51.852395 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:51Z","lastTransitionTime":"2026-02-17T14:06:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:51 crc kubenswrapper[4836]: E0217 14:06:51.872354 4836 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d638d470-b0e0-4be9-938f-7ec815bf6bd8\\\",\\\"systemUUID\\\":\\\"f194f106-0bf2-4b65-bcb3-5215631b39d2\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:51Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:51 crc kubenswrapper[4836]: I0217 14:06:51.876371 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:51 crc kubenswrapper[4836]: I0217 14:06:51.876466 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:51 crc kubenswrapper[4836]: I0217 14:06:51.876489 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:51 crc kubenswrapper[4836]: I0217 14:06:51.876516 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:51 crc kubenswrapper[4836]: I0217 14:06:51.876541 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:51Z","lastTransitionTime":"2026-02-17T14:06:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:51 crc kubenswrapper[4836]: E0217 14:06:51.888181 4836 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d638d470-b0e0-4be9-938f-7ec815bf6bd8\\\",\\\"systemUUID\\\":\\\"f194f106-0bf2-4b65-bcb3-5215631b39d2\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:51Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:51 crc kubenswrapper[4836]: I0217 14:06:51.891871 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:51 crc kubenswrapper[4836]: I0217 14:06:51.891931 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:51 crc kubenswrapper[4836]: I0217 14:06:51.891947 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:51 crc kubenswrapper[4836]: I0217 14:06:51.891968 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:51 crc kubenswrapper[4836]: I0217 14:06:51.892179 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:51Z","lastTransitionTime":"2026-02-17T14:06:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:51 crc kubenswrapper[4836]: E0217 14:06:51.904479 4836 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d638d470-b0e0-4be9-938f-7ec815bf6bd8\\\",\\\"systemUUID\\\":\\\"f194f106-0bf2-4b65-bcb3-5215631b39d2\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:51Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:51 crc kubenswrapper[4836]: I0217 14:06:51.908561 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:51 crc kubenswrapper[4836]: I0217 14:06:51.908591 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:51 crc kubenswrapper[4836]: I0217 14:06:51.908599 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:51 crc kubenswrapper[4836]: I0217 14:06:51.908612 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:51 crc kubenswrapper[4836]: I0217 14:06:51.908622 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:51Z","lastTransitionTime":"2026-02-17T14:06:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:51 crc kubenswrapper[4836]: E0217 14:06:51.920019 4836 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d638d470-b0e0-4be9-938f-7ec815bf6bd8\\\",\\\"systemUUID\\\":\\\"f194f106-0bf2-4b65-bcb3-5215631b39d2\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:51Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:51 crc kubenswrapper[4836]: I0217 14:06:51.923459 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:51 crc kubenswrapper[4836]: I0217 14:06:51.923497 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:51 crc kubenswrapper[4836]: I0217 14:06:51.923505 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:51 crc kubenswrapper[4836]: I0217 14:06:51.923519 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:51 crc kubenswrapper[4836]: I0217 14:06:51.923552 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:51Z","lastTransitionTime":"2026-02-17T14:06:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:51 crc kubenswrapper[4836]: E0217 14:06:51.937080 4836 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:06:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d638d470-b0e0-4be9-938f-7ec815bf6bd8\\\",\\\"systemUUID\\\":\\\"f194f106-0bf2-4b65-bcb3-5215631b39d2\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:51Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:51 crc kubenswrapper[4836]: E0217 14:06:51.937250 4836 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 17 14:06:51 crc kubenswrapper[4836]: I0217 14:06:51.938926 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:51 crc kubenswrapper[4836]: I0217 14:06:51.938965 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:51 crc kubenswrapper[4836]: I0217 14:06:51.938974 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:51 crc kubenswrapper[4836]: I0217 14:06:51.938988 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:51 crc kubenswrapper[4836]: I0217 14:06:51.938999 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:51Z","lastTransitionTime":"2026-02-17T14:06:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:52 crc kubenswrapper[4836]: I0217 14:06:52.042505 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:52 crc kubenswrapper[4836]: I0217 14:06:52.042584 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:52 crc kubenswrapper[4836]: I0217 14:06:52.042602 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:52 crc kubenswrapper[4836]: I0217 14:06:52.042625 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:52 crc kubenswrapper[4836]: I0217 14:06:52.042642 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:52Z","lastTransitionTime":"2026-02-17T14:06:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:52 crc kubenswrapper[4836]: I0217 14:06:52.145015 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:52 crc kubenswrapper[4836]: I0217 14:06:52.145065 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:52 crc kubenswrapper[4836]: I0217 14:06:52.145076 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:52 crc kubenswrapper[4836]: I0217 14:06:52.145092 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:52 crc kubenswrapper[4836]: I0217 14:06:52.145111 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:52Z","lastTransitionTime":"2026-02-17T14:06:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:52 crc kubenswrapper[4836]: I0217 14:06:52.247607 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:52 crc kubenswrapper[4836]: I0217 14:06:52.247646 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:52 crc kubenswrapper[4836]: I0217 14:06:52.247655 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:52 crc kubenswrapper[4836]: I0217 14:06:52.247667 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:52 crc kubenswrapper[4836]: I0217 14:06:52.247676 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:52Z","lastTransitionTime":"2026-02-17T14:06:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:52 crc kubenswrapper[4836]: I0217 14:06:52.349914 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:52 crc kubenswrapper[4836]: I0217 14:06:52.349972 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:52 crc kubenswrapper[4836]: I0217 14:06:52.349981 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:52 crc kubenswrapper[4836]: I0217 14:06:52.349995 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:52 crc kubenswrapper[4836]: I0217 14:06:52.350005 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:52Z","lastTransitionTime":"2026-02-17T14:06:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:52 crc kubenswrapper[4836]: I0217 14:06:52.453839 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:52 crc kubenswrapper[4836]: I0217 14:06:52.453920 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:52 crc kubenswrapper[4836]: I0217 14:06:52.453943 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:52 crc kubenswrapper[4836]: I0217 14:06:52.453973 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:52 crc kubenswrapper[4836]: I0217 14:06:52.453995 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:52Z","lastTransitionTime":"2026-02-17T14:06:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:52 crc kubenswrapper[4836]: I0217 14:06:52.501963 4836 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-19 20:29:06.084279604 +0000 UTC Feb 17 14:06:52 crc kubenswrapper[4836]: I0217 14:06:52.556945 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:52 crc kubenswrapper[4836]: I0217 14:06:52.556998 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:52 crc kubenswrapper[4836]: I0217 14:06:52.557010 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:52 crc kubenswrapper[4836]: I0217 14:06:52.557024 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:52 crc kubenswrapper[4836]: I0217 14:06:52.557033 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:52Z","lastTransitionTime":"2026-02-17T14:06:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:52 crc kubenswrapper[4836]: I0217 14:06:52.660244 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:52 crc kubenswrapper[4836]: I0217 14:06:52.660308 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:52 crc kubenswrapper[4836]: I0217 14:06:52.660318 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:52 crc kubenswrapper[4836]: I0217 14:06:52.660331 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:52 crc kubenswrapper[4836]: I0217 14:06:52.660348 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:52Z","lastTransitionTime":"2026-02-17T14:06:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:52 crc kubenswrapper[4836]: I0217 14:06:52.763203 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:52 crc kubenswrapper[4836]: I0217 14:06:52.763248 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:52 crc kubenswrapper[4836]: I0217 14:06:52.763260 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:52 crc kubenswrapper[4836]: I0217 14:06:52.763275 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:52 crc kubenswrapper[4836]: I0217 14:06:52.763285 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:52Z","lastTransitionTime":"2026-02-17T14:06:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:52 crc kubenswrapper[4836]: I0217 14:06:52.867165 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:52 crc kubenswrapper[4836]: I0217 14:06:52.867248 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:52 crc kubenswrapper[4836]: I0217 14:06:52.867276 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:52 crc kubenswrapper[4836]: I0217 14:06:52.867357 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:52 crc kubenswrapper[4836]: I0217 14:06:52.867382 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:52Z","lastTransitionTime":"2026-02-17T14:06:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:52 crc kubenswrapper[4836]: I0217 14:06:52.970564 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:52 crc kubenswrapper[4836]: I0217 14:06:52.970623 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:52 crc kubenswrapper[4836]: I0217 14:06:52.970633 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:52 crc kubenswrapper[4836]: I0217 14:06:52.970651 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:52 crc kubenswrapper[4836]: I0217 14:06:52.970661 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:52Z","lastTransitionTime":"2026-02-17T14:06:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:53 crc kubenswrapper[4836]: I0217 14:06:53.073092 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:53 crc kubenswrapper[4836]: I0217 14:06:53.073143 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:53 crc kubenswrapper[4836]: I0217 14:06:53.073154 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:53 crc kubenswrapper[4836]: I0217 14:06:53.073174 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:53 crc kubenswrapper[4836]: I0217 14:06:53.073188 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:53Z","lastTransitionTime":"2026-02-17T14:06:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:53 crc kubenswrapper[4836]: I0217 14:06:53.175815 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:53 crc kubenswrapper[4836]: I0217 14:06:53.175869 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:53 crc kubenswrapper[4836]: I0217 14:06:53.175881 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:53 crc kubenswrapper[4836]: I0217 14:06:53.175900 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:53 crc kubenswrapper[4836]: I0217 14:06:53.175912 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:53Z","lastTransitionTime":"2026-02-17T14:06:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:53 crc kubenswrapper[4836]: I0217 14:06:53.278626 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:53 crc kubenswrapper[4836]: I0217 14:06:53.278673 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:53 crc kubenswrapper[4836]: I0217 14:06:53.278684 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:53 crc kubenswrapper[4836]: I0217 14:06:53.278701 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:53 crc kubenswrapper[4836]: I0217 14:06:53.278715 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:53Z","lastTransitionTime":"2026-02-17T14:06:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:53 crc kubenswrapper[4836]: I0217 14:06:53.381968 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:53 crc kubenswrapper[4836]: I0217 14:06:53.382005 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:53 crc kubenswrapper[4836]: I0217 14:06:53.382016 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:53 crc kubenswrapper[4836]: I0217 14:06:53.382035 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:53 crc kubenswrapper[4836]: I0217 14:06:53.382048 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:53Z","lastTransitionTime":"2026-02-17T14:06:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:53 crc kubenswrapper[4836]: I0217 14:06:53.485086 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:53 crc kubenswrapper[4836]: I0217 14:06:53.485121 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:53 crc kubenswrapper[4836]: I0217 14:06:53.485130 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:53 crc kubenswrapper[4836]: I0217 14:06:53.485223 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:53 crc kubenswrapper[4836]: I0217 14:06:53.485234 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:53Z","lastTransitionTime":"2026-02-17T14:06:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:53 crc kubenswrapper[4836]: I0217 14:06:53.502282 4836 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-28 05:23:06.422771797 +0000 UTC Feb 17 14:06:53 crc kubenswrapper[4836]: I0217 14:06:53.567499 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:06:53 crc kubenswrapper[4836]: I0217 14:06:53.567553 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:06:53 crc kubenswrapper[4836]: I0217 14:06:53.567507 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:06:53 crc kubenswrapper[4836]: E0217 14:06:53.567717 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:06:53 crc kubenswrapper[4836]: I0217 14:06:53.567823 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c4txt" Feb 17 14:06:53 crc kubenswrapper[4836]: E0217 14:06:53.568023 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:06:53 crc kubenswrapper[4836]: E0217 14:06:53.568168 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:06:53 crc kubenswrapper[4836]: E0217 14:06:53.568262 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c4txt" podUID="8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c" Feb 17 14:06:53 crc kubenswrapper[4836]: I0217 14:06:53.569284 4836 scope.go:117] "RemoveContainer" containerID="a13a546a929cc67fa3a708f23186f65c3dbc28afd67a421a66c732b8308e4177" Feb 17 14:06:53 crc kubenswrapper[4836]: I0217 14:06:53.590565 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:53 crc kubenswrapper[4836]: I0217 14:06:53.590633 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:53 crc kubenswrapper[4836]: I0217 14:06:53.590649 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:53 crc kubenswrapper[4836]: I0217 14:06:53.590675 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:53 crc kubenswrapper[4836]: I0217 14:06:53.590700 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:53Z","lastTransitionTime":"2026-02-17T14:06:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:53 crc kubenswrapper[4836]: I0217 14:06:53.671845 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c-metrics-certs\") pod \"network-metrics-daemon-c4txt\" (UID: \"8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c\") " pod="openshift-multus/network-metrics-daemon-c4txt" Feb 17 14:06:53 crc kubenswrapper[4836]: E0217 14:06:53.672103 4836 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 14:06:53 crc kubenswrapper[4836]: E0217 14:06:53.672178 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c-metrics-certs podName:8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c nodeName:}" failed. No retries permitted until 2026-02-17 14:06:57.672158782 +0000 UTC m=+44.015087051 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c-metrics-certs") pod "network-metrics-daemon-c4txt" (UID: "8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 14:06:53 crc kubenswrapper[4836]: I0217 14:06:53.693002 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:53 crc kubenswrapper[4836]: I0217 14:06:53.693053 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:53 crc kubenswrapper[4836]: I0217 14:06:53.693066 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:53 crc kubenswrapper[4836]: I0217 14:06:53.693083 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:53 crc kubenswrapper[4836]: I0217 14:06:53.693095 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:53Z","lastTransitionTime":"2026-02-17T14:06:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:53 crc kubenswrapper[4836]: I0217 14:06:53.795343 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:53 crc kubenswrapper[4836]: I0217 14:06:53.795390 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:53 crc kubenswrapper[4836]: I0217 14:06:53.795401 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:53 crc kubenswrapper[4836]: I0217 14:06:53.795419 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:53 crc kubenswrapper[4836]: I0217 14:06:53.795431 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:53Z","lastTransitionTime":"2026-02-17T14:06:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:53 crc kubenswrapper[4836]: I0217 14:06:53.898533 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:53 crc kubenswrapper[4836]: I0217 14:06:53.898589 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:53 crc kubenswrapper[4836]: I0217 14:06:53.898606 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:53 crc kubenswrapper[4836]: I0217 14:06:53.898630 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:53 crc kubenswrapper[4836]: I0217 14:06:53.898646 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:53Z","lastTransitionTime":"2026-02-17T14:06:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:53 crc kubenswrapper[4836]: I0217 14:06:53.907054 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Feb 17 14:06:53 crc kubenswrapper[4836]: I0217 14:06:53.910291 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"14e1fa543737ef273b75b2cffc78dc2edf4d1673c2c4095edda7c40db8238f2c"} Feb 17 14:06:53 crc kubenswrapper[4836]: I0217 14:06:53.910903 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:06:53 crc kubenswrapper[4836]: I0217 14:06:53.947179 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"243121c7-6c63-4df6-a6a6-1ef6770e1125\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0143c73054039d5957c29bdf62217900d803ba2e0046c09dcff149b27fc94995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18b1d7908740d818568e489caf3447ab0f5768c180cc6d443adc67414e51cd07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e054e9ec9a1f8d54b704be11ff2e55c9aaad9d717b1817aa553cfbf2cc6ac16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a5d1fd5d2de253417bcb1e4edadc2363a1c259769418eb2b2aab3ff8253de2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fbf3d3daf43d0da96ad0c1340b3c72815adceec24d0ed5fc97f965d1d93e7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://854199187b81a3d6692a05073c35985feb14993f76f4ac86149c2b6bac635e4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://854199187b81a3d6692a05073c35985feb14993f76f4ac86149c2b6bac635e4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67b26f0d65128004f7645ac1f54a11bc56909269cb9b9337e447b0d5bf783c4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67b26f0d65128004f7645ac1f54a11bc56909269cb9b9337e447b0d5bf783c4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e9fc4a71369f80c06598341729c7db0cb6e0f33d70b0be64470bc6fd40d1cce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9fc4a71369f80c06598341729c7db0cb6e0f33d70b0be64470bc6fd40d1cce2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:53Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:53 crc kubenswrapper[4836]: I0217 14:06:53.966605 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b942ef1c58993ae6344d4aef5fcfe21434b5f3b3282bdf8070560fa8973a3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77390b24cb4a223b8eb6a46db49a4a88b68f87aa0b4b134b45a552c180dec5b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:53Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:53 crc kubenswrapper[4836]: I0217 14:06:53.988524 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:53Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:54 crc kubenswrapper[4836]: I0217 14:06:54.001481 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:54 crc kubenswrapper[4836]: I0217 14:06:54.001525 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:54 crc kubenswrapper[4836]: I0217 14:06:54.001542 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:54 crc kubenswrapper[4836]: I0217 14:06:54.001564 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:54 crc kubenswrapper[4836]: I0217 14:06:54.001579 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:54Z","lastTransitionTime":"2026-02-17T14:06:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:54 crc kubenswrapper[4836]: I0217 14:06:54.007528 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd6efd6a58b6d90897fa902c8d0c8082ecec2899f16adc3e56b98f0bc5c4640d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:54Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:54 crc kubenswrapper[4836]: I0217 14:06:54.020958 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vt5sw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d1f430-35ed-4c4e-a797-d7a0a5a45266\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25f1e7f83b2916db4faac1afe1f5107375c0e1674ff1d6f90f1025a9920111e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqtsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vt5sw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:54Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:54 crc kubenswrapper[4836]: I0217 14:06:54.037736 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jlz6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f87e91ef-e64c-45a5-9bd5-cc6537e51b1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e03fa870790f606ebf6db3bac825e59b209b7a2bd6cb22264597b30211fbc2fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vbzg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jlz6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:54Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:54 crc kubenswrapper[4836]: I0217 14:06:54.063832 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebae2d4e3375fda616b0ea16c0c86a5c1c36004e7d22873a289ff37874bbc74b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb05318b17722c3f731dc976d38c05a4ab7eafd686ad0fcf17bd01032409eaee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5576148df454de0fb3d15ea9de93736aab907ca8cf09669c3c25fde79ec6fad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47fdc413501c45a2d6e531a1427fd6df3e9d2c57e3d60098b31fbfd695d98a0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d36d269996169583e4e181aac060491640e74df94385276752f460c408273cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6545ec908eecb35a4405050dd77343f0acd4ec7d54ec55853a926e00b7f37f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b66e2b9fc807500c9a546e089579ae9af2485f2fa86b58e90b5979b4d18a052f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efe40e27f5a0b564251e0f62c8b242039a04e92beaa2f621047a67d94afda3e5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T14:06:48Z\\\",\\\"message\\\":\\\"flector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 14:06:48.167820 6099 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 14:06:48.167887 6099 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 14:06:48.168346 6099 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0217 14:06:48.168371 6099 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0217 14:06:48.168384 6099 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0217 14:06:48.168403 6099 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0217 14:06:48.168427 6099 factory.go:656] Stopping watch factory\\\\nI0217 14:06:48.168443 6099 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0217 14:06:48.168450 6099 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0217 14:06:48.168457 6099 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0217 14:06:48.168463 6099 handler.go:208] Removed *v1.EgressIP ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:43Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b66e2b9fc807500c9a546e089579ae9af2485f2fa86b58e90b5979b4d18a052f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T14:06:49Z\\\",\\\"message\\\":\\\"rics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0217 14:06:49.751267 6265 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI0217 14:06:49.751271 6265 obj_retry.go:303] Retry object setup: *v1.Pod openshift-dns/node-resolver-vt5sw\\\\nI0217 14:06:49.751281 6265 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI0217 14:06:49.751311 6265 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI0217 14:06:49.751325 6265 ovn.go:134] Ensuring zone local for Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g in node crc\\\\nI0217 14:06:49.751329 6265 obj_retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nI0217 14:06:49.751338 6265 obj_retry.go:365] Adding new object: *v1.Pod openshift-etcd/etcd-crc\\\\nF0217 14:06:49.751346 6265 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8d2edfc6a88391d034f310661e09eb58503f33f015b5e067654db06e8b152e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81e6d07edc7d7857cb288980bc7e9d5002234c2fe642df6798611b150b6d66e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81e6d07edc7d7857cb288980bc7e9d5002234c2fe642df6798611b150b6d66e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gfznp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:54Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:54 crc kubenswrapper[4836]: I0217 14:06:54.085612 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15e7e5f3-c249-4609-937e-ffdc78580880\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://281c6f51b95a96d557a29b449759275131f2fde30c4814e7692edfff96442e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bc90fa5f9363034d51c35721493655d10f8f8d5fad4dda86ca4c882963fb7a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edaaa277a27d7e7b9e8b42acaa9ee04fc0933b440441ad3f8abee458e96945c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14e1fa543737ef273b75b2cffc78dc2edf4d1673c2c4095edda7c40db8238f2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a13a546a929cc67fa3a708f23186f65c3dbc28afd67a421a66c732b8308e4177\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:06:30.193193 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:06:30.195418 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-478710080/tls.crt::/tmp/serving-cert-478710080/tls.key\\\\\\\"\\\\nI0217 14:06:35.690345 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:06:35.701676 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:06:35.701714 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:06:35.701739 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:06:35.701745 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:06:35.709986 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 14:06:35.710011 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:06:35.710016 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:06:35.710020 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:06:35.710023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:06:35.710027 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:06:35.710030 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 14:06:35.710186 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 14:06:35.711336 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b36202875b19f7dafcfde168e006d9d47571de30240d5886e5e0920b74b609b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf6a761df18c5e7699f2ec87d4b97f19da71f588cb12eff8e541140a361fa905\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf6a761df18c5e7699f2ec87d4b97f19da71f588cb12eff8e541140a361fa905\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:54Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:54 crc kubenswrapper[4836]: I0217 14:06:54.104222 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:54 crc kubenswrapper[4836]: I0217 14:06:54.104252 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:54 crc kubenswrapper[4836]: I0217 14:06:54.104260 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:54 crc kubenswrapper[4836]: I0217 14:06:54.104274 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:54 crc kubenswrapper[4836]: I0217 14:06:54.104283 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:54Z","lastTransitionTime":"2026-02-17T14:06:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:54 crc kubenswrapper[4836]: I0217 14:06:54.105369 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc30b1c397bb81c58abc71fb2c6d71c94b9a0f7962d8278edc4f65f1a1c64ef8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:54Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:54 crc kubenswrapper[4836]: I0217 14:06:54.124387 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:54Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:54 crc kubenswrapper[4836]: I0217 14:06:54.174639 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:54Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:54 crc kubenswrapper[4836]: I0217 14:06:54.194489 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t7845" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eeaa6bd-bab3-4310-9522-747924f2e825\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54ef47d3331023c2e4524dc2c8958b4ed4e2a499eb4d5f084f70cd4afcca1d32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6778ba6ea7cfd1d378cae6dc4330542c35cedaff8e820a2158571f542e0ec55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6778ba6ea7cfd1d378cae6dc4330542c35cedaff8e820a2158571f542e0ec55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14eab60dcf798778aa77f01af0307c7a5ef44846ceaf5a12e4a5f529b812aaa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14eab60dcf798778aa77f01af0307c7a5ef44846ceaf5a12e4a5f529b812aaa4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd65cf675b0d34fc5ea309bdcd17ef648bbf1c0ee6e2ece25f459d8d2403387c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd65cf675b0d34fc5ea309bdcd17ef648bbf1c0ee6e2ece25f459d8d2403387c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b15d92528d05fd8bcab26df5eeb85616b44e750574cc8effa1f34c39dc329bf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b15d92528d05fd8bcab26df5eeb85616b44e750574cc8effa1f34c39dc329bf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ddc9616f0a81f7623eb10b0cdb194656ad856548f814df2e6cebdb2419293f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ddc9616f0a81f7623eb10b0cdb194656ad856548f814df2e6cebdb2419293f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a96c0c25b8539ae8bb660010bbcd7d147a150bfb653dc725a34bc7e4e4b949b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a96c0c25b8539ae8bb660010bbcd7d147a150bfb653dc725a34bc7e4e4b949b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t7845\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:54Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:54 crc kubenswrapper[4836]: I0217 14:06:54.207516 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7nmc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b98ed9fc-ca9f-49c8-b92d-c5b58df2ce3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac6f7d7722015b1a0d40a119c1c74e8a37a7e96e735f56a93ee2fde68d0e10f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6dh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61f8fdf114b2f1ca00e55833ea4eeb033f97b9fa48a3308c4db3c7dbc617c3b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6dh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7nmc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:54Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:54 crc kubenswrapper[4836]: I0217 14:06:54.207965 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:54 crc kubenswrapper[4836]: I0217 14:06:54.208003 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:54 crc kubenswrapper[4836]: I0217 14:06:54.208015 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:54 crc kubenswrapper[4836]: I0217 14:06:54.208034 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:54 crc kubenswrapper[4836]: I0217 14:06:54.208047 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:54Z","lastTransitionTime":"2026-02-17T14:06:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:54 crc kubenswrapper[4836]: I0217 14:06:54.221862 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-c4txt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g78bt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g78bt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-c4txt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:54Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:54 crc kubenswrapper[4836]: I0217 14:06:54.238986 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a443a5a-83a8-4154-94b7-ace6b324f461\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8787a45a4a94da65a0f67f1a402bfa526fa57631259981ded7ec3569fa977f36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6196237bf10370f0d43ac7ba96f1fbc861d8690734cf883081f67616e6047dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21927a2142504e88fd8bb2a254d804c0e13f6a42e81e16c02205292de3de3155\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c21e5f00c6fb44be82d2a907c0f9d17ea47071615370135f0e94a62f1eb7840\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:54Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:54 crc kubenswrapper[4836]: I0217 14:06:54.252174 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c76cc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"592aa549-1b1b-441e-93e4-0821e05ff2b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d56329a484e41f44539eb8ca01b461cc6f79eebf756350b0f49c47c0431e8abc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8vh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c76cc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:54Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:54 crc kubenswrapper[4836]: I0217 14:06:54.264229 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"895a19c9-a3f0-4a15-aa19-19347121388c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bce78b27c441ccb16f0fd2489cbe3e38be0f3d330fcd2b602f62a8ca0aa8616b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99tf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c10f7b69881ea75bfa81905a42379fed8daf48e788b5f2787c522a52d28f58cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99tf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bkk9g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:54Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:54 crc kubenswrapper[4836]: I0217 14:06:54.310717 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:54 crc kubenswrapper[4836]: I0217 14:06:54.310756 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:54 crc kubenswrapper[4836]: I0217 14:06:54.310765 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:54 crc kubenswrapper[4836]: I0217 14:06:54.310778 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:54 crc kubenswrapper[4836]: I0217 14:06:54.310786 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:54Z","lastTransitionTime":"2026-02-17T14:06:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:54 crc kubenswrapper[4836]: I0217 14:06:54.412743 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:54 crc kubenswrapper[4836]: I0217 14:06:54.412781 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:54 crc kubenswrapper[4836]: I0217 14:06:54.412793 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:54 crc kubenswrapper[4836]: I0217 14:06:54.412807 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:54 crc kubenswrapper[4836]: I0217 14:06:54.412817 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:54Z","lastTransitionTime":"2026-02-17T14:06:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:54 crc kubenswrapper[4836]: I0217 14:06:54.503068 4836 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-08 07:51:49.562127583 +0000 UTC Feb 17 14:06:54 crc kubenswrapper[4836]: I0217 14:06:54.514924 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:54 crc kubenswrapper[4836]: I0217 14:06:54.514966 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:54 crc kubenswrapper[4836]: I0217 14:06:54.514976 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:54 crc kubenswrapper[4836]: I0217 14:06:54.514993 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:54 crc kubenswrapper[4836]: I0217 14:06:54.515006 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:54Z","lastTransitionTime":"2026-02-17T14:06:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:54 crc kubenswrapper[4836]: I0217 14:06:54.583768 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:54Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:54 crc kubenswrapper[4836]: I0217 14:06:54.594332 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:54Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:54 crc kubenswrapper[4836]: I0217 14:06:54.604243 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jlz6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f87e91ef-e64c-45a5-9bd5-cc6537e51b1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e03fa870790f606ebf6db3bac825e59b209b7a2bd6cb22264597b30211fbc2fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vbzg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jlz6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:54Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:54 crc kubenswrapper[4836]: I0217 14:06:54.618174 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:54 crc kubenswrapper[4836]: I0217 14:06:54.618217 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:54 crc kubenswrapper[4836]: I0217 14:06:54.618228 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:54 crc kubenswrapper[4836]: I0217 14:06:54.618242 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:54 crc kubenswrapper[4836]: I0217 14:06:54.618252 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:54Z","lastTransitionTime":"2026-02-17T14:06:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:54 crc kubenswrapper[4836]: I0217 14:06:54.624996 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebae2d4e3375fda616b0ea16c0c86a5c1c36004e7d22873a289ff37874bbc74b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb05318b17722c3f731dc976d38c05a4ab7eafd686ad0fcf17bd01032409eaee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5576148df454de0fb3d15ea9de93736aab907ca8cf09669c3c25fde79ec6fad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47fdc413501c45a2d6e531a1427fd6df3e9d2c57e3d60098b31fbfd695d98a0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d36d269996169583e4e181aac060491640e74df94385276752f460c408273cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6545ec908eecb35a4405050dd77343f0acd4ec7d54ec55853a926e00b7f37f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b66e2b9fc807500c9a546e089579ae9af2485f2fa86b58e90b5979b4d18a052f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://efe40e27f5a0b564251e0f62c8b242039a04e92beaa2f621047a67d94afda3e5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T14:06:48Z\\\",\\\"message\\\":\\\"flector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 14:06:48.167820 6099 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 14:06:48.167887 6099 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 14:06:48.168346 6099 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0217 14:06:48.168371 6099 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0217 14:06:48.168384 6099 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0217 14:06:48.168403 6099 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0217 14:06:48.168427 6099 factory.go:656] Stopping watch factory\\\\nI0217 14:06:48.168443 6099 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0217 14:06:48.168450 6099 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0217 14:06:48.168457 6099 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0217 14:06:48.168463 6099 handler.go:208] Removed *v1.EgressIP ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:43Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b66e2b9fc807500c9a546e089579ae9af2485f2fa86b58e90b5979b4d18a052f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T14:06:49Z\\\",\\\"message\\\":\\\"rics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0217 14:06:49.751267 6265 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI0217 14:06:49.751271 6265 obj_retry.go:303] Retry object setup: *v1.Pod openshift-dns/node-resolver-vt5sw\\\\nI0217 14:06:49.751281 6265 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI0217 14:06:49.751311 6265 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI0217 14:06:49.751325 6265 ovn.go:134] Ensuring zone local for Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g in node crc\\\\nI0217 14:06:49.751329 6265 obj_retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nI0217 14:06:49.751338 6265 obj_retry.go:365] Adding new object: *v1.Pod openshift-etcd/etcd-crc\\\\nF0217 14:06:49.751346 6265 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8d2edfc6a88391d034f310661e09eb58503f33f015b5e067654db06e8b152e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81e6d07edc7d7857cb288980bc7e9d5002234c2fe642df6798611b150b6d66e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81e6d07edc7d7857cb288980bc7e9d5002234c2fe642df6798611b150b6d66e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gfznp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:54Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:54 crc kubenswrapper[4836]: I0217 14:06:54.642334 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15e7e5f3-c249-4609-937e-ffdc78580880\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://281c6f51b95a96d557a29b449759275131f2fde30c4814e7692edfff96442e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bc90fa5f9363034d51c35721493655d10f8f8d5fad4dda86ca4c882963fb7a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edaaa277a27d7e7b9e8b42acaa9ee04fc0933b440441ad3f8abee458e96945c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14e1fa543737ef273b75b2cffc78dc2edf4d1673c2c4095edda7c40db8238f2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a13a546a929cc67fa3a708f23186f65c3dbc28afd67a421a66c732b8308e4177\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:06:30.193193 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:06:30.195418 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-478710080/tls.crt::/tmp/serving-cert-478710080/tls.key\\\\\\\"\\\\nI0217 14:06:35.690345 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:06:35.701676 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:06:35.701714 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:06:35.701739 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:06:35.701745 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:06:35.709986 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 14:06:35.710011 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:06:35.710016 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:06:35.710020 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:06:35.710023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:06:35.710027 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:06:35.710030 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 14:06:35.710186 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 14:06:35.711336 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b36202875b19f7dafcfde168e006d9d47571de30240d5886e5e0920b74b609b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf6a761df18c5e7699f2ec87d4b97f19da71f588cb12eff8e541140a361fa905\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf6a761df18c5e7699f2ec87d4b97f19da71f588cb12eff8e541140a361fa905\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:54Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:54 crc kubenswrapper[4836]: I0217 14:06:54.655050 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc30b1c397bb81c58abc71fb2c6d71c94b9a0f7962d8278edc4f65f1a1c64ef8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:54Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:54 crc kubenswrapper[4836]: I0217 14:06:54.672642 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t7845" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eeaa6bd-bab3-4310-9522-747924f2e825\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54ef47d3331023c2e4524dc2c8958b4ed4e2a499eb4d5f084f70cd4afcca1d32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6778ba6ea7cfd1d378cae6dc4330542c35cedaff8e820a2158571f542e0ec55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6778ba6ea7cfd1d378cae6dc4330542c35cedaff8e820a2158571f542e0ec55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14eab60dcf798778aa77f01af0307c7a5ef44846ceaf5a12e4a5f529b812aaa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14eab60dcf798778aa77f01af0307c7a5ef44846ceaf5a12e4a5f529b812aaa4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd65cf675b0d34fc5ea309bdcd17ef648bbf1c0ee6e2ece25f459d8d2403387c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd65cf675b0d34fc5ea309bdcd17ef648bbf1c0ee6e2ece25f459d8d2403387c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b15d92528d05fd8bcab26df5eeb85616b44e750574cc8effa1f34c39dc329bf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b15d92528d05fd8bcab26df5eeb85616b44e750574cc8effa1f34c39dc329bf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ddc9616f0a81f7623eb10b0cdb194656ad856548f814df2e6cebdb2419293f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ddc9616f0a81f7623eb10b0cdb194656ad856548f814df2e6cebdb2419293f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a96c0c25b8539ae8bb660010bbcd7d147a150bfb653dc725a34bc7e4e4b949b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a96c0c25b8539ae8bb660010bbcd7d147a150bfb653dc725a34bc7e4e4b949b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t7845\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:54Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:54 crc kubenswrapper[4836]: I0217 14:06:54.687024 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7nmc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b98ed9fc-ca9f-49c8-b92d-c5b58df2ce3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac6f7d7722015b1a0d40a119c1c74e8a37a7e96e735f56a93ee2fde68d0e10f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6dh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61f8fdf114b2f1ca00e55833ea4eeb033f97b9fa48a3308c4db3c7dbc617c3b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6dh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7nmc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:54Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:54 crc kubenswrapper[4836]: I0217 14:06:54.700805 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-c4txt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g78bt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g78bt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-c4txt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:54Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:54 crc kubenswrapper[4836]: I0217 14:06:54.714472 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a443a5a-83a8-4154-94b7-ace6b324f461\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8787a45a4a94da65a0f67f1a402bfa526fa57631259981ded7ec3569fa977f36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6196237bf10370f0d43ac7ba96f1fbc861d8690734cf883081f67616e6047dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21927a2142504e88fd8bb2a254d804c0e13f6a42e81e16c02205292de3de3155\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c21e5f00c6fb44be82d2a907c0f9d17ea47071615370135f0e94a62f1eb7840\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:54Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:54 crc kubenswrapper[4836]: I0217 14:06:54.720514 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:54 crc kubenswrapper[4836]: I0217 14:06:54.720590 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:54 crc kubenswrapper[4836]: I0217 14:06:54.720604 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:54 crc kubenswrapper[4836]: I0217 14:06:54.720632 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:54 crc kubenswrapper[4836]: I0217 14:06:54.720647 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:54Z","lastTransitionTime":"2026-02-17T14:06:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:54 crc kubenswrapper[4836]: I0217 14:06:54.732560 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c76cc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"592aa549-1b1b-441e-93e4-0821e05ff2b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d56329a484e41f44539eb8ca01b461cc6f79eebf756350b0f49c47c0431e8abc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8vh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c76cc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:54Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:54 crc kubenswrapper[4836]: I0217 14:06:54.746947 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"895a19c9-a3f0-4a15-aa19-19347121388c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bce78b27c441ccb16f0fd2489cbe3e38be0f3d330fcd2b602f62a8ca0aa8616b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99tf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c10f7b69881ea75bfa81905a42379fed8daf48e788b5f2787c522a52d28f58cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99tf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bkk9g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:54Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:54 crc kubenswrapper[4836]: I0217 14:06:54.759237 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vt5sw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d1f430-35ed-4c4e-a797-d7a0a5a45266\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25f1e7f83b2916db4faac1afe1f5107375c0e1674ff1d6f90f1025a9920111e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqtsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vt5sw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:54Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:54 crc kubenswrapper[4836]: I0217 14:06:54.779535 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"243121c7-6c63-4df6-a6a6-1ef6770e1125\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0143c73054039d5957c29bdf62217900d803ba2e0046c09dcff149b27fc94995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18b1d7908740d818568e489caf3447ab0f5768c180cc6d443adc67414e51cd07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e054e9ec9a1f8d54b704be11ff2e55c9aaad9d717b1817aa553cfbf2cc6ac16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a5d1fd5d2de253417bcb1e4edadc2363a1c259769418eb2b2aab3ff8253de2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fbf3d3daf43d0da96ad0c1340b3c72815adceec24d0ed5fc97f965d1d93e7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://854199187b81a3d6692a05073c35985feb14993f76f4ac86149c2b6bac635e4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://854199187b81a3d6692a05073c35985feb14993f76f4ac86149c2b6bac635e4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67b26f0d65128004f7645ac1f54a11bc56909269cb9b9337e447b0d5bf783c4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67b26f0d65128004f7645ac1f54a11bc56909269cb9b9337e447b0d5bf783c4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e9fc4a71369f80c06598341729c7db0cb6e0f33d70b0be64470bc6fd40d1cce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9fc4a71369f80c06598341729c7db0cb6e0f33d70b0be64470bc6fd40d1cce2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:54Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:54 crc kubenswrapper[4836]: I0217 14:06:54.794003 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b942ef1c58993ae6344d4aef5fcfe21434b5f3b3282bdf8070560fa8973a3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77390b24cb4a223b8eb6a46db49a4a88b68f87aa0b4b134b45a552c180dec5b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:54Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:54 crc kubenswrapper[4836]: I0217 14:06:54.808708 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:54Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:54 crc kubenswrapper[4836]: I0217 14:06:54.820944 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd6efd6a58b6d90897fa902c8d0c8082ecec2899f16adc3e56b98f0bc5c4640d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:06:54Z is after 2025-08-24T17:21:41Z" Feb 17 14:06:54 crc kubenswrapper[4836]: I0217 14:06:54.822808 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:54 crc kubenswrapper[4836]: I0217 14:06:54.822841 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:54 crc kubenswrapper[4836]: I0217 14:06:54.822849 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:54 crc kubenswrapper[4836]: I0217 14:06:54.822862 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:54 crc kubenswrapper[4836]: I0217 14:06:54.822873 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:54Z","lastTransitionTime":"2026-02-17T14:06:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:54 crc kubenswrapper[4836]: I0217 14:06:54.928619 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:54 crc kubenswrapper[4836]: I0217 14:06:54.928678 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:54 crc kubenswrapper[4836]: I0217 14:06:54.928690 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:54 crc kubenswrapper[4836]: I0217 14:06:54.928711 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:54 crc kubenswrapper[4836]: I0217 14:06:54.928724 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:54Z","lastTransitionTime":"2026-02-17T14:06:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:55 crc kubenswrapper[4836]: I0217 14:06:55.031314 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:55 crc kubenswrapper[4836]: I0217 14:06:55.031367 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:55 crc kubenswrapper[4836]: I0217 14:06:55.031378 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:55 crc kubenswrapper[4836]: I0217 14:06:55.031397 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:55 crc kubenswrapper[4836]: I0217 14:06:55.031408 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:55Z","lastTransitionTime":"2026-02-17T14:06:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:55 crc kubenswrapper[4836]: I0217 14:06:55.134047 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:55 crc kubenswrapper[4836]: I0217 14:06:55.134086 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:55 crc kubenswrapper[4836]: I0217 14:06:55.134096 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:55 crc kubenswrapper[4836]: I0217 14:06:55.134110 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:55 crc kubenswrapper[4836]: I0217 14:06:55.134121 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:55Z","lastTransitionTime":"2026-02-17T14:06:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:55 crc kubenswrapper[4836]: I0217 14:06:55.237115 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:55 crc kubenswrapper[4836]: I0217 14:06:55.237150 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:55 crc kubenswrapper[4836]: I0217 14:06:55.237159 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:55 crc kubenswrapper[4836]: I0217 14:06:55.237176 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:55 crc kubenswrapper[4836]: I0217 14:06:55.237189 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:55Z","lastTransitionTime":"2026-02-17T14:06:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:55 crc kubenswrapper[4836]: I0217 14:06:55.340136 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:55 crc kubenswrapper[4836]: I0217 14:06:55.340196 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:55 crc kubenswrapper[4836]: I0217 14:06:55.340206 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:55 crc kubenswrapper[4836]: I0217 14:06:55.340220 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:55 crc kubenswrapper[4836]: I0217 14:06:55.340230 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:55Z","lastTransitionTime":"2026-02-17T14:06:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:55 crc kubenswrapper[4836]: I0217 14:06:55.443928 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:55 crc kubenswrapper[4836]: I0217 14:06:55.443996 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:55 crc kubenswrapper[4836]: I0217 14:06:55.444010 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:55 crc kubenswrapper[4836]: I0217 14:06:55.444029 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:55 crc kubenswrapper[4836]: I0217 14:06:55.444047 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:55Z","lastTransitionTime":"2026-02-17T14:06:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:55 crc kubenswrapper[4836]: I0217 14:06:55.504134 4836 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-28 21:09:12.861823943 +0000 UTC Feb 17 14:06:55 crc kubenswrapper[4836]: I0217 14:06:55.547652 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:55 crc kubenswrapper[4836]: I0217 14:06:55.547721 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:55 crc kubenswrapper[4836]: I0217 14:06:55.547748 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:55 crc kubenswrapper[4836]: I0217 14:06:55.547778 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:55 crc kubenswrapper[4836]: I0217 14:06:55.547805 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:55Z","lastTransitionTime":"2026-02-17T14:06:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:55 crc kubenswrapper[4836]: I0217 14:06:55.567888 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:06:55 crc kubenswrapper[4836]: I0217 14:06:55.567929 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:06:55 crc kubenswrapper[4836]: I0217 14:06:55.568034 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c4txt" Feb 17 14:06:55 crc kubenswrapper[4836]: I0217 14:06:55.568119 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:06:55 crc kubenswrapper[4836]: E0217 14:06:55.568122 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:06:55 crc kubenswrapper[4836]: E0217 14:06:55.568291 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:06:55 crc kubenswrapper[4836]: E0217 14:06:55.568423 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:06:55 crc kubenswrapper[4836]: E0217 14:06:55.568456 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c4txt" podUID="8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c" Feb 17 14:06:55 crc kubenswrapper[4836]: I0217 14:06:55.650617 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:55 crc kubenswrapper[4836]: I0217 14:06:55.650643 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:55 crc kubenswrapper[4836]: I0217 14:06:55.650651 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:55 crc kubenswrapper[4836]: I0217 14:06:55.650664 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:55 crc kubenswrapper[4836]: I0217 14:06:55.650672 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:55Z","lastTransitionTime":"2026-02-17T14:06:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:55 crc kubenswrapper[4836]: I0217 14:06:55.756958 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:55 crc kubenswrapper[4836]: I0217 14:06:55.757411 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:55 crc kubenswrapper[4836]: I0217 14:06:55.757428 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:55 crc kubenswrapper[4836]: I0217 14:06:55.757447 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:55 crc kubenswrapper[4836]: I0217 14:06:55.757458 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:55Z","lastTransitionTime":"2026-02-17T14:06:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:55 crc kubenswrapper[4836]: I0217 14:06:55.860330 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:55 crc kubenswrapper[4836]: I0217 14:06:55.860415 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:55 crc kubenswrapper[4836]: I0217 14:06:55.860433 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:55 crc kubenswrapper[4836]: I0217 14:06:55.860451 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:55 crc kubenswrapper[4836]: I0217 14:06:55.860463 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:55Z","lastTransitionTime":"2026-02-17T14:06:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:55 crc kubenswrapper[4836]: I0217 14:06:55.962358 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:55 crc kubenswrapper[4836]: I0217 14:06:55.962410 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:55 crc kubenswrapper[4836]: I0217 14:06:55.962421 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:55 crc kubenswrapper[4836]: I0217 14:06:55.962437 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:55 crc kubenswrapper[4836]: I0217 14:06:55.962449 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:55Z","lastTransitionTime":"2026-02-17T14:06:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:56 crc kubenswrapper[4836]: I0217 14:06:56.065076 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:56 crc kubenswrapper[4836]: I0217 14:06:56.065117 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:56 crc kubenswrapper[4836]: I0217 14:06:56.065127 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:56 crc kubenswrapper[4836]: I0217 14:06:56.065140 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:56 crc kubenswrapper[4836]: I0217 14:06:56.065149 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:56Z","lastTransitionTime":"2026-02-17T14:06:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:56 crc kubenswrapper[4836]: I0217 14:06:56.168073 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:56 crc kubenswrapper[4836]: I0217 14:06:56.168110 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:56 crc kubenswrapper[4836]: I0217 14:06:56.168118 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:56 crc kubenswrapper[4836]: I0217 14:06:56.168131 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:56 crc kubenswrapper[4836]: I0217 14:06:56.168146 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:56Z","lastTransitionTime":"2026-02-17T14:06:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:56 crc kubenswrapper[4836]: I0217 14:06:56.270357 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:56 crc kubenswrapper[4836]: I0217 14:06:56.270404 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:56 crc kubenswrapper[4836]: I0217 14:06:56.270415 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:56 crc kubenswrapper[4836]: I0217 14:06:56.270460 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:56 crc kubenswrapper[4836]: I0217 14:06:56.270477 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:56Z","lastTransitionTime":"2026-02-17T14:06:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:56 crc kubenswrapper[4836]: I0217 14:06:56.374196 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:56 crc kubenswrapper[4836]: I0217 14:06:56.374272 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:56 crc kubenswrapper[4836]: I0217 14:06:56.374345 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:56 crc kubenswrapper[4836]: I0217 14:06:56.374378 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:56 crc kubenswrapper[4836]: I0217 14:06:56.374402 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:56Z","lastTransitionTime":"2026-02-17T14:06:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:56 crc kubenswrapper[4836]: I0217 14:06:56.476980 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:56 crc kubenswrapper[4836]: I0217 14:06:56.477024 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:56 crc kubenswrapper[4836]: I0217 14:06:56.477035 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:56 crc kubenswrapper[4836]: I0217 14:06:56.477048 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:56 crc kubenswrapper[4836]: I0217 14:06:56.477058 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:56Z","lastTransitionTime":"2026-02-17T14:06:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:56 crc kubenswrapper[4836]: I0217 14:06:56.504850 4836 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-06 19:07:36.316671926 +0000 UTC Feb 17 14:06:56 crc kubenswrapper[4836]: I0217 14:06:56.579912 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:56 crc kubenswrapper[4836]: I0217 14:06:56.580000 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:56 crc kubenswrapper[4836]: I0217 14:06:56.580027 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:56 crc kubenswrapper[4836]: I0217 14:06:56.580053 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:56 crc kubenswrapper[4836]: I0217 14:06:56.580075 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:56Z","lastTransitionTime":"2026-02-17T14:06:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:56 crc kubenswrapper[4836]: I0217 14:06:56.682428 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:56 crc kubenswrapper[4836]: I0217 14:06:56.682476 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:56 crc kubenswrapper[4836]: I0217 14:06:56.682488 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:56 crc kubenswrapper[4836]: I0217 14:06:56.682505 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:56 crc kubenswrapper[4836]: I0217 14:06:56.682515 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:56Z","lastTransitionTime":"2026-02-17T14:06:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:56 crc kubenswrapper[4836]: I0217 14:06:56.784176 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:56 crc kubenswrapper[4836]: I0217 14:06:56.784225 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:56 crc kubenswrapper[4836]: I0217 14:06:56.784241 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:56 crc kubenswrapper[4836]: I0217 14:06:56.784258 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:56 crc kubenswrapper[4836]: I0217 14:06:56.784267 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:56Z","lastTransitionTime":"2026-02-17T14:06:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:56 crc kubenswrapper[4836]: I0217 14:06:56.887281 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:56 crc kubenswrapper[4836]: I0217 14:06:56.887387 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:56 crc kubenswrapper[4836]: I0217 14:06:56.887409 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:56 crc kubenswrapper[4836]: I0217 14:06:56.887433 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:56 crc kubenswrapper[4836]: I0217 14:06:56.887451 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:56Z","lastTransitionTime":"2026-02-17T14:06:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:56 crc kubenswrapper[4836]: I0217 14:06:56.989929 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:56 crc kubenswrapper[4836]: I0217 14:06:56.989994 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:56 crc kubenswrapper[4836]: I0217 14:06:56.990012 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:56 crc kubenswrapper[4836]: I0217 14:06:56.990035 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:56 crc kubenswrapper[4836]: I0217 14:06:56.990053 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:56Z","lastTransitionTime":"2026-02-17T14:06:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:57 crc kubenswrapper[4836]: I0217 14:06:57.092621 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:57 crc kubenswrapper[4836]: I0217 14:06:57.092691 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:57 crc kubenswrapper[4836]: I0217 14:06:57.092712 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:57 crc kubenswrapper[4836]: I0217 14:06:57.092736 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:57 crc kubenswrapper[4836]: I0217 14:06:57.092752 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:57Z","lastTransitionTime":"2026-02-17T14:06:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:57 crc kubenswrapper[4836]: I0217 14:06:57.195354 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:57 crc kubenswrapper[4836]: I0217 14:06:57.195412 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:57 crc kubenswrapper[4836]: I0217 14:06:57.195422 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:57 crc kubenswrapper[4836]: I0217 14:06:57.195437 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:57 crc kubenswrapper[4836]: I0217 14:06:57.195447 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:57Z","lastTransitionTime":"2026-02-17T14:06:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:57 crc kubenswrapper[4836]: I0217 14:06:57.297782 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:57 crc kubenswrapper[4836]: I0217 14:06:57.297838 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:57 crc kubenswrapper[4836]: I0217 14:06:57.297851 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:57 crc kubenswrapper[4836]: I0217 14:06:57.297870 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:57 crc kubenswrapper[4836]: I0217 14:06:57.297882 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:57Z","lastTransitionTime":"2026-02-17T14:06:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:57 crc kubenswrapper[4836]: I0217 14:06:57.401360 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:57 crc kubenswrapper[4836]: I0217 14:06:57.401426 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:57 crc kubenswrapper[4836]: I0217 14:06:57.401439 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:57 crc kubenswrapper[4836]: I0217 14:06:57.401466 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:57 crc kubenswrapper[4836]: I0217 14:06:57.401478 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:57Z","lastTransitionTime":"2026-02-17T14:06:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:57 crc kubenswrapper[4836]: I0217 14:06:57.504140 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:57 crc kubenswrapper[4836]: I0217 14:06:57.504200 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:57 crc kubenswrapper[4836]: I0217 14:06:57.504215 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:57 crc kubenswrapper[4836]: I0217 14:06:57.504233 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:57 crc kubenswrapper[4836]: I0217 14:06:57.504243 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:57Z","lastTransitionTime":"2026-02-17T14:06:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:57 crc kubenswrapper[4836]: I0217 14:06:57.505611 4836 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-12 18:39:32.111427327 +0000 UTC Feb 17 14:06:57 crc kubenswrapper[4836]: I0217 14:06:57.567650 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:06:57 crc kubenswrapper[4836]: I0217 14:06:57.567685 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c4txt" Feb 17 14:06:57 crc kubenswrapper[4836]: I0217 14:06:57.567724 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:06:57 crc kubenswrapper[4836]: E0217 14:06:57.567826 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:06:57 crc kubenswrapper[4836]: E0217 14:06:57.567885 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:06:57 crc kubenswrapper[4836]: I0217 14:06:57.567661 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:06:57 crc kubenswrapper[4836]: E0217 14:06:57.568135 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c4txt" podUID="8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c" Feb 17 14:06:57 crc kubenswrapper[4836]: E0217 14:06:57.568371 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:06:57 crc kubenswrapper[4836]: I0217 14:06:57.607242 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:57 crc kubenswrapper[4836]: I0217 14:06:57.607308 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:57 crc kubenswrapper[4836]: I0217 14:06:57.607323 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:57 crc kubenswrapper[4836]: I0217 14:06:57.607339 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:57 crc kubenswrapper[4836]: I0217 14:06:57.607351 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:57Z","lastTransitionTime":"2026-02-17T14:06:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:57 crc kubenswrapper[4836]: I0217 14:06:57.709565 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:57 crc kubenswrapper[4836]: I0217 14:06:57.709622 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:57 crc kubenswrapper[4836]: I0217 14:06:57.709634 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:57 crc kubenswrapper[4836]: I0217 14:06:57.709652 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:57 crc kubenswrapper[4836]: I0217 14:06:57.709667 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:57Z","lastTransitionTime":"2026-02-17T14:06:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:57 crc kubenswrapper[4836]: I0217 14:06:57.716158 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c-metrics-certs\") pod \"network-metrics-daemon-c4txt\" (UID: \"8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c\") " pod="openshift-multus/network-metrics-daemon-c4txt" Feb 17 14:06:57 crc kubenswrapper[4836]: E0217 14:06:57.716316 4836 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 14:06:57 crc kubenswrapper[4836]: E0217 14:06:57.716374 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c-metrics-certs podName:8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c nodeName:}" failed. No retries permitted until 2026-02-17 14:07:05.716358297 +0000 UTC m=+52.059286566 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c-metrics-certs") pod "network-metrics-daemon-c4txt" (UID: "8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 14:06:57 crc kubenswrapper[4836]: I0217 14:06:57.811861 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:57 crc kubenswrapper[4836]: I0217 14:06:57.811902 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:57 crc kubenswrapper[4836]: I0217 14:06:57.811912 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:57 crc kubenswrapper[4836]: I0217 14:06:57.811931 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:57 crc kubenswrapper[4836]: I0217 14:06:57.811941 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:57Z","lastTransitionTime":"2026-02-17T14:06:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:57 crc kubenswrapper[4836]: I0217 14:06:57.915015 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:57 crc kubenswrapper[4836]: I0217 14:06:57.915091 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:57 crc kubenswrapper[4836]: I0217 14:06:57.915116 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:57 crc kubenswrapper[4836]: I0217 14:06:57.915143 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:57 crc kubenswrapper[4836]: I0217 14:06:57.915158 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:57Z","lastTransitionTime":"2026-02-17T14:06:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:58 crc kubenswrapper[4836]: I0217 14:06:58.017880 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:58 crc kubenswrapper[4836]: I0217 14:06:58.017928 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:58 crc kubenswrapper[4836]: I0217 14:06:58.017938 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:58 crc kubenswrapper[4836]: I0217 14:06:58.017956 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:58 crc kubenswrapper[4836]: I0217 14:06:58.017974 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:58Z","lastTransitionTime":"2026-02-17T14:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:58 crc kubenswrapper[4836]: I0217 14:06:58.121795 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:58 crc kubenswrapper[4836]: I0217 14:06:58.121857 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:58 crc kubenswrapper[4836]: I0217 14:06:58.121871 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:58 crc kubenswrapper[4836]: I0217 14:06:58.121888 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:58 crc kubenswrapper[4836]: I0217 14:06:58.121906 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:58Z","lastTransitionTime":"2026-02-17T14:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:58 crc kubenswrapper[4836]: I0217 14:06:58.224810 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:58 crc kubenswrapper[4836]: I0217 14:06:58.224863 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:58 crc kubenswrapper[4836]: I0217 14:06:58.224880 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:58 crc kubenswrapper[4836]: I0217 14:06:58.224900 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:58 crc kubenswrapper[4836]: I0217 14:06:58.224914 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:58Z","lastTransitionTime":"2026-02-17T14:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:58 crc kubenswrapper[4836]: I0217 14:06:58.327330 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:58 crc kubenswrapper[4836]: I0217 14:06:58.327382 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:58 crc kubenswrapper[4836]: I0217 14:06:58.327396 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:58 crc kubenswrapper[4836]: I0217 14:06:58.327414 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:58 crc kubenswrapper[4836]: I0217 14:06:58.327427 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:58Z","lastTransitionTime":"2026-02-17T14:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:58 crc kubenswrapper[4836]: I0217 14:06:58.431114 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:58 crc kubenswrapper[4836]: I0217 14:06:58.431160 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:58 crc kubenswrapper[4836]: I0217 14:06:58.431169 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:58 crc kubenswrapper[4836]: I0217 14:06:58.431183 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:58 crc kubenswrapper[4836]: I0217 14:06:58.431193 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:58Z","lastTransitionTime":"2026-02-17T14:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:58 crc kubenswrapper[4836]: I0217 14:06:58.506117 4836 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 12:07:14.109075669 +0000 UTC Feb 17 14:06:58 crc kubenswrapper[4836]: I0217 14:06:58.534331 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:58 crc kubenswrapper[4836]: I0217 14:06:58.534395 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:58 crc kubenswrapper[4836]: I0217 14:06:58.534414 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:58 crc kubenswrapper[4836]: I0217 14:06:58.534441 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:58 crc kubenswrapper[4836]: I0217 14:06:58.534459 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:58Z","lastTransitionTime":"2026-02-17T14:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:58 crc kubenswrapper[4836]: I0217 14:06:58.638217 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:58 crc kubenswrapper[4836]: I0217 14:06:58.638269 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:58 crc kubenswrapper[4836]: I0217 14:06:58.638278 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:58 crc kubenswrapper[4836]: I0217 14:06:58.638307 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:58 crc kubenswrapper[4836]: I0217 14:06:58.638318 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:58Z","lastTransitionTime":"2026-02-17T14:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:58 crc kubenswrapper[4836]: I0217 14:06:58.741122 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:58 crc kubenswrapper[4836]: I0217 14:06:58.741158 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:58 crc kubenswrapper[4836]: I0217 14:06:58.741176 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:58 crc kubenswrapper[4836]: I0217 14:06:58.741193 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:58 crc kubenswrapper[4836]: I0217 14:06:58.741204 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:58Z","lastTransitionTime":"2026-02-17T14:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:58 crc kubenswrapper[4836]: I0217 14:06:58.843391 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:58 crc kubenswrapper[4836]: I0217 14:06:58.843440 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:58 crc kubenswrapper[4836]: I0217 14:06:58.843452 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:58 crc kubenswrapper[4836]: I0217 14:06:58.843468 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:58 crc kubenswrapper[4836]: I0217 14:06:58.843479 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:58Z","lastTransitionTime":"2026-02-17T14:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:58 crc kubenswrapper[4836]: I0217 14:06:58.946489 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:58 crc kubenswrapper[4836]: I0217 14:06:58.946558 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:58 crc kubenswrapper[4836]: I0217 14:06:58.946573 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:58 crc kubenswrapper[4836]: I0217 14:06:58.946596 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:58 crc kubenswrapper[4836]: I0217 14:06:58.946608 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:58Z","lastTransitionTime":"2026-02-17T14:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:59 crc kubenswrapper[4836]: I0217 14:06:59.049381 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:59 crc kubenswrapper[4836]: I0217 14:06:59.049450 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:59 crc kubenswrapper[4836]: I0217 14:06:59.049468 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:59 crc kubenswrapper[4836]: I0217 14:06:59.049490 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:59 crc kubenswrapper[4836]: I0217 14:06:59.049509 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:59Z","lastTransitionTime":"2026-02-17T14:06:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:59 crc kubenswrapper[4836]: I0217 14:06:59.151614 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:59 crc kubenswrapper[4836]: I0217 14:06:59.151649 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:59 crc kubenswrapper[4836]: I0217 14:06:59.151657 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:59 crc kubenswrapper[4836]: I0217 14:06:59.151671 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:59 crc kubenswrapper[4836]: I0217 14:06:59.151681 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:59Z","lastTransitionTime":"2026-02-17T14:06:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:59 crc kubenswrapper[4836]: I0217 14:06:59.254626 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:59 crc kubenswrapper[4836]: I0217 14:06:59.254672 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:59 crc kubenswrapper[4836]: I0217 14:06:59.254682 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:59 crc kubenswrapper[4836]: I0217 14:06:59.254696 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:59 crc kubenswrapper[4836]: I0217 14:06:59.254705 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:59Z","lastTransitionTime":"2026-02-17T14:06:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:59 crc kubenswrapper[4836]: I0217 14:06:59.357514 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:59 crc kubenswrapper[4836]: I0217 14:06:59.357568 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:59 crc kubenswrapper[4836]: I0217 14:06:59.357647 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:59 crc kubenswrapper[4836]: I0217 14:06:59.357664 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:59 crc kubenswrapper[4836]: I0217 14:06:59.357676 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:59Z","lastTransitionTime":"2026-02-17T14:06:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:59 crc kubenswrapper[4836]: I0217 14:06:59.460823 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:59 crc kubenswrapper[4836]: I0217 14:06:59.460863 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:59 crc kubenswrapper[4836]: I0217 14:06:59.460873 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:59 crc kubenswrapper[4836]: I0217 14:06:59.460890 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:59 crc kubenswrapper[4836]: I0217 14:06:59.460901 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:59Z","lastTransitionTime":"2026-02-17T14:06:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:59 crc kubenswrapper[4836]: I0217 14:06:59.506478 4836 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-31 18:43:44.054808581 +0000 UTC Feb 17 14:06:59 crc kubenswrapper[4836]: I0217 14:06:59.564588 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:59 crc kubenswrapper[4836]: I0217 14:06:59.564645 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:59 crc kubenswrapper[4836]: I0217 14:06:59.564657 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:59 crc kubenswrapper[4836]: I0217 14:06:59.564675 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:59 crc kubenswrapper[4836]: I0217 14:06:59.564686 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:59Z","lastTransitionTime":"2026-02-17T14:06:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:59 crc kubenswrapper[4836]: I0217 14:06:59.568036 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c4txt" Feb 17 14:06:59 crc kubenswrapper[4836]: I0217 14:06:59.568088 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:06:59 crc kubenswrapper[4836]: I0217 14:06:59.568088 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:06:59 crc kubenswrapper[4836]: I0217 14:06:59.568031 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:06:59 crc kubenswrapper[4836]: E0217 14:06:59.568218 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c4txt" podUID="8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c" Feb 17 14:06:59 crc kubenswrapper[4836]: E0217 14:06:59.568449 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:06:59 crc kubenswrapper[4836]: E0217 14:06:59.568614 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:06:59 crc kubenswrapper[4836]: E0217 14:06:59.568703 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:06:59 crc kubenswrapper[4836]: I0217 14:06:59.667247 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:59 crc kubenswrapper[4836]: I0217 14:06:59.667330 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:59 crc kubenswrapper[4836]: I0217 14:06:59.667342 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:59 crc kubenswrapper[4836]: I0217 14:06:59.667358 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:59 crc kubenswrapper[4836]: I0217 14:06:59.667391 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:59Z","lastTransitionTime":"2026-02-17T14:06:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:59 crc kubenswrapper[4836]: I0217 14:06:59.770516 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:59 crc kubenswrapper[4836]: I0217 14:06:59.770606 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:59 crc kubenswrapper[4836]: I0217 14:06:59.770633 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:59 crc kubenswrapper[4836]: I0217 14:06:59.770666 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:59 crc kubenswrapper[4836]: I0217 14:06:59.770689 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:59Z","lastTransitionTime":"2026-02-17T14:06:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:59 crc kubenswrapper[4836]: I0217 14:06:59.873096 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:59 crc kubenswrapper[4836]: I0217 14:06:59.873192 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:59 crc kubenswrapper[4836]: I0217 14:06:59.873212 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:59 crc kubenswrapper[4836]: I0217 14:06:59.873234 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:59 crc kubenswrapper[4836]: I0217 14:06:59.873248 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:59Z","lastTransitionTime":"2026-02-17T14:06:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:06:59 crc kubenswrapper[4836]: I0217 14:06:59.976116 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:06:59 crc kubenswrapper[4836]: I0217 14:06:59.976152 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:06:59 crc kubenswrapper[4836]: I0217 14:06:59.976161 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:06:59 crc kubenswrapper[4836]: I0217 14:06:59.976174 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:06:59 crc kubenswrapper[4836]: I0217 14:06:59.976183 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:06:59Z","lastTransitionTime":"2026-02-17T14:06:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:00 crc kubenswrapper[4836]: I0217 14:07:00.078893 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:00 crc kubenswrapper[4836]: I0217 14:07:00.079004 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:00 crc kubenswrapper[4836]: I0217 14:07:00.079026 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:00 crc kubenswrapper[4836]: I0217 14:07:00.079049 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:00 crc kubenswrapper[4836]: I0217 14:07:00.079069 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:00Z","lastTransitionTime":"2026-02-17T14:07:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:00 crc kubenswrapper[4836]: I0217 14:07:00.182376 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:00 crc kubenswrapper[4836]: I0217 14:07:00.182420 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:00 crc kubenswrapper[4836]: I0217 14:07:00.182431 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:00 crc kubenswrapper[4836]: I0217 14:07:00.182445 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:00 crc kubenswrapper[4836]: I0217 14:07:00.182457 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:00Z","lastTransitionTime":"2026-02-17T14:07:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:00 crc kubenswrapper[4836]: I0217 14:07:00.284954 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:00 crc kubenswrapper[4836]: I0217 14:07:00.284997 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:00 crc kubenswrapper[4836]: I0217 14:07:00.285006 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:00 crc kubenswrapper[4836]: I0217 14:07:00.285081 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:00 crc kubenswrapper[4836]: I0217 14:07:00.285092 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:00Z","lastTransitionTime":"2026-02-17T14:07:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:00 crc kubenswrapper[4836]: I0217 14:07:00.389322 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:00 crc kubenswrapper[4836]: I0217 14:07:00.389402 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:00 crc kubenswrapper[4836]: I0217 14:07:00.389423 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:00 crc kubenswrapper[4836]: I0217 14:07:00.389451 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:00 crc kubenswrapper[4836]: I0217 14:07:00.389467 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:00Z","lastTransitionTime":"2026-02-17T14:07:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:00 crc kubenswrapper[4836]: I0217 14:07:00.492540 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:00 crc kubenswrapper[4836]: I0217 14:07:00.492597 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:00 crc kubenswrapper[4836]: I0217 14:07:00.492607 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:00 crc kubenswrapper[4836]: I0217 14:07:00.492623 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:00 crc kubenswrapper[4836]: I0217 14:07:00.492633 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:00Z","lastTransitionTime":"2026-02-17T14:07:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:00 crc kubenswrapper[4836]: I0217 14:07:00.506884 4836 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-17 02:23:37.451846543 +0000 UTC Feb 17 14:07:00 crc kubenswrapper[4836]: I0217 14:07:00.594706 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:00 crc kubenswrapper[4836]: I0217 14:07:00.594756 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:00 crc kubenswrapper[4836]: I0217 14:07:00.594768 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:00 crc kubenswrapper[4836]: I0217 14:07:00.594787 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:00 crc kubenswrapper[4836]: I0217 14:07:00.594799 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:00Z","lastTransitionTime":"2026-02-17T14:07:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:00 crc kubenswrapper[4836]: I0217 14:07:00.697559 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:00 crc kubenswrapper[4836]: I0217 14:07:00.697590 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:00 crc kubenswrapper[4836]: I0217 14:07:00.697599 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:00 crc kubenswrapper[4836]: I0217 14:07:00.697612 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:00 crc kubenswrapper[4836]: I0217 14:07:00.697620 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:00Z","lastTransitionTime":"2026-02-17T14:07:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:00 crc kubenswrapper[4836]: I0217 14:07:00.799450 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:00 crc kubenswrapper[4836]: I0217 14:07:00.799491 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:00 crc kubenswrapper[4836]: I0217 14:07:00.799501 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:00 crc kubenswrapper[4836]: I0217 14:07:00.799516 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:00 crc kubenswrapper[4836]: I0217 14:07:00.799526 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:00Z","lastTransitionTime":"2026-02-17T14:07:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:00 crc kubenswrapper[4836]: I0217 14:07:00.902288 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:00 crc kubenswrapper[4836]: I0217 14:07:00.902370 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:00 crc kubenswrapper[4836]: I0217 14:07:00.902381 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:00 crc kubenswrapper[4836]: I0217 14:07:00.902425 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:00 crc kubenswrapper[4836]: I0217 14:07:00.902442 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:00Z","lastTransitionTime":"2026-02-17T14:07:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:01 crc kubenswrapper[4836]: I0217 14:07:01.004588 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:01 crc kubenswrapper[4836]: I0217 14:07:01.004663 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:01 crc kubenswrapper[4836]: I0217 14:07:01.004675 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:01 crc kubenswrapper[4836]: I0217 14:07:01.004694 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:01 crc kubenswrapper[4836]: I0217 14:07:01.004705 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:01Z","lastTransitionTime":"2026-02-17T14:07:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:01 crc kubenswrapper[4836]: I0217 14:07:01.107013 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:01 crc kubenswrapper[4836]: I0217 14:07:01.107054 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:01 crc kubenswrapper[4836]: I0217 14:07:01.107066 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:01 crc kubenswrapper[4836]: I0217 14:07:01.107081 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:01 crc kubenswrapper[4836]: I0217 14:07:01.107092 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:01Z","lastTransitionTime":"2026-02-17T14:07:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:01 crc kubenswrapper[4836]: I0217 14:07:01.209165 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:01 crc kubenswrapper[4836]: I0217 14:07:01.209212 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:01 crc kubenswrapper[4836]: I0217 14:07:01.209223 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:01 crc kubenswrapper[4836]: I0217 14:07:01.209238 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:01 crc kubenswrapper[4836]: I0217 14:07:01.209249 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:01Z","lastTransitionTime":"2026-02-17T14:07:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:01 crc kubenswrapper[4836]: I0217 14:07:01.311199 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:01 crc kubenswrapper[4836]: I0217 14:07:01.311244 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:01 crc kubenswrapper[4836]: I0217 14:07:01.311255 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:01 crc kubenswrapper[4836]: I0217 14:07:01.311269 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:01 crc kubenswrapper[4836]: I0217 14:07:01.311279 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:01Z","lastTransitionTime":"2026-02-17T14:07:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:01 crc kubenswrapper[4836]: I0217 14:07:01.414236 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:01 crc kubenswrapper[4836]: I0217 14:07:01.414313 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:01 crc kubenswrapper[4836]: I0217 14:07:01.414326 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:01 crc kubenswrapper[4836]: I0217 14:07:01.414344 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:01 crc kubenswrapper[4836]: I0217 14:07:01.414382 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:01Z","lastTransitionTime":"2026-02-17T14:07:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:01 crc kubenswrapper[4836]: I0217 14:07:01.507156 4836 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 14:16:53.037740429 +0000 UTC Feb 17 14:07:01 crc kubenswrapper[4836]: I0217 14:07:01.517152 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:01 crc kubenswrapper[4836]: I0217 14:07:01.517213 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:01 crc kubenswrapper[4836]: I0217 14:07:01.517229 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:01 crc kubenswrapper[4836]: I0217 14:07:01.517251 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:01 crc kubenswrapper[4836]: I0217 14:07:01.517265 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:01Z","lastTransitionTime":"2026-02-17T14:07:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:01 crc kubenswrapper[4836]: I0217 14:07:01.567100 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:07:01 crc kubenswrapper[4836]: E0217 14:07:01.567251 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:07:01 crc kubenswrapper[4836]: I0217 14:07:01.568142 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c4txt" Feb 17 14:07:01 crc kubenswrapper[4836]: I0217 14:07:01.568254 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:07:01 crc kubenswrapper[4836]: E0217 14:07:01.568365 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c4txt" podUID="8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c" Feb 17 14:07:01 crc kubenswrapper[4836]: I0217 14:07:01.568456 4836 scope.go:117] "RemoveContainer" containerID="b66e2b9fc807500c9a546e089579ae9af2485f2fa86b58e90b5979b4d18a052f" Feb 17 14:07:01 crc kubenswrapper[4836]: E0217 14:07:01.568509 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:07:01 crc kubenswrapper[4836]: I0217 14:07:01.568759 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:07:01 crc kubenswrapper[4836]: E0217 14:07:01.569540 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:07:01 crc kubenswrapper[4836]: I0217 14:07:01.597267 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"243121c7-6c63-4df6-a6a6-1ef6770e1125\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0143c73054039d5957c29bdf62217900d803ba2e0046c09dcff149b27fc94995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18b1d7908740d818568e489caf3447ab0f5768c180cc6d443adc67414e51cd07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e054e9ec9a1f8d54b704be11ff2e55c9aaad9d717b1817aa553cfbf2cc6ac16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a5d1fd5d2de253417bcb1e4edadc2363a1c259769418eb2b2aab3ff8253de2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fbf3d3daf43d0da96ad0c1340b3c72815adceec24d0ed5fc97f965d1d93e7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://854199187b81a3d6692a05073c35985feb14993f76f4ac86149c2b6bac635e4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://854199187b81a3d6692a05073c35985feb14993f76f4ac86149c2b6bac635e4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67b26f0d65128004f7645ac1f54a11bc56909269cb9b9337e447b0d5bf783c4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67b26f0d65128004f7645ac1f54a11bc56909269cb9b9337e447b0d5bf783c4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e9fc4a71369f80c06598341729c7db0cb6e0f33d70b0be64470bc6fd40d1cce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9fc4a71369f80c06598341729c7db0cb6e0f33d70b0be64470bc6fd40d1cce2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:01Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:01 crc kubenswrapper[4836]: I0217 14:07:01.610025 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b942ef1c58993ae6344d4aef5fcfe21434b5f3b3282bdf8070560fa8973a3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77390b24cb4a223b8eb6a46db49a4a88b68f87aa0b4b134b45a552c180dec5b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:01Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:01 crc kubenswrapper[4836]: I0217 14:07:01.620217 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:01 crc kubenswrapper[4836]: I0217 14:07:01.628837 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:01 crc kubenswrapper[4836]: I0217 14:07:01.629009 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:01 crc kubenswrapper[4836]: I0217 14:07:01.629115 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:01 crc kubenswrapper[4836]: I0217 14:07:01.629259 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:01Z","lastTransitionTime":"2026-02-17T14:07:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:01 crc kubenswrapper[4836]: I0217 14:07:01.633932 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:01Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:01 crc kubenswrapper[4836]: I0217 14:07:01.648672 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd6efd6a58b6d90897fa902c8d0c8082ecec2899f16adc3e56b98f0bc5c4640d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:01Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:01 crc kubenswrapper[4836]: I0217 14:07:01.659743 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vt5sw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d1f430-35ed-4c4e-a797-d7a0a5a45266\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25f1e7f83b2916db4faac1afe1f5107375c0e1674ff1d6f90f1025a9920111e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqtsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vt5sw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:01Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:01 crc kubenswrapper[4836]: I0217 14:07:01.669764 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jlz6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f87e91ef-e64c-45a5-9bd5-cc6537e51b1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e03fa870790f606ebf6db3bac825e59b209b7a2bd6cb22264597b30211fbc2fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vbzg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jlz6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:01Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:01 crc kubenswrapper[4836]: I0217 14:07:01.690966 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebae2d4e3375fda616b0ea16c0c86a5c1c36004e7d22873a289ff37874bbc74b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb05318b17722c3f731dc976d38c05a4ab7eafd686ad0fcf17bd01032409eaee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5576148df454de0fb3d15ea9de93736aab907ca8cf09669c3c25fde79ec6fad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47fdc413501c45a2d6e531a1427fd6df3e9d2c57e3d60098b31fbfd695d98a0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d36d269996169583e4e181aac060491640e74df94385276752f460c408273cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6545ec908eecb35a4405050dd77343f0acd4ec7d54ec55853a926e00b7f37f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b66e2b9fc807500c9a546e089579ae9af2485f2fa86b58e90b5979b4d18a052f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b66e2b9fc807500c9a546e089579ae9af2485f2fa86b58e90b5979b4d18a052f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T14:06:49Z\\\",\\\"message\\\":\\\"rics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0217 14:06:49.751267 6265 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI0217 14:06:49.751271 6265 obj_retry.go:303] Retry object setup: *v1.Pod openshift-dns/node-resolver-vt5sw\\\\nI0217 14:06:49.751281 6265 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI0217 14:06:49.751311 6265 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI0217 14:06:49.751325 6265 ovn.go:134] Ensuring zone local for Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g in node crc\\\\nI0217 14:06:49.751329 6265 obj_retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nI0217 14:06:49.751338 6265 obj_retry.go:365] Adding new object: *v1.Pod openshift-etcd/etcd-crc\\\\nF0217 14:06:49.751346 6265 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-gfznp_openshift-ovn-kubernetes(67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8d2edfc6a88391d034f310661e09eb58503f33f015b5e067654db06e8b152e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81e6d07edc7d7857cb288980bc7e9d5002234c2fe642df6798611b150b6d66e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81e6d07edc7d7857cb288980bc7e9d5002234c2fe642df6798611b150b6d66e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gfznp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:01Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:01 crc kubenswrapper[4836]: I0217 14:07:01.704510 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15e7e5f3-c249-4609-937e-ffdc78580880\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://281c6f51b95a96d557a29b449759275131f2fde30c4814e7692edfff96442e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bc90fa5f9363034d51c35721493655d10f8f8d5fad4dda86ca4c882963fb7a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edaaa277a27d7e7b9e8b42acaa9ee04fc0933b440441ad3f8abee458e96945c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14e1fa543737ef273b75b2cffc78dc2edf4d1673c2c4095edda7c40db8238f2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a13a546a929cc67fa3a708f23186f65c3dbc28afd67a421a66c732b8308e4177\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:06:30.193193 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:06:30.195418 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-478710080/tls.crt::/tmp/serving-cert-478710080/tls.key\\\\\\\"\\\\nI0217 14:06:35.690345 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:06:35.701676 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:06:35.701714 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:06:35.701739 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:06:35.701745 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:06:35.709986 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 14:06:35.710011 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:06:35.710016 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:06:35.710020 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:06:35.710023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:06:35.710027 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:06:35.710030 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 14:06:35.710186 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 14:06:35.711336 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b36202875b19f7dafcfde168e006d9d47571de30240d5886e5e0920b74b609b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf6a761df18c5e7699f2ec87d4b97f19da71f588cb12eff8e541140a361fa905\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf6a761df18c5e7699f2ec87d4b97f19da71f588cb12eff8e541140a361fa905\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:01Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:01 crc kubenswrapper[4836]: I0217 14:07:01.716272 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc30b1c397bb81c58abc71fb2c6d71c94b9a0f7962d8278edc4f65f1a1c64ef8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:01Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:01 crc kubenswrapper[4836]: I0217 14:07:01.728005 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:01Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:01 crc kubenswrapper[4836]: I0217 14:07:01.732467 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:01 crc kubenswrapper[4836]: I0217 14:07:01.732502 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:01 crc kubenswrapper[4836]: I0217 14:07:01.732514 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:01 crc kubenswrapper[4836]: I0217 14:07:01.732530 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:01 crc kubenswrapper[4836]: I0217 14:07:01.732542 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:01Z","lastTransitionTime":"2026-02-17T14:07:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:01 crc kubenswrapper[4836]: I0217 14:07:01.743079 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:01Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:01 crc kubenswrapper[4836]: I0217 14:07:01.763693 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t7845" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eeaa6bd-bab3-4310-9522-747924f2e825\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54ef47d3331023c2e4524dc2c8958b4ed4e2a499eb4d5f084f70cd4afcca1d32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6778ba6ea7cfd1d378cae6dc4330542c35cedaff8e820a2158571f542e0ec55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6778ba6ea7cfd1d378cae6dc4330542c35cedaff8e820a2158571f542e0ec55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14eab60dcf798778aa77f01af0307c7a5ef44846ceaf5a12e4a5f529b812aaa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14eab60dcf798778aa77f01af0307c7a5ef44846ceaf5a12e4a5f529b812aaa4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd65cf675b0d34fc5ea309bdcd17ef648bbf1c0ee6e2ece25f459d8d2403387c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd65cf675b0d34fc5ea309bdcd17ef648bbf1c0ee6e2ece25f459d8d2403387c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b15d92528d05fd8bcab26df5eeb85616b44e750574cc8effa1f34c39dc329bf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b15d92528d05fd8bcab26df5eeb85616b44e750574cc8effa1f34c39dc329bf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ddc9616f0a81f7623eb10b0cdb194656ad856548f814df2e6cebdb2419293f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ddc9616f0a81f7623eb10b0cdb194656ad856548f814df2e6cebdb2419293f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a96c0c25b8539ae8bb660010bbcd7d147a150bfb653dc725a34bc7e4e4b949b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a96c0c25b8539ae8bb660010bbcd7d147a150bfb653dc725a34bc7e4e4b949b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t7845\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:01Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:01 crc kubenswrapper[4836]: I0217 14:07:01.776553 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7nmc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b98ed9fc-ca9f-49c8-b92d-c5b58df2ce3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac6f7d7722015b1a0d40a119c1c74e8a37a7e96e735f56a93ee2fde68d0e10f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6dh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61f8fdf114b2f1ca00e55833ea4eeb033f97b9fa48a3308c4db3c7dbc617c3b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6dh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7nmc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:01Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:01 crc kubenswrapper[4836]: I0217 14:07:01.790320 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-c4txt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g78bt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g78bt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-c4txt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:01Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:01 crc kubenswrapper[4836]: I0217 14:07:01.804553 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a443a5a-83a8-4154-94b7-ace6b324f461\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8787a45a4a94da65a0f67f1a402bfa526fa57631259981ded7ec3569fa977f36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6196237bf10370f0d43ac7ba96f1fbc861d8690734cf883081f67616e6047dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21927a2142504e88fd8bb2a254d804c0e13f6a42e81e16c02205292de3de3155\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c21e5f00c6fb44be82d2a907c0f9d17ea47071615370135f0e94a62f1eb7840\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:01Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:01 crc kubenswrapper[4836]: I0217 14:07:01.819150 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c76cc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"592aa549-1b1b-441e-93e4-0821e05ff2b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d56329a484e41f44539eb8ca01b461cc6f79eebf756350b0f49c47c0431e8abc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8vh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c76cc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:01Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:01 crc kubenswrapper[4836]: I0217 14:07:01.832159 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"895a19c9-a3f0-4a15-aa19-19347121388c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bce78b27c441ccb16f0fd2489cbe3e38be0f3d330fcd2b602f62a8ca0aa8616b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99tf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c10f7b69881ea75bfa81905a42379fed8daf48e788b5f2787c522a52d28f58cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99tf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bkk9g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:01Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:01 crc kubenswrapper[4836]: I0217 14:07:01.834759 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:01 crc kubenswrapper[4836]: I0217 14:07:01.834807 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:01 crc kubenswrapper[4836]: I0217 14:07:01.834818 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:01 crc kubenswrapper[4836]: I0217 14:07:01.834835 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:01 crc kubenswrapper[4836]: I0217 14:07:01.834847 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:01Z","lastTransitionTime":"2026-02-17T14:07:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:01 crc kubenswrapper[4836]: I0217 14:07:01.937148 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:01 crc kubenswrapper[4836]: I0217 14:07:01.937200 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:01 crc kubenswrapper[4836]: I0217 14:07:01.937213 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:01 crc kubenswrapper[4836]: I0217 14:07:01.937233 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:01 crc kubenswrapper[4836]: I0217 14:07:01.937247 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:01Z","lastTransitionTime":"2026-02-17T14:07:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:01 crc kubenswrapper[4836]: I0217 14:07:01.938148 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gfznp_67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e/ovnkube-controller/1.log" Feb 17 14:07:01 crc kubenswrapper[4836]: I0217 14:07:01.943699 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" event={"ID":"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e","Type":"ContainerStarted","Data":"4db5a5e396fbddd65ca0e94f1414b226b324f8bce9dd7c243d40bfbd0d060bb1"} Feb 17 14:07:01 crc kubenswrapper[4836]: I0217 14:07:01.948459 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" Feb 17 14:07:01 crc kubenswrapper[4836]: I0217 14:07:01.965131 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a443a5a-83a8-4154-94b7-ace6b324f461\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8787a45a4a94da65a0f67f1a402bfa526fa57631259981ded7ec3569fa977f36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6196237bf10370f0d43ac7ba96f1fbc861d8690734cf883081f67616e6047dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21927a2142504e88fd8bb2a254d804c0e13f6a42e81e16c02205292de3de3155\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c21e5f00c6fb44be82d2a907c0f9d17ea47071615370135f0e94a62f1eb7840\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:01Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:01 crc kubenswrapper[4836]: I0217 14:07:01.979338 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c76cc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"592aa549-1b1b-441e-93e4-0821e05ff2b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d56329a484e41f44539eb8ca01b461cc6f79eebf756350b0f49c47c0431e8abc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8vh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c76cc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:01Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:01 crc kubenswrapper[4836]: I0217 14:07:01.993284 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"895a19c9-a3f0-4a15-aa19-19347121388c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bce78b27c441ccb16f0fd2489cbe3e38be0f3d330fcd2b602f62a8ca0aa8616b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99tf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c10f7b69881ea75bfa81905a42379fed8daf48e788b5f2787c522a52d28f58cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99tf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bkk9g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:01Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:02 crc kubenswrapper[4836]: I0217 14:07:02.007067 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd6efd6a58b6d90897fa902c8d0c8082ecec2899f16adc3e56b98f0bc5c4640d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:02Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:02 crc kubenswrapper[4836]: I0217 14:07:02.018613 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vt5sw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d1f430-35ed-4c4e-a797-d7a0a5a45266\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25f1e7f83b2916db4faac1afe1f5107375c0e1674ff1d6f90f1025a9920111e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqtsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vt5sw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:02Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:02 crc kubenswrapper[4836]: I0217 14:07:02.037475 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"243121c7-6c63-4df6-a6a6-1ef6770e1125\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0143c73054039d5957c29bdf62217900d803ba2e0046c09dcff149b27fc94995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18b1d7908740d818568e489caf3447ab0f5768c180cc6d443adc67414e51cd07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e054e9ec9a1f8d54b704be11ff2e55c9aaad9d717b1817aa553cfbf2cc6ac16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a5d1fd5d2de253417bcb1e4edadc2363a1c259769418eb2b2aab3ff8253de2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fbf3d3daf43d0da96ad0c1340b3c72815adceec24d0ed5fc97f965d1d93e7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://854199187b81a3d6692a05073c35985feb14993f76f4ac86149c2b6bac635e4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://854199187b81a3d6692a05073c35985feb14993f76f4ac86149c2b6bac635e4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67b26f0d65128004f7645ac1f54a11bc56909269cb9b9337e447b0d5bf783c4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67b26f0d65128004f7645ac1f54a11bc56909269cb9b9337e447b0d5bf783c4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e9fc4a71369f80c06598341729c7db0cb6e0f33d70b0be64470bc6fd40d1cce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9fc4a71369f80c06598341729c7db0cb6e0f33d70b0be64470bc6fd40d1cce2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:02Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:02 crc kubenswrapper[4836]: I0217 14:07:02.040670 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:02 crc kubenswrapper[4836]: I0217 14:07:02.040750 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:02 crc kubenswrapper[4836]: I0217 14:07:02.040758 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:02 crc kubenswrapper[4836]: I0217 14:07:02.040775 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:02 crc kubenswrapper[4836]: I0217 14:07:02.040786 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:02Z","lastTransitionTime":"2026-02-17T14:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:02 crc kubenswrapper[4836]: I0217 14:07:02.052513 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b942ef1c58993ae6344d4aef5fcfe21434b5f3b3282bdf8070560fa8973a3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77390b24cb4a223b8eb6a46db49a4a88b68f87aa0b4b134b45a552c180dec5b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:02Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:02 crc kubenswrapper[4836]: I0217 14:07:02.066767 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:02Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:02 crc kubenswrapper[4836]: I0217 14:07:02.081041 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc30b1c397bb81c58abc71fb2c6d71c94b9a0f7962d8278edc4f65f1a1c64ef8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:02Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:02 crc kubenswrapper[4836]: I0217 14:07:02.094287 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:02Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:02 crc kubenswrapper[4836]: I0217 14:07:02.107813 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:02Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:02 crc kubenswrapper[4836]: I0217 14:07:02.118588 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jlz6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f87e91ef-e64c-45a5-9bd5-cc6537e51b1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e03fa870790f606ebf6db3bac825e59b209b7a2bd6cb22264597b30211fbc2fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vbzg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jlz6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:02Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:02 crc kubenswrapper[4836]: I0217 14:07:02.129353 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:02 crc kubenswrapper[4836]: I0217 14:07:02.129397 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:02 crc kubenswrapper[4836]: I0217 14:07:02.129408 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:02 crc kubenswrapper[4836]: I0217 14:07:02.129428 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:02 crc kubenswrapper[4836]: I0217 14:07:02.129441 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:02Z","lastTransitionTime":"2026-02-17T14:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:02 crc kubenswrapper[4836]: I0217 14:07:02.139156 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebae2d4e3375fda616b0ea16c0c86a5c1c36004e7d22873a289ff37874bbc74b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb05318b17722c3f731dc976d38c05a4ab7eafd686ad0fcf17bd01032409eaee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5576148df454de0fb3d15ea9de93736aab907ca8cf09669c3c25fde79ec6fad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47fdc413501c45a2d6e531a1427fd6df3e9d2c57e3d60098b31fbfd695d98a0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d36d269996169583e4e181aac060491640e74df94385276752f460c408273cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6545ec908eecb35a4405050dd77343f0acd4ec7d54ec55853a926e00b7f37f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4db5a5e396fbddd65ca0e94f1414b226b324f8bce9dd7c243d40bfbd0d060bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b66e2b9fc807500c9a546e089579ae9af2485f2fa86b58e90b5979b4d18a052f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T14:06:49Z\\\",\\\"message\\\":\\\"rics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0217 14:06:49.751267 6265 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI0217 14:06:49.751271 6265 obj_retry.go:303] Retry object setup: *v1.Pod openshift-dns/node-resolver-vt5sw\\\\nI0217 14:06:49.751281 6265 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI0217 14:06:49.751311 6265 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI0217 14:06:49.751325 6265 ovn.go:134] Ensuring zone local for Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g in node crc\\\\nI0217 14:06:49.751329 6265 obj_retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nI0217 14:06:49.751338 6265 obj_retry.go:365] Adding new object: *v1.Pod openshift-etcd/etcd-crc\\\\nF0217 14:06:49.751346 6265 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:07:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8d2edfc6a88391d034f310661e09eb58503f33f015b5e067654db06e8b152e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81e6d07edc7d7857cb288980bc7e9d5002234c2fe642df6798611b150b6d66e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81e6d07edc7d7857cb288980bc7e9d5002234c2fe642df6798611b150b6d66e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gfznp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:02Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:02 crc kubenswrapper[4836]: E0217 14:07:02.143599 4836 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:07:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:07:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:07:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:07:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d638d470-b0e0-4be9-938f-7ec815bf6bd8\\\",\\\"systemUUID\\\":\\\"f194f106-0bf2-4b65-bcb3-5215631b39d2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:02Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:02 crc kubenswrapper[4836]: I0217 14:07:02.149961 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:02 crc kubenswrapper[4836]: I0217 14:07:02.150023 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:02 crc kubenswrapper[4836]: I0217 14:07:02.150034 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:02 crc kubenswrapper[4836]: I0217 14:07:02.150112 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:02 crc kubenswrapper[4836]: I0217 14:07:02.150135 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:02Z","lastTransitionTime":"2026-02-17T14:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:02 crc kubenswrapper[4836]: I0217 14:07:02.153540 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15e7e5f3-c249-4609-937e-ffdc78580880\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://281c6f51b95a96d557a29b449759275131f2fde30c4814e7692edfff96442e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bc90fa5f9363034d51c35721493655d10f8f8d5fad4dda86ca4c882963fb7a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edaaa277a27d7e7b9e8b42acaa9ee04fc0933b440441ad3f8abee458e96945c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14e1fa543737ef273b75b2cffc78dc2edf4d1673c2c4095edda7c40db8238f2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a13a546a929cc67fa3a708f23186f65c3dbc28afd67a421a66c732b8308e4177\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:06:30.193193 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:06:30.195418 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-478710080/tls.crt::/tmp/serving-cert-478710080/tls.key\\\\\\\"\\\\nI0217 14:06:35.690345 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:06:35.701676 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:06:35.701714 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:06:35.701739 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:06:35.701745 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:06:35.709986 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 14:06:35.710011 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:06:35.710016 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:06:35.710020 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:06:35.710023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:06:35.710027 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:06:35.710030 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 14:06:35.710186 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 14:06:35.711336 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b36202875b19f7dafcfde168e006d9d47571de30240d5886e5e0920b74b609b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf6a761df18c5e7699f2ec87d4b97f19da71f588cb12eff8e541140a361fa905\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf6a761df18c5e7699f2ec87d4b97f19da71f588cb12eff8e541140a361fa905\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:02Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:02 crc kubenswrapper[4836]: E0217 14:07:02.168082 4836 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:07:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:07:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:07:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:07:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d638d470-b0e0-4be9-938f-7ec815bf6bd8\\\",\\\"systemUUID\\\":\\\"f194f106-0bf2-4b65-bcb3-5215631b39d2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:02Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:02 crc kubenswrapper[4836]: I0217 14:07:02.169870 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t7845" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eeaa6bd-bab3-4310-9522-747924f2e825\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54ef47d3331023c2e4524dc2c8958b4ed4e2a499eb4d5f084f70cd4afcca1d32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6778ba6ea7cfd1d378cae6dc4330542c35cedaff8e820a2158571f542e0ec55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6778ba6ea7cfd1d378cae6dc4330542c35cedaff8e820a2158571f542e0ec55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14eab60dcf798778aa77f01af0307c7a5ef44846ceaf5a12e4a5f529b812aaa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14eab60dcf798778aa77f01af0307c7a5ef44846ceaf5a12e4a5f529b812aaa4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd65cf675b0d34fc5ea309bdcd17ef648bbf1c0ee6e2ece25f459d8d2403387c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd65cf675b0d34fc5ea309bdcd17ef648bbf1c0ee6e2ece25f459d8d2403387c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b15d92528d05fd8bcab26df5eeb85616b44e750574cc8effa1f34c39dc329bf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b15d92528d05fd8bcab26df5eeb85616b44e750574cc8effa1f34c39dc329bf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ddc9616f0a81f7623eb10b0cdb194656ad856548f814df2e6cebdb2419293f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ddc9616f0a81f7623eb10b0cdb194656ad856548f814df2e6cebdb2419293f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a96c0c25b8539ae8bb660010bbcd7d147a150bfb653dc725a34bc7e4e4b949b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a96c0c25b8539ae8bb660010bbcd7d147a150bfb653dc725a34bc7e4e4b949b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t7845\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:02Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:02 crc kubenswrapper[4836]: I0217 14:07:02.171913 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:02 crc kubenswrapper[4836]: I0217 14:07:02.171941 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:02 crc kubenswrapper[4836]: I0217 14:07:02.171951 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:02 crc kubenswrapper[4836]: I0217 14:07:02.171965 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:02 crc kubenswrapper[4836]: I0217 14:07:02.171975 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:02Z","lastTransitionTime":"2026-02-17T14:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:02 crc kubenswrapper[4836]: I0217 14:07:02.185721 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7nmc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b98ed9fc-ca9f-49c8-b92d-c5b58df2ce3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac6f7d7722015b1a0d40a119c1c74e8a37a7e96e735f56a93ee2fde68d0e10f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6dh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61f8fdf114b2f1ca00e55833ea4eeb033f97b9fa48a3308c4db3c7dbc617c3b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6dh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7nmc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:02Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:02 crc kubenswrapper[4836]: E0217 14:07:02.196228 4836 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:07:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:07:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:07:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:07:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d638d470-b0e0-4be9-938f-7ec815bf6bd8\\\",\\\"systemUUID\\\":\\\"f194f106-0bf2-4b65-bcb3-5215631b39d2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:02Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:02 crc kubenswrapper[4836]: I0217 14:07:02.201038 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:02 crc kubenswrapper[4836]: I0217 14:07:02.201102 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:02 crc kubenswrapper[4836]: I0217 14:07:02.201117 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:02 crc kubenswrapper[4836]: I0217 14:07:02.201136 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:02 crc kubenswrapper[4836]: I0217 14:07:02.201148 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:02Z","lastTransitionTime":"2026-02-17T14:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:02 crc kubenswrapper[4836]: E0217 14:07:02.219537 4836 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:07:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:07:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:07:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:07:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d638d470-b0e0-4be9-938f-7ec815bf6bd8\\\",\\\"systemUUID\\\":\\\"f194f106-0bf2-4b65-bcb3-5215631b39d2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:02Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:02 crc kubenswrapper[4836]: I0217 14:07:02.219918 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-c4txt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g78bt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g78bt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-c4txt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:02Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:02 crc kubenswrapper[4836]: I0217 14:07:02.226074 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:02 crc kubenswrapper[4836]: I0217 14:07:02.226102 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:02 crc kubenswrapper[4836]: I0217 14:07:02.226112 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:02 crc kubenswrapper[4836]: I0217 14:07:02.226126 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:02 crc kubenswrapper[4836]: I0217 14:07:02.226136 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:02Z","lastTransitionTime":"2026-02-17T14:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:02 crc kubenswrapper[4836]: E0217 14:07:02.253664 4836 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:07:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:07:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:07:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:07:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d638d470-b0e0-4be9-938f-7ec815bf6bd8\\\",\\\"systemUUID\\\":\\\"f194f106-0bf2-4b65-bcb3-5215631b39d2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:02Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:02 crc kubenswrapper[4836]: E0217 14:07:02.253843 4836 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 17 14:07:02 crc kubenswrapper[4836]: I0217 14:07:02.255658 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:02 crc kubenswrapper[4836]: I0217 14:07:02.255693 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:02 crc kubenswrapper[4836]: I0217 14:07:02.255707 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:02 crc kubenswrapper[4836]: I0217 14:07:02.255747 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:02 crc kubenswrapper[4836]: I0217 14:07:02.255758 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:02Z","lastTransitionTime":"2026-02-17T14:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:02 crc kubenswrapper[4836]: I0217 14:07:02.358796 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:02 crc kubenswrapper[4836]: I0217 14:07:02.358836 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:02 crc kubenswrapper[4836]: I0217 14:07:02.358845 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:02 crc kubenswrapper[4836]: I0217 14:07:02.358859 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:02 crc kubenswrapper[4836]: I0217 14:07:02.358868 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:02Z","lastTransitionTime":"2026-02-17T14:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:02 crc kubenswrapper[4836]: I0217 14:07:02.460986 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:02 crc kubenswrapper[4836]: I0217 14:07:02.461023 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:02 crc kubenswrapper[4836]: I0217 14:07:02.461033 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:02 crc kubenswrapper[4836]: I0217 14:07:02.461051 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:02 crc kubenswrapper[4836]: I0217 14:07:02.461062 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:02Z","lastTransitionTime":"2026-02-17T14:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:02 crc kubenswrapper[4836]: I0217 14:07:02.507930 4836 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-08 10:36:05.386499119 +0000 UTC Feb 17 14:07:02 crc kubenswrapper[4836]: I0217 14:07:02.564348 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:02 crc kubenswrapper[4836]: I0217 14:07:02.564406 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:02 crc kubenswrapper[4836]: I0217 14:07:02.564419 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:02 crc kubenswrapper[4836]: I0217 14:07:02.564449 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:02 crc kubenswrapper[4836]: I0217 14:07:02.564461 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:02Z","lastTransitionTime":"2026-02-17T14:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:02 crc kubenswrapper[4836]: I0217 14:07:02.667187 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:02 crc kubenswrapper[4836]: I0217 14:07:02.667238 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:02 crc kubenswrapper[4836]: I0217 14:07:02.667248 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:02 crc kubenswrapper[4836]: I0217 14:07:02.667262 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:02 crc kubenswrapper[4836]: I0217 14:07:02.667270 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:02Z","lastTransitionTime":"2026-02-17T14:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:02 crc kubenswrapper[4836]: I0217 14:07:02.769113 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:02 crc kubenswrapper[4836]: I0217 14:07:02.769160 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:02 crc kubenswrapper[4836]: I0217 14:07:02.769172 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:02 crc kubenswrapper[4836]: I0217 14:07:02.769187 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:02 crc kubenswrapper[4836]: I0217 14:07:02.769197 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:02Z","lastTransitionTime":"2026-02-17T14:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:02 crc kubenswrapper[4836]: I0217 14:07:02.871978 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:02 crc kubenswrapper[4836]: I0217 14:07:02.872034 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:02 crc kubenswrapper[4836]: I0217 14:07:02.872043 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:02 crc kubenswrapper[4836]: I0217 14:07:02.872056 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:02 crc kubenswrapper[4836]: I0217 14:07:02.872065 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:02Z","lastTransitionTime":"2026-02-17T14:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:02 crc kubenswrapper[4836]: I0217 14:07:02.950258 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gfznp_67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e/ovnkube-controller/2.log" Feb 17 14:07:02 crc kubenswrapper[4836]: I0217 14:07:02.950932 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gfznp_67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e/ovnkube-controller/1.log" Feb 17 14:07:02 crc kubenswrapper[4836]: I0217 14:07:02.954355 4836 generic.go:334] "Generic (PLEG): container finished" podID="67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e" containerID="4db5a5e396fbddd65ca0e94f1414b226b324f8bce9dd7c243d40bfbd0d060bb1" exitCode=1 Feb 17 14:07:02 crc kubenswrapper[4836]: I0217 14:07:02.954410 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" event={"ID":"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e","Type":"ContainerDied","Data":"4db5a5e396fbddd65ca0e94f1414b226b324f8bce9dd7c243d40bfbd0d060bb1"} Feb 17 14:07:02 crc kubenswrapper[4836]: I0217 14:07:02.954497 4836 scope.go:117] "RemoveContainer" containerID="b66e2b9fc807500c9a546e089579ae9af2485f2fa86b58e90b5979b4d18a052f" Feb 17 14:07:02 crc kubenswrapper[4836]: I0217 14:07:02.955240 4836 scope.go:117] "RemoveContainer" containerID="4db5a5e396fbddd65ca0e94f1414b226b324f8bce9dd7c243d40bfbd0d060bb1" Feb 17 14:07:02 crc kubenswrapper[4836]: E0217 14:07:02.955488 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-gfznp_openshift-ovn-kubernetes(67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" podUID="67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e" Feb 17 14:07:02 crc kubenswrapper[4836]: I0217 14:07:02.968436 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"895a19c9-a3f0-4a15-aa19-19347121388c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bce78b27c441ccb16f0fd2489cbe3e38be0f3d330fcd2b602f62a8ca0aa8616b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99tf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c10f7b69881ea75bfa81905a42379fed8daf48e788b5f2787c522a52d28f58cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99tf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bkk9g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:02Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:02 crc kubenswrapper[4836]: I0217 14:07:02.974120 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:02 crc kubenswrapper[4836]: I0217 14:07:02.974158 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:02 crc kubenswrapper[4836]: I0217 14:07:02.974167 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:02 crc kubenswrapper[4836]: I0217 14:07:02.974181 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:02 crc kubenswrapper[4836]: I0217 14:07:02.974191 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:02Z","lastTransitionTime":"2026-02-17T14:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:02 crc kubenswrapper[4836]: I0217 14:07:02.981494 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:02Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:02 crc kubenswrapper[4836]: I0217 14:07:02.992399 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd6efd6a58b6d90897fa902c8d0c8082ecec2899f16adc3e56b98f0bc5c4640d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:02Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:03 crc kubenswrapper[4836]: I0217 14:07:03.003169 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vt5sw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d1f430-35ed-4c4e-a797-d7a0a5a45266\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25f1e7f83b2916db4faac1afe1f5107375c0e1674ff1d6f90f1025a9920111e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqtsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vt5sw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:03Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:03 crc kubenswrapper[4836]: I0217 14:07:03.020048 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"243121c7-6c63-4df6-a6a6-1ef6770e1125\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0143c73054039d5957c29bdf62217900d803ba2e0046c09dcff149b27fc94995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18b1d7908740d818568e489caf3447ab0f5768c180cc6d443adc67414e51cd07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e054e9ec9a1f8d54b704be11ff2e55c9aaad9d717b1817aa553cfbf2cc6ac16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a5d1fd5d2de253417bcb1e4edadc2363a1c259769418eb2b2aab3ff8253de2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fbf3d3daf43d0da96ad0c1340b3c72815adceec24d0ed5fc97f965d1d93e7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://854199187b81a3d6692a05073c35985feb14993f76f4ac86149c2b6bac635e4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://854199187b81a3d6692a05073c35985feb14993f76f4ac86149c2b6bac635e4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67b26f0d65128004f7645ac1f54a11bc56909269cb9b9337e447b0d5bf783c4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67b26f0d65128004f7645ac1f54a11bc56909269cb9b9337e447b0d5bf783c4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e9fc4a71369f80c06598341729c7db0cb6e0f33d70b0be64470bc6fd40d1cce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9fc4a71369f80c06598341729c7db0cb6e0f33d70b0be64470bc6fd40d1cce2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:03Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:03 crc kubenswrapper[4836]: I0217 14:07:03.034097 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b942ef1c58993ae6344d4aef5fcfe21434b5f3b3282bdf8070560fa8973a3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77390b24cb4a223b8eb6a46db49a4a88b68f87aa0b4b134b45a552c180dec5b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:03Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:03 crc kubenswrapper[4836]: I0217 14:07:03.046656 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc30b1c397bb81c58abc71fb2c6d71c94b9a0f7962d8278edc4f65f1a1c64ef8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:03Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:03 crc kubenswrapper[4836]: I0217 14:07:03.060423 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:03Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:03 crc kubenswrapper[4836]: I0217 14:07:03.071917 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:03Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:03 crc kubenswrapper[4836]: I0217 14:07:03.075803 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:03 crc kubenswrapper[4836]: I0217 14:07:03.075839 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:03 crc kubenswrapper[4836]: I0217 14:07:03.075849 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:03 crc kubenswrapper[4836]: I0217 14:07:03.075864 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:03 crc kubenswrapper[4836]: I0217 14:07:03.075874 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:03Z","lastTransitionTime":"2026-02-17T14:07:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:03 crc kubenswrapper[4836]: I0217 14:07:03.083610 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jlz6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f87e91ef-e64c-45a5-9bd5-cc6537e51b1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e03fa870790f606ebf6db3bac825e59b209b7a2bd6cb22264597b30211fbc2fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vbzg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jlz6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:03Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:03 crc kubenswrapper[4836]: I0217 14:07:03.102222 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebae2d4e3375fda616b0ea16c0c86a5c1c36004e7d22873a289ff37874bbc74b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb05318b17722c3f731dc976d38c05a4ab7eafd686ad0fcf17bd01032409eaee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5576148df454de0fb3d15ea9de93736aab907ca8cf09669c3c25fde79ec6fad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47fdc413501c45a2d6e531a1427fd6df3e9d2c57e3d60098b31fbfd695d98a0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d36d269996169583e4e181aac060491640e74df94385276752f460c408273cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6545ec908eecb35a4405050dd77343f0acd4ec7d54ec55853a926e00b7f37f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4db5a5e396fbddd65ca0e94f1414b226b324f8bce9dd7c243d40bfbd0d060bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b66e2b9fc807500c9a546e089579ae9af2485f2fa86b58e90b5979b4d18a052f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T14:06:49Z\\\",\\\"message\\\":\\\"rics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0217 14:06:49.751267 6265 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI0217 14:06:49.751271 6265 obj_retry.go:303] Retry object setup: *v1.Pod openshift-dns/node-resolver-vt5sw\\\\nI0217 14:06:49.751281 6265 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI0217 14:06:49.751311 6265 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI0217 14:06:49.751325 6265 ovn.go:134] Ensuring zone local for Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g in node crc\\\\nI0217 14:06:49.751329 6265 obj_retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nI0217 14:06:49.751338 6265 obj_retry.go:365] Adding new object: *v1.Pod openshift-etcd/etcd-crc\\\\nF0217 14:06:49.751346 6265 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4db5a5e396fbddd65ca0e94f1414b226b324f8bce9dd7c243d40bfbd0d060bb1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T14:07:02Z\\\",\\\"message\\\":\\\"\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0217 14:07:02.828170 6489 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]} options:{GoMap:map[iface-id-ver:3b6479f0-333b-4a96-9adf-2099afdc2447 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61897e97-c771-4738-8709-09636387cb00}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0217 14:07:02.828220 6489 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-config-operator/machine-config-controller\\\\\\\"}\\\\nI0217 14:07:02.828227 6489 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0217 14:07:02.828241 6489 services_controller.go:360] Finished syncing service machine-config-controller on namespace openshift-machine-config-operator for network=default : 1.693755ms\\\\nI0217 14:07:02.828254 6489 services_control\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:07:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8d2edfc6a88391d034f310661e09eb58503f33f015b5e067654db06e8b152e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81e6d07edc7d7857cb288980bc7e9d5002234c2fe642df6798611b150b6d66e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81e6d07edc7d7857cb288980bc7e9d5002234c2fe642df6798611b150b6d66e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gfznp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:03Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:03 crc kubenswrapper[4836]: I0217 14:07:03.116052 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15e7e5f3-c249-4609-937e-ffdc78580880\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://281c6f51b95a96d557a29b449759275131f2fde30c4814e7692edfff96442e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bc90fa5f9363034d51c35721493655d10f8f8d5fad4dda86ca4c882963fb7a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edaaa277a27d7e7b9e8b42acaa9ee04fc0933b440441ad3f8abee458e96945c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14e1fa543737ef273b75b2cffc78dc2edf4d1673c2c4095edda7c40db8238f2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a13a546a929cc67fa3a708f23186f65c3dbc28afd67a421a66c732b8308e4177\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:06:30.193193 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:06:30.195418 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-478710080/tls.crt::/tmp/serving-cert-478710080/tls.key\\\\\\\"\\\\nI0217 14:06:35.690345 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:06:35.701676 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:06:35.701714 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:06:35.701739 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:06:35.701745 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:06:35.709986 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 14:06:35.710011 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:06:35.710016 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:06:35.710020 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:06:35.710023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:06:35.710027 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:06:35.710030 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 14:06:35.710186 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 14:06:35.711336 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b36202875b19f7dafcfde168e006d9d47571de30240d5886e5e0920b74b609b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf6a761df18c5e7699f2ec87d4b97f19da71f588cb12eff8e541140a361fa905\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf6a761df18c5e7699f2ec87d4b97f19da71f588cb12eff8e541140a361fa905\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:03Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:03 crc kubenswrapper[4836]: I0217 14:07:03.125723 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-c4txt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g78bt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g78bt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-c4txt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:03Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:03 crc kubenswrapper[4836]: I0217 14:07:03.137871 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t7845" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eeaa6bd-bab3-4310-9522-747924f2e825\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54ef47d3331023c2e4524dc2c8958b4ed4e2a499eb4d5f084f70cd4afcca1d32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6778ba6ea7cfd1d378cae6dc4330542c35cedaff8e820a2158571f542e0ec55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6778ba6ea7cfd1d378cae6dc4330542c35cedaff8e820a2158571f542e0ec55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14eab60dcf798778aa77f01af0307c7a5ef44846ceaf5a12e4a5f529b812aaa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14eab60dcf798778aa77f01af0307c7a5ef44846ceaf5a12e4a5f529b812aaa4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd65cf675b0d34fc5ea309bdcd17ef648bbf1c0ee6e2ece25f459d8d2403387c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd65cf675b0d34fc5ea309bdcd17ef648bbf1c0ee6e2ece25f459d8d2403387c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b15d92528d05fd8bcab26df5eeb85616b44e750574cc8effa1f34c39dc329bf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b15d92528d05fd8bcab26df5eeb85616b44e750574cc8effa1f34c39dc329bf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ddc9616f0a81f7623eb10b0cdb194656ad856548f814df2e6cebdb2419293f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ddc9616f0a81f7623eb10b0cdb194656ad856548f814df2e6cebdb2419293f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a96c0c25b8539ae8bb660010bbcd7d147a150bfb653dc725a34bc7e4e4b949b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a96c0c25b8539ae8bb660010bbcd7d147a150bfb653dc725a34bc7e4e4b949b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t7845\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:03Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:03 crc kubenswrapper[4836]: I0217 14:07:03.149000 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7nmc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b98ed9fc-ca9f-49c8-b92d-c5b58df2ce3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac6f7d7722015b1a0d40a119c1c74e8a37a7e96e735f56a93ee2fde68d0e10f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6dh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61f8fdf114b2f1ca00e55833ea4eeb033f97b9fa48a3308c4db3c7dbc617c3b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6dh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7nmc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:03Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:03 crc kubenswrapper[4836]: I0217 14:07:03.162208 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a443a5a-83a8-4154-94b7-ace6b324f461\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8787a45a4a94da65a0f67f1a402bfa526fa57631259981ded7ec3569fa977f36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6196237bf10370f0d43ac7ba96f1fbc861d8690734cf883081f67616e6047dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21927a2142504e88fd8bb2a254d804c0e13f6a42e81e16c02205292de3de3155\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c21e5f00c6fb44be82d2a907c0f9d17ea47071615370135f0e94a62f1eb7840\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:03Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:03 crc kubenswrapper[4836]: I0217 14:07:03.173514 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c76cc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"592aa549-1b1b-441e-93e4-0821e05ff2b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d56329a484e41f44539eb8ca01b461cc6f79eebf756350b0f49c47c0431e8abc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8vh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c76cc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:03Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:03 crc kubenswrapper[4836]: I0217 14:07:03.178610 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:03 crc kubenswrapper[4836]: I0217 14:07:03.178662 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:03 crc kubenswrapper[4836]: I0217 14:07:03.178679 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:03 crc kubenswrapper[4836]: I0217 14:07:03.178700 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:03 crc kubenswrapper[4836]: I0217 14:07:03.178712 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:03Z","lastTransitionTime":"2026-02-17T14:07:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:03 crc kubenswrapper[4836]: I0217 14:07:03.281781 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:03 crc kubenswrapper[4836]: I0217 14:07:03.281811 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:03 crc kubenswrapper[4836]: I0217 14:07:03.281819 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:03 crc kubenswrapper[4836]: I0217 14:07:03.281832 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:03 crc kubenswrapper[4836]: I0217 14:07:03.281842 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:03Z","lastTransitionTime":"2026-02-17T14:07:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:03 crc kubenswrapper[4836]: I0217 14:07:03.384780 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:03 crc kubenswrapper[4836]: I0217 14:07:03.384825 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:03 crc kubenswrapper[4836]: I0217 14:07:03.384836 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:03 crc kubenswrapper[4836]: I0217 14:07:03.384852 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:03 crc kubenswrapper[4836]: I0217 14:07:03.384864 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:03Z","lastTransitionTime":"2026-02-17T14:07:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:03 crc kubenswrapper[4836]: I0217 14:07:03.487191 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:03 crc kubenswrapper[4836]: I0217 14:07:03.487239 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:03 crc kubenswrapper[4836]: I0217 14:07:03.487251 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:03 crc kubenswrapper[4836]: I0217 14:07:03.487267 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:03 crc kubenswrapper[4836]: I0217 14:07:03.487277 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:03Z","lastTransitionTime":"2026-02-17T14:07:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:03 crc kubenswrapper[4836]: I0217 14:07:03.508951 4836 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-15 07:23:17.013253046 +0000 UTC Feb 17 14:07:03 crc kubenswrapper[4836]: I0217 14:07:03.567445 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:07:03 crc kubenswrapper[4836]: I0217 14:07:03.567492 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:07:03 crc kubenswrapper[4836]: I0217 14:07:03.567515 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:07:03 crc kubenswrapper[4836]: E0217 14:07:03.567595 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:07:03 crc kubenswrapper[4836]: E0217 14:07:03.567717 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:07:03 crc kubenswrapper[4836]: E0217 14:07:03.567831 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:07:03 crc kubenswrapper[4836]: I0217 14:07:03.568056 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c4txt" Feb 17 14:07:03 crc kubenswrapper[4836]: E0217 14:07:03.568205 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c4txt" podUID="8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c" Feb 17 14:07:03 crc kubenswrapper[4836]: I0217 14:07:03.589188 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:03 crc kubenswrapper[4836]: I0217 14:07:03.589233 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:03 crc kubenswrapper[4836]: I0217 14:07:03.589242 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:03 crc kubenswrapper[4836]: I0217 14:07:03.589255 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:03 crc kubenswrapper[4836]: I0217 14:07:03.589264 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:03Z","lastTransitionTime":"2026-02-17T14:07:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:03 crc kubenswrapper[4836]: I0217 14:07:03.691665 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:03 crc kubenswrapper[4836]: I0217 14:07:03.691726 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:03 crc kubenswrapper[4836]: I0217 14:07:03.691740 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:03 crc kubenswrapper[4836]: I0217 14:07:03.691762 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:03 crc kubenswrapper[4836]: I0217 14:07:03.691776 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:03Z","lastTransitionTime":"2026-02-17T14:07:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:03 crc kubenswrapper[4836]: I0217 14:07:03.794034 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:03 crc kubenswrapper[4836]: I0217 14:07:03.794107 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:03 crc kubenswrapper[4836]: I0217 14:07:03.794117 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:03 crc kubenswrapper[4836]: I0217 14:07:03.794131 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:03 crc kubenswrapper[4836]: I0217 14:07:03.794142 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:03Z","lastTransitionTime":"2026-02-17T14:07:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:03 crc kubenswrapper[4836]: I0217 14:07:03.896616 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:03 crc kubenswrapper[4836]: I0217 14:07:03.896660 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:03 crc kubenswrapper[4836]: I0217 14:07:03.896671 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:03 crc kubenswrapper[4836]: I0217 14:07:03.896687 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:03 crc kubenswrapper[4836]: I0217 14:07:03.896697 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:03Z","lastTransitionTime":"2026-02-17T14:07:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:03 crc kubenswrapper[4836]: I0217 14:07:03.959053 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gfznp_67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e/ovnkube-controller/2.log" Feb 17 14:07:03 crc kubenswrapper[4836]: I0217 14:07:03.963242 4836 scope.go:117] "RemoveContainer" containerID="4db5a5e396fbddd65ca0e94f1414b226b324f8bce9dd7c243d40bfbd0d060bb1" Feb 17 14:07:03 crc kubenswrapper[4836]: E0217 14:07:03.963474 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-gfznp_openshift-ovn-kubernetes(67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" podUID="67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e" Feb 17 14:07:03 crc kubenswrapper[4836]: I0217 14:07:03.976230 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"895a19c9-a3f0-4a15-aa19-19347121388c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bce78b27c441ccb16f0fd2489cbe3e38be0f3d330fcd2b602f62a8ca0aa8616b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99tf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c10f7b69881ea75bfa81905a42379fed8daf48e788b5f2787c522a52d28f58cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99tf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bkk9g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:03Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:04 crc kubenswrapper[4836]: I0217 14:07:04.000133 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:04 crc kubenswrapper[4836]: I0217 14:07:04.000187 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:04 crc kubenswrapper[4836]: I0217 14:07:04.000205 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:04 crc kubenswrapper[4836]: I0217 14:07:04.000227 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:04 crc kubenswrapper[4836]: I0217 14:07:04.000244 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:04Z","lastTransitionTime":"2026-02-17T14:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:04 crc kubenswrapper[4836]: I0217 14:07:04.008474 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"243121c7-6c63-4df6-a6a6-1ef6770e1125\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0143c73054039d5957c29bdf62217900d803ba2e0046c09dcff149b27fc94995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18b1d7908740d818568e489caf3447ab0f5768c180cc6d443adc67414e51cd07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e054e9ec9a1f8d54b704be11ff2e55c9aaad9d717b1817aa553cfbf2cc6ac16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a5d1fd5d2de253417bcb1e4edadc2363a1c259769418eb2b2aab3ff8253de2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fbf3d3daf43d0da96ad0c1340b3c72815adceec24d0ed5fc97f965d1d93e7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://854199187b81a3d6692a05073c35985feb14993f76f4ac86149c2b6bac635e4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://854199187b81a3d6692a05073c35985feb14993f76f4ac86149c2b6bac635e4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67b26f0d65128004f7645ac1f54a11bc56909269cb9b9337e447b0d5bf783c4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67b26f0d65128004f7645ac1f54a11bc56909269cb9b9337e447b0d5bf783c4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e9fc4a71369f80c06598341729c7db0cb6e0f33d70b0be64470bc6fd40d1cce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9fc4a71369f80c06598341729c7db0cb6e0f33d70b0be64470bc6fd40d1cce2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:04Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:04 crc kubenswrapper[4836]: I0217 14:07:04.027109 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b942ef1c58993ae6344d4aef5fcfe21434b5f3b3282bdf8070560fa8973a3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77390b24cb4a223b8eb6a46db49a4a88b68f87aa0b4b134b45a552c180dec5b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:04Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:04 crc kubenswrapper[4836]: I0217 14:07:04.046362 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:04Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:04 crc kubenswrapper[4836]: I0217 14:07:04.064764 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd6efd6a58b6d90897fa902c8d0c8082ecec2899f16adc3e56b98f0bc5c4640d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:04Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:04 crc kubenswrapper[4836]: I0217 14:07:04.078643 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vt5sw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d1f430-35ed-4c4e-a797-d7a0a5a45266\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25f1e7f83b2916db4faac1afe1f5107375c0e1674ff1d6f90f1025a9920111e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqtsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vt5sw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:04Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:04 crc kubenswrapper[4836]: I0217 14:07:04.099585 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15e7e5f3-c249-4609-937e-ffdc78580880\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://281c6f51b95a96d557a29b449759275131f2fde30c4814e7692edfff96442e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bc90fa5f9363034d51c35721493655d10f8f8d5fad4dda86ca4c882963fb7a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edaaa277a27d7e7b9e8b42acaa9ee04fc0933b440441ad3f8abee458e96945c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14e1fa543737ef273b75b2cffc78dc2edf4d1673c2c4095edda7c40db8238f2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a13a546a929cc67fa3a708f23186f65c3dbc28afd67a421a66c732b8308e4177\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:06:30.193193 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:06:30.195418 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-478710080/tls.crt::/tmp/serving-cert-478710080/tls.key\\\\\\\"\\\\nI0217 14:06:35.690345 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:06:35.701676 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:06:35.701714 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:06:35.701739 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:06:35.701745 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:06:35.709986 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 14:06:35.710011 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:06:35.710016 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:06:35.710020 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:06:35.710023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:06:35.710027 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:06:35.710030 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 14:06:35.710186 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 14:06:35.711336 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b36202875b19f7dafcfde168e006d9d47571de30240d5886e5e0920b74b609b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf6a761df18c5e7699f2ec87d4b97f19da71f588cb12eff8e541140a361fa905\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf6a761df18c5e7699f2ec87d4b97f19da71f588cb12eff8e541140a361fa905\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:04Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:04 crc kubenswrapper[4836]: I0217 14:07:04.103851 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:04 crc kubenswrapper[4836]: I0217 14:07:04.103916 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:04 crc kubenswrapper[4836]: I0217 14:07:04.103933 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:04 crc kubenswrapper[4836]: I0217 14:07:04.103959 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:04 crc kubenswrapper[4836]: I0217 14:07:04.103975 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:04Z","lastTransitionTime":"2026-02-17T14:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:04 crc kubenswrapper[4836]: I0217 14:07:04.115702 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc30b1c397bb81c58abc71fb2c6d71c94b9a0f7962d8278edc4f65f1a1c64ef8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:04Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:04 crc kubenswrapper[4836]: I0217 14:07:04.130765 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:04Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:04 crc kubenswrapper[4836]: I0217 14:07:04.143412 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:04Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:04 crc kubenswrapper[4836]: I0217 14:07:04.154568 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jlz6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f87e91ef-e64c-45a5-9bd5-cc6537e51b1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e03fa870790f606ebf6db3bac825e59b209b7a2bd6cb22264597b30211fbc2fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vbzg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jlz6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:04Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:04 crc kubenswrapper[4836]: I0217 14:07:04.174333 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebae2d4e3375fda616b0ea16c0c86a5c1c36004e7d22873a289ff37874bbc74b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb05318b17722c3f731dc976d38c05a4ab7eafd686ad0fcf17bd01032409eaee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5576148df454de0fb3d15ea9de93736aab907ca8cf09669c3c25fde79ec6fad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47fdc413501c45a2d6e531a1427fd6df3e9d2c57e3d60098b31fbfd695d98a0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d36d269996169583e4e181aac060491640e74df94385276752f460c408273cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6545ec908eecb35a4405050dd77343f0acd4ec7d54ec55853a926e00b7f37f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4db5a5e396fbddd65ca0e94f1414b226b324f8bce9dd7c243d40bfbd0d060bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4db5a5e396fbddd65ca0e94f1414b226b324f8bce9dd7c243d40bfbd0d060bb1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T14:07:02Z\\\",\\\"message\\\":\\\"\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0217 14:07:02.828170 6489 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]} options:{GoMap:map[iface-id-ver:3b6479f0-333b-4a96-9adf-2099afdc2447 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61897e97-c771-4738-8709-09636387cb00}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0217 14:07:02.828220 6489 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-config-operator/machine-config-controller\\\\\\\"}\\\\nI0217 14:07:02.828227 6489 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0217 14:07:02.828241 6489 services_controller.go:360] Finished syncing service machine-config-controller on namespace openshift-machine-config-operator for network=default : 1.693755ms\\\\nI0217 14:07:02.828254 6489 services_control\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:07:01Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-gfznp_openshift-ovn-kubernetes(67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8d2edfc6a88391d034f310661e09eb58503f33f015b5e067654db06e8b152e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81e6d07edc7d7857cb288980bc7e9d5002234c2fe642df6798611b150b6d66e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81e6d07edc7d7857cb288980bc7e9d5002234c2fe642df6798611b150b6d66e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gfznp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:04Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:04 crc kubenswrapper[4836]: I0217 14:07:04.190792 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t7845" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eeaa6bd-bab3-4310-9522-747924f2e825\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54ef47d3331023c2e4524dc2c8958b4ed4e2a499eb4d5f084f70cd4afcca1d32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6778ba6ea7cfd1d378cae6dc4330542c35cedaff8e820a2158571f542e0ec55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6778ba6ea7cfd1d378cae6dc4330542c35cedaff8e820a2158571f542e0ec55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14eab60dcf798778aa77f01af0307c7a5ef44846ceaf5a12e4a5f529b812aaa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14eab60dcf798778aa77f01af0307c7a5ef44846ceaf5a12e4a5f529b812aaa4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd65cf675b0d34fc5ea309bdcd17ef648bbf1c0ee6e2ece25f459d8d2403387c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd65cf675b0d34fc5ea309bdcd17ef648bbf1c0ee6e2ece25f459d8d2403387c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b15d92528d05fd8bcab26df5eeb85616b44e750574cc8effa1f34c39dc329bf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b15d92528d05fd8bcab26df5eeb85616b44e750574cc8effa1f34c39dc329bf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ddc9616f0a81f7623eb10b0cdb194656ad856548f814df2e6cebdb2419293f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ddc9616f0a81f7623eb10b0cdb194656ad856548f814df2e6cebdb2419293f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a96c0c25b8539ae8bb660010bbcd7d147a150bfb653dc725a34bc7e4e4b949b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a96c0c25b8539ae8bb660010bbcd7d147a150bfb653dc725a34bc7e4e4b949b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t7845\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:04Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:04 crc kubenswrapper[4836]: I0217 14:07:04.202530 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7nmc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b98ed9fc-ca9f-49c8-b92d-c5b58df2ce3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac6f7d7722015b1a0d40a119c1c74e8a37a7e96e735f56a93ee2fde68d0e10f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6dh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61f8fdf114b2f1ca00e55833ea4eeb033f97b9fa48a3308c4db3c7dbc617c3b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6dh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7nmc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:04Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:04 crc kubenswrapper[4836]: I0217 14:07:04.206838 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:04 crc kubenswrapper[4836]: I0217 14:07:04.206873 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:04 crc kubenswrapper[4836]: I0217 14:07:04.206883 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:04 crc kubenswrapper[4836]: I0217 14:07:04.206898 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:04 crc kubenswrapper[4836]: I0217 14:07:04.206914 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:04Z","lastTransitionTime":"2026-02-17T14:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:04 crc kubenswrapper[4836]: I0217 14:07:04.214443 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-c4txt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g78bt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g78bt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-c4txt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:04Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:04 crc kubenswrapper[4836]: I0217 14:07:04.232649 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a443a5a-83a8-4154-94b7-ace6b324f461\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8787a45a4a94da65a0f67f1a402bfa526fa57631259981ded7ec3569fa977f36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6196237bf10370f0d43ac7ba96f1fbc861d8690734cf883081f67616e6047dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21927a2142504e88fd8bb2a254d804c0e13f6a42e81e16c02205292de3de3155\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c21e5f00c6fb44be82d2a907c0f9d17ea47071615370135f0e94a62f1eb7840\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:04Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:04 crc kubenswrapper[4836]: I0217 14:07:04.245957 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c76cc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"592aa549-1b1b-441e-93e4-0821e05ff2b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d56329a484e41f44539eb8ca01b461cc6f79eebf756350b0f49c47c0431e8abc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8vh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c76cc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:04Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:04 crc kubenswrapper[4836]: I0217 14:07:04.309925 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:04 crc kubenswrapper[4836]: I0217 14:07:04.309966 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:04 crc kubenswrapper[4836]: I0217 14:07:04.309977 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:04 crc kubenswrapper[4836]: I0217 14:07:04.309992 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:04 crc kubenswrapper[4836]: I0217 14:07:04.310003 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:04Z","lastTransitionTime":"2026-02-17T14:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:04 crc kubenswrapper[4836]: I0217 14:07:04.412934 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:04 crc kubenswrapper[4836]: I0217 14:07:04.412981 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:04 crc kubenswrapper[4836]: I0217 14:07:04.412992 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:04 crc kubenswrapper[4836]: I0217 14:07:04.413011 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:04 crc kubenswrapper[4836]: I0217 14:07:04.413030 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:04Z","lastTransitionTime":"2026-02-17T14:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:04 crc kubenswrapper[4836]: I0217 14:07:04.509396 4836 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-08 17:26:19.990440991 +0000 UTC Feb 17 14:07:04 crc kubenswrapper[4836]: I0217 14:07:04.516099 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:04 crc kubenswrapper[4836]: I0217 14:07:04.516173 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:04 crc kubenswrapper[4836]: I0217 14:07:04.516188 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:04 crc kubenswrapper[4836]: I0217 14:07:04.516210 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:04 crc kubenswrapper[4836]: I0217 14:07:04.516226 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:04Z","lastTransitionTime":"2026-02-17T14:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:04 crc kubenswrapper[4836]: I0217 14:07:04.579022 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"895a19c9-a3f0-4a15-aa19-19347121388c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bce78b27c441ccb16f0fd2489cbe3e38be0f3d330fcd2b602f62a8ca0aa8616b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99tf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c10f7b69881ea75bfa81905a42379fed8daf48e788b5f2787c522a52d28f58cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99tf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bkk9g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:04Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:04 crc kubenswrapper[4836]: I0217 14:07:04.591021 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd6efd6a58b6d90897fa902c8d0c8082ecec2899f16adc3e56b98f0bc5c4640d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:04Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:04 crc kubenswrapper[4836]: I0217 14:07:04.600353 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vt5sw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d1f430-35ed-4c4e-a797-d7a0a5a45266\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25f1e7f83b2916db4faac1afe1f5107375c0e1674ff1d6f90f1025a9920111e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqtsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vt5sw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:04Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:04 crc kubenswrapper[4836]: I0217 14:07:04.618952 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:04 crc kubenswrapper[4836]: I0217 14:07:04.619009 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:04 crc kubenswrapper[4836]: I0217 14:07:04.619028 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:04 crc kubenswrapper[4836]: I0217 14:07:04.619050 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:04 crc kubenswrapper[4836]: I0217 14:07:04.619067 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:04Z","lastTransitionTime":"2026-02-17T14:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:04 crc kubenswrapper[4836]: I0217 14:07:04.620185 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"243121c7-6c63-4df6-a6a6-1ef6770e1125\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0143c73054039d5957c29bdf62217900d803ba2e0046c09dcff149b27fc94995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18b1d7908740d818568e489caf3447ab0f5768c180cc6d443adc67414e51cd07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e054e9ec9a1f8d54b704be11ff2e55c9aaad9d717b1817aa553cfbf2cc6ac16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a5d1fd5d2de253417bcb1e4edadc2363a1c259769418eb2b2aab3ff8253de2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fbf3d3daf43d0da96ad0c1340b3c72815adceec24d0ed5fc97f965d1d93e7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://854199187b81a3d6692a05073c35985feb14993f76f4ac86149c2b6bac635e4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://854199187b81a3d6692a05073c35985feb14993f76f4ac86149c2b6bac635e4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67b26f0d65128004f7645ac1f54a11bc56909269cb9b9337e447b0d5bf783c4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67b26f0d65128004f7645ac1f54a11bc56909269cb9b9337e447b0d5bf783c4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e9fc4a71369f80c06598341729c7db0cb6e0f33d70b0be64470bc6fd40d1cce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9fc4a71369f80c06598341729c7db0cb6e0f33d70b0be64470bc6fd40d1cce2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:04Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:04 crc kubenswrapper[4836]: I0217 14:07:04.635819 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b942ef1c58993ae6344d4aef5fcfe21434b5f3b3282bdf8070560fa8973a3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77390b24cb4a223b8eb6a46db49a4a88b68f87aa0b4b134b45a552c180dec5b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:04Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:04 crc kubenswrapper[4836]: I0217 14:07:04.647942 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:04Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:04 crc kubenswrapper[4836]: I0217 14:07:04.663050 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc30b1c397bb81c58abc71fb2c6d71c94b9a0f7962d8278edc4f65f1a1c64ef8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:04Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:04 crc kubenswrapper[4836]: I0217 14:07:04.676375 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:04Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:04 crc kubenswrapper[4836]: I0217 14:07:04.691232 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:04Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:04 crc kubenswrapper[4836]: I0217 14:07:04.701679 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jlz6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f87e91ef-e64c-45a5-9bd5-cc6537e51b1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e03fa870790f606ebf6db3bac825e59b209b7a2bd6cb22264597b30211fbc2fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vbzg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jlz6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:04Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:04 crc kubenswrapper[4836]: I0217 14:07:04.720269 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebae2d4e3375fda616b0ea16c0c86a5c1c36004e7d22873a289ff37874bbc74b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb05318b17722c3f731dc976d38c05a4ab7eafd686ad0fcf17bd01032409eaee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5576148df454de0fb3d15ea9de93736aab907ca8cf09669c3c25fde79ec6fad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47fdc413501c45a2d6e531a1427fd6df3e9d2c57e3d60098b31fbfd695d98a0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d36d269996169583e4e181aac060491640e74df94385276752f460c408273cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6545ec908eecb35a4405050dd77343f0acd4ec7d54ec55853a926e00b7f37f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4db5a5e396fbddd65ca0e94f1414b226b324f8bce9dd7c243d40bfbd0d060bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4db5a5e396fbddd65ca0e94f1414b226b324f8bce9dd7c243d40bfbd0d060bb1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T14:07:02Z\\\",\\\"message\\\":\\\"\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0217 14:07:02.828170 6489 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]} options:{GoMap:map[iface-id-ver:3b6479f0-333b-4a96-9adf-2099afdc2447 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61897e97-c771-4738-8709-09636387cb00}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0217 14:07:02.828220 6489 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-config-operator/machine-config-controller\\\\\\\"}\\\\nI0217 14:07:02.828227 6489 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0217 14:07:02.828241 6489 services_controller.go:360] Finished syncing service machine-config-controller on namespace openshift-machine-config-operator for network=default : 1.693755ms\\\\nI0217 14:07:02.828254 6489 services_control\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:07:01Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-gfznp_openshift-ovn-kubernetes(67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8d2edfc6a88391d034f310661e09eb58503f33f015b5e067654db06e8b152e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81e6d07edc7d7857cb288980bc7e9d5002234c2fe642df6798611b150b6d66e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81e6d07edc7d7857cb288980bc7e9d5002234c2fe642df6798611b150b6d66e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gfznp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:04Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:04 crc kubenswrapper[4836]: I0217 14:07:04.722260 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:04 crc kubenswrapper[4836]: I0217 14:07:04.722335 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:04 crc kubenswrapper[4836]: I0217 14:07:04.722347 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:04 crc kubenswrapper[4836]: I0217 14:07:04.722363 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:04 crc kubenswrapper[4836]: I0217 14:07:04.722373 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:04Z","lastTransitionTime":"2026-02-17T14:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:04 crc kubenswrapper[4836]: I0217 14:07:04.733860 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15e7e5f3-c249-4609-937e-ffdc78580880\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://281c6f51b95a96d557a29b449759275131f2fde30c4814e7692edfff96442e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bc90fa5f9363034d51c35721493655d10f8f8d5fad4dda86ca4c882963fb7a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edaaa277a27d7e7b9e8b42acaa9ee04fc0933b440441ad3f8abee458e96945c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14e1fa543737ef273b75b2cffc78dc2edf4d1673c2c4095edda7c40db8238f2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a13a546a929cc67fa3a708f23186f65c3dbc28afd67a421a66c732b8308e4177\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:06:30.193193 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:06:30.195418 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-478710080/tls.crt::/tmp/serving-cert-478710080/tls.key\\\\\\\"\\\\nI0217 14:06:35.690345 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:06:35.701676 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:06:35.701714 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:06:35.701739 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:06:35.701745 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:06:35.709986 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 14:06:35.710011 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:06:35.710016 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:06:35.710020 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:06:35.710023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:06:35.710027 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:06:35.710030 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 14:06:35.710186 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 14:06:35.711336 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b36202875b19f7dafcfde168e006d9d47571de30240d5886e5e0920b74b609b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf6a761df18c5e7699f2ec87d4b97f19da71f588cb12eff8e541140a361fa905\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf6a761df18c5e7699f2ec87d4b97f19da71f588cb12eff8e541140a361fa905\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:04Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:04 crc kubenswrapper[4836]: I0217 14:07:04.750466 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t7845" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eeaa6bd-bab3-4310-9522-747924f2e825\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54ef47d3331023c2e4524dc2c8958b4ed4e2a499eb4d5f084f70cd4afcca1d32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6778ba6ea7cfd1d378cae6dc4330542c35cedaff8e820a2158571f542e0ec55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6778ba6ea7cfd1d378cae6dc4330542c35cedaff8e820a2158571f542e0ec55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14eab60dcf798778aa77f01af0307c7a5ef44846ceaf5a12e4a5f529b812aaa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14eab60dcf798778aa77f01af0307c7a5ef44846ceaf5a12e4a5f529b812aaa4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd65cf675b0d34fc5ea309bdcd17ef648bbf1c0ee6e2ece25f459d8d2403387c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd65cf675b0d34fc5ea309bdcd17ef648bbf1c0ee6e2ece25f459d8d2403387c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b15d92528d05fd8bcab26df5eeb85616b44e750574cc8effa1f34c39dc329bf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b15d92528d05fd8bcab26df5eeb85616b44e750574cc8effa1f34c39dc329bf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ddc9616f0a81f7623eb10b0cdb194656ad856548f814df2e6cebdb2419293f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ddc9616f0a81f7623eb10b0cdb194656ad856548f814df2e6cebdb2419293f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a96c0c25b8539ae8bb660010bbcd7d147a150bfb653dc725a34bc7e4e4b949b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a96c0c25b8539ae8bb660010bbcd7d147a150bfb653dc725a34bc7e4e4b949b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t7845\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:04Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:04 crc kubenswrapper[4836]: I0217 14:07:04.763791 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7nmc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b98ed9fc-ca9f-49c8-b92d-c5b58df2ce3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac6f7d7722015b1a0d40a119c1c74e8a37a7e96e735f56a93ee2fde68d0e10f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6dh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61f8fdf114b2f1ca00e55833ea4eeb033f97b9fa48a3308c4db3c7dbc617c3b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6dh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7nmc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:04Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:04 crc kubenswrapper[4836]: I0217 14:07:04.777573 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-c4txt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g78bt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g78bt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-c4txt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:04Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:04 crc kubenswrapper[4836]: I0217 14:07:04.791406 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a443a5a-83a8-4154-94b7-ace6b324f461\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8787a45a4a94da65a0f67f1a402bfa526fa57631259981ded7ec3569fa977f36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6196237bf10370f0d43ac7ba96f1fbc861d8690734cf883081f67616e6047dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21927a2142504e88fd8bb2a254d804c0e13f6a42e81e16c02205292de3de3155\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c21e5f00c6fb44be82d2a907c0f9d17ea47071615370135f0e94a62f1eb7840\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:04Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:04 crc kubenswrapper[4836]: I0217 14:07:04.804492 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c76cc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"592aa549-1b1b-441e-93e4-0821e05ff2b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d56329a484e41f44539eb8ca01b461cc6f79eebf756350b0f49c47c0431e8abc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8vh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c76cc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:04Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:04 crc kubenswrapper[4836]: I0217 14:07:04.825118 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:04 crc kubenswrapper[4836]: I0217 14:07:04.825170 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:04 crc kubenswrapper[4836]: I0217 14:07:04.825181 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:04 crc kubenswrapper[4836]: I0217 14:07:04.825202 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:04 crc kubenswrapper[4836]: I0217 14:07:04.825215 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:04Z","lastTransitionTime":"2026-02-17T14:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:04 crc kubenswrapper[4836]: I0217 14:07:04.928646 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:04 crc kubenswrapper[4836]: I0217 14:07:04.928704 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:04 crc kubenswrapper[4836]: I0217 14:07:04.928719 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:04 crc kubenswrapper[4836]: I0217 14:07:04.928737 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:04 crc kubenswrapper[4836]: I0217 14:07:04.928753 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:04Z","lastTransitionTime":"2026-02-17T14:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:05 crc kubenswrapper[4836]: I0217 14:07:05.030995 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:05 crc kubenswrapper[4836]: I0217 14:07:05.031062 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:05 crc kubenswrapper[4836]: I0217 14:07:05.031087 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:05 crc kubenswrapper[4836]: I0217 14:07:05.031111 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:05 crc kubenswrapper[4836]: I0217 14:07:05.031126 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:05Z","lastTransitionTime":"2026-02-17T14:07:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:05 crc kubenswrapper[4836]: I0217 14:07:05.133386 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:05 crc kubenswrapper[4836]: I0217 14:07:05.133426 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:05 crc kubenswrapper[4836]: I0217 14:07:05.133441 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:05 crc kubenswrapper[4836]: I0217 14:07:05.133458 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:05 crc kubenswrapper[4836]: I0217 14:07:05.133469 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:05Z","lastTransitionTime":"2026-02-17T14:07:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:05 crc kubenswrapper[4836]: I0217 14:07:05.235694 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:05 crc kubenswrapper[4836]: I0217 14:07:05.235743 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:05 crc kubenswrapper[4836]: I0217 14:07:05.235754 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:05 crc kubenswrapper[4836]: I0217 14:07:05.235772 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:05 crc kubenswrapper[4836]: I0217 14:07:05.235788 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:05Z","lastTransitionTime":"2026-02-17T14:07:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:05 crc kubenswrapper[4836]: I0217 14:07:05.338120 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:05 crc kubenswrapper[4836]: I0217 14:07:05.338166 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:05 crc kubenswrapper[4836]: I0217 14:07:05.338178 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:05 crc kubenswrapper[4836]: I0217 14:07:05.338196 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:05 crc kubenswrapper[4836]: I0217 14:07:05.338210 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:05Z","lastTransitionTime":"2026-02-17T14:07:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:05 crc kubenswrapper[4836]: I0217 14:07:05.440824 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:05 crc kubenswrapper[4836]: I0217 14:07:05.440863 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:05 crc kubenswrapper[4836]: I0217 14:07:05.440875 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:05 crc kubenswrapper[4836]: I0217 14:07:05.440890 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:05 crc kubenswrapper[4836]: I0217 14:07:05.440901 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:05Z","lastTransitionTime":"2026-02-17T14:07:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:05 crc kubenswrapper[4836]: I0217 14:07:05.509880 4836 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-18 21:29:28.785126987 +0000 UTC Feb 17 14:07:05 crc kubenswrapper[4836]: I0217 14:07:05.542799 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:05 crc kubenswrapper[4836]: I0217 14:07:05.542876 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:05 crc kubenswrapper[4836]: I0217 14:07:05.542893 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:05 crc kubenswrapper[4836]: I0217 14:07:05.542918 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:05 crc kubenswrapper[4836]: I0217 14:07:05.542932 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:05Z","lastTransitionTime":"2026-02-17T14:07:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:05 crc kubenswrapper[4836]: I0217 14:07:05.567209 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:07:05 crc kubenswrapper[4836]: I0217 14:07:05.567241 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:07:05 crc kubenswrapper[4836]: I0217 14:07:05.567250 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c4txt" Feb 17 14:07:05 crc kubenswrapper[4836]: I0217 14:07:05.567270 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:07:05 crc kubenswrapper[4836]: E0217 14:07:05.567358 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:07:05 crc kubenswrapper[4836]: E0217 14:07:05.567545 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:07:05 crc kubenswrapper[4836]: E0217 14:07:05.567629 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:07:05 crc kubenswrapper[4836]: E0217 14:07:05.567701 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c4txt" podUID="8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c" Feb 17 14:07:05 crc kubenswrapper[4836]: I0217 14:07:05.645380 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:05 crc kubenswrapper[4836]: I0217 14:07:05.645424 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:05 crc kubenswrapper[4836]: I0217 14:07:05.645437 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:05 crc kubenswrapper[4836]: I0217 14:07:05.645454 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:05 crc kubenswrapper[4836]: I0217 14:07:05.645466 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:05Z","lastTransitionTime":"2026-02-17T14:07:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:05 crc kubenswrapper[4836]: I0217 14:07:05.748331 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:05 crc kubenswrapper[4836]: I0217 14:07:05.748380 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:05 crc kubenswrapper[4836]: I0217 14:07:05.748392 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:05 crc kubenswrapper[4836]: I0217 14:07:05.748406 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:05 crc kubenswrapper[4836]: I0217 14:07:05.748415 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:05Z","lastTransitionTime":"2026-02-17T14:07:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:05 crc kubenswrapper[4836]: I0217 14:07:05.800344 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c-metrics-certs\") pod \"network-metrics-daemon-c4txt\" (UID: \"8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c\") " pod="openshift-multus/network-metrics-daemon-c4txt" Feb 17 14:07:05 crc kubenswrapper[4836]: E0217 14:07:05.800531 4836 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 14:07:05 crc kubenswrapper[4836]: E0217 14:07:05.800587 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c-metrics-certs podName:8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c nodeName:}" failed. No retries permitted until 2026-02-17 14:07:21.800572598 +0000 UTC m=+68.143500867 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c-metrics-certs") pod "network-metrics-daemon-c4txt" (UID: "8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 14:07:05 crc kubenswrapper[4836]: I0217 14:07:05.851352 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:05 crc kubenswrapper[4836]: I0217 14:07:05.851405 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:05 crc kubenswrapper[4836]: I0217 14:07:05.851418 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:05 crc kubenswrapper[4836]: I0217 14:07:05.851439 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:05 crc kubenswrapper[4836]: I0217 14:07:05.851448 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:05Z","lastTransitionTime":"2026-02-17T14:07:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:05 crc kubenswrapper[4836]: I0217 14:07:05.953961 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:05 crc kubenswrapper[4836]: I0217 14:07:05.954002 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:05 crc kubenswrapper[4836]: I0217 14:07:05.954013 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:05 crc kubenswrapper[4836]: I0217 14:07:05.954051 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:05 crc kubenswrapper[4836]: I0217 14:07:05.954063 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:05Z","lastTransitionTime":"2026-02-17T14:07:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:06 crc kubenswrapper[4836]: I0217 14:07:06.057079 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:06 crc kubenswrapper[4836]: I0217 14:07:06.057356 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:06 crc kubenswrapper[4836]: I0217 14:07:06.057369 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:06 crc kubenswrapper[4836]: I0217 14:07:06.057389 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:06 crc kubenswrapper[4836]: I0217 14:07:06.057400 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:06Z","lastTransitionTime":"2026-02-17T14:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:06 crc kubenswrapper[4836]: I0217 14:07:06.160470 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:06 crc kubenswrapper[4836]: I0217 14:07:06.160525 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:06 crc kubenswrapper[4836]: I0217 14:07:06.160533 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:06 crc kubenswrapper[4836]: I0217 14:07:06.160550 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:06 crc kubenswrapper[4836]: I0217 14:07:06.160560 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:06Z","lastTransitionTime":"2026-02-17T14:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:06 crc kubenswrapper[4836]: I0217 14:07:06.263821 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:06 crc kubenswrapper[4836]: I0217 14:07:06.263882 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:06 crc kubenswrapper[4836]: I0217 14:07:06.263895 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:06 crc kubenswrapper[4836]: I0217 14:07:06.263915 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:06 crc kubenswrapper[4836]: I0217 14:07:06.263928 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:06Z","lastTransitionTime":"2026-02-17T14:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:06 crc kubenswrapper[4836]: I0217 14:07:06.367001 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:06 crc kubenswrapper[4836]: I0217 14:07:06.367061 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:06 crc kubenswrapper[4836]: I0217 14:07:06.367075 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:06 crc kubenswrapper[4836]: I0217 14:07:06.367106 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:06 crc kubenswrapper[4836]: I0217 14:07:06.367118 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:06Z","lastTransitionTime":"2026-02-17T14:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:06 crc kubenswrapper[4836]: I0217 14:07:06.469667 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:06 crc kubenswrapper[4836]: I0217 14:07:06.469740 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:06 crc kubenswrapper[4836]: I0217 14:07:06.469759 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:06 crc kubenswrapper[4836]: I0217 14:07:06.469787 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:06 crc kubenswrapper[4836]: I0217 14:07:06.469805 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:06Z","lastTransitionTime":"2026-02-17T14:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:06 crc kubenswrapper[4836]: I0217 14:07:06.510725 4836 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-26 16:34:02.9226518 +0000 UTC Feb 17 14:07:06 crc kubenswrapper[4836]: I0217 14:07:06.572111 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:06 crc kubenswrapper[4836]: I0217 14:07:06.572189 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:06 crc kubenswrapper[4836]: I0217 14:07:06.572199 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:06 crc kubenswrapper[4836]: I0217 14:07:06.572212 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:06 crc kubenswrapper[4836]: I0217 14:07:06.572223 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:06Z","lastTransitionTime":"2026-02-17T14:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:06 crc kubenswrapper[4836]: I0217 14:07:06.674969 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:06 crc kubenswrapper[4836]: I0217 14:07:06.675015 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:06 crc kubenswrapper[4836]: I0217 14:07:06.675030 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:06 crc kubenswrapper[4836]: I0217 14:07:06.675048 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:06 crc kubenswrapper[4836]: I0217 14:07:06.675057 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:06Z","lastTransitionTime":"2026-02-17T14:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:06 crc kubenswrapper[4836]: I0217 14:07:06.778000 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:06 crc kubenswrapper[4836]: I0217 14:07:06.778046 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:06 crc kubenswrapper[4836]: I0217 14:07:06.778058 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:06 crc kubenswrapper[4836]: I0217 14:07:06.778075 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:06 crc kubenswrapper[4836]: I0217 14:07:06.778086 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:06Z","lastTransitionTime":"2026-02-17T14:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:06 crc kubenswrapper[4836]: I0217 14:07:06.880679 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:06 crc kubenswrapper[4836]: I0217 14:07:06.881109 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:06 crc kubenswrapper[4836]: I0217 14:07:06.881245 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:06 crc kubenswrapper[4836]: I0217 14:07:06.881371 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:06 crc kubenswrapper[4836]: I0217 14:07:06.881468 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:06Z","lastTransitionTime":"2026-02-17T14:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:06 crc kubenswrapper[4836]: I0217 14:07:06.983918 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:06 crc kubenswrapper[4836]: I0217 14:07:06.983969 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:06 crc kubenswrapper[4836]: I0217 14:07:06.983981 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:06 crc kubenswrapper[4836]: I0217 14:07:06.983999 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:06 crc kubenswrapper[4836]: I0217 14:07:06.984012 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:06Z","lastTransitionTime":"2026-02-17T14:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:07 crc kubenswrapper[4836]: I0217 14:07:07.086686 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:07 crc kubenswrapper[4836]: I0217 14:07:07.086743 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:07 crc kubenswrapper[4836]: I0217 14:07:07.086755 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:07 crc kubenswrapper[4836]: I0217 14:07:07.086769 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:07 crc kubenswrapper[4836]: I0217 14:07:07.086779 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:07Z","lastTransitionTime":"2026-02-17T14:07:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:07 crc kubenswrapper[4836]: I0217 14:07:07.189258 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:07 crc kubenswrapper[4836]: I0217 14:07:07.189343 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:07 crc kubenswrapper[4836]: I0217 14:07:07.189357 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:07 crc kubenswrapper[4836]: I0217 14:07:07.189375 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:07 crc kubenswrapper[4836]: I0217 14:07:07.189387 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:07Z","lastTransitionTime":"2026-02-17T14:07:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:07 crc kubenswrapper[4836]: I0217 14:07:07.214523 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:07:07 crc kubenswrapper[4836]: E0217 14:07:07.214632 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:07:39.214609419 +0000 UTC m=+85.557537698 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:07:07 crc kubenswrapper[4836]: I0217 14:07:07.214730 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:07:07 crc kubenswrapper[4836]: I0217 14:07:07.214801 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:07:07 crc kubenswrapper[4836]: E0217 14:07:07.214878 4836 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 14:07:07 crc kubenswrapper[4836]: E0217 14:07:07.214908 4836 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 14:07:07 crc kubenswrapper[4836]: E0217 14:07:07.214928 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 14:07:39.214917567 +0000 UTC m=+85.557845836 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 14:07:07 crc kubenswrapper[4836]: E0217 14:07:07.214947 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 14:07:39.214937147 +0000 UTC m=+85.557865416 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 14:07:07 crc kubenswrapper[4836]: I0217 14:07:07.292440 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:07 crc kubenswrapper[4836]: I0217 14:07:07.292478 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:07 crc kubenswrapper[4836]: I0217 14:07:07.292490 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:07 crc kubenswrapper[4836]: I0217 14:07:07.292508 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:07 crc kubenswrapper[4836]: I0217 14:07:07.292523 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:07Z","lastTransitionTime":"2026-02-17T14:07:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:07 crc kubenswrapper[4836]: I0217 14:07:07.316110 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:07:07 crc kubenswrapper[4836]: I0217 14:07:07.316154 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:07:07 crc kubenswrapper[4836]: E0217 14:07:07.316260 4836 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 14:07:07 crc kubenswrapper[4836]: E0217 14:07:07.316263 4836 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 14:07:07 crc kubenswrapper[4836]: E0217 14:07:07.316320 4836 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 14:07:07 crc kubenswrapper[4836]: E0217 14:07:07.316336 4836 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 14:07:07 crc kubenswrapper[4836]: E0217 14:07:07.316276 4836 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 14:07:07 crc kubenswrapper[4836]: E0217 14:07:07.316389 4836 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 14:07:07 crc kubenswrapper[4836]: E0217 14:07:07.316390 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-17 14:07:39.316374223 +0000 UTC m=+85.659302512 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 14:07:07 crc kubenswrapper[4836]: E0217 14:07:07.316424 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-17 14:07:39.316416284 +0000 UTC m=+85.659344553 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 14:07:07 crc kubenswrapper[4836]: I0217 14:07:07.395009 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:07 crc kubenswrapper[4836]: I0217 14:07:07.395053 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:07 crc kubenswrapper[4836]: I0217 14:07:07.395065 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:07 crc kubenswrapper[4836]: I0217 14:07:07.395081 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:07 crc kubenswrapper[4836]: I0217 14:07:07.395092 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:07Z","lastTransitionTime":"2026-02-17T14:07:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:07 crc kubenswrapper[4836]: I0217 14:07:07.498866 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:07 crc kubenswrapper[4836]: I0217 14:07:07.498922 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:07 crc kubenswrapper[4836]: I0217 14:07:07.498934 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:07 crc kubenswrapper[4836]: I0217 14:07:07.498953 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:07 crc kubenswrapper[4836]: I0217 14:07:07.498965 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:07Z","lastTransitionTime":"2026-02-17T14:07:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:07 crc kubenswrapper[4836]: I0217 14:07:07.511037 4836 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-23 06:14:50.178798318 +0000 UTC Feb 17 14:07:07 crc kubenswrapper[4836]: I0217 14:07:07.567926 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:07:07 crc kubenswrapper[4836]: I0217 14:07:07.568068 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:07:07 crc kubenswrapper[4836]: I0217 14:07:07.568062 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:07:07 crc kubenswrapper[4836]: I0217 14:07:07.568077 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c4txt" Feb 17 14:07:07 crc kubenswrapper[4836]: E0217 14:07:07.568209 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:07:07 crc kubenswrapper[4836]: E0217 14:07:07.568361 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:07:07 crc kubenswrapper[4836]: E0217 14:07:07.568453 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:07:07 crc kubenswrapper[4836]: E0217 14:07:07.568591 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c4txt" podUID="8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c" Feb 17 14:07:07 crc kubenswrapper[4836]: I0217 14:07:07.601594 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:07 crc kubenswrapper[4836]: I0217 14:07:07.601639 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:07 crc kubenswrapper[4836]: I0217 14:07:07.601652 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:07 crc kubenswrapper[4836]: I0217 14:07:07.601669 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:07 crc kubenswrapper[4836]: I0217 14:07:07.601680 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:07Z","lastTransitionTime":"2026-02-17T14:07:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:07 crc kubenswrapper[4836]: I0217 14:07:07.704253 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:07 crc kubenswrapper[4836]: I0217 14:07:07.704360 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:07 crc kubenswrapper[4836]: I0217 14:07:07.704380 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:07 crc kubenswrapper[4836]: I0217 14:07:07.704397 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:07 crc kubenswrapper[4836]: I0217 14:07:07.704409 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:07Z","lastTransitionTime":"2026-02-17T14:07:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:07 crc kubenswrapper[4836]: I0217 14:07:07.807239 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:07 crc kubenswrapper[4836]: I0217 14:07:07.807265 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:07 crc kubenswrapper[4836]: I0217 14:07:07.807273 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:07 crc kubenswrapper[4836]: I0217 14:07:07.807285 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:07 crc kubenswrapper[4836]: I0217 14:07:07.807309 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:07Z","lastTransitionTime":"2026-02-17T14:07:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:07 crc kubenswrapper[4836]: I0217 14:07:07.909025 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:07 crc kubenswrapper[4836]: I0217 14:07:07.909068 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:07 crc kubenswrapper[4836]: I0217 14:07:07.909080 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:07 crc kubenswrapper[4836]: I0217 14:07:07.909096 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:07 crc kubenswrapper[4836]: I0217 14:07:07.909108 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:07Z","lastTransitionTime":"2026-02-17T14:07:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:08 crc kubenswrapper[4836]: I0217 14:07:08.011514 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:08 crc kubenswrapper[4836]: I0217 14:07:08.011824 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:08 crc kubenswrapper[4836]: I0217 14:07:08.011896 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:08 crc kubenswrapper[4836]: I0217 14:07:08.011968 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:08 crc kubenswrapper[4836]: I0217 14:07:08.012040 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:08Z","lastTransitionTime":"2026-02-17T14:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:08 crc kubenswrapper[4836]: I0217 14:07:08.114583 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:08 crc kubenswrapper[4836]: I0217 14:07:08.114627 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:08 crc kubenswrapper[4836]: I0217 14:07:08.114638 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:08 crc kubenswrapper[4836]: I0217 14:07:08.114654 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:08 crc kubenswrapper[4836]: I0217 14:07:08.114666 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:08Z","lastTransitionTime":"2026-02-17T14:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:08 crc kubenswrapper[4836]: I0217 14:07:08.216857 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:08 crc kubenswrapper[4836]: I0217 14:07:08.216912 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:08 crc kubenswrapper[4836]: I0217 14:07:08.216924 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:08 crc kubenswrapper[4836]: I0217 14:07:08.216941 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:08 crc kubenswrapper[4836]: I0217 14:07:08.216953 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:08Z","lastTransitionTime":"2026-02-17T14:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:08 crc kubenswrapper[4836]: I0217 14:07:08.319963 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:08 crc kubenswrapper[4836]: I0217 14:07:08.320057 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:08 crc kubenswrapper[4836]: I0217 14:07:08.320076 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:08 crc kubenswrapper[4836]: I0217 14:07:08.320099 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:08 crc kubenswrapper[4836]: I0217 14:07:08.320116 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:08Z","lastTransitionTime":"2026-02-17T14:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:08 crc kubenswrapper[4836]: I0217 14:07:08.422952 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:08 crc kubenswrapper[4836]: I0217 14:07:08.423014 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:08 crc kubenswrapper[4836]: I0217 14:07:08.423031 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:08 crc kubenswrapper[4836]: I0217 14:07:08.423056 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:08 crc kubenswrapper[4836]: I0217 14:07:08.423074 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:08Z","lastTransitionTime":"2026-02-17T14:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:08 crc kubenswrapper[4836]: I0217 14:07:08.511783 4836 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-02 07:44:40.412569106 +0000 UTC Feb 17 14:07:08 crc kubenswrapper[4836]: I0217 14:07:08.526477 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:08 crc kubenswrapper[4836]: I0217 14:07:08.526545 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:08 crc kubenswrapper[4836]: I0217 14:07:08.526604 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:08 crc kubenswrapper[4836]: I0217 14:07:08.526629 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:08 crc kubenswrapper[4836]: I0217 14:07:08.526646 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:08Z","lastTransitionTime":"2026-02-17T14:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:08 crc kubenswrapper[4836]: I0217 14:07:08.629398 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:08 crc kubenswrapper[4836]: I0217 14:07:08.629469 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:08 crc kubenswrapper[4836]: I0217 14:07:08.629483 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:08 crc kubenswrapper[4836]: I0217 14:07:08.629504 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:08 crc kubenswrapper[4836]: I0217 14:07:08.629519 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:08Z","lastTransitionTime":"2026-02-17T14:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:08 crc kubenswrapper[4836]: I0217 14:07:08.733076 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:08 crc kubenswrapper[4836]: I0217 14:07:08.733124 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:08 crc kubenswrapper[4836]: I0217 14:07:08.733135 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:08 crc kubenswrapper[4836]: I0217 14:07:08.733151 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:08 crc kubenswrapper[4836]: I0217 14:07:08.733164 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:08Z","lastTransitionTime":"2026-02-17T14:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:08 crc kubenswrapper[4836]: I0217 14:07:08.837360 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:08 crc kubenswrapper[4836]: I0217 14:07:08.837609 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:08 crc kubenswrapper[4836]: I0217 14:07:08.837725 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:08 crc kubenswrapper[4836]: I0217 14:07:08.837836 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:08 crc kubenswrapper[4836]: I0217 14:07:08.837904 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:08Z","lastTransitionTime":"2026-02-17T14:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:08 crc kubenswrapper[4836]: I0217 14:07:08.940490 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:08 crc kubenswrapper[4836]: I0217 14:07:08.940836 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:08 crc kubenswrapper[4836]: I0217 14:07:08.940924 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:08 crc kubenswrapper[4836]: I0217 14:07:08.941036 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:08 crc kubenswrapper[4836]: I0217 14:07:08.941128 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:08Z","lastTransitionTime":"2026-02-17T14:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:09 crc kubenswrapper[4836]: I0217 14:07:09.043879 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:09 crc kubenswrapper[4836]: I0217 14:07:09.043932 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:09 crc kubenswrapper[4836]: I0217 14:07:09.043947 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:09 crc kubenswrapper[4836]: I0217 14:07:09.043967 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:09 crc kubenswrapper[4836]: I0217 14:07:09.043982 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:09Z","lastTransitionTime":"2026-02-17T14:07:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:09 crc kubenswrapper[4836]: I0217 14:07:09.146745 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:09 crc kubenswrapper[4836]: I0217 14:07:09.146787 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:09 crc kubenswrapper[4836]: I0217 14:07:09.146798 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:09 crc kubenswrapper[4836]: I0217 14:07:09.146813 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:09 crc kubenswrapper[4836]: I0217 14:07:09.146826 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:09Z","lastTransitionTime":"2026-02-17T14:07:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:09 crc kubenswrapper[4836]: I0217 14:07:09.249926 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:09 crc kubenswrapper[4836]: I0217 14:07:09.250223 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:09 crc kubenswrapper[4836]: I0217 14:07:09.250360 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:09 crc kubenswrapper[4836]: I0217 14:07:09.250441 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:09 crc kubenswrapper[4836]: I0217 14:07:09.250524 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:09Z","lastTransitionTime":"2026-02-17T14:07:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:09 crc kubenswrapper[4836]: I0217 14:07:09.340513 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:07:09 crc kubenswrapper[4836]: I0217 14:07:09.353183 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:09 crc kubenswrapper[4836]: I0217 14:07:09.353516 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:09 crc kubenswrapper[4836]: I0217 14:07:09.353629 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:09 crc kubenswrapper[4836]: I0217 14:07:09.353750 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:09 crc kubenswrapper[4836]: I0217 14:07:09.353871 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:09Z","lastTransitionTime":"2026-02-17T14:07:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:09 crc kubenswrapper[4836]: I0217 14:07:09.361610 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"895a19c9-a3f0-4a15-aa19-19347121388c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bce78b27c441ccb16f0fd2489cbe3e38be0f3d330fcd2b602f62a8ca0aa8616b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99tf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c10f7b69881ea75bfa81905a42379fed8daf48e788b5f2787c522a52d28f58cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99tf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bkk9g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:09Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:09 crc kubenswrapper[4836]: I0217 14:07:09.374399 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b942ef1c58993ae6344d4aef5fcfe21434b5f3b3282bdf8070560fa8973a3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77390b24cb4a223b8eb6a46db49a4a88b68f87aa0b4b134b45a552c180dec5b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:09Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:09 crc kubenswrapper[4836]: I0217 14:07:09.386898 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:09Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:09 crc kubenswrapper[4836]: I0217 14:07:09.388415 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 17 14:07:09 crc kubenswrapper[4836]: I0217 14:07:09.397270 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd6efd6a58b6d90897fa902c8d0c8082ecec2899f16adc3e56b98f0bc5c4640d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:09Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:09 crc kubenswrapper[4836]: I0217 14:07:09.399610 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Feb 17 14:07:09 crc kubenswrapper[4836]: I0217 14:07:09.410445 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vt5sw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d1f430-35ed-4c4e-a797-d7a0a5a45266\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25f1e7f83b2916db4faac1afe1f5107375c0e1674ff1d6f90f1025a9920111e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqtsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vt5sw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:09Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:09 crc kubenswrapper[4836]: I0217 14:07:09.431645 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"243121c7-6c63-4df6-a6a6-1ef6770e1125\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0143c73054039d5957c29bdf62217900d803ba2e0046c09dcff149b27fc94995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18b1d7908740d818568e489caf3447ab0f5768c180cc6d443adc67414e51cd07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e054e9ec9a1f8d54b704be11ff2e55c9aaad9d717b1817aa553cfbf2cc6ac16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a5d1fd5d2de253417bcb1e4edadc2363a1c259769418eb2b2aab3ff8253de2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fbf3d3daf43d0da96ad0c1340b3c72815adceec24d0ed5fc97f965d1d93e7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://854199187b81a3d6692a05073c35985feb14993f76f4ac86149c2b6bac635e4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://854199187b81a3d6692a05073c35985feb14993f76f4ac86149c2b6bac635e4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67b26f0d65128004f7645ac1f54a11bc56909269cb9b9337e447b0d5bf783c4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67b26f0d65128004f7645ac1f54a11bc56909269cb9b9337e447b0d5bf783c4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e9fc4a71369f80c06598341729c7db0cb6e0f33d70b0be64470bc6fd40d1cce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9fc4a71369f80c06598341729c7db0cb6e0f33d70b0be64470bc6fd40d1cce2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:09Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:09 crc kubenswrapper[4836]: I0217 14:07:09.446085 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15e7e5f3-c249-4609-937e-ffdc78580880\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://281c6f51b95a96d557a29b449759275131f2fde30c4814e7692edfff96442e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bc90fa5f9363034d51c35721493655d10f8f8d5fad4dda86ca4c882963fb7a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edaaa277a27d7e7b9e8b42acaa9ee04fc0933b440441ad3f8abee458e96945c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14e1fa543737ef273b75b2cffc78dc2edf4d1673c2c4095edda7c40db8238f2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a13a546a929cc67fa3a708f23186f65c3dbc28afd67a421a66c732b8308e4177\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:06:30.193193 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:06:30.195418 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-478710080/tls.crt::/tmp/serving-cert-478710080/tls.key\\\\\\\"\\\\nI0217 14:06:35.690345 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:06:35.701676 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:06:35.701714 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:06:35.701739 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:06:35.701745 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:06:35.709986 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 14:06:35.710011 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:06:35.710016 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:06:35.710020 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:06:35.710023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:06:35.710027 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:06:35.710030 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 14:06:35.710186 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 14:06:35.711336 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b36202875b19f7dafcfde168e006d9d47571de30240d5886e5e0920b74b609b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf6a761df18c5e7699f2ec87d4b97f19da71f588cb12eff8e541140a361fa905\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf6a761df18c5e7699f2ec87d4b97f19da71f588cb12eff8e541140a361fa905\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:09Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:09 crc kubenswrapper[4836]: I0217 14:07:09.455960 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:09 crc kubenswrapper[4836]: I0217 14:07:09.455993 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:09 crc kubenswrapper[4836]: I0217 14:07:09.456004 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:09 crc kubenswrapper[4836]: I0217 14:07:09.456019 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:09 crc kubenswrapper[4836]: I0217 14:07:09.456030 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:09Z","lastTransitionTime":"2026-02-17T14:07:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:09 crc kubenswrapper[4836]: I0217 14:07:09.458397 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc30b1c397bb81c58abc71fb2c6d71c94b9a0f7962d8278edc4f65f1a1c64ef8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:09Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:09 crc kubenswrapper[4836]: I0217 14:07:09.469215 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:09Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:09 crc kubenswrapper[4836]: I0217 14:07:09.481433 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:09Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:09 crc kubenswrapper[4836]: I0217 14:07:09.494377 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jlz6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f87e91ef-e64c-45a5-9bd5-cc6537e51b1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e03fa870790f606ebf6db3bac825e59b209b7a2bd6cb22264597b30211fbc2fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vbzg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jlz6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:09Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:09 crc kubenswrapper[4836]: I0217 14:07:09.512051 4836 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-31 01:10:55.402095534 +0000 UTC Feb 17 14:07:09 crc kubenswrapper[4836]: I0217 14:07:09.516641 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebae2d4e3375fda616b0ea16c0c86a5c1c36004e7d22873a289ff37874bbc74b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb05318b17722c3f731dc976d38c05a4ab7eafd686ad0fcf17bd01032409eaee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5576148df454de0fb3d15ea9de93736aab907ca8cf09669c3c25fde79ec6fad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47fdc413501c45a2d6e531a1427fd6df3e9d2c57e3d60098b31fbfd695d98a0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d36d269996169583e4e181aac060491640e74df94385276752f460c408273cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6545ec908eecb35a4405050dd77343f0acd4ec7d54ec55853a926e00b7f37f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4db5a5e396fbddd65ca0e94f1414b226b324f8bce9dd7c243d40bfbd0d060bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4db5a5e396fbddd65ca0e94f1414b226b324f8bce9dd7c243d40bfbd0d060bb1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T14:07:02Z\\\",\\\"message\\\":\\\"\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0217 14:07:02.828170 6489 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]} options:{GoMap:map[iface-id-ver:3b6479f0-333b-4a96-9adf-2099afdc2447 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61897e97-c771-4738-8709-09636387cb00}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0217 14:07:02.828220 6489 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-config-operator/machine-config-controller\\\\\\\"}\\\\nI0217 14:07:02.828227 6489 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0217 14:07:02.828241 6489 services_controller.go:360] Finished syncing service machine-config-controller on namespace openshift-machine-config-operator for network=default : 1.693755ms\\\\nI0217 14:07:02.828254 6489 services_control\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:07:01Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-gfznp_openshift-ovn-kubernetes(67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8d2edfc6a88391d034f310661e09eb58503f33f015b5e067654db06e8b152e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81e6d07edc7d7857cb288980bc7e9d5002234c2fe642df6798611b150b6d66e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81e6d07edc7d7857cb288980bc7e9d5002234c2fe642df6798611b150b6d66e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gfznp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:09Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:09 crc kubenswrapper[4836]: I0217 14:07:09.527077 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7nmc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b98ed9fc-ca9f-49c8-b92d-c5b58df2ce3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac6f7d7722015b1a0d40a119c1c74e8a37a7e96e735f56a93ee2fde68d0e10f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6dh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61f8fdf114b2f1ca00e55833ea4eeb033f97b9fa48a3308c4db3c7dbc617c3b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6dh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7nmc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:09Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:09 crc kubenswrapper[4836]: I0217 14:07:09.536865 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-c4txt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g78bt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g78bt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-c4txt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:09Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:09 crc kubenswrapper[4836]: I0217 14:07:09.549453 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t7845" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eeaa6bd-bab3-4310-9522-747924f2e825\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54ef47d3331023c2e4524dc2c8958b4ed4e2a499eb4d5f084f70cd4afcca1d32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6778ba6ea7cfd1d378cae6dc4330542c35cedaff8e820a2158571f542e0ec55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6778ba6ea7cfd1d378cae6dc4330542c35cedaff8e820a2158571f542e0ec55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14eab60dcf798778aa77f01af0307c7a5ef44846ceaf5a12e4a5f529b812aaa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14eab60dcf798778aa77f01af0307c7a5ef44846ceaf5a12e4a5f529b812aaa4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd65cf675b0d34fc5ea309bdcd17ef648bbf1c0ee6e2ece25f459d8d2403387c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd65cf675b0d34fc5ea309bdcd17ef648bbf1c0ee6e2ece25f459d8d2403387c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b15d92528d05fd8bcab26df5eeb85616b44e750574cc8effa1f34c39dc329bf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b15d92528d05fd8bcab26df5eeb85616b44e750574cc8effa1f34c39dc329bf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ddc9616f0a81f7623eb10b0cdb194656ad856548f814df2e6cebdb2419293f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ddc9616f0a81f7623eb10b0cdb194656ad856548f814df2e6cebdb2419293f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a96c0c25b8539ae8bb660010bbcd7d147a150bfb653dc725a34bc7e4e4b949b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a96c0c25b8539ae8bb660010bbcd7d147a150bfb653dc725a34bc7e4e4b949b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t7845\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:09Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:09 crc kubenswrapper[4836]: I0217 14:07:09.558229 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:09 crc kubenswrapper[4836]: I0217 14:07:09.558273 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:09 crc kubenswrapper[4836]: I0217 14:07:09.558283 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:09 crc kubenswrapper[4836]: I0217 14:07:09.558315 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:09 crc kubenswrapper[4836]: I0217 14:07:09.558327 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:09Z","lastTransitionTime":"2026-02-17T14:07:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:09 crc kubenswrapper[4836]: I0217 14:07:09.562380 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c76cc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"592aa549-1b1b-441e-93e4-0821e05ff2b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d56329a484e41f44539eb8ca01b461cc6f79eebf756350b0f49c47c0431e8abc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8vh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c76cc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:09Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:09 crc kubenswrapper[4836]: I0217 14:07:09.567434 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:07:09 crc kubenswrapper[4836]: I0217 14:07:09.567458 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c4txt" Feb 17 14:07:09 crc kubenswrapper[4836]: I0217 14:07:09.567466 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:07:09 crc kubenswrapper[4836]: I0217 14:07:09.567519 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:07:09 crc kubenswrapper[4836]: E0217 14:07:09.567548 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:07:09 crc kubenswrapper[4836]: E0217 14:07:09.567636 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:07:09 crc kubenswrapper[4836]: E0217 14:07:09.567739 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c4txt" podUID="8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c" Feb 17 14:07:09 crc kubenswrapper[4836]: E0217 14:07:09.567796 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:07:09 crc kubenswrapper[4836]: I0217 14:07:09.573598 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a443a5a-83a8-4154-94b7-ace6b324f461\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8787a45a4a94da65a0f67f1a402bfa526fa57631259981ded7ec3569fa977f36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6196237bf10370f0d43ac7ba96f1fbc861d8690734cf883081f67616e6047dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21927a2142504e88fd8bb2a254d804c0e13f6a42e81e16c02205292de3de3155\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c21e5f00c6fb44be82d2a907c0f9d17ea47071615370135f0e94a62f1eb7840\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:09Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:09 crc kubenswrapper[4836]: I0217 14:07:09.584901 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a443a5a-83a8-4154-94b7-ace6b324f461\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8787a45a4a94da65a0f67f1a402bfa526fa57631259981ded7ec3569fa977f36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6196237bf10370f0d43ac7ba96f1fbc861d8690734cf883081f67616e6047dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21927a2142504e88fd8bb2a254d804c0e13f6a42e81e16c02205292de3de3155\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c21e5f00c6fb44be82d2a907c0f9d17ea47071615370135f0e94a62f1eb7840\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:09Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:09 crc kubenswrapper[4836]: I0217 14:07:09.599077 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c76cc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"592aa549-1b1b-441e-93e4-0821e05ff2b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d56329a484e41f44539eb8ca01b461cc6f79eebf756350b0f49c47c0431e8abc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8vh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c76cc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:09Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:09 crc kubenswrapper[4836]: I0217 14:07:09.610686 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"895a19c9-a3f0-4a15-aa19-19347121388c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bce78b27c441ccb16f0fd2489cbe3e38be0f3d330fcd2b602f62a8ca0aa8616b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99tf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c10f7b69881ea75bfa81905a42379fed8daf48e788b5f2787c522a52d28f58cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99tf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bkk9g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:09Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:09 crc kubenswrapper[4836]: I0217 14:07:09.629067 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"243121c7-6c63-4df6-a6a6-1ef6770e1125\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0143c73054039d5957c29bdf62217900d803ba2e0046c09dcff149b27fc94995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18b1d7908740d818568e489caf3447ab0f5768c180cc6d443adc67414e51cd07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e054e9ec9a1f8d54b704be11ff2e55c9aaad9d717b1817aa553cfbf2cc6ac16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a5d1fd5d2de253417bcb1e4edadc2363a1c259769418eb2b2aab3ff8253de2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fbf3d3daf43d0da96ad0c1340b3c72815adceec24d0ed5fc97f965d1d93e7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://854199187b81a3d6692a05073c35985feb14993f76f4ac86149c2b6bac635e4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://854199187b81a3d6692a05073c35985feb14993f76f4ac86149c2b6bac635e4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67b26f0d65128004f7645ac1f54a11bc56909269cb9b9337e447b0d5bf783c4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67b26f0d65128004f7645ac1f54a11bc56909269cb9b9337e447b0d5bf783c4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e9fc4a71369f80c06598341729c7db0cb6e0f33d70b0be64470bc6fd40d1cce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9fc4a71369f80c06598341729c7db0cb6e0f33d70b0be64470bc6fd40d1cce2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:09Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:09 crc kubenswrapper[4836]: I0217 14:07:09.642254 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b942ef1c58993ae6344d4aef5fcfe21434b5f3b3282bdf8070560fa8973a3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77390b24cb4a223b8eb6a46db49a4a88b68f87aa0b4b134b45a552c180dec5b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:09Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:09 crc kubenswrapper[4836]: I0217 14:07:09.653899 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:09Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:09 crc kubenswrapper[4836]: I0217 14:07:09.661172 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:09 crc kubenswrapper[4836]: I0217 14:07:09.661205 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:09 crc kubenswrapper[4836]: I0217 14:07:09.661213 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:09 crc kubenswrapper[4836]: I0217 14:07:09.661240 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:09 crc kubenswrapper[4836]: I0217 14:07:09.661249 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:09Z","lastTransitionTime":"2026-02-17T14:07:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:09 crc kubenswrapper[4836]: I0217 14:07:09.667069 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd6efd6a58b6d90897fa902c8d0c8082ecec2899f16adc3e56b98f0bc5c4640d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:09Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:09 crc kubenswrapper[4836]: I0217 14:07:09.677219 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vt5sw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d1f430-35ed-4c4e-a797-d7a0a5a45266\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25f1e7f83b2916db4faac1afe1f5107375c0e1674ff1d6f90f1025a9920111e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqtsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vt5sw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:09Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:09 crc kubenswrapper[4836]: I0217 14:07:09.689822 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:09Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:09 crc kubenswrapper[4836]: I0217 14:07:09.701053 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jlz6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f87e91ef-e64c-45a5-9bd5-cc6537e51b1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e03fa870790f606ebf6db3bac825e59b209b7a2bd6cb22264597b30211fbc2fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vbzg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jlz6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:09Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:09 crc kubenswrapper[4836]: I0217 14:07:09.720526 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebae2d4e3375fda616b0ea16c0c86a5c1c36004e7d22873a289ff37874bbc74b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb05318b17722c3f731dc976d38c05a4ab7eafd686ad0fcf17bd01032409eaee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5576148df454de0fb3d15ea9de93736aab907ca8cf09669c3c25fde79ec6fad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47fdc413501c45a2d6e531a1427fd6df3e9d2c57e3d60098b31fbfd695d98a0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d36d269996169583e4e181aac060491640e74df94385276752f460c408273cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6545ec908eecb35a4405050dd77343f0acd4ec7d54ec55853a926e00b7f37f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4db5a5e396fbddd65ca0e94f1414b226b324f8bce9dd7c243d40bfbd0d060bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4db5a5e396fbddd65ca0e94f1414b226b324f8bce9dd7c243d40bfbd0d060bb1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T14:07:02Z\\\",\\\"message\\\":\\\"\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0217 14:07:02.828170 6489 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]} options:{GoMap:map[iface-id-ver:3b6479f0-333b-4a96-9adf-2099afdc2447 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61897e97-c771-4738-8709-09636387cb00}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0217 14:07:02.828220 6489 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-config-operator/machine-config-controller\\\\\\\"}\\\\nI0217 14:07:02.828227 6489 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0217 14:07:02.828241 6489 services_controller.go:360] Finished syncing service machine-config-controller on namespace openshift-machine-config-operator for network=default : 1.693755ms\\\\nI0217 14:07:02.828254 6489 services_control\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:07:01Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-gfznp_openshift-ovn-kubernetes(67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8d2edfc6a88391d034f310661e09eb58503f33f015b5e067654db06e8b152e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81e6d07edc7d7857cb288980bc7e9d5002234c2fe642df6798611b150b6d66e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81e6d07edc7d7857cb288980bc7e9d5002234c2fe642df6798611b150b6d66e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gfznp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:09Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:09 crc kubenswrapper[4836]: I0217 14:07:09.736059 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15e7e5f3-c249-4609-937e-ffdc78580880\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://281c6f51b95a96d557a29b449759275131f2fde30c4814e7692edfff96442e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bc90fa5f9363034d51c35721493655d10f8f8d5fad4dda86ca4c882963fb7a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edaaa277a27d7e7b9e8b42acaa9ee04fc0933b440441ad3f8abee458e96945c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14e1fa543737ef273b75b2cffc78dc2edf4d1673c2c4095edda7c40db8238f2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a13a546a929cc67fa3a708f23186f65c3dbc28afd67a421a66c732b8308e4177\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:06:30.193193 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:06:30.195418 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-478710080/tls.crt::/tmp/serving-cert-478710080/tls.key\\\\\\\"\\\\nI0217 14:06:35.690345 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:06:35.701676 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:06:35.701714 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:06:35.701739 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:06:35.701745 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:06:35.709986 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 14:06:35.710011 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:06:35.710016 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:06:35.710020 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:06:35.710023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:06:35.710027 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:06:35.710030 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 14:06:35.710186 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 14:06:35.711336 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b36202875b19f7dafcfde168e006d9d47571de30240d5886e5e0920b74b609b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf6a761df18c5e7699f2ec87d4b97f19da71f588cb12eff8e541140a361fa905\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf6a761df18c5e7699f2ec87d4b97f19da71f588cb12eff8e541140a361fa905\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:09Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:09 crc kubenswrapper[4836]: I0217 14:07:09.749169 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e7e5246-9255-4e1e-95c9-44606b16c1ae\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://125ff8db8d520f18c812d9fabdb8e31c63958022c790a31167fabc3549de5f7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd23a5cef9ad5b95a15765a1cc2055e05de0f9821f426745c1e50829ed1c8141\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1c943885038503c4b93bc7688fb4ac415496280e3f969326883c0d366c66e08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73bf4b735f1125a8bb5b1f3f449b651c60fdc4ee62b918e28eb3b13278e1f7d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73bf4b735f1125a8bb5b1f3f449b651c60fdc4ee62b918e28eb3b13278e1f7d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:09Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:09 crc kubenswrapper[4836]: I0217 14:07:09.764570 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:09 crc kubenswrapper[4836]: I0217 14:07:09.764612 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:09 crc kubenswrapper[4836]: I0217 14:07:09.764623 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:09 crc kubenswrapper[4836]: I0217 14:07:09.764639 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:09 crc kubenswrapper[4836]: I0217 14:07:09.764650 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:09Z","lastTransitionTime":"2026-02-17T14:07:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:09 crc kubenswrapper[4836]: I0217 14:07:09.766470 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc30b1c397bb81c58abc71fb2c6d71c94b9a0f7962d8278edc4f65f1a1c64ef8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:09Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:09 crc kubenswrapper[4836]: I0217 14:07:09.779548 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:09Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:09 crc kubenswrapper[4836]: I0217 14:07:09.792955 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t7845" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eeaa6bd-bab3-4310-9522-747924f2e825\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54ef47d3331023c2e4524dc2c8958b4ed4e2a499eb4d5f084f70cd4afcca1d32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6778ba6ea7cfd1d378cae6dc4330542c35cedaff8e820a2158571f542e0ec55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6778ba6ea7cfd1d378cae6dc4330542c35cedaff8e820a2158571f542e0ec55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14eab60dcf798778aa77f01af0307c7a5ef44846ceaf5a12e4a5f529b812aaa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14eab60dcf798778aa77f01af0307c7a5ef44846ceaf5a12e4a5f529b812aaa4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd65cf675b0d34fc5ea309bdcd17ef648bbf1c0ee6e2ece25f459d8d2403387c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd65cf675b0d34fc5ea309bdcd17ef648bbf1c0ee6e2ece25f459d8d2403387c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b15d92528d05fd8bcab26df5eeb85616b44e750574cc8effa1f34c39dc329bf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b15d92528d05fd8bcab26df5eeb85616b44e750574cc8effa1f34c39dc329bf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ddc9616f0a81f7623eb10b0cdb194656ad856548f814df2e6cebdb2419293f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ddc9616f0a81f7623eb10b0cdb194656ad856548f814df2e6cebdb2419293f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a96c0c25b8539ae8bb660010bbcd7d147a150bfb653dc725a34bc7e4e4b949b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a96c0c25b8539ae8bb660010bbcd7d147a150bfb653dc725a34bc7e4e4b949b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t7845\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:09Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:09 crc kubenswrapper[4836]: I0217 14:07:09.803281 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7nmc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b98ed9fc-ca9f-49c8-b92d-c5b58df2ce3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac6f7d7722015b1a0d40a119c1c74e8a37a7e96e735f56a93ee2fde68d0e10f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6dh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61f8fdf114b2f1ca00e55833ea4eeb033f97b9fa48a3308c4db3c7dbc617c3b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6dh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7nmc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:09Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:09 crc kubenswrapper[4836]: I0217 14:07:09.813026 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-c4txt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g78bt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g78bt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-c4txt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:09Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:09 crc kubenswrapper[4836]: I0217 14:07:09.867792 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:09 crc kubenswrapper[4836]: I0217 14:07:09.867834 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:09 crc kubenswrapper[4836]: I0217 14:07:09.867858 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:09 crc kubenswrapper[4836]: I0217 14:07:09.867872 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:09 crc kubenswrapper[4836]: I0217 14:07:09.867882 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:09Z","lastTransitionTime":"2026-02-17T14:07:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:09 crc kubenswrapper[4836]: I0217 14:07:09.970175 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:09 crc kubenswrapper[4836]: I0217 14:07:09.970211 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:09 crc kubenswrapper[4836]: I0217 14:07:09.970219 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:09 crc kubenswrapper[4836]: I0217 14:07:09.970232 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:09 crc kubenswrapper[4836]: I0217 14:07:09.970240 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:09Z","lastTransitionTime":"2026-02-17T14:07:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:10 crc kubenswrapper[4836]: I0217 14:07:10.073198 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:10 crc kubenswrapper[4836]: I0217 14:07:10.073247 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:10 crc kubenswrapper[4836]: I0217 14:07:10.073258 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:10 crc kubenswrapper[4836]: I0217 14:07:10.073277 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:10 crc kubenswrapper[4836]: I0217 14:07:10.073305 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:10Z","lastTransitionTime":"2026-02-17T14:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:10 crc kubenswrapper[4836]: I0217 14:07:10.175495 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:10 crc kubenswrapper[4836]: I0217 14:07:10.175577 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:10 crc kubenswrapper[4836]: I0217 14:07:10.175591 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:10 crc kubenswrapper[4836]: I0217 14:07:10.175630 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:10 crc kubenswrapper[4836]: I0217 14:07:10.175645 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:10Z","lastTransitionTime":"2026-02-17T14:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:10 crc kubenswrapper[4836]: I0217 14:07:10.278567 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:10 crc kubenswrapper[4836]: I0217 14:07:10.278631 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:10 crc kubenswrapper[4836]: I0217 14:07:10.278644 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:10 crc kubenswrapper[4836]: I0217 14:07:10.278662 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:10 crc kubenswrapper[4836]: I0217 14:07:10.278677 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:10Z","lastTransitionTime":"2026-02-17T14:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:10 crc kubenswrapper[4836]: I0217 14:07:10.382144 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:10 crc kubenswrapper[4836]: I0217 14:07:10.382189 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:10 crc kubenswrapper[4836]: I0217 14:07:10.382201 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:10 crc kubenswrapper[4836]: I0217 14:07:10.382408 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:10 crc kubenswrapper[4836]: I0217 14:07:10.382567 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:10Z","lastTransitionTime":"2026-02-17T14:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:10 crc kubenswrapper[4836]: I0217 14:07:10.485687 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:10 crc kubenswrapper[4836]: I0217 14:07:10.485748 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:10 crc kubenswrapper[4836]: I0217 14:07:10.485758 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:10 crc kubenswrapper[4836]: I0217 14:07:10.485774 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:10 crc kubenswrapper[4836]: I0217 14:07:10.485789 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:10Z","lastTransitionTime":"2026-02-17T14:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:10 crc kubenswrapper[4836]: I0217 14:07:10.513085 4836 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-19 14:53:24.770827489 +0000 UTC Feb 17 14:07:10 crc kubenswrapper[4836]: I0217 14:07:10.587871 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:10 crc kubenswrapper[4836]: I0217 14:07:10.587913 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:10 crc kubenswrapper[4836]: I0217 14:07:10.587923 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:10 crc kubenswrapper[4836]: I0217 14:07:10.587937 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:10 crc kubenswrapper[4836]: I0217 14:07:10.587946 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:10Z","lastTransitionTime":"2026-02-17T14:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:10 crc kubenswrapper[4836]: I0217 14:07:10.690927 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:10 crc kubenswrapper[4836]: I0217 14:07:10.690989 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:10 crc kubenswrapper[4836]: I0217 14:07:10.691002 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:10 crc kubenswrapper[4836]: I0217 14:07:10.691022 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:10 crc kubenswrapper[4836]: I0217 14:07:10.691034 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:10Z","lastTransitionTime":"2026-02-17T14:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:10 crc kubenswrapper[4836]: I0217 14:07:10.792983 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:10 crc kubenswrapper[4836]: I0217 14:07:10.793023 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:10 crc kubenswrapper[4836]: I0217 14:07:10.793033 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:10 crc kubenswrapper[4836]: I0217 14:07:10.793048 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:10 crc kubenswrapper[4836]: I0217 14:07:10.793060 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:10Z","lastTransitionTime":"2026-02-17T14:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:10 crc kubenswrapper[4836]: I0217 14:07:10.895536 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:10 crc kubenswrapper[4836]: I0217 14:07:10.895580 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:10 crc kubenswrapper[4836]: I0217 14:07:10.895592 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:10 crc kubenswrapper[4836]: I0217 14:07:10.895609 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:10 crc kubenswrapper[4836]: I0217 14:07:10.895623 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:10Z","lastTransitionTime":"2026-02-17T14:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:10 crc kubenswrapper[4836]: I0217 14:07:10.998893 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:10 crc kubenswrapper[4836]: I0217 14:07:10.998988 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:10 crc kubenswrapper[4836]: I0217 14:07:10.999024 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:10 crc kubenswrapper[4836]: I0217 14:07:10.999058 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:10 crc kubenswrapper[4836]: I0217 14:07:10.999082 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:10Z","lastTransitionTime":"2026-02-17T14:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:11 crc kubenswrapper[4836]: I0217 14:07:11.102125 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:11 crc kubenswrapper[4836]: I0217 14:07:11.102185 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:11 crc kubenswrapper[4836]: I0217 14:07:11.102197 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:11 crc kubenswrapper[4836]: I0217 14:07:11.102222 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:11 crc kubenswrapper[4836]: I0217 14:07:11.102235 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:11Z","lastTransitionTime":"2026-02-17T14:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:11 crc kubenswrapper[4836]: I0217 14:07:11.204166 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:11 crc kubenswrapper[4836]: I0217 14:07:11.204225 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:11 crc kubenswrapper[4836]: I0217 14:07:11.204240 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:11 crc kubenswrapper[4836]: I0217 14:07:11.204263 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:11 crc kubenswrapper[4836]: I0217 14:07:11.204506 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:11Z","lastTransitionTime":"2026-02-17T14:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:11 crc kubenswrapper[4836]: I0217 14:07:11.307234 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:11 crc kubenswrapper[4836]: I0217 14:07:11.307286 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:11 crc kubenswrapper[4836]: I0217 14:07:11.307319 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:11 crc kubenswrapper[4836]: I0217 14:07:11.307334 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:11 crc kubenswrapper[4836]: I0217 14:07:11.307344 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:11Z","lastTransitionTime":"2026-02-17T14:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:11 crc kubenswrapper[4836]: I0217 14:07:11.409248 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:11 crc kubenswrapper[4836]: I0217 14:07:11.409324 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:11 crc kubenswrapper[4836]: I0217 14:07:11.409340 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:11 crc kubenswrapper[4836]: I0217 14:07:11.409356 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:11 crc kubenswrapper[4836]: I0217 14:07:11.409366 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:11Z","lastTransitionTime":"2026-02-17T14:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:11 crc kubenswrapper[4836]: I0217 14:07:11.512197 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:11 crc kubenswrapper[4836]: I0217 14:07:11.512260 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:11 crc kubenswrapper[4836]: I0217 14:07:11.512281 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:11 crc kubenswrapper[4836]: I0217 14:07:11.512332 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:11 crc kubenswrapper[4836]: I0217 14:07:11.512352 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:11Z","lastTransitionTime":"2026-02-17T14:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:11 crc kubenswrapper[4836]: I0217 14:07:11.513445 4836 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-25 12:13:07.27404931 +0000 UTC Feb 17 14:07:11 crc kubenswrapper[4836]: I0217 14:07:11.567475 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:07:11 crc kubenswrapper[4836]: I0217 14:07:11.567523 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:07:11 crc kubenswrapper[4836]: I0217 14:07:11.567475 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:07:11 crc kubenswrapper[4836]: I0217 14:07:11.567615 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c4txt" Feb 17 14:07:11 crc kubenswrapper[4836]: E0217 14:07:11.567965 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:07:11 crc kubenswrapper[4836]: E0217 14:07:11.568289 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c4txt" podUID="8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c" Feb 17 14:07:11 crc kubenswrapper[4836]: E0217 14:07:11.568347 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:07:11 crc kubenswrapper[4836]: E0217 14:07:11.569552 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:07:11 crc kubenswrapper[4836]: I0217 14:07:11.615271 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:11 crc kubenswrapper[4836]: I0217 14:07:11.615375 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:11 crc kubenswrapper[4836]: I0217 14:07:11.615398 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:11 crc kubenswrapper[4836]: I0217 14:07:11.615420 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:11 crc kubenswrapper[4836]: I0217 14:07:11.615438 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:11Z","lastTransitionTime":"2026-02-17T14:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:11 crc kubenswrapper[4836]: I0217 14:07:11.718865 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:11 crc kubenswrapper[4836]: I0217 14:07:11.718938 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:11 crc kubenswrapper[4836]: I0217 14:07:11.718961 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:11 crc kubenswrapper[4836]: I0217 14:07:11.718993 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:11 crc kubenswrapper[4836]: I0217 14:07:11.719018 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:11Z","lastTransitionTime":"2026-02-17T14:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:11 crc kubenswrapper[4836]: I0217 14:07:11.822759 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:11 crc kubenswrapper[4836]: I0217 14:07:11.822804 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:11 crc kubenswrapper[4836]: I0217 14:07:11.822814 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:11 crc kubenswrapper[4836]: I0217 14:07:11.822830 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:11 crc kubenswrapper[4836]: I0217 14:07:11.822839 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:11Z","lastTransitionTime":"2026-02-17T14:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:11 crc kubenswrapper[4836]: I0217 14:07:11.925042 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:11 crc kubenswrapper[4836]: I0217 14:07:11.925135 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:11 crc kubenswrapper[4836]: I0217 14:07:11.925148 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:11 crc kubenswrapper[4836]: I0217 14:07:11.925165 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:11 crc kubenswrapper[4836]: I0217 14:07:11.925178 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:11Z","lastTransitionTime":"2026-02-17T14:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:12 crc kubenswrapper[4836]: I0217 14:07:12.028468 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:12 crc kubenswrapper[4836]: I0217 14:07:12.028527 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:12 crc kubenswrapper[4836]: I0217 14:07:12.028542 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:12 crc kubenswrapper[4836]: I0217 14:07:12.028561 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:12 crc kubenswrapper[4836]: I0217 14:07:12.028574 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:12Z","lastTransitionTime":"2026-02-17T14:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:12 crc kubenswrapper[4836]: I0217 14:07:12.130903 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:12 crc kubenswrapper[4836]: I0217 14:07:12.130964 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:12 crc kubenswrapper[4836]: I0217 14:07:12.130976 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:12 crc kubenswrapper[4836]: I0217 14:07:12.130995 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:12 crc kubenswrapper[4836]: I0217 14:07:12.131012 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:12Z","lastTransitionTime":"2026-02-17T14:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:12 crc kubenswrapper[4836]: I0217 14:07:12.233525 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:12 crc kubenswrapper[4836]: I0217 14:07:12.233580 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:12 crc kubenswrapper[4836]: I0217 14:07:12.233591 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:12 crc kubenswrapper[4836]: I0217 14:07:12.233608 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:12 crc kubenswrapper[4836]: I0217 14:07:12.233619 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:12Z","lastTransitionTime":"2026-02-17T14:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:12 crc kubenswrapper[4836]: I0217 14:07:12.270350 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:12 crc kubenswrapper[4836]: I0217 14:07:12.270396 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:12 crc kubenswrapper[4836]: I0217 14:07:12.270407 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:12 crc kubenswrapper[4836]: I0217 14:07:12.270420 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:12 crc kubenswrapper[4836]: I0217 14:07:12.270428 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:12Z","lastTransitionTime":"2026-02-17T14:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:12 crc kubenswrapper[4836]: E0217 14:07:12.282385 4836 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:07:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:07:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:07:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:07:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d638d470-b0e0-4be9-938f-7ec815bf6bd8\\\",\\\"systemUUID\\\":\\\"f194f106-0bf2-4b65-bcb3-5215631b39d2\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:12Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:12 crc kubenswrapper[4836]: I0217 14:07:12.287090 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:12 crc kubenswrapper[4836]: I0217 14:07:12.287134 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:12 crc kubenswrapper[4836]: I0217 14:07:12.287143 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:12 crc kubenswrapper[4836]: I0217 14:07:12.287158 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:12 crc kubenswrapper[4836]: I0217 14:07:12.287167 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:12Z","lastTransitionTime":"2026-02-17T14:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:12 crc kubenswrapper[4836]: E0217 14:07:12.299901 4836 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:07:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:07:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:07:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:07:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d638d470-b0e0-4be9-938f-7ec815bf6bd8\\\",\\\"systemUUID\\\":\\\"f194f106-0bf2-4b65-bcb3-5215631b39d2\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:12Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:12 crc kubenswrapper[4836]: I0217 14:07:12.303707 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:12 crc kubenswrapper[4836]: I0217 14:07:12.303744 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:12 crc kubenswrapper[4836]: I0217 14:07:12.303754 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:12 crc kubenswrapper[4836]: I0217 14:07:12.303777 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:12 crc kubenswrapper[4836]: I0217 14:07:12.303788 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:12Z","lastTransitionTime":"2026-02-17T14:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:12 crc kubenswrapper[4836]: E0217 14:07:12.322068 4836 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:07:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:07:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:07:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:07:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d638d470-b0e0-4be9-938f-7ec815bf6bd8\\\",\\\"systemUUID\\\":\\\"f194f106-0bf2-4b65-bcb3-5215631b39d2\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:12Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:12 crc kubenswrapper[4836]: I0217 14:07:12.326195 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:12 crc kubenswrapper[4836]: I0217 14:07:12.326248 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:12 crc kubenswrapper[4836]: I0217 14:07:12.326258 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:12 crc kubenswrapper[4836]: I0217 14:07:12.326271 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:12 crc kubenswrapper[4836]: I0217 14:07:12.326280 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:12Z","lastTransitionTime":"2026-02-17T14:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:12 crc kubenswrapper[4836]: E0217 14:07:12.342283 4836 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:07:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:07:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:07:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:07:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d638d470-b0e0-4be9-938f-7ec815bf6bd8\\\",\\\"systemUUID\\\":\\\"f194f106-0bf2-4b65-bcb3-5215631b39d2\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:12Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:12 crc kubenswrapper[4836]: I0217 14:07:12.346763 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:12 crc kubenswrapper[4836]: I0217 14:07:12.346823 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:12 crc kubenswrapper[4836]: I0217 14:07:12.346842 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:12 crc kubenswrapper[4836]: I0217 14:07:12.346867 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:12 crc kubenswrapper[4836]: I0217 14:07:12.346886 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:12Z","lastTransitionTime":"2026-02-17T14:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:12 crc kubenswrapper[4836]: E0217 14:07:12.362727 4836 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:07:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:07:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:07:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:07:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d638d470-b0e0-4be9-938f-7ec815bf6bd8\\\",\\\"systemUUID\\\":\\\"f194f106-0bf2-4b65-bcb3-5215631b39d2\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:12Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:12 crc kubenswrapper[4836]: E0217 14:07:12.362847 4836 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 17 14:07:12 crc kubenswrapper[4836]: I0217 14:07:12.364527 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:12 crc kubenswrapper[4836]: I0217 14:07:12.364550 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:12 crc kubenswrapper[4836]: I0217 14:07:12.364558 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:12 crc kubenswrapper[4836]: I0217 14:07:12.364571 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:12 crc kubenswrapper[4836]: I0217 14:07:12.364581 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:12Z","lastTransitionTime":"2026-02-17T14:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:12 crc kubenswrapper[4836]: I0217 14:07:12.468263 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:12 crc kubenswrapper[4836]: I0217 14:07:12.468358 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:12 crc kubenswrapper[4836]: I0217 14:07:12.468381 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:12 crc kubenswrapper[4836]: I0217 14:07:12.468406 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:12 crc kubenswrapper[4836]: I0217 14:07:12.468424 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:12Z","lastTransitionTime":"2026-02-17T14:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:12 crc kubenswrapper[4836]: I0217 14:07:12.513936 4836 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-12 00:05:59.153114735 +0000 UTC Feb 17 14:07:12 crc kubenswrapper[4836]: I0217 14:07:12.570768 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:12 crc kubenswrapper[4836]: I0217 14:07:12.570849 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:12 crc kubenswrapper[4836]: I0217 14:07:12.570865 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:12 crc kubenswrapper[4836]: I0217 14:07:12.570917 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:12 crc kubenswrapper[4836]: I0217 14:07:12.570937 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:12Z","lastTransitionTime":"2026-02-17T14:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:12 crc kubenswrapper[4836]: I0217 14:07:12.674191 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:12 crc kubenswrapper[4836]: I0217 14:07:12.674259 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:12 crc kubenswrapper[4836]: I0217 14:07:12.674282 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:12 crc kubenswrapper[4836]: I0217 14:07:12.674364 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:12 crc kubenswrapper[4836]: I0217 14:07:12.674389 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:12Z","lastTransitionTime":"2026-02-17T14:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:12 crc kubenswrapper[4836]: I0217 14:07:12.776996 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:12 crc kubenswrapper[4836]: I0217 14:07:12.777172 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:12 crc kubenswrapper[4836]: I0217 14:07:12.777207 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:12 crc kubenswrapper[4836]: I0217 14:07:12.777237 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:12 crc kubenswrapper[4836]: I0217 14:07:12.777259 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:12Z","lastTransitionTime":"2026-02-17T14:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:12 crc kubenswrapper[4836]: I0217 14:07:12.890804 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:12 crc kubenswrapper[4836]: I0217 14:07:12.890848 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:12 crc kubenswrapper[4836]: I0217 14:07:12.890859 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:12 crc kubenswrapper[4836]: I0217 14:07:12.890879 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:12 crc kubenswrapper[4836]: I0217 14:07:12.890892 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:12Z","lastTransitionTime":"2026-02-17T14:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:12 crc kubenswrapper[4836]: I0217 14:07:12.992653 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:12 crc kubenswrapper[4836]: I0217 14:07:12.992689 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:12 crc kubenswrapper[4836]: I0217 14:07:12.992698 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:12 crc kubenswrapper[4836]: I0217 14:07:12.992710 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:12 crc kubenswrapper[4836]: I0217 14:07:12.992718 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:12Z","lastTransitionTime":"2026-02-17T14:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:13 crc kubenswrapper[4836]: I0217 14:07:13.095182 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:13 crc kubenswrapper[4836]: I0217 14:07:13.095224 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:13 crc kubenswrapper[4836]: I0217 14:07:13.095236 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:13 crc kubenswrapper[4836]: I0217 14:07:13.095253 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:13 crc kubenswrapper[4836]: I0217 14:07:13.095265 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:13Z","lastTransitionTime":"2026-02-17T14:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:13 crc kubenswrapper[4836]: I0217 14:07:13.197937 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:13 crc kubenswrapper[4836]: I0217 14:07:13.197992 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:13 crc kubenswrapper[4836]: I0217 14:07:13.198001 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:13 crc kubenswrapper[4836]: I0217 14:07:13.198016 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:13 crc kubenswrapper[4836]: I0217 14:07:13.198026 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:13Z","lastTransitionTime":"2026-02-17T14:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:13 crc kubenswrapper[4836]: I0217 14:07:13.300521 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:13 crc kubenswrapper[4836]: I0217 14:07:13.300563 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:13 crc kubenswrapper[4836]: I0217 14:07:13.300579 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:13 crc kubenswrapper[4836]: I0217 14:07:13.300594 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:13 crc kubenswrapper[4836]: I0217 14:07:13.300606 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:13Z","lastTransitionTime":"2026-02-17T14:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:13 crc kubenswrapper[4836]: I0217 14:07:13.402477 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:13 crc kubenswrapper[4836]: I0217 14:07:13.402531 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:13 crc kubenswrapper[4836]: I0217 14:07:13.402544 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:13 crc kubenswrapper[4836]: I0217 14:07:13.402561 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:13 crc kubenswrapper[4836]: I0217 14:07:13.402574 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:13Z","lastTransitionTime":"2026-02-17T14:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:13 crc kubenswrapper[4836]: I0217 14:07:13.505520 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:13 crc kubenswrapper[4836]: I0217 14:07:13.505609 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:13 crc kubenswrapper[4836]: I0217 14:07:13.505633 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:13 crc kubenswrapper[4836]: I0217 14:07:13.505664 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:13 crc kubenswrapper[4836]: I0217 14:07:13.505685 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:13Z","lastTransitionTime":"2026-02-17T14:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:13 crc kubenswrapper[4836]: I0217 14:07:13.514741 4836 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-19 07:29:05.7292283 +0000 UTC Feb 17 14:07:13 crc kubenswrapper[4836]: I0217 14:07:13.567088 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c4txt" Feb 17 14:07:13 crc kubenswrapper[4836]: I0217 14:07:13.567139 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:07:13 crc kubenswrapper[4836]: I0217 14:07:13.567196 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:07:13 crc kubenswrapper[4836]: I0217 14:07:13.567141 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:07:13 crc kubenswrapper[4836]: E0217 14:07:13.567360 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c4txt" podUID="8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c" Feb 17 14:07:13 crc kubenswrapper[4836]: E0217 14:07:13.567786 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:07:13 crc kubenswrapper[4836]: E0217 14:07:13.567986 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:07:13 crc kubenswrapper[4836]: E0217 14:07:13.568101 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:07:13 crc kubenswrapper[4836]: I0217 14:07:13.608643 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:13 crc kubenswrapper[4836]: I0217 14:07:13.608692 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:13 crc kubenswrapper[4836]: I0217 14:07:13.608704 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:13 crc kubenswrapper[4836]: I0217 14:07:13.608723 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:13 crc kubenswrapper[4836]: I0217 14:07:13.608733 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:13Z","lastTransitionTime":"2026-02-17T14:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:13 crc kubenswrapper[4836]: I0217 14:07:13.711550 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:13 crc kubenswrapper[4836]: I0217 14:07:13.711613 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:13 crc kubenswrapper[4836]: I0217 14:07:13.711629 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:13 crc kubenswrapper[4836]: I0217 14:07:13.711654 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:13 crc kubenswrapper[4836]: I0217 14:07:13.711673 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:13Z","lastTransitionTime":"2026-02-17T14:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:13 crc kubenswrapper[4836]: I0217 14:07:13.813965 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:13 crc kubenswrapper[4836]: I0217 14:07:13.814008 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:13 crc kubenswrapper[4836]: I0217 14:07:13.814017 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:13 crc kubenswrapper[4836]: I0217 14:07:13.814032 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:13 crc kubenswrapper[4836]: I0217 14:07:13.814041 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:13Z","lastTransitionTime":"2026-02-17T14:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:13 crc kubenswrapper[4836]: I0217 14:07:13.916078 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:13 crc kubenswrapper[4836]: I0217 14:07:13.916128 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:13 crc kubenswrapper[4836]: I0217 14:07:13.916145 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:13 crc kubenswrapper[4836]: I0217 14:07:13.916163 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:13 crc kubenswrapper[4836]: I0217 14:07:13.916176 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:13Z","lastTransitionTime":"2026-02-17T14:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:14 crc kubenswrapper[4836]: I0217 14:07:14.018976 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:14 crc kubenswrapper[4836]: I0217 14:07:14.019030 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:14 crc kubenswrapper[4836]: I0217 14:07:14.019044 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:14 crc kubenswrapper[4836]: I0217 14:07:14.019062 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:14 crc kubenswrapper[4836]: I0217 14:07:14.019075 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:14Z","lastTransitionTime":"2026-02-17T14:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:14 crc kubenswrapper[4836]: I0217 14:07:14.121750 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:14 crc kubenswrapper[4836]: I0217 14:07:14.121787 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:14 crc kubenswrapper[4836]: I0217 14:07:14.121795 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:14 crc kubenswrapper[4836]: I0217 14:07:14.121809 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:14 crc kubenswrapper[4836]: I0217 14:07:14.121818 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:14Z","lastTransitionTime":"2026-02-17T14:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:14 crc kubenswrapper[4836]: I0217 14:07:14.224668 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:14 crc kubenswrapper[4836]: I0217 14:07:14.224722 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:14 crc kubenswrapper[4836]: I0217 14:07:14.224735 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:14 crc kubenswrapper[4836]: I0217 14:07:14.224751 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:14 crc kubenswrapper[4836]: I0217 14:07:14.224763 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:14Z","lastTransitionTime":"2026-02-17T14:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:14 crc kubenswrapper[4836]: I0217 14:07:14.327254 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:14 crc kubenswrapper[4836]: I0217 14:07:14.327371 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:14 crc kubenswrapper[4836]: I0217 14:07:14.327391 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:14 crc kubenswrapper[4836]: I0217 14:07:14.327413 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:14 crc kubenswrapper[4836]: I0217 14:07:14.327429 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:14Z","lastTransitionTime":"2026-02-17T14:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:14 crc kubenswrapper[4836]: I0217 14:07:14.430222 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:14 crc kubenswrapper[4836]: I0217 14:07:14.430277 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:14 crc kubenswrapper[4836]: I0217 14:07:14.430333 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:14 crc kubenswrapper[4836]: I0217 14:07:14.430358 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:14 crc kubenswrapper[4836]: I0217 14:07:14.430373 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:14Z","lastTransitionTime":"2026-02-17T14:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:14 crc kubenswrapper[4836]: I0217 14:07:14.515508 4836 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 05:51:37.674642736 +0000 UTC Feb 17 14:07:14 crc kubenswrapper[4836]: I0217 14:07:14.532854 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:14 crc kubenswrapper[4836]: I0217 14:07:14.532958 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:14 crc kubenswrapper[4836]: I0217 14:07:14.533020 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:14 crc kubenswrapper[4836]: I0217 14:07:14.533043 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:14 crc kubenswrapper[4836]: I0217 14:07:14.533060 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:14Z","lastTransitionTime":"2026-02-17T14:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:14 crc kubenswrapper[4836]: I0217 14:07:14.580866 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a443a5a-83a8-4154-94b7-ace6b324f461\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8787a45a4a94da65a0f67f1a402bfa526fa57631259981ded7ec3569fa977f36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6196237bf10370f0d43ac7ba96f1fbc861d8690734cf883081f67616e6047dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21927a2142504e88fd8bb2a254d804c0e13f6a42e81e16c02205292de3de3155\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c21e5f00c6fb44be82d2a907c0f9d17ea47071615370135f0e94a62f1eb7840\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:14Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:14 crc kubenswrapper[4836]: I0217 14:07:14.593645 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c76cc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"592aa549-1b1b-441e-93e4-0821e05ff2b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d56329a484e41f44539eb8ca01b461cc6f79eebf756350b0f49c47c0431e8abc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8vh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c76cc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:14Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:14 crc kubenswrapper[4836]: I0217 14:07:14.605393 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"895a19c9-a3f0-4a15-aa19-19347121388c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bce78b27c441ccb16f0fd2489cbe3e38be0f3d330fcd2b602f62a8ca0aa8616b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99tf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c10f7b69881ea75bfa81905a42379fed8daf48e788b5f2787c522a52d28f58cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99tf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bkk9g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:14Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:14 crc kubenswrapper[4836]: I0217 14:07:14.619876 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:14Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:14 crc kubenswrapper[4836]: I0217 14:07:14.634964 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd6efd6a58b6d90897fa902c8d0c8082ecec2899f16adc3e56b98f0bc5c4640d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:14Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:14 crc kubenswrapper[4836]: I0217 14:07:14.636027 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:14 crc kubenswrapper[4836]: I0217 14:07:14.636062 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:14 crc kubenswrapper[4836]: I0217 14:07:14.636073 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:14 crc kubenswrapper[4836]: I0217 14:07:14.636089 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:14 crc kubenswrapper[4836]: I0217 14:07:14.636102 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:14Z","lastTransitionTime":"2026-02-17T14:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:14 crc kubenswrapper[4836]: I0217 14:07:14.646176 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vt5sw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d1f430-35ed-4c4e-a797-d7a0a5a45266\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25f1e7f83b2916db4faac1afe1f5107375c0e1674ff1d6f90f1025a9920111e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqtsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vt5sw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:14Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:14 crc kubenswrapper[4836]: I0217 14:07:14.666914 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"243121c7-6c63-4df6-a6a6-1ef6770e1125\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0143c73054039d5957c29bdf62217900d803ba2e0046c09dcff149b27fc94995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18b1d7908740d818568e489caf3447ab0f5768c180cc6d443adc67414e51cd07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e054e9ec9a1f8d54b704be11ff2e55c9aaad9d717b1817aa553cfbf2cc6ac16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a5d1fd5d2de253417bcb1e4edadc2363a1c259769418eb2b2aab3ff8253de2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fbf3d3daf43d0da96ad0c1340b3c72815adceec24d0ed5fc97f965d1d93e7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://854199187b81a3d6692a05073c35985feb14993f76f4ac86149c2b6bac635e4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://854199187b81a3d6692a05073c35985feb14993f76f4ac86149c2b6bac635e4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67b26f0d65128004f7645ac1f54a11bc56909269cb9b9337e447b0d5bf783c4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67b26f0d65128004f7645ac1f54a11bc56909269cb9b9337e447b0d5bf783c4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e9fc4a71369f80c06598341729c7db0cb6e0f33d70b0be64470bc6fd40d1cce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9fc4a71369f80c06598341729c7db0cb6e0f33d70b0be64470bc6fd40d1cce2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:14Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:14 crc kubenswrapper[4836]: I0217 14:07:14.682267 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b942ef1c58993ae6344d4aef5fcfe21434b5f3b3282bdf8070560fa8973a3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77390b24cb4a223b8eb6a46db49a4a88b68f87aa0b4b134b45a552c180dec5b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:14Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:14 crc kubenswrapper[4836]: I0217 14:07:14.694379 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e7e5246-9255-4e1e-95c9-44606b16c1ae\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://125ff8db8d520f18c812d9fabdb8e31c63958022c790a31167fabc3549de5f7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd23a5cef9ad5b95a15765a1cc2055e05de0f9821f426745c1e50829ed1c8141\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1c943885038503c4b93bc7688fb4ac415496280e3f969326883c0d366c66e08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73bf4b735f1125a8bb5b1f3f449b651c60fdc4ee62b918e28eb3b13278e1f7d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73bf4b735f1125a8bb5b1f3f449b651c60fdc4ee62b918e28eb3b13278e1f7d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:14Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:14 crc kubenswrapper[4836]: I0217 14:07:14.706932 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc30b1c397bb81c58abc71fb2c6d71c94b9a0f7962d8278edc4f65f1a1c64ef8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:14Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:14 crc kubenswrapper[4836]: I0217 14:07:14.718617 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:14Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:14 crc kubenswrapper[4836]: I0217 14:07:14.730642 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:14Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:14 crc kubenswrapper[4836]: I0217 14:07:14.739225 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:14 crc kubenswrapper[4836]: I0217 14:07:14.739271 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:14 crc kubenswrapper[4836]: I0217 14:07:14.739284 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:14 crc kubenswrapper[4836]: I0217 14:07:14.739322 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:14 crc kubenswrapper[4836]: I0217 14:07:14.739334 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:14Z","lastTransitionTime":"2026-02-17T14:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:14 crc kubenswrapper[4836]: I0217 14:07:14.742501 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jlz6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f87e91ef-e64c-45a5-9bd5-cc6537e51b1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e03fa870790f606ebf6db3bac825e59b209b7a2bd6cb22264597b30211fbc2fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vbzg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jlz6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:14Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:14 crc kubenswrapper[4836]: I0217 14:07:14.767249 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebae2d4e3375fda616b0ea16c0c86a5c1c36004e7d22873a289ff37874bbc74b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb05318b17722c3f731dc976d38c05a4ab7eafd686ad0fcf17bd01032409eaee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5576148df454de0fb3d15ea9de93736aab907ca8cf09669c3c25fde79ec6fad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47fdc413501c45a2d6e531a1427fd6df3e9d2c57e3d60098b31fbfd695d98a0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d36d269996169583e4e181aac060491640e74df94385276752f460c408273cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6545ec908eecb35a4405050dd77343f0acd4ec7d54ec55853a926e00b7f37f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4db5a5e396fbddd65ca0e94f1414b226b324f8bce9dd7c243d40bfbd0d060bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4db5a5e396fbddd65ca0e94f1414b226b324f8bce9dd7c243d40bfbd0d060bb1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T14:07:02Z\\\",\\\"message\\\":\\\"\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0217 14:07:02.828170 6489 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]} options:{GoMap:map[iface-id-ver:3b6479f0-333b-4a96-9adf-2099afdc2447 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61897e97-c771-4738-8709-09636387cb00}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0217 14:07:02.828220 6489 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-config-operator/machine-config-controller\\\\\\\"}\\\\nI0217 14:07:02.828227 6489 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0217 14:07:02.828241 6489 services_controller.go:360] Finished syncing service machine-config-controller on namespace openshift-machine-config-operator for network=default : 1.693755ms\\\\nI0217 14:07:02.828254 6489 services_control\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:07:01Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-gfznp_openshift-ovn-kubernetes(67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8d2edfc6a88391d034f310661e09eb58503f33f015b5e067654db06e8b152e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81e6d07edc7d7857cb288980bc7e9d5002234c2fe642df6798611b150b6d66e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81e6d07edc7d7857cb288980bc7e9d5002234c2fe642df6798611b150b6d66e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gfznp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:14Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:14 crc kubenswrapper[4836]: I0217 14:07:14.782597 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15e7e5f3-c249-4609-937e-ffdc78580880\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://281c6f51b95a96d557a29b449759275131f2fde30c4814e7692edfff96442e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bc90fa5f9363034d51c35721493655d10f8f8d5fad4dda86ca4c882963fb7a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edaaa277a27d7e7b9e8b42acaa9ee04fc0933b440441ad3f8abee458e96945c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14e1fa543737ef273b75b2cffc78dc2edf4d1673c2c4095edda7c40db8238f2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a13a546a929cc67fa3a708f23186f65c3dbc28afd67a421a66c732b8308e4177\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:06:30.193193 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:06:30.195418 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-478710080/tls.crt::/tmp/serving-cert-478710080/tls.key\\\\\\\"\\\\nI0217 14:06:35.690345 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:06:35.701676 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:06:35.701714 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:06:35.701739 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:06:35.701745 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:06:35.709986 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 14:06:35.710011 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:06:35.710016 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:06:35.710020 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:06:35.710023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:06:35.710027 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:06:35.710030 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 14:06:35.710186 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 14:06:35.711336 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b36202875b19f7dafcfde168e006d9d47571de30240d5886e5e0920b74b609b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf6a761df18c5e7699f2ec87d4b97f19da71f588cb12eff8e541140a361fa905\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf6a761df18c5e7699f2ec87d4b97f19da71f588cb12eff8e541140a361fa905\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:14Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:14 crc kubenswrapper[4836]: I0217 14:07:14.793575 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-c4txt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g78bt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g78bt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-c4txt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:14Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:14 crc kubenswrapper[4836]: I0217 14:07:14.808370 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t7845" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eeaa6bd-bab3-4310-9522-747924f2e825\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54ef47d3331023c2e4524dc2c8958b4ed4e2a499eb4d5f084f70cd4afcca1d32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6778ba6ea7cfd1d378cae6dc4330542c35cedaff8e820a2158571f542e0ec55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6778ba6ea7cfd1d378cae6dc4330542c35cedaff8e820a2158571f542e0ec55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14eab60dcf798778aa77f01af0307c7a5ef44846ceaf5a12e4a5f529b812aaa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14eab60dcf798778aa77f01af0307c7a5ef44846ceaf5a12e4a5f529b812aaa4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd65cf675b0d34fc5ea309bdcd17ef648bbf1c0ee6e2ece25f459d8d2403387c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd65cf675b0d34fc5ea309bdcd17ef648bbf1c0ee6e2ece25f459d8d2403387c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b15d92528d05fd8bcab26df5eeb85616b44e750574cc8effa1f34c39dc329bf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b15d92528d05fd8bcab26df5eeb85616b44e750574cc8effa1f34c39dc329bf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ddc9616f0a81f7623eb10b0cdb194656ad856548f814df2e6cebdb2419293f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ddc9616f0a81f7623eb10b0cdb194656ad856548f814df2e6cebdb2419293f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a96c0c25b8539ae8bb660010bbcd7d147a150bfb653dc725a34bc7e4e4b949b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a96c0c25b8539ae8bb660010bbcd7d147a150bfb653dc725a34bc7e4e4b949b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t7845\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:14Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:14 crc kubenswrapper[4836]: I0217 14:07:14.820228 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7nmc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b98ed9fc-ca9f-49c8-b92d-c5b58df2ce3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac6f7d7722015b1a0d40a119c1c74e8a37a7e96e735f56a93ee2fde68d0e10f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6dh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61f8fdf114b2f1ca00e55833ea4eeb033f97b9fa48a3308c4db3c7dbc617c3b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6dh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7nmc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:14Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:14 crc kubenswrapper[4836]: I0217 14:07:14.841070 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:14 crc kubenswrapper[4836]: I0217 14:07:14.841111 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:14 crc kubenswrapper[4836]: I0217 14:07:14.841122 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:14 crc kubenswrapper[4836]: I0217 14:07:14.841136 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:14 crc kubenswrapper[4836]: I0217 14:07:14.841145 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:14Z","lastTransitionTime":"2026-02-17T14:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:14 crc kubenswrapper[4836]: I0217 14:07:14.943798 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:14 crc kubenswrapper[4836]: I0217 14:07:14.943834 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:14 crc kubenswrapper[4836]: I0217 14:07:14.943843 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:14 crc kubenswrapper[4836]: I0217 14:07:14.943857 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:14 crc kubenswrapper[4836]: I0217 14:07:14.943868 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:14Z","lastTransitionTime":"2026-02-17T14:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:15 crc kubenswrapper[4836]: I0217 14:07:15.046138 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:15 crc kubenswrapper[4836]: I0217 14:07:15.046479 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:15 crc kubenswrapper[4836]: I0217 14:07:15.046488 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:15 crc kubenswrapper[4836]: I0217 14:07:15.046502 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:15 crc kubenswrapper[4836]: I0217 14:07:15.046511 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:15Z","lastTransitionTime":"2026-02-17T14:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:15 crc kubenswrapper[4836]: I0217 14:07:15.149569 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:15 crc kubenswrapper[4836]: I0217 14:07:15.149614 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:15 crc kubenswrapper[4836]: I0217 14:07:15.149624 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:15 crc kubenswrapper[4836]: I0217 14:07:15.149639 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:15 crc kubenswrapper[4836]: I0217 14:07:15.149651 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:15Z","lastTransitionTime":"2026-02-17T14:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:15 crc kubenswrapper[4836]: I0217 14:07:15.252155 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:15 crc kubenswrapper[4836]: I0217 14:07:15.252220 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:15 crc kubenswrapper[4836]: I0217 14:07:15.252239 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:15 crc kubenswrapper[4836]: I0217 14:07:15.252264 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:15 crc kubenswrapper[4836]: I0217 14:07:15.252281 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:15Z","lastTransitionTime":"2026-02-17T14:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:15 crc kubenswrapper[4836]: I0217 14:07:15.356321 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:15 crc kubenswrapper[4836]: I0217 14:07:15.356365 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:15 crc kubenswrapper[4836]: I0217 14:07:15.356373 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:15 crc kubenswrapper[4836]: I0217 14:07:15.356386 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:15 crc kubenswrapper[4836]: I0217 14:07:15.356396 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:15Z","lastTransitionTime":"2026-02-17T14:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:15 crc kubenswrapper[4836]: I0217 14:07:15.459798 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:15 crc kubenswrapper[4836]: I0217 14:07:15.459850 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:15 crc kubenswrapper[4836]: I0217 14:07:15.459866 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:15 crc kubenswrapper[4836]: I0217 14:07:15.459886 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:15 crc kubenswrapper[4836]: I0217 14:07:15.459901 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:15Z","lastTransitionTime":"2026-02-17T14:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:15 crc kubenswrapper[4836]: I0217 14:07:15.515645 4836 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-25 14:21:37.347378851 +0000 UTC Feb 17 14:07:15 crc kubenswrapper[4836]: I0217 14:07:15.563685 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:15 crc kubenswrapper[4836]: I0217 14:07:15.563732 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:15 crc kubenswrapper[4836]: I0217 14:07:15.563743 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:15 crc kubenswrapper[4836]: I0217 14:07:15.563757 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:15 crc kubenswrapper[4836]: I0217 14:07:15.563766 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:15Z","lastTransitionTime":"2026-02-17T14:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:15 crc kubenswrapper[4836]: I0217 14:07:15.567413 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:07:15 crc kubenswrapper[4836]: I0217 14:07:15.567501 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:07:15 crc kubenswrapper[4836]: I0217 14:07:15.567520 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:07:15 crc kubenswrapper[4836]: I0217 14:07:15.567677 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c4txt" Feb 17 14:07:15 crc kubenswrapper[4836]: E0217 14:07:15.567674 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:07:15 crc kubenswrapper[4836]: E0217 14:07:15.567797 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c4txt" podUID="8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c" Feb 17 14:07:15 crc kubenswrapper[4836]: E0217 14:07:15.567870 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:07:15 crc kubenswrapper[4836]: E0217 14:07:15.567929 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:07:15 crc kubenswrapper[4836]: I0217 14:07:15.667424 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:15 crc kubenswrapper[4836]: I0217 14:07:15.667493 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:15 crc kubenswrapper[4836]: I0217 14:07:15.667709 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:15 crc kubenswrapper[4836]: I0217 14:07:15.667737 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:15 crc kubenswrapper[4836]: I0217 14:07:15.667754 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:15Z","lastTransitionTime":"2026-02-17T14:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:15 crc kubenswrapper[4836]: I0217 14:07:15.770971 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:15 crc kubenswrapper[4836]: I0217 14:07:15.771010 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:15 crc kubenswrapper[4836]: I0217 14:07:15.771021 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:15 crc kubenswrapper[4836]: I0217 14:07:15.771037 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:15 crc kubenswrapper[4836]: I0217 14:07:15.771049 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:15Z","lastTransitionTime":"2026-02-17T14:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:15 crc kubenswrapper[4836]: I0217 14:07:15.874217 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:15 crc kubenswrapper[4836]: I0217 14:07:15.874317 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:15 crc kubenswrapper[4836]: I0217 14:07:15.874337 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:15 crc kubenswrapper[4836]: I0217 14:07:15.874357 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:15 crc kubenswrapper[4836]: I0217 14:07:15.874375 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:15Z","lastTransitionTime":"2026-02-17T14:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:15 crc kubenswrapper[4836]: I0217 14:07:15.977247 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:15 crc kubenswrapper[4836]: I0217 14:07:15.977287 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:15 crc kubenswrapper[4836]: I0217 14:07:15.977320 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:15 crc kubenswrapper[4836]: I0217 14:07:15.977336 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:15 crc kubenswrapper[4836]: I0217 14:07:15.977346 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:15Z","lastTransitionTime":"2026-02-17T14:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:16 crc kubenswrapper[4836]: I0217 14:07:16.080121 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:16 crc kubenswrapper[4836]: I0217 14:07:16.080156 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:16 crc kubenswrapper[4836]: I0217 14:07:16.080165 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:16 crc kubenswrapper[4836]: I0217 14:07:16.080181 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:16 crc kubenswrapper[4836]: I0217 14:07:16.080191 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:16Z","lastTransitionTime":"2026-02-17T14:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:16 crc kubenswrapper[4836]: I0217 14:07:16.183197 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:16 crc kubenswrapper[4836]: I0217 14:07:16.183244 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:16 crc kubenswrapper[4836]: I0217 14:07:16.183260 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:16 crc kubenswrapper[4836]: I0217 14:07:16.183282 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:16 crc kubenswrapper[4836]: I0217 14:07:16.183318 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:16Z","lastTransitionTime":"2026-02-17T14:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:16 crc kubenswrapper[4836]: I0217 14:07:16.286680 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:16 crc kubenswrapper[4836]: I0217 14:07:16.286739 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:16 crc kubenswrapper[4836]: I0217 14:07:16.286752 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:16 crc kubenswrapper[4836]: I0217 14:07:16.286771 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:16 crc kubenswrapper[4836]: I0217 14:07:16.286785 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:16Z","lastTransitionTime":"2026-02-17T14:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:16 crc kubenswrapper[4836]: I0217 14:07:16.389472 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:16 crc kubenswrapper[4836]: I0217 14:07:16.389508 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:16 crc kubenswrapper[4836]: I0217 14:07:16.389518 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:16 crc kubenswrapper[4836]: I0217 14:07:16.389534 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:16 crc kubenswrapper[4836]: I0217 14:07:16.389545 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:16Z","lastTransitionTime":"2026-02-17T14:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:16 crc kubenswrapper[4836]: I0217 14:07:16.492717 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:16 crc kubenswrapper[4836]: I0217 14:07:16.492791 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:16 crc kubenswrapper[4836]: I0217 14:07:16.492808 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:16 crc kubenswrapper[4836]: I0217 14:07:16.492826 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:16 crc kubenswrapper[4836]: I0217 14:07:16.492839 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:16Z","lastTransitionTime":"2026-02-17T14:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:16 crc kubenswrapper[4836]: I0217 14:07:16.517072 4836 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-13 14:59:40.645254981 +0000 UTC Feb 17 14:07:16 crc kubenswrapper[4836]: I0217 14:07:16.595670 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:16 crc kubenswrapper[4836]: I0217 14:07:16.595743 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:16 crc kubenswrapper[4836]: I0217 14:07:16.595754 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:16 crc kubenswrapper[4836]: I0217 14:07:16.595769 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:16 crc kubenswrapper[4836]: I0217 14:07:16.595781 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:16Z","lastTransitionTime":"2026-02-17T14:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:16 crc kubenswrapper[4836]: I0217 14:07:16.707139 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:16 crc kubenswrapper[4836]: I0217 14:07:16.707252 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:16 crc kubenswrapper[4836]: I0217 14:07:16.707269 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:16 crc kubenswrapper[4836]: I0217 14:07:16.707355 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:16 crc kubenswrapper[4836]: I0217 14:07:16.707382 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:16Z","lastTransitionTime":"2026-02-17T14:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:16 crc kubenswrapper[4836]: I0217 14:07:16.810459 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:16 crc kubenswrapper[4836]: I0217 14:07:16.810516 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:16 crc kubenswrapper[4836]: I0217 14:07:16.810528 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:16 crc kubenswrapper[4836]: I0217 14:07:16.810564 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:16 crc kubenswrapper[4836]: I0217 14:07:16.810577 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:16Z","lastTransitionTime":"2026-02-17T14:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:16 crc kubenswrapper[4836]: I0217 14:07:16.913101 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:16 crc kubenswrapper[4836]: I0217 14:07:16.913137 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:16 crc kubenswrapper[4836]: I0217 14:07:16.913149 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:16 crc kubenswrapper[4836]: I0217 14:07:16.913165 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:16 crc kubenswrapper[4836]: I0217 14:07:16.913178 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:16Z","lastTransitionTime":"2026-02-17T14:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:17 crc kubenswrapper[4836]: I0217 14:07:17.017012 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:17 crc kubenswrapper[4836]: I0217 14:07:17.017111 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:17 crc kubenswrapper[4836]: I0217 14:07:17.017135 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:17 crc kubenswrapper[4836]: I0217 14:07:17.017162 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:17 crc kubenswrapper[4836]: I0217 14:07:17.017173 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:17Z","lastTransitionTime":"2026-02-17T14:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:17 crc kubenswrapper[4836]: I0217 14:07:17.120179 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:17 crc kubenswrapper[4836]: I0217 14:07:17.120230 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:17 crc kubenswrapper[4836]: I0217 14:07:17.120241 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:17 crc kubenswrapper[4836]: I0217 14:07:17.120258 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:17 crc kubenswrapper[4836]: I0217 14:07:17.120270 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:17Z","lastTransitionTime":"2026-02-17T14:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:17 crc kubenswrapper[4836]: I0217 14:07:17.223328 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:17 crc kubenswrapper[4836]: I0217 14:07:17.223402 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:17 crc kubenswrapper[4836]: I0217 14:07:17.223413 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:17 crc kubenswrapper[4836]: I0217 14:07:17.223431 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:17 crc kubenswrapper[4836]: I0217 14:07:17.223444 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:17Z","lastTransitionTime":"2026-02-17T14:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:17 crc kubenswrapper[4836]: I0217 14:07:17.325774 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:17 crc kubenswrapper[4836]: I0217 14:07:17.325830 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:17 crc kubenswrapper[4836]: I0217 14:07:17.325879 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:17 crc kubenswrapper[4836]: I0217 14:07:17.325899 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:17 crc kubenswrapper[4836]: I0217 14:07:17.325911 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:17Z","lastTransitionTime":"2026-02-17T14:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:17 crc kubenswrapper[4836]: I0217 14:07:17.428173 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:17 crc kubenswrapper[4836]: I0217 14:07:17.428224 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:17 crc kubenswrapper[4836]: I0217 14:07:17.428236 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:17 crc kubenswrapper[4836]: I0217 14:07:17.428252 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:17 crc kubenswrapper[4836]: I0217 14:07:17.428261 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:17Z","lastTransitionTime":"2026-02-17T14:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:17 crc kubenswrapper[4836]: I0217 14:07:17.517404 4836 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-07 13:38:13.192741311 +0000 UTC Feb 17 14:07:17 crc kubenswrapper[4836]: I0217 14:07:17.532512 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:17 crc kubenswrapper[4836]: I0217 14:07:17.532540 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:17 crc kubenswrapper[4836]: I0217 14:07:17.532550 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:17 crc kubenswrapper[4836]: I0217 14:07:17.532570 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:17 crc kubenswrapper[4836]: I0217 14:07:17.532586 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:17Z","lastTransitionTime":"2026-02-17T14:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:17 crc kubenswrapper[4836]: I0217 14:07:17.567374 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:07:17 crc kubenswrapper[4836]: E0217 14:07:17.567898 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:07:17 crc kubenswrapper[4836]: I0217 14:07:17.568414 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:07:17 crc kubenswrapper[4836]: I0217 14:07:17.568426 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c4txt" Feb 17 14:07:17 crc kubenswrapper[4836]: I0217 14:07:17.568592 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:07:17 crc kubenswrapper[4836]: E0217 14:07:17.569362 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:07:17 crc kubenswrapper[4836]: E0217 14:07:17.569506 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c4txt" podUID="8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c" Feb 17 14:07:17 crc kubenswrapper[4836]: E0217 14:07:17.569616 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:07:17 crc kubenswrapper[4836]: I0217 14:07:17.634896 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:17 crc kubenswrapper[4836]: I0217 14:07:17.634931 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:17 crc kubenswrapper[4836]: I0217 14:07:17.634939 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:17 crc kubenswrapper[4836]: I0217 14:07:17.634952 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:17 crc kubenswrapper[4836]: I0217 14:07:17.634961 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:17Z","lastTransitionTime":"2026-02-17T14:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:17 crc kubenswrapper[4836]: I0217 14:07:17.737884 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:17 crc kubenswrapper[4836]: I0217 14:07:17.737927 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:17 crc kubenswrapper[4836]: I0217 14:07:17.737937 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:17 crc kubenswrapper[4836]: I0217 14:07:17.737950 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:17 crc kubenswrapper[4836]: I0217 14:07:17.737960 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:17Z","lastTransitionTime":"2026-02-17T14:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:17 crc kubenswrapper[4836]: I0217 14:07:17.841464 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:17 crc kubenswrapper[4836]: I0217 14:07:17.841514 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:17 crc kubenswrapper[4836]: I0217 14:07:17.841526 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:17 crc kubenswrapper[4836]: I0217 14:07:17.841543 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:17 crc kubenswrapper[4836]: I0217 14:07:17.841554 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:17Z","lastTransitionTime":"2026-02-17T14:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:17 crc kubenswrapper[4836]: I0217 14:07:17.943926 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:17 crc kubenswrapper[4836]: I0217 14:07:17.943971 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:17 crc kubenswrapper[4836]: I0217 14:07:17.943981 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:17 crc kubenswrapper[4836]: I0217 14:07:17.943994 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:17 crc kubenswrapper[4836]: I0217 14:07:17.944003 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:17Z","lastTransitionTime":"2026-02-17T14:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:18 crc kubenswrapper[4836]: I0217 14:07:18.046740 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:18 crc kubenswrapper[4836]: I0217 14:07:18.046788 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:18 crc kubenswrapper[4836]: I0217 14:07:18.046800 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:18 crc kubenswrapper[4836]: I0217 14:07:18.046817 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:18 crc kubenswrapper[4836]: I0217 14:07:18.046832 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:18Z","lastTransitionTime":"2026-02-17T14:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:18 crc kubenswrapper[4836]: I0217 14:07:18.150958 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:18 crc kubenswrapper[4836]: I0217 14:07:18.151009 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:18 crc kubenswrapper[4836]: I0217 14:07:18.151019 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:18 crc kubenswrapper[4836]: I0217 14:07:18.151034 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:18 crc kubenswrapper[4836]: I0217 14:07:18.151044 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:18Z","lastTransitionTime":"2026-02-17T14:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:18 crc kubenswrapper[4836]: I0217 14:07:18.253940 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:18 crc kubenswrapper[4836]: I0217 14:07:18.254006 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:18 crc kubenswrapper[4836]: I0217 14:07:18.254066 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:18 crc kubenswrapper[4836]: I0217 14:07:18.254089 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:18 crc kubenswrapper[4836]: I0217 14:07:18.254105 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:18Z","lastTransitionTime":"2026-02-17T14:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:18 crc kubenswrapper[4836]: I0217 14:07:18.357250 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:18 crc kubenswrapper[4836]: I0217 14:07:18.357313 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:18 crc kubenswrapper[4836]: I0217 14:07:18.357324 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:18 crc kubenswrapper[4836]: I0217 14:07:18.357339 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:18 crc kubenswrapper[4836]: I0217 14:07:18.357353 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:18Z","lastTransitionTime":"2026-02-17T14:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:18 crc kubenswrapper[4836]: I0217 14:07:18.459831 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:18 crc kubenswrapper[4836]: I0217 14:07:18.459901 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:18 crc kubenswrapper[4836]: I0217 14:07:18.459914 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:18 crc kubenswrapper[4836]: I0217 14:07:18.459929 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:18 crc kubenswrapper[4836]: I0217 14:07:18.459940 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:18Z","lastTransitionTime":"2026-02-17T14:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:18 crc kubenswrapper[4836]: I0217 14:07:18.518168 4836 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-26 12:49:23.77647241 +0000 UTC Feb 17 14:07:18 crc kubenswrapper[4836]: I0217 14:07:18.562398 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:18 crc kubenswrapper[4836]: I0217 14:07:18.562470 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:18 crc kubenswrapper[4836]: I0217 14:07:18.562481 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:18 crc kubenswrapper[4836]: I0217 14:07:18.562495 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:18 crc kubenswrapper[4836]: I0217 14:07:18.562506 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:18Z","lastTransitionTime":"2026-02-17T14:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:18 crc kubenswrapper[4836]: I0217 14:07:18.664709 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:18 crc kubenswrapper[4836]: I0217 14:07:18.664756 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:18 crc kubenswrapper[4836]: I0217 14:07:18.664770 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:18 crc kubenswrapper[4836]: I0217 14:07:18.664788 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:18 crc kubenswrapper[4836]: I0217 14:07:18.664800 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:18Z","lastTransitionTime":"2026-02-17T14:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:18 crc kubenswrapper[4836]: I0217 14:07:18.767938 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:18 crc kubenswrapper[4836]: I0217 14:07:18.767990 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:18 crc kubenswrapper[4836]: I0217 14:07:18.768003 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:18 crc kubenswrapper[4836]: I0217 14:07:18.768021 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:18 crc kubenswrapper[4836]: I0217 14:07:18.768035 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:18Z","lastTransitionTime":"2026-02-17T14:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:18 crc kubenswrapper[4836]: I0217 14:07:18.870523 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:18 crc kubenswrapper[4836]: I0217 14:07:18.870577 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:18 crc kubenswrapper[4836]: I0217 14:07:18.870602 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:18 crc kubenswrapper[4836]: I0217 14:07:18.870629 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:18 crc kubenswrapper[4836]: I0217 14:07:18.870646 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:18Z","lastTransitionTime":"2026-02-17T14:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:18 crc kubenswrapper[4836]: I0217 14:07:18.976210 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:18 crc kubenswrapper[4836]: I0217 14:07:18.976264 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:18 crc kubenswrapper[4836]: I0217 14:07:18.976276 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:18 crc kubenswrapper[4836]: I0217 14:07:18.976309 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:18 crc kubenswrapper[4836]: I0217 14:07:18.976324 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:18Z","lastTransitionTime":"2026-02-17T14:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:19 crc kubenswrapper[4836]: I0217 14:07:19.078624 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:19 crc kubenswrapper[4836]: I0217 14:07:19.078668 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:19 crc kubenswrapper[4836]: I0217 14:07:19.078680 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:19 crc kubenswrapper[4836]: I0217 14:07:19.078696 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:19 crc kubenswrapper[4836]: I0217 14:07:19.078707 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:19Z","lastTransitionTime":"2026-02-17T14:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:19 crc kubenswrapper[4836]: I0217 14:07:19.181435 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:19 crc kubenswrapper[4836]: I0217 14:07:19.181473 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:19 crc kubenswrapper[4836]: I0217 14:07:19.181483 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:19 crc kubenswrapper[4836]: I0217 14:07:19.181499 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:19 crc kubenswrapper[4836]: I0217 14:07:19.181509 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:19Z","lastTransitionTime":"2026-02-17T14:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:19 crc kubenswrapper[4836]: I0217 14:07:19.284608 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:19 crc kubenswrapper[4836]: I0217 14:07:19.284648 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:19 crc kubenswrapper[4836]: I0217 14:07:19.284674 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:19 crc kubenswrapper[4836]: I0217 14:07:19.284688 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:19 crc kubenswrapper[4836]: I0217 14:07:19.284699 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:19Z","lastTransitionTime":"2026-02-17T14:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:19 crc kubenswrapper[4836]: I0217 14:07:19.387611 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:19 crc kubenswrapper[4836]: I0217 14:07:19.387659 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:19 crc kubenswrapper[4836]: I0217 14:07:19.387669 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:19 crc kubenswrapper[4836]: I0217 14:07:19.387684 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:19 crc kubenswrapper[4836]: I0217 14:07:19.387693 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:19Z","lastTransitionTime":"2026-02-17T14:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:19 crc kubenswrapper[4836]: I0217 14:07:19.490085 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:19 crc kubenswrapper[4836]: I0217 14:07:19.490126 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:19 crc kubenswrapper[4836]: I0217 14:07:19.490135 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:19 crc kubenswrapper[4836]: I0217 14:07:19.490150 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:19 crc kubenswrapper[4836]: I0217 14:07:19.490160 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:19Z","lastTransitionTime":"2026-02-17T14:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:19 crc kubenswrapper[4836]: I0217 14:07:19.518448 4836 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-28 08:21:40.03534261 +0000 UTC Feb 17 14:07:19 crc kubenswrapper[4836]: I0217 14:07:19.567555 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:07:19 crc kubenswrapper[4836]: I0217 14:07:19.567611 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:07:19 crc kubenswrapper[4836]: I0217 14:07:19.567636 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c4txt" Feb 17 14:07:19 crc kubenswrapper[4836]: E0217 14:07:19.567676 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:07:19 crc kubenswrapper[4836]: I0217 14:07:19.567750 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:07:19 crc kubenswrapper[4836]: E0217 14:07:19.567875 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:07:19 crc kubenswrapper[4836]: E0217 14:07:19.568207 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c4txt" podUID="8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c" Feb 17 14:07:19 crc kubenswrapper[4836]: E0217 14:07:19.568325 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:07:19 crc kubenswrapper[4836]: I0217 14:07:19.568455 4836 scope.go:117] "RemoveContainer" containerID="4db5a5e396fbddd65ca0e94f1414b226b324f8bce9dd7c243d40bfbd0d060bb1" Feb 17 14:07:19 crc kubenswrapper[4836]: E0217 14:07:19.568656 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-gfznp_openshift-ovn-kubernetes(67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" podUID="67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e" Feb 17 14:07:19 crc kubenswrapper[4836]: I0217 14:07:19.592107 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:19 crc kubenswrapper[4836]: I0217 14:07:19.592155 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:19 crc kubenswrapper[4836]: I0217 14:07:19.592167 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:19 crc kubenswrapper[4836]: I0217 14:07:19.592183 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:19 crc kubenswrapper[4836]: I0217 14:07:19.592196 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:19Z","lastTransitionTime":"2026-02-17T14:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:19 crc kubenswrapper[4836]: I0217 14:07:19.694550 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:19 crc kubenswrapper[4836]: I0217 14:07:19.694593 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:19 crc kubenswrapper[4836]: I0217 14:07:19.694605 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:19 crc kubenswrapper[4836]: I0217 14:07:19.694626 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:19 crc kubenswrapper[4836]: I0217 14:07:19.694637 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:19Z","lastTransitionTime":"2026-02-17T14:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:19 crc kubenswrapper[4836]: I0217 14:07:19.796560 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:19 crc kubenswrapper[4836]: I0217 14:07:19.796606 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:19 crc kubenswrapper[4836]: I0217 14:07:19.796615 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:19 crc kubenswrapper[4836]: I0217 14:07:19.796632 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:19 crc kubenswrapper[4836]: I0217 14:07:19.796642 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:19Z","lastTransitionTime":"2026-02-17T14:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:19 crc kubenswrapper[4836]: I0217 14:07:19.898887 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:19 crc kubenswrapper[4836]: I0217 14:07:19.898928 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:19 crc kubenswrapper[4836]: I0217 14:07:19.898939 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:19 crc kubenswrapper[4836]: I0217 14:07:19.898952 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:19 crc kubenswrapper[4836]: I0217 14:07:19.898963 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:19Z","lastTransitionTime":"2026-02-17T14:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:20 crc kubenswrapper[4836]: I0217 14:07:20.001622 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:20 crc kubenswrapper[4836]: I0217 14:07:20.001659 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:20 crc kubenswrapper[4836]: I0217 14:07:20.001669 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:20 crc kubenswrapper[4836]: I0217 14:07:20.001685 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:20 crc kubenswrapper[4836]: I0217 14:07:20.001696 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:20Z","lastTransitionTime":"2026-02-17T14:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:20 crc kubenswrapper[4836]: I0217 14:07:20.103398 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:20 crc kubenswrapper[4836]: I0217 14:07:20.103437 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:20 crc kubenswrapper[4836]: I0217 14:07:20.103450 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:20 crc kubenswrapper[4836]: I0217 14:07:20.103464 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:20 crc kubenswrapper[4836]: I0217 14:07:20.103476 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:20Z","lastTransitionTime":"2026-02-17T14:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:20 crc kubenswrapper[4836]: I0217 14:07:20.206174 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:20 crc kubenswrapper[4836]: I0217 14:07:20.206208 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:20 crc kubenswrapper[4836]: I0217 14:07:20.206216 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:20 crc kubenswrapper[4836]: I0217 14:07:20.206231 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:20 crc kubenswrapper[4836]: I0217 14:07:20.206240 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:20Z","lastTransitionTime":"2026-02-17T14:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:20 crc kubenswrapper[4836]: I0217 14:07:20.308853 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:20 crc kubenswrapper[4836]: I0217 14:07:20.308895 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:20 crc kubenswrapper[4836]: I0217 14:07:20.308906 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:20 crc kubenswrapper[4836]: I0217 14:07:20.308921 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:20 crc kubenswrapper[4836]: I0217 14:07:20.308932 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:20Z","lastTransitionTime":"2026-02-17T14:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:20 crc kubenswrapper[4836]: I0217 14:07:20.411589 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:20 crc kubenswrapper[4836]: I0217 14:07:20.411629 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:20 crc kubenswrapper[4836]: I0217 14:07:20.411640 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:20 crc kubenswrapper[4836]: I0217 14:07:20.411656 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:20 crc kubenswrapper[4836]: I0217 14:07:20.411666 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:20Z","lastTransitionTime":"2026-02-17T14:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:20 crc kubenswrapper[4836]: I0217 14:07:20.513858 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:20 crc kubenswrapper[4836]: I0217 14:07:20.514130 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:20 crc kubenswrapper[4836]: I0217 14:07:20.514230 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:20 crc kubenswrapper[4836]: I0217 14:07:20.514344 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:20 crc kubenswrapper[4836]: I0217 14:07:20.514428 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:20Z","lastTransitionTime":"2026-02-17T14:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:20 crc kubenswrapper[4836]: I0217 14:07:20.519198 4836 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-02 00:02:02.10399266 +0000 UTC Feb 17 14:07:20 crc kubenswrapper[4836]: I0217 14:07:20.616798 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:20 crc kubenswrapper[4836]: I0217 14:07:20.616835 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:20 crc kubenswrapper[4836]: I0217 14:07:20.616842 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:20 crc kubenswrapper[4836]: I0217 14:07:20.616856 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:20 crc kubenswrapper[4836]: I0217 14:07:20.616865 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:20Z","lastTransitionTime":"2026-02-17T14:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:20 crc kubenswrapper[4836]: I0217 14:07:20.718981 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:20 crc kubenswrapper[4836]: I0217 14:07:20.719276 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:20 crc kubenswrapper[4836]: I0217 14:07:20.719377 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:20 crc kubenswrapper[4836]: I0217 14:07:20.719472 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:20 crc kubenswrapper[4836]: I0217 14:07:20.719564 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:20Z","lastTransitionTime":"2026-02-17T14:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:20 crc kubenswrapper[4836]: I0217 14:07:20.821447 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:20 crc kubenswrapper[4836]: I0217 14:07:20.821504 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:20 crc kubenswrapper[4836]: I0217 14:07:20.821516 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:20 crc kubenswrapper[4836]: I0217 14:07:20.821542 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:20 crc kubenswrapper[4836]: I0217 14:07:20.821555 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:20Z","lastTransitionTime":"2026-02-17T14:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:20 crc kubenswrapper[4836]: I0217 14:07:20.924053 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:20 crc kubenswrapper[4836]: I0217 14:07:20.924109 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:20 crc kubenswrapper[4836]: I0217 14:07:20.924121 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:20 crc kubenswrapper[4836]: I0217 14:07:20.924141 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:20 crc kubenswrapper[4836]: I0217 14:07:20.924154 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:20Z","lastTransitionTime":"2026-02-17T14:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:21 crc kubenswrapper[4836]: I0217 14:07:21.026469 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:21 crc kubenswrapper[4836]: I0217 14:07:21.026765 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:21 crc kubenswrapper[4836]: I0217 14:07:21.026909 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:21 crc kubenswrapper[4836]: I0217 14:07:21.027038 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:21 crc kubenswrapper[4836]: I0217 14:07:21.027142 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:21Z","lastTransitionTime":"2026-02-17T14:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:21 crc kubenswrapper[4836]: I0217 14:07:21.129728 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:21 crc kubenswrapper[4836]: I0217 14:07:21.129979 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:21 crc kubenswrapper[4836]: I0217 14:07:21.130046 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:21 crc kubenswrapper[4836]: I0217 14:07:21.130149 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:21 crc kubenswrapper[4836]: I0217 14:07:21.130222 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:21Z","lastTransitionTime":"2026-02-17T14:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:21 crc kubenswrapper[4836]: I0217 14:07:21.232387 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:21 crc kubenswrapper[4836]: I0217 14:07:21.232441 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:21 crc kubenswrapper[4836]: I0217 14:07:21.232451 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:21 crc kubenswrapper[4836]: I0217 14:07:21.232464 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:21 crc kubenswrapper[4836]: I0217 14:07:21.232475 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:21Z","lastTransitionTime":"2026-02-17T14:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:21 crc kubenswrapper[4836]: I0217 14:07:21.335379 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:21 crc kubenswrapper[4836]: I0217 14:07:21.335431 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:21 crc kubenswrapper[4836]: I0217 14:07:21.335442 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:21 crc kubenswrapper[4836]: I0217 14:07:21.335458 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:21 crc kubenswrapper[4836]: I0217 14:07:21.335470 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:21Z","lastTransitionTime":"2026-02-17T14:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:21 crc kubenswrapper[4836]: I0217 14:07:21.437213 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:21 crc kubenswrapper[4836]: I0217 14:07:21.437244 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:21 crc kubenswrapper[4836]: I0217 14:07:21.437252 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:21 crc kubenswrapper[4836]: I0217 14:07:21.437264 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:21 crc kubenswrapper[4836]: I0217 14:07:21.437275 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:21Z","lastTransitionTime":"2026-02-17T14:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:21 crc kubenswrapper[4836]: I0217 14:07:21.520006 4836 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-16 10:24:47.01753924 +0000 UTC Feb 17 14:07:21 crc kubenswrapper[4836]: I0217 14:07:21.539878 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:21 crc kubenswrapper[4836]: I0217 14:07:21.539921 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:21 crc kubenswrapper[4836]: I0217 14:07:21.539937 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:21 crc kubenswrapper[4836]: I0217 14:07:21.539953 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:21 crc kubenswrapper[4836]: I0217 14:07:21.539964 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:21Z","lastTransitionTime":"2026-02-17T14:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:21 crc kubenswrapper[4836]: I0217 14:07:21.567032 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:07:21 crc kubenswrapper[4836]: I0217 14:07:21.567098 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:07:21 crc kubenswrapper[4836]: I0217 14:07:21.567055 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c4txt" Feb 17 14:07:21 crc kubenswrapper[4836]: I0217 14:07:21.567106 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:07:21 crc kubenswrapper[4836]: E0217 14:07:21.567185 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:07:21 crc kubenswrapper[4836]: E0217 14:07:21.567280 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:07:21 crc kubenswrapper[4836]: E0217 14:07:21.567375 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c4txt" podUID="8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c" Feb 17 14:07:21 crc kubenswrapper[4836]: E0217 14:07:21.567449 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:07:21 crc kubenswrapper[4836]: I0217 14:07:21.642388 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:21 crc kubenswrapper[4836]: I0217 14:07:21.642444 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:21 crc kubenswrapper[4836]: I0217 14:07:21.642457 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:21 crc kubenswrapper[4836]: I0217 14:07:21.642477 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:21 crc kubenswrapper[4836]: I0217 14:07:21.642489 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:21Z","lastTransitionTime":"2026-02-17T14:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:21 crc kubenswrapper[4836]: I0217 14:07:21.745022 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:21 crc kubenswrapper[4836]: I0217 14:07:21.745081 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:21 crc kubenswrapper[4836]: I0217 14:07:21.745092 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:21 crc kubenswrapper[4836]: I0217 14:07:21.745108 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:21 crc kubenswrapper[4836]: I0217 14:07:21.745120 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:21Z","lastTransitionTime":"2026-02-17T14:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:21 crc kubenswrapper[4836]: I0217 14:07:21.846854 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:21 crc kubenswrapper[4836]: I0217 14:07:21.846892 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:21 crc kubenswrapper[4836]: I0217 14:07:21.846900 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:21 crc kubenswrapper[4836]: I0217 14:07:21.846912 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:21 crc kubenswrapper[4836]: I0217 14:07:21.846920 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:21Z","lastTransitionTime":"2026-02-17T14:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:21 crc kubenswrapper[4836]: I0217 14:07:21.882624 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c-metrics-certs\") pod \"network-metrics-daemon-c4txt\" (UID: \"8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c\") " pod="openshift-multus/network-metrics-daemon-c4txt" Feb 17 14:07:21 crc kubenswrapper[4836]: E0217 14:07:21.882858 4836 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 14:07:21 crc kubenswrapper[4836]: E0217 14:07:21.882961 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c-metrics-certs podName:8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c nodeName:}" failed. No retries permitted until 2026-02-17 14:07:53.882936762 +0000 UTC m=+100.225865071 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c-metrics-certs") pod "network-metrics-daemon-c4txt" (UID: "8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 14:07:21 crc kubenswrapper[4836]: I0217 14:07:21.949545 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:21 crc kubenswrapper[4836]: I0217 14:07:21.949889 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:21 crc kubenswrapper[4836]: I0217 14:07:21.949970 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:21 crc kubenswrapper[4836]: I0217 14:07:21.950044 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:21 crc kubenswrapper[4836]: I0217 14:07:21.950129 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:21Z","lastTransitionTime":"2026-02-17T14:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:22 crc kubenswrapper[4836]: I0217 14:07:22.052690 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:22 crc kubenswrapper[4836]: I0217 14:07:22.052740 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:22 crc kubenswrapper[4836]: I0217 14:07:22.052752 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:22 crc kubenswrapper[4836]: I0217 14:07:22.052773 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:22 crc kubenswrapper[4836]: I0217 14:07:22.052785 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:22Z","lastTransitionTime":"2026-02-17T14:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:22 crc kubenswrapper[4836]: I0217 14:07:22.155619 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:22 crc kubenswrapper[4836]: I0217 14:07:22.155669 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:22 crc kubenswrapper[4836]: I0217 14:07:22.155685 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:22 crc kubenswrapper[4836]: I0217 14:07:22.155709 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:22 crc kubenswrapper[4836]: I0217 14:07:22.155726 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:22Z","lastTransitionTime":"2026-02-17T14:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:22 crc kubenswrapper[4836]: I0217 14:07:22.257775 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:22 crc kubenswrapper[4836]: I0217 14:07:22.257805 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:22 crc kubenswrapper[4836]: I0217 14:07:22.257814 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:22 crc kubenswrapper[4836]: I0217 14:07:22.257829 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:22 crc kubenswrapper[4836]: I0217 14:07:22.257838 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:22Z","lastTransitionTime":"2026-02-17T14:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:22 crc kubenswrapper[4836]: I0217 14:07:22.360547 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:22 crc kubenswrapper[4836]: I0217 14:07:22.360613 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:22 crc kubenswrapper[4836]: I0217 14:07:22.360625 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:22 crc kubenswrapper[4836]: I0217 14:07:22.360640 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:22 crc kubenswrapper[4836]: I0217 14:07:22.360651 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:22Z","lastTransitionTime":"2026-02-17T14:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:22 crc kubenswrapper[4836]: I0217 14:07:22.408006 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:22 crc kubenswrapper[4836]: I0217 14:07:22.408336 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:22 crc kubenswrapper[4836]: I0217 14:07:22.408443 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:22 crc kubenswrapper[4836]: I0217 14:07:22.408536 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:22 crc kubenswrapper[4836]: I0217 14:07:22.408626 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:22Z","lastTransitionTime":"2026-02-17T14:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:22 crc kubenswrapper[4836]: E0217 14:07:22.421993 4836 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:07:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:07:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:07:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:07:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d638d470-b0e0-4be9-938f-7ec815bf6bd8\\\",\\\"systemUUID\\\":\\\"f194f106-0bf2-4b65-bcb3-5215631b39d2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:22Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:22 crc kubenswrapper[4836]: I0217 14:07:22.426398 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:22 crc kubenswrapper[4836]: I0217 14:07:22.426463 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:22 crc kubenswrapper[4836]: I0217 14:07:22.426480 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:22 crc kubenswrapper[4836]: I0217 14:07:22.426509 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:22 crc kubenswrapper[4836]: I0217 14:07:22.426548 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:22Z","lastTransitionTime":"2026-02-17T14:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:22 crc kubenswrapper[4836]: E0217 14:07:22.441908 4836 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:07:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:07:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:07:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:07:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d638d470-b0e0-4be9-938f-7ec815bf6bd8\\\",\\\"systemUUID\\\":\\\"f194f106-0bf2-4b65-bcb3-5215631b39d2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:22Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:22 crc kubenswrapper[4836]: I0217 14:07:22.445685 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:22 crc kubenswrapper[4836]: I0217 14:07:22.445720 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:22 crc kubenswrapper[4836]: I0217 14:07:22.445728 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:22 crc kubenswrapper[4836]: I0217 14:07:22.445742 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:22 crc kubenswrapper[4836]: I0217 14:07:22.445752 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:22Z","lastTransitionTime":"2026-02-17T14:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:22 crc kubenswrapper[4836]: E0217 14:07:22.456921 4836 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:07:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:07:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:07:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:07:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d638d470-b0e0-4be9-938f-7ec815bf6bd8\\\",\\\"systemUUID\\\":\\\"f194f106-0bf2-4b65-bcb3-5215631b39d2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:22Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:22 crc kubenswrapper[4836]: I0217 14:07:22.460228 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:22 crc kubenswrapper[4836]: I0217 14:07:22.460256 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:22 crc kubenswrapper[4836]: I0217 14:07:22.460264 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:22 crc kubenswrapper[4836]: I0217 14:07:22.460277 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:22 crc kubenswrapper[4836]: I0217 14:07:22.460287 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:22Z","lastTransitionTime":"2026-02-17T14:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:22 crc kubenswrapper[4836]: E0217 14:07:22.472426 4836 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:07:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:07:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:07:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:07:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d638d470-b0e0-4be9-938f-7ec815bf6bd8\\\",\\\"systemUUID\\\":\\\"f194f106-0bf2-4b65-bcb3-5215631b39d2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:22Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:22 crc kubenswrapper[4836]: I0217 14:07:22.475729 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:22 crc kubenswrapper[4836]: I0217 14:07:22.475762 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:22 crc kubenswrapper[4836]: I0217 14:07:22.475771 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:22 crc kubenswrapper[4836]: I0217 14:07:22.475784 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:22 crc kubenswrapper[4836]: I0217 14:07:22.475792 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:22Z","lastTransitionTime":"2026-02-17T14:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:22 crc kubenswrapper[4836]: E0217 14:07:22.487045 4836 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:07:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:07:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:07:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:07:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d638d470-b0e0-4be9-938f-7ec815bf6bd8\\\",\\\"systemUUID\\\":\\\"f194f106-0bf2-4b65-bcb3-5215631b39d2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:22Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:22 crc kubenswrapper[4836]: E0217 14:07:22.487163 4836 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 17 14:07:22 crc kubenswrapper[4836]: I0217 14:07:22.488625 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:22 crc kubenswrapper[4836]: I0217 14:07:22.488654 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:22 crc kubenswrapper[4836]: I0217 14:07:22.488664 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:22 crc kubenswrapper[4836]: I0217 14:07:22.488678 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:22 crc kubenswrapper[4836]: I0217 14:07:22.488689 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:22Z","lastTransitionTime":"2026-02-17T14:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:22 crc kubenswrapper[4836]: I0217 14:07:22.521165 4836 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-26 15:04:09.502457403 +0000 UTC Feb 17 14:07:22 crc kubenswrapper[4836]: I0217 14:07:22.590726 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:22 crc kubenswrapper[4836]: I0217 14:07:22.590774 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:22 crc kubenswrapper[4836]: I0217 14:07:22.590786 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:22 crc kubenswrapper[4836]: I0217 14:07:22.590801 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:22 crc kubenswrapper[4836]: I0217 14:07:22.590813 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:22Z","lastTransitionTime":"2026-02-17T14:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:22 crc kubenswrapper[4836]: I0217 14:07:22.693459 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:22 crc kubenswrapper[4836]: I0217 14:07:22.693512 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:22 crc kubenswrapper[4836]: I0217 14:07:22.693524 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:22 crc kubenswrapper[4836]: I0217 14:07:22.693543 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:22 crc kubenswrapper[4836]: I0217 14:07:22.693555 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:22Z","lastTransitionTime":"2026-02-17T14:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:22 crc kubenswrapper[4836]: I0217 14:07:22.795566 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:22 crc kubenswrapper[4836]: I0217 14:07:22.795610 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:22 crc kubenswrapper[4836]: I0217 14:07:22.795622 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:22 crc kubenswrapper[4836]: I0217 14:07:22.795640 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:22 crc kubenswrapper[4836]: I0217 14:07:22.795656 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:22Z","lastTransitionTime":"2026-02-17T14:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:22 crc kubenswrapper[4836]: I0217 14:07:22.898087 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:22 crc kubenswrapper[4836]: I0217 14:07:22.898127 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:22 crc kubenswrapper[4836]: I0217 14:07:22.898140 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:22 crc kubenswrapper[4836]: I0217 14:07:22.898156 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:22 crc kubenswrapper[4836]: I0217 14:07:22.898166 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:22Z","lastTransitionTime":"2026-02-17T14:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:23 crc kubenswrapper[4836]: I0217 14:07:23.000928 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:23 crc kubenswrapper[4836]: I0217 14:07:23.000973 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:23 crc kubenswrapper[4836]: I0217 14:07:23.000983 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:23 crc kubenswrapper[4836]: I0217 14:07:23.000996 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:23 crc kubenswrapper[4836]: I0217 14:07:23.001006 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:23Z","lastTransitionTime":"2026-02-17T14:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:23 crc kubenswrapper[4836]: I0217 14:07:23.103415 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:23 crc kubenswrapper[4836]: I0217 14:07:23.103476 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:23 crc kubenswrapper[4836]: I0217 14:07:23.103490 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:23 crc kubenswrapper[4836]: I0217 14:07:23.103510 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:23 crc kubenswrapper[4836]: I0217 14:07:23.103524 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:23Z","lastTransitionTime":"2026-02-17T14:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:23 crc kubenswrapper[4836]: I0217 14:07:23.206044 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:23 crc kubenswrapper[4836]: I0217 14:07:23.206084 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:23 crc kubenswrapper[4836]: I0217 14:07:23.206095 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:23 crc kubenswrapper[4836]: I0217 14:07:23.206110 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:23 crc kubenswrapper[4836]: I0217 14:07:23.206122 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:23Z","lastTransitionTime":"2026-02-17T14:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:23 crc kubenswrapper[4836]: I0217 14:07:23.309284 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:23 crc kubenswrapper[4836]: I0217 14:07:23.309339 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:23 crc kubenswrapper[4836]: I0217 14:07:23.309350 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:23 crc kubenswrapper[4836]: I0217 14:07:23.309364 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:23 crc kubenswrapper[4836]: I0217 14:07:23.309374 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:23Z","lastTransitionTime":"2026-02-17T14:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:23 crc kubenswrapper[4836]: I0217 14:07:23.412289 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:23 crc kubenswrapper[4836]: I0217 14:07:23.412351 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:23 crc kubenswrapper[4836]: I0217 14:07:23.412363 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:23 crc kubenswrapper[4836]: I0217 14:07:23.412380 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:23 crc kubenswrapper[4836]: I0217 14:07:23.412392 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:23Z","lastTransitionTime":"2026-02-17T14:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:23 crc kubenswrapper[4836]: I0217 14:07:23.514867 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:23 crc kubenswrapper[4836]: I0217 14:07:23.514904 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:23 crc kubenswrapper[4836]: I0217 14:07:23.514915 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:23 crc kubenswrapper[4836]: I0217 14:07:23.514932 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:23 crc kubenswrapper[4836]: I0217 14:07:23.514945 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:23Z","lastTransitionTime":"2026-02-17T14:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:23 crc kubenswrapper[4836]: I0217 14:07:23.521875 4836 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-05 06:01:48.085579305 +0000 UTC Feb 17 14:07:23 crc kubenswrapper[4836]: I0217 14:07:23.567275 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:07:23 crc kubenswrapper[4836]: I0217 14:07:23.567350 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:07:23 crc kubenswrapper[4836]: E0217 14:07:23.567419 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:07:23 crc kubenswrapper[4836]: I0217 14:07:23.567289 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:07:23 crc kubenswrapper[4836]: I0217 14:07:23.567516 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c4txt" Feb 17 14:07:23 crc kubenswrapper[4836]: E0217 14:07:23.567575 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:07:23 crc kubenswrapper[4836]: E0217 14:07:23.567652 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:07:23 crc kubenswrapper[4836]: E0217 14:07:23.567713 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c4txt" podUID="8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c" Feb 17 14:07:23 crc kubenswrapper[4836]: I0217 14:07:23.617587 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:23 crc kubenswrapper[4836]: I0217 14:07:23.617634 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:23 crc kubenswrapper[4836]: I0217 14:07:23.617644 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:23 crc kubenswrapper[4836]: I0217 14:07:23.617684 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:23 crc kubenswrapper[4836]: I0217 14:07:23.617694 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:23Z","lastTransitionTime":"2026-02-17T14:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:23 crc kubenswrapper[4836]: I0217 14:07:23.720465 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:23 crc kubenswrapper[4836]: I0217 14:07:23.720518 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:23 crc kubenswrapper[4836]: I0217 14:07:23.720537 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:23 crc kubenswrapper[4836]: I0217 14:07:23.720556 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:23 crc kubenswrapper[4836]: I0217 14:07:23.720566 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:23Z","lastTransitionTime":"2026-02-17T14:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:23 crc kubenswrapper[4836]: I0217 14:07:23.823357 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:23 crc kubenswrapper[4836]: I0217 14:07:23.823408 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:23 crc kubenswrapper[4836]: I0217 14:07:23.823421 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:23 crc kubenswrapper[4836]: I0217 14:07:23.823436 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:23 crc kubenswrapper[4836]: I0217 14:07:23.823448 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:23Z","lastTransitionTime":"2026-02-17T14:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:23 crc kubenswrapper[4836]: I0217 14:07:23.925667 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:23 crc kubenswrapper[4836]: I0217 14:07:23.925710 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:23 crc kubenswrapper[4836]: I0217 14:07:23.925721 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:23 crc kubenswrapper[4836]: I0217 14:07:23.925736 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:23 crc kubenswrapper[4836]: I0217 14:07:23.925749 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:23Z","lastTransitionTime":"2026-02-17T14:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:24 crc kubenswrapper[4836]: I0217 14:07:24.027882 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:24 crc kubenswrapper[4836]: I0217 14:07:24.027924 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:24 crc kubenswrapper[4836]: I0217 14:07:24.027936 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:24 crc kubenswrapper[4836]: I0217 14:07:24.027951 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:24 crc kubenswrapper[4836]: I0217 14:07:24.027961 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:24Z","lastTransitionTime":"2026-02-17T14:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:24 crc kubenswrapper[4836]: I0217 14:07:24.130600 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:24 crc kubenswrapper[4836]: I0217 14:07:24.130633 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:24 crc kubenswrapper[4836]: I0217 14:07:24.130642 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:24 crc kubenswrapper[4836]: I0217 14:07:24.130659 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:24 crc kubenswrapper[4836]: I0217 14:07:24.130670 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:24Z","lastTransitionTime":"2026-02-17T14:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:24 crc kubenswrapper[4836]: I0217 14:07:24.232215 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:24 crc kubenswrapper[4836]: I0217 14:07:24.232255 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:24 crc kubenswrapper[4836]: I0217 14:07:24.232266 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:24 crc kubenswrapper[4836]: I0217 14:07:24.232280 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:24 crc kubenswrapper[4836]: I0217 14:07:24.232310 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:24Z","lastTransitionTime":"2026-02-17T14:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:24 crc kubenswrapper[4836]: I0217 14:07:24.334519 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:24 crc kubenswrapper[4836]: I0217 14:07:24.334553 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:24 crc kubenswrapper[4836]: I0217 14:07:24.334562 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:24 crc kubenswrapper[4836]: I0217 14:07:24.334574 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:24 crc kubenswrapper[4836]: I0217 14:07:24.334583 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:24Z","lastTransitionTime":"2026-02-17T14:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:24 crc kubenswrapper[4836]: I0217 14:07:24.437657 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:24 crc kubenswrapper[4836]: I0217 14:07:24.437708 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:24 crc kubenswrapper[4836]: I0217 14:07:24.437720 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:24 crc kubenswrapper[4836]: I0217 14:07:24.437746 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:24 crc kubenswrapper[4836]: I0217 14:07:24.437757 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:24Z","lastTransitionTime":"2026-02-17T14:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:24 crc kubenswrapper[4836]: I0217 14:07:24.522114 4836 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 00:13:08.177979524 +0000 UTC Feb 17 14:07:24 crc kubenswrapper[4836]: I0217 14:07:24.540666 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:24 crc kubenswrapper[4836]: I0217 14:07:24.540713 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:24 crc kubenswrapper[4836]: I0217 14:07:24.540733 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:24 crc kubenswrapper[4836]: I0217 14:07:24.540778 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:24 crc kubenswrapper[4836]: I0217 14:07:24.540793 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:24Z","lastTransitionTime":"2026-02-17T14:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:24 crc kubenswrapper[4836]: I0217 14:07:24.587665 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"895a19c9-a3f0-4a15-aa19-19347121388c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bce78b27c441ccb16f0fd2489cbe3e38be0f3d330fcd2b602f62a8ca0aa8616b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99tf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c10f7b69881ea75bfa81905a42379fed8daf48e788b5f2787c522a52d28f58cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99tf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bkk9g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:24Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:24 crc kubenswrapper[4836]: I0217 14:07:24.600146 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:24Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:24 crc kubenswrapper[4836]: I0217 14:07:24.609708 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd6efd6a58b6d90897fa902c8d0c8082ecec2899f16adc3e56b98f0bc5c4640d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:24Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:24 crc kubenswrapper[4836]: I0217 14:07:24.619577 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vt5sw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d1f430-35ed-4c4e-a797-d7a0a5a45266\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25f1e7f83b2916db4faac1afe1f5107375c0e1674ff1d6f90f1025a9920111e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqtsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vt5sw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:24Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:24 crc kubenswrapper[4836]: I0217 14:07:24.640007 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"243121c7-6c63-4df6-a6a6-1ef6770e1125\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0143c73054039d5957c29bdf62217900d803ba2e0046c09dcff149b27fc94995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18b1d7908740d818568e489caf3447ab0f5768c180cc6d443adc67414e51cd07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e054e9ec9a1f8d54b704be11ff2e55c9aaad9d717b1817aa553cfbf2cc6ac16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a5d1fd5d2de253417bcb1e4edadc2363a1c259769418eb2b2aab3ff8253de2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fbf3d3daf43d0da96ad0c1340b3c72815adceec24d0ed5fc97f965d1d93e7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://854199187b81a3d6692a05073c35985feb14993f76f4ac86149c2b6bac635e4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://854199187b81a3d6692a05073c35985feb14993f76f4ac86149c2b6bac635e4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67b26f0d65128004f7645ac1f54a11bc56909269cb9b9337e447b0d5bf783c4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67b26f0d65128004f7645ac1f54a11bc56909269cb9b9337e447b0d5bf783c4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e9fc4a71369f80c06598341729c7db0cb6e0f33d70b0be64470bc6fd40d1cce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9fc4a71369f80c06598341729c7db0cb6e0f33d70b0be64470bc6fd40d1cce2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:24Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:24 crc kubenswrapper[4836]: I0217 14:07:24.644284 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:24 crc kubenswrapper[4836]: I0217 14:07:24.644327 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:24 crc kubenswrapper[4836]: I0217 14:07:24.644337 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:24 crc kubenswrapper[4836]: I0217 14:07:24.644352 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:24 crc kubenswrapper[4836]: I0217 14:07:24.644363 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:24Z","lastTransitionTime":"2026-02-17T14:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:24 crc kubenswrapper[4836]: I0217 14:07:24.656222 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b942ef1c58993ae6344d4aef5fcfe21434b5f3b3282bdf8070560fa8973a3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77390b24cb4a223b8eb6a46db49a4a88b68f87aa0b4b134b45a552c180dec5b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:24Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:24 crc kubenswrapper[4836]: I0217 14:07:24.668609 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e7e5246-9255-4e1e-95c9-44606b16c1ae\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://125ff8db8d520f18c812d9fabdb8e31c63958022c790a31167fabc3549de5f7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd23a5cef9ad5b95a15765a1cc2055e05de0f9821f426745c1e50829ed1c8141\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1c943885038503c4b93bc7688fb4ac415496280e3f969326883c0d366c66e08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73bf4b735f1125a8bb5b1f3f449b651c60fdc4ee62b918e28eb3b13278e1f7d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73bf4b735f1125a8bb5b1f3f449b651c60fdc4ee62b918e28eb3b13278e1f7d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:24Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:24 crc kubenswrapper[4836]: I0217 14:07:24.681380 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc30b1c397bb81c58abc71fb2c6d71c94b9a0f7962d8278edc4f65f1a1c64ef8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:24Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:24 crc kubenswrapper[4836]: I0217 14:07:24.695902 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:24Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:24 crc kubenswrapper[4836]: I0217 14:07:24.707382 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:24Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:24 crc kubenswrapper[4836]: I0217 14:07:24.718691 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jlz6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f87e91ef-e64c-45a5-9bd5-cc6537e51b1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e03fa870790f606ebf6db3bac825e59b209b7a2bd6cb22264597b30211fbc2fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vbzg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jlz6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:24Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:24 crc kubenswrapper[4836]: I0217 14:07:24.735741 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebae2d4e3375fda616b0ea16c0c86a5c1c36004e7d22873a289ff37874bbc74b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb05318b17722c3f731dc976d38c05a4ab7eafd686ad0fcf17bd01032409eaee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5576148df454de0fb3d15ea9de93736aab907ca8cf09669c3c25fde79ec6fad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47fdc413501c45a2d6e531a1427fd6df3e9d2c57e3d60098b31fbfd695d98a0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d36d269996169583e4e181aac060491640e74df94385276752f460c408273cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6545ec908eecb35a4405050dd77343f0acd4ec7d54ec55853a926e00b7f37f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4db5a5e396fbddd65ca0e94f1414b226b324f8bce9dd7c243d40bfbd0d060bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4db5a5e396fbddd65ca0e94f1414b226b324f8bce9dd7c243d40bfbd0d060bb1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T14:07:02Z\\\",\\\"message\\\":\\\"\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0217 14:07:02.828170 6489 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]} options:{GoMap:map[iface-id-ver:3b6479f0-333b-4a96-9adf-2099afdc2447 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61897e97-c771-4738-8709-09636387cb00}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0217 14:07:02.828220 6489 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-config-operator/machine-config-controller\\\\\\\"}\\\\nI0217 14:07:02.828227 6489 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0217 14:07:02.828241 6489 services_controller.go:360] Finished syncing service machine-config-controller on namespace openshift-machine-config-operator for network=default : 1.693755ms\\\\nI0217 14:07:02.828254 6489 services_control\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:07:01Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-gfznp_openshift-ovn-kubernetes(67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8d2edfc6a88391d034f310661e09eb58503f33f015b5e067654db06e8b152e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81e6d07edc7d7857cb288980bc7e9d5002234c2fe642df6798611b150b6d66e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81e6d07edc7d7857cb288980bc7e9d5002234c2fe642df6798611b150b6d66e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gfznp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:24Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:24 crc kubenswrapper[4836]: I0217 14:07:24.745782 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:24 crc kubenswrapper[4836]: I0217 14:07:24.745871 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:24 crc kubenswrapper[4836]: I0217 14:07:24.745889 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:24 crc kubenswrapper[4836]: I0217 14:07:24.745912 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:24 crc kubenswrapper[4836]: I0217 14:07:24.745925 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:24Z","lastTransitionTime":"2026-02-17T14:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:24 crc kubenswrapper[4836]: I0217 14:07:24.750412 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15e7e5f3-c249-4609-937e-ffdc78580880\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://281c6f51b95a96d557a29b449759275131f2fde30c4814e7692edfff96442e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bc90fa5f9363034d51c35721493655d10f8f8d5fad4dda86ca4c882963fb7a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edaaa277a27d7e7b9e8b42acaa9ee04fc0933b440441ad3f8abee458e96945c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14e1fa543737ef273b75b2cffc78dc2edf4d1673c2c4095edda7c40db8238f2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a13a546a929cc67fa3a708f23186f65c3dbc28afd67a421a66c732b8308e4177\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:06:30.193193 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:06:30.195418 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-478710080/tls.crt::/tmp/serving-cert-478710080/tls.key\\\\\\\"\\\\nI0217 14:06:35.690345 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:06:35.701676 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:06:35.701714 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:06:35.701739 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:06:35.701745 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:06:35.709986 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 14:06:35.710011 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:06:35.710016 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:06:35.710020 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:06:35.710023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:06:35.710027 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:06:35.710030 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 14:06:35.710186 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 14:06:35.711336 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b36202875b19f7dafcfde168e006d9d47571de30240d5886e5e0920b74b609b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf6a761df18c5e7699f2ec87d4b97f19da71f588cb12eff8e541140a361fa905\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf6a761df18c5e7699f2ec87d4b97f19da71f588cb12eff8e541140a361fa905\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:24Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:24 crc kubenswrapper[4836]: I0217 14:07:24.760118 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-c4txt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g78bt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g78bt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-c4txt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:24Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:24 crc kubenswrapper[4836]: I0217 14:07:24.774603 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t7845" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eeaa6bd-bab3-4310-9522-747924f2e825\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54ef47d3331023c2e4524dc2c8958b4ed4e2a499eb4d5f084f70cd4afcca1d32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6778ba6ea7cfd1d378cae6dc4330542c35cedaff8e820a2158571f542e0ec55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6778ba6ea7cfd1d378cae6dc4330542c35cedaff8e820a2158571f542e0ec55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14eab60dcf798778aa77f01af0307c7a5ef44846ceaf5a12e4a5f529b812aaa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14eab60dcf798778aa77f01af0307c7a5ef44846ceaf5a12e4a5f529b812aaa4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd65cf675b0d34fc5ea309bdcd17ef648bbf1c0ee6e2ece25f459d8d2403387c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd65cf675b0d34fc5ea309bdcd17ef648bbf1c0ee6e2ece25f459d8d2403387c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b15d92528d05fd8bcab26df5eeb85616b44e750574cc8effa1f34c39dc329bf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b15d92528d05fd8bcab26df5eeb85616b44e750574cc8effa1f34c39dc329bf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ddc9616f0a81f7623eb10b0cdb194656ad856548f814df2e6cebdb2419293f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ddc9616f0a81f7623eb10b0cdb194656ad856548f814df2e6cebdb2419293f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a96c0c25b8539ae8bb660010bbcd7d147a150bfb653dc725a34bc7e4e4b949b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a96c0c25b8539ae8bb660010bbcd7d147a150bfb653dc725a34bc7e4e4b949b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t7845\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:24Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:24 crc kubenswrapper[4836]: I0217 14:07:24.785461 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7nmc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b98ed9fc-ca9f-49c8-b92d-c5b58df2ce3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac6f7d7722015b1a0d40a119c1c74e8a37a7e96e735f56a93ee2fde68d0e10f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6dh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61f8fdf114b2f1ca00e55833ea4eeb033f97b9fa48a3308c4db3c7dbc617c3b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6dh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7nmc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:24Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:24 crc kubenswrapper[4836]: I0217 14:07:24.795219 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a443a5a-83a8-4154-94b7-ace6b324f461\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8787a45a4a94da65a0f67f1a402bfa526fa57631259981ded7ec3569fa977f36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6196237bf10370f0d43ac7ba96f1fbc861d8690734cf883081f67616e6047dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21927a2142504e88fd8bb2a254d804c0e13f6a42e81e16c02205292de3de3155\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c21e5f00c6fb44be82d2a907c0f9d17ea47071615370135f0e94a62f1eb7840\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:24Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:24 crc kubenswrapper[4836]: I0217 14:07:24.805829 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c76cc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"592aa549-1b1b-441e-93e4-0821e05ff2b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d56329a484e41f44539eb8ca01b461cc6f79eebf756350b0f49c47c0431e8abc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8vh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c76cc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:24Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:24 crc kubenswrapper[4836]: I0217 14:07:24.848073 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:24 crc kubenswrapper[4836]: I0217 14:07:24.848139 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:24 crc kubenswrapper[4836]: I0217 14:07:24.848149 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:24 crc kubenswrapper[4836]: I0217 14:07:24.848169 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:24 crc kubenswrapper[4836]: I0217 14:07:24.848184 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:24Z","lastTransitionTime":"2026-02-17T14:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:24 crc kubenswrapper[4836]: I0217 14:07:24.950888 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:24 crc kubenswrapper[4836]: I0217 14:07:24.950927 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:24 crc kubenswrapper[4836]: I0217 14:07:24.950942 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:24 crc kubenswrapper[4836]: I0217 14:07:24.950962 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:24 crc kubenswrapper[4836]: I0217 14:07:24.950975 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:24Z","lastTransitionTime":"2026-02-17T14:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:25 crc kubenswrapper[4836]: I0217 14:07:25.027664 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-c76cc_592aa549-1b1b-441e-93e4-0821e05ff2b2/kube-multus/0.log" Feb 17 14:07:25 crc kubenswrapper[4836]: I0217 14:07:25.027747 4836 generic.go:334] "Generic (PLEG): container finished" podID="592aa549-1b1b-441e-93e4-0821e05ff2b2" containerID="d56329a484e41f44539eb8ca01b461cc6f79eebf756350b0f49c47c0431e8abc" exitCode=1 Feb 17 14:07:25 crc kubenswrapper[4836]: I0217 14:07:25.027793 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-c76cc" event={"ID":"592aa549-1b1b-441e-93e4-0821e05ff2b2","Type":"ContainerDied","Data":"d56329a484e41f44539eb8ca01b461cc6f79eebf756350b0f49c47c0431e8abc"} Feb 17 14:07:25 crc kubenswrapper[4836]: I0217 14:07:25.028206 4836 scope.go:117] "RemoveContainer" containerID="d56329a484e41f44539eb8ca01b461cc6f79eebf756350b0f49c47c0431e8abc" Feb 17 14:07:25 crc kubenswrapper[4836]: I0217 14:07:25.040954 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:25Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:25 crc kubenswrapper[4836]: I0217 14:07:25.052805 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd6efd6a58b6d90897fa902c8d0c8082ecec2899f16adc3e56b98f0bc5c4640d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:25Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:25 crc kubenswrapper[4836]: I0217 14:07:25.054484 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:25 crc kubenswrapper[4836]: I0217 14:07:25.054510 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:25 crc kubenswrapper[4836]: I0217 14:07:25.054520 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:25 crc kubenswrapper[4836]: I0217 14:07:25.054535 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:25 crc kubenswrapper[4836]: I0217 14:07:25.054545 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:25Z","lastTransitionTime":"2026-02-17T14:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:25 crc kubenswrapper[4836]: I0217 14:07:25.064601 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vt5sw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d1f430-35ed-4c4e-a797-d7a0a5a45266\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25f1e7f83b2916db4faac1afe1f5107375c0e1674ff1d6f90f1025a9920111e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqtsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vt5sw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:25Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:25 crc kubenswrapper[4836]: I0217 14:07:25.084512 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"243121c7-6c63-4df6-a6a6-1ef6770e1125\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0143c73054039d5957c29bdf62217900d803ba2e0046c09dcff149b27fc94995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18b1d7908740d818568e489caf3447ab0f5768c180cc6d443adc67414e51cd07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e054e9ec9a1f8d54b704be11ff2e55c9aaad9d717b1817aa553cfbf2cc6ac16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a5d1fd5d2de253417bcb1e4edadc2363a1c259769418eb2b2aab3ff8253de2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fbf3d3daf43d0da96ad0c1340b3c72815adceec24d0ed5fc97f965d1d93e7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://854199187b81a3d6692a05073c35985feb14993f76f4ac86149c2b6bac635e4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://854199187b81a3d6692a05073c35985feb14993f76f4ac86149c2b6bac635e4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67b26f0d65128004f7645ac1f54a11bc56909269cb9b9337e447b0d5bf783c4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67b26f0d65128004f7645ac1f54a11bc56909269cb9b9337e447b0d5bf783c4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e9fc4a71369f80c06598341729c7db0cb6e0f33d70b0be64470bc6fd40d1cce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9fc4a71369f80c06598341729c7db0cb6e0f33d70b0be64470bc6fd40d1cce2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:25Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:25 crc kubenswrapper[4836]: I0217 14:07:25.096406 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b942ef1c58993ae6344d4aef5fcfe21434b5f3b3282bdf8070560fa8973a3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77390b24cb4a223b8eb6a46db49a4a88b68f87aa0b4b134b45a552c180dec5b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:25Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:25 crc kubenswrapper[4836]: I0217 14:07:25.109488 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e7e5246-9255-4e1e-95c9-44606b16c1ae\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://125ff8db8d520f18c812d9fabdb8e31c63958022c790a31167fabc3549de5f7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd23a5cef9ad5b95a15765a1cc2055e05de0f9821f426745c1e50829ed1c8141\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1c943885038503c4b93bc7688fb4ac415496280e3f969326883c0d366c66e08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73bf4b735f1125a8bb5b1f3f449b651c60fdc4ee62b918e28eb3b13278e1f7d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73bf4b735f1125a8bb5b1f3f449b651c60fdc4ee62b918e28eb3b13278e1f7d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:25Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:25 crc kubenswrapper[4836]: I0217 14:07:25.131563 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc30b1c397bb81c58abc71fb2c6d71c94b9a0f7962d8278edc4f65f1a1c64ef8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:25Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:25 crc kubenswrapper[4836]: I0217 14:07:25.144694 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:25Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:25 crc kubenswrapper[4836]: I0217 14:07:25.155710 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:25Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:25 crc kubenswrapper[4836]: I0217 14:07:25.157791 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:25 crc kubenswrapper[4836]: I0217 14:07:25.157816 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:25 crc kubenswrapper[4836]: I0217 14:07:25.157826 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:25 crc kubenswrapper[4836]: I0217 14:07:25.157864 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:25 crc kubenswrapper[4836]: I0217 14:07:25.157875 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:25Z","lastTransitionTime":"2026-02-17T14:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:25 crc kubenswrapper[4836]: I0217 14:07:25.167224 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jlz6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f87e91ef-e64c-45a5-9bd5-cc6537e51b1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e03fa870790f606ebf6db3bac825e59b209b7a2bd6cb22264597b30211fbc2fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vbzg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jlz6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:25Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:25 crc kubenswrapper[4836]: I0217 14:07:25.184860 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebae2d4e3375fda616b0ea16c0c86a5c1c36004e7d22873a289ff37874bbc74b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb05318b17722c3f731dc976d38c05a4ab7eafd686ad0fcf17bd01032409eaee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5576148df454de0fb3d15ea9de93736aab907ca8cf09669c3c25fde79ec6fad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47fdc413501c45a2d6e531a1427fd6df3e9d2c57e3d60098b31fbfd695d98a0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d36d269996169583e4e181aac060491640e74df94385276752f460c408273cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6545ec908eecb35a4405050dd77343f0acd4ec7d54ec55853a926e00b7f37f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4db5a5e396fbddd65ca0e94f1414b226b324f8bce9dd7c243d40bfbd0d060bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4db5a5e396fbddd65ca0e94f1414b226b324f8bce9dd7c243d40bfbd0d060bb1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T14:07:02Z\\\",\\\"message\\\":\\\"\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0217 14:07:02.828170 6489 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]} options:{GoMap:map[iface-id-ver:3b6479f0-333b-4a96-9adf-2099afdc2447 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61897e97-c771-4738-8709-09636387cb00}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0217 14:07:02.828220 6489 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-config-operator/machine-config-controller\\\\\\\"}\\\\nI0217 14:07:02.828227 6489 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0217 14:07:02.828241 6489 services_controller.go:360] Finished syncing service machine-config-controller on namespace openshift-machine-config-operator for network=default : 1.693755ms\\\\nI0217 14:07:02.828254 6489 services_control\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:07:01Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-gfznp_openshift-ovn-kubernetes(67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8d2edfc6a88391d034f310661e09eb58503f33f015b5e067654db06e8b152e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81e6d07edc7d7857cb288980bc7e9d5002234c2fe642df6798611b150b6d66e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81e6d07edc7d7857cb288980bc7e9d5002234c2fe642df6798611b150b6d66e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gfznp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:25Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:25 crc kubenswrapper[4836]: I0217 14:07:25.199011 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15e7e5f3-c249-4609-937e-ffdc78580880\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://281c6f51b95a96d557a29b449759275131f2fde30c4814e7692edfff96442e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bc90fa5f9363034d51c35721493655d10f8f8d5fad4dda86ca4c882963fb7a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edaaa277a27d7e7b9e8b42acaa9ee04fc0933b440441ad3f8abee458e96945c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14e1fa543737ef273b75b2cffc78dc2edf4d1673c2c4095edda7c40db8238f2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a13a546a929cc67fa3a708f23186f65c3dbc28afd67a421a66c732b8308e4177\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:06:30.193193 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:06:30.195418 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-478710080/tls.crt::/tmp/serving-cert-478710080/tls.key\\\\\\\"\\\\nI0217 14:06:35.690345 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:06:35.701676 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:06:35.701714 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:06:35.701739 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:06:35.701745 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:06:35.709986 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 14:06:35.710011 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:06:35.710016 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:06:35.710020 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:06:35.710023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:06:35.710027 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:06:35.710030 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 14:06:35.710186 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 14:06:35.711336 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b36202875b19f7dafcfde168e006d9d47571de30240d5886e5e0920b74b609b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf6a761df18c5e7699f2ec87d4b97f19da71f588cb12eff8e541140a361fa905\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf6a761df18c5e7699f2ec87d4b97f19da71f588cb12eff8e541140a361fa905\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:25Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:25 crc kubenswrapper[4836]: I0217 14:07:25.210200 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-c4txt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g78bt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g78bt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-c4txt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:25Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:25 crc kubenswrapper[4836]: I0217 14:07:25.224199 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t7845" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eeaa6bd-bab3-4310-9522-747924f2e825\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54ef47d3331023c2e4524dc2c8958b4ed4e2a499eb4d5f084f70cd4afcca1d32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6778ba6ea7cfd1d378cae6dc4330542c35cedaff8e820a2158571f542e0ec55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6778ba6ea7cfd1d378cae6dc4330542c35cedaff8e820a2158571f542e0ec55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14eab60dcf798778aa77f01af0307c7a5ef44846ceaf5a12e4a5f529b812aaa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14eab60dcf798778aa77f01af0307c7a5ef44846ceaf5a12e4a5f529b812aaa4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd65cf675b0d34fc5ea309bdcd17ef648bbf1c0ee6e2ece25f459d8d2403387c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd65cf675b0d34fc5ea309bdcd17ef648bbf1c0ee6e2ece25f459d8d2403387c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b15d92528d05fd8bcab26df5eeb85616b44e750574cc8effa1f34c39dc329bf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b15d92528d05fd8bcab26df5eeb85616b44e750574cc8effa1f34c39dc329bf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ddc9616f0a81f7623eb10b0cdb194656ad856548f814df2e6cebdb2419293f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ddc9616f0a81f7623eb10b0cdb194656ad856548f814df2e6cebdb2419293f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a96c0c25b8539ae8bb660010bbcd7d147a150bfb653dc725a34bc7e4e4b949b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a96c0c25b8539ae8bb660010bbcd7d147a150bfb653dc725a34bc7e4e4b949b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t7845\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:25Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:25 crc kubenswrapper[4836]: I0217 14:07:25.234882 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7nmc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b98ed9fc-ca9f-49c8-b92d-c5b58df2ce3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac6f7d7722015b1a0d40a119c1c74e8a37a7e96e735f56a93ee2fde68d0e10f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6dh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61f8fdf114b2f1ca00e55833ea4eeb033f97b9fa48a3308c4db3c7dbc617c3b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6dh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7nmc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:25Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:25 crc kubenswrapper[4836]: I0217 14:07:25.249490 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a443a5a-83a8-4154-94b7-ace6b324f461\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8787a45a4a94da65a0f67f1a402bfa526fa57631259981ded7ec3569fa977f36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6196237bf10370f0d43ac7ba96f1fbc861d8690734cf883081f67616e6047dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21927a2142504e88fd8bb2a254d804c0e13f6a42e81e16c02205292de3de3155\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c21e5f00c6fb44be82d2a907c0f9d17ea47071615370135f0e94a62f1eb7840\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:25Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:25 crc kubenswrapper[4836]: I0217 14:07:25.260899 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:25 crc kubenswrapper[4836]: I0217 14:07:25.260940 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:25 crc kubenswrapper[4836]: I0217 14:07:25.260948 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:25 crc kubenswrapper[4836]: I0217 14:07:25.260963 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:25 crc kubenswrapper[4836]: I0217 14:07:25.260973 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:25Z","lastTransitionTime":"2026-02-17T14:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:25 crc kubenswrapper[4836]: I0217 14:07:25.264611 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c76cc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"592aa549-1b1b-441e-93e4-0821e05ff2b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d56329a484e41f44539eb8ca01b461cc6f79eebf756350b0f49c47c0431e8abc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d56329a484e41f44539eb8ca01b461cc6f79eebf756350b0f49c47c0431e8abc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T14:07:24Z\\\",\\\"message\\\":\\\"2026-02-17T14:06:39+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_500ea1db-a57d-4559-adb6-4997611db3f4\\\\n2026-02-17T14:06:39+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_500ea1db-a57d-4559-adb6-4997611db3f4 to /host/opt/cni/bin/\\\\n2026-02-17T14:06:39Z [verbose] multus-daemon started\\\\n2026-02-17T14:06:39Z [verbose] Readiness Indicator file check\\\\n2026-02-17T14:07:24Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8vh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c76cc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:25Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:25 crc kubenswrapper[4836]: I0217 14:07:25.275503 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"895a19c9-a3f0-4a15-aa19-19347121388c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bce78b27c441ccb16f0fd2489cbe3e38be0f3d330fcd2b602f62a8ca0aa8616b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99tf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c10f7b69881ea75bfa81905a42379fed8daf48e788b5f2787c522a52d28f58cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99tf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bkk9g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:25Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:25 crc kubenswrapper[4836]: I0217 14:07:25.362889 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:25 crc kubenswrapper[4836]: I0217 14:07:25.362935 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:25 crc kubenswrapper[4836]: I0217 14:07:25.362948 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:25 crc kubenswrapper[4836]: I0217 14:07:25.362966 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:25 crc kubenswrapper[4836]: I0217 14:07:25.362977 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:25Z","lastTransitionTime":"2026-02-17T14:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:25 crc kubenswrapper[4836]: I0217 14:07:25.465625 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:25 crc kubenswrapper[4836]: I0217 14:07:25.465668 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:25 crc kubenswrapper[4836]: I0217 14:07:25.465677 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:25 crc kubenswrapper[4836]: I0217 14:07:25.465692 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:25 crc kubenswrapper[4836]: I0217 14:07:25.465700 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:25Z","lastTransitionTime":"2026-02-17T14:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:25 crc kubenswrapper[4836]: I0217 14:07:25.522362 4836 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-13 20:21:02.339431977 +0000 UTC Feb 17 14:07:25 crc kubenswrapper[4836]: I0217 14:07:25.566986 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:07:25 crc kubenswrapper[4836]: E0217 14:07:25.567115 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:07:25 crc kubenswrapper[4836]: I0217 14:07:25.567123 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:07:25 crc kubenswrapper[4836]: E0217 14:07:25.567273 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:07:25 crc kubenswrapper[4836]: I0217 14:07:25.567371 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c4txt" Feb 17 14:07:25 crc kubenswrapper[4836]: E0217 14:07:25.567443 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c4txt" podUID="8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c" Feb 17 14:07:25 crc kubenswrapper[4836]: I0217 14:07:25.567586 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:07:25 crc kubenswrapper[4836]: E0217 14:07:25.567658 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:07:25 crc kubenswrapper[4836]: I0217 14:07:25.568460 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:25 crc kubenswrapper[4836]: I0217 14:07:25.568483 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:25 crc kubenswrapper[4836]: I0217 14:07:25.568491 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:25 crc kubenswrapper[4836]: I0217 14:07:25.568502 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:25 crc kubenswrapper[4836]: I0217 14:07:25.568512 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:25Z","lastTransitionTime":"2026-02-17T14:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:25 crc kubenswrapper[4836]: I0217 14:07:25.671163 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:25 crc kubenswrapper[4836]: I0217 14:07:25.671192 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:25 crc kubenswrapper[4836]: I0217 14:07:25.671200 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:25 crc kubenswrapper[4836]: I0217 14:07:25.671213 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:25 crc kubenswrapper[4836]: I0217 14:07:25.671223 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:25Z","lastTransitionTime":"2026-02-17T14:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:25 crc kubenswrapper[4836]: I0217 14:07:25.774582 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:25 crc kubenswrapper[4836]: I0217 14:07:25.774646 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:25 crc kubenswrapper[4836]: I0217 14:07:25.774668 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:25 crc kubenswrapper[4836]: I0217 14:07:25.774696 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:25 crc kubenswrapper[4836]: I0217 14:07:25.774718 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:25Z","lastTransitionTime":"2026-02-17T14:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:25 crc kubenswrapper[4836]: I0217 14:07:25.877063 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:25 crc kubenswrapper[4836]: I0217 14:07:25.877100 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:25 crc kubenswrapper[4836]: I0217 14:07:25.877108 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:25 crc kubenswrapper[4836]: I0217 14:07:25.877121 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:25 crc kubenswrapper[4836]: I0217 14:07:25.877129 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:25Z","lastTransitionTime":"2026-02-17T14:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:25 crc kubenswrapper[4836]: I0217 14:07:25.979903 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:25 crc kubenswrapper[4836]: I0217 14:07:25.979948 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:25 crc kubenswrapper[4836]: I0217 14:07:25.979960 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:25 crc kubenswrapper[4836]: I0217 14:07:25.979976 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:25 crc kubenswrapper[4836]: I0217 14:07:25.979990 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:25Z","lastTransitionTime":"2026-02-17T14:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:26 crc kubenswrapper[4836]: I0217 14:07:26.031612 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-c76cc_592aa549-1b1b-441e-93e4-0821e05ff2b2/kube-multus/0.log" Feb 17 14:07:26 crc kubenswrapper[4836]: I0217 14:07:26.031664 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-c76cc" event={"ID":"592aa549-1b1b-441e-93e4-0821e05ff2b2","Type":"ContainerStarted","Data":"b64d012815880d0ba6314438a96b0ffd6bdad4678135a9bd3c9c2a4d6eb83c41"} Feb 17 14:07:26 crc kubenswrapper[4836]: I0217 14:07:26.045923 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:26Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:26 crc kubenswrapper[4836]: I0217 14:07:26.057590 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:26Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:26 crc kubenswrapper[4836]: I0217 14:07:26.070044 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jlz6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f87e91ef-e64c-45a5-9bd5-cc6537e51b1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e03fa870790f606ebf6db3bac825e59b209b7a2bd6cb22264597b30211fbc2fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vbzg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jlz6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:26Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:26 crc kubenswrapper[4836]: I0217 14:07:26.081798 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:26 crc kubenswrapper[4836]: I0217 14:07:26.081832 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:26 crc kubenswrapper[4836]: I0217 14:07:26.081843 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:26 crc kubenswrapper[4836]: I0217 14:07:26.081856 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:26 crc kubenswrapper[4836]: I0217 14:07:26.081865 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:26Z","lastTransitionTime":"2026-02-17T14:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:26 crc kubenswrapper[4836]: I0217 14:07:26.088611 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebae2d4e3375fda616b0ea16c0c86a5c1c36004e7d22873a289ff37874bbc74b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb05318b17722c3f731dc976d38c05a4ab7eafd686ad0fcf17bd01032409eaee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5576148df454de0fb3d15ea9de93736aab907ca8cf09669c3c25fde79ec6fad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47fdc413501c45a2d6e531a1427fd6df3e9d2c57e3d60098b31fbfd695d98a0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d36d269996169583e4e181aac060491640e74df94385276752f460c408273cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6545ec908eecb35a4405050dd77343f0acd4ec7d54ec55853a926e00b7f37f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4db5a5e396fbddd65ca0e94f1414b226b324f8bce9dd7c243d40bfbd0d060bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4db5a5e396fbddd65ca0e94f1414b226b324f8bce9dd7c243d40bfbd0d060bb1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T14:07:02Z\\\",\\\"message\\\":\\\"\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0217 14:07:02.828170 6489 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]} options:{GoMap:map[iface-id-ver:3b6479f0-333b-4a96-9adf-2099afdc2447 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61897e97-c771-4738-8709-09636387cb00}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0217 14:07:02.828220 6489 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-config-operator/machine-config-controller\\\\\\\"}\\\\nI0217 14:07:02.828227 6489 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0217 14:07:02.828241 6489 services_controller.go:360] Finished syncing service machine-config-controller on namespace openshift-machine-config-operator for network=default : 1.693755ms\\\\nI0217 14:07:02.828254 6489 services_control\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:07:01Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-gfznp_openshift-ovn-kubernetes(67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8d2edfc6a88391d034f310661e09eb58503f33f015b5e067654db06e8b152e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81e6d07edc7d7857cb288980bc7e9d5002234c2fe642df6798611b150b6d66e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81e6d07edc7d7857cb288980bc7e9d5002234c2fe642df6798611b150b6d66e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gfznp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:26Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:26 crc kubenswrapper[4836]: I0217 14:07:26.103106 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15e7e5f3-c249-4609-937e-ffdc78580880\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://281c6f51b95a96d557a29b449759275131f2fde30c4814e7692edfff96442e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bc90fa5f9363034d51c35721493655d10f8f8d5fad4dda86ca4c882963fb7a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edaaa277a27d7e7b9e8b42acaa9ee04fc0933b440441ad3f8abee458e96945c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14e1fa543737ef273b75b2cffc78dc2edf4d1673c2c4095edda7c40db8238f2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a13a546a929cc67fa3a708f23186f65c3dbc28afd67a421a66c732b8308e4177\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:06:30.193193 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:06:30.195418 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-478710080/tls.crt::/tmp/serving-cert-478710080/tls.key\\\\\\\"\\\\nI0217 14:06:35.690345 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:06:35.701676 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:06:35.701714 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:06:35.701739 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:06:35.701745 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:06:35.709986 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 14:06:35.710011 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:06:35.710016 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:06:35.710020 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:06:35.710023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:06:35.710027 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:06:35.710030 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 14:06:35.710186 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 14:06:35.711336 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b36202875b19f7dafcfde168e006d9d47571de30240d5886e5e0920b74b609b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf6a761df18c5e7699f2ec87d4b97f19da71f588cb12eff8e541140a361fa905\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf6a761df18c5e7699f2ec87d4b97f19da71f588cb12eff8e541140a361fa905\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:26Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:26 crc kubenswrapper[4836]: I0217 14:07:26.114795 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e7e5246-9255-4e1e-95c9-44606b16c1ae\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://125ff8db8d520f18c812d9fabdb8e31c63958022c790a31167fabc3549de5f7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd23a5cef9ad5b95a15765a1cc2055e05de0f9821f426745c1e50829ed1c8141\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1c943885038503c4b93bc7688fb4ac415496280e3f969326883c0d366c66e08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73bf4b735f1125a8bb5b1f3f449b651c60fdc4ee62b918e28eb3b13278e1f7d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73bf4b735f1125a8bb5b1f3f449b651c60fdc4ee62b918e28eb3b13278e1f7d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:26Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:26 crc kubenswrapper[4836]: I0217 14:07:26.127042 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc30b1c397bb81c58abc71fb2c6d71c94b9a0f7962d8278edc4f65f1a1c64ef8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:26Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:26 crc kubenswrapper[4836]: I0217 14:07:26.140829 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t7845" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eeaa6bd-bab3-4310-9522-747924f2e825\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54ef47d3331023c2e4524dc2c8958b4ed4e2a499eb4d5f084f70cd4afcca1d32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6778ba6ea7cfd1d378cae6dc4330542c35cedaff8e820a2158571f542e0ec55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6778ba6ea7cfd1d378cae6dc4330542c35cedaff8e820a2158571f542e0ec55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14eab60dcf798778aa77f01af0307c7a5ef44846ceaf5a12e4a5f529b812aaa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14eab60dcf798778aa77f01af0307c7a5ef44846ceaf5a12e4a5f529b812aaa4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd65cf675b0d34fc5ea309bdcd17ef648bbf1c0ee6e2ece25f459d8d2403387c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd65cf675b0d34fc5ea309bdcd17ef648bbf1c0ee6e2ece25f459d8d2403387c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b15d92528d05fd8bcab26df5eeb85616b44e750574cc8effa1f34c39dc329bf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b15d92528d05fd8bcab26df5eeb85616b44e750574cc8effa1f34c39dc329bf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ddc9616f0a81f7623eb10b0cdb194656ad856548f814df2e6cebdb2419293f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ddc9616f0a81f7623eb10b0cdb194656ad856548f814df2e6cebdb2419293f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a96c0c25b8539ae8bb660010bbcd7d147a150bfb653dc725a34bc7e4e4b949b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a96c0c25b8539ae8bb660010bbcd7d147a150bfb653dc725a34bc7e4e4b949b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t7845\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:26Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:26 crc kubenswrapper[4836]: I0217 14:07:26.151239 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7nmc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b98ed9fc-ca9f-49c8-b92d-c5b58df2ce3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac6f7d7722015b1a0d40a119c1c74e8a37a7e96e735f56a93ee2fde68d0e10f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6dh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61f8fdf114b2f1ca00e55833ea4eeb033f97b9fa48a3308c4db3c7dbc617c3b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6dh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7nmc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:26Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:26 crc kubenswrapper[4836]: I0217 14:07:26.162316 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-c4txt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g78bt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g78bt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-c4txt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:26Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:26 crc kubenswrapper[4836]: I0217 14:07:26.173797 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a443a5a-83a8-4154-94b7-ace6b324f461\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8787a45a4a94da65a0f67f1a402bfa526fa57631259981ded7ec3569fa977f36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6196237bf10370f0d43ac7ba96f1fbc861d8690734cf883081f67616e6047dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21927a2142504e88fd8bb2a254d804c0e13f6a42e81e16c02205292de3de3155\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c21e5f00c6fb44be82d2a907c0f9d17ea47071615370135f0e94a62f1eb7840\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:26Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:26 crc kubenswrapper[4836]: I0217 14:07:26.184462 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:26 crc kubenswrapper[4836]: I0217 14:07:26.184499 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:26 crc kubenswrapper[4836]: I0217 14:07:26.184510 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:26 crc kubenswrapper[4836]: I0217 14:07:26.184527 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:26 crc kubenswrapper[4836]: I0217 14:07:26.184538 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:26Z","lastTransitionTime":"2026-02-17T14:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:26 crc kubenswrapper[4836]: I0217 14:07:26.184672 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c76cc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"592aa549-1b1b-441e-93e4-0821e05ff2b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b64d012815880d0ba6314438a96b0ffd6bdad4678135a9bd3c9c2a4d6eb83c41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d56329a484e41f44539eb8ca01b461cc6f79eebf756350b0f49c47c0431e8abc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T14:07:24Z\\\",\\\"message\\\":\\\"2026-02-17T14:06:39+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_500ea1db-a57d-4559-adb6-4997611db3f4\\\\n2026-02-17T14:06:39+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_500ea1db-a57d-4559-adb6-4997611db3f4 to /host/opt/cni/bin/\\\\n2026-02-17T14:06:39Z [verbose] multus-daemon started\\\\n2026-02-17T14:06:39Z [verbose] Readiness Indicator file check\\\\n2026-02-17T14:07:24Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8vh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c76cc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:26Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:26 crc kubenswrapper[4836]: I0217 14:07:26.194272 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"895a19c9-a3f0-4a15-aa19-19347121388c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bce78b27c441ccb16f0fd2489cbe3e38be0f3d330fcd2b602f62a8ca0aa8616b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99tf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c10f7b69881ea75bfa81905a42379fed8daf48e788b5f2787c522a52d28f58cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99tf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bkk9g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:26Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:26 crc kubenswrapper[4836]: I0217 14:07:26.204236 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vt5sw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d1f430-35ed-4c4e-a797-d7a0a5a45266\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25f1e7f83b2916db4faac1afe1f5107375c0e1674ff1d6f90f1025a9920111e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqtsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vt5sw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:26Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:26 crc kubenswrapper[4836]: I0217 14:07:26.222447 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"243121c7-6c63-4df6-a6a6-1ef6770e1125\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0143c73054039d5957c29bdf62217900d803ba2e0046c09dcff149b27fc94995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18b1d7908740d818568e489caf3447ab0f5768c180cc6d443adc67414e51cd07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e054e9ec9a1f8d54b704be11ff2e55c9aaad9d717b1817aa553cfbf2cc6ac16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a5d1fd5d2de253417bcb1e4edadc2363a1c259769418eb2b2aab3ff8253de2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fbf3d3daf43d0da96ad0c1340b3c72815adceec24d0ed5fc97f965d1d93e7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://854199187b81a3d6692a05073c35985feb14993f76f4ac86149c2b6bac635e4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://854199187b81a3d6692a05073c35985feb14993f76f4ac86149c2b6bac635e4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67b26f0d65128004f7645ac1f54a11bc56909269cb9b9337e447b0d5bf783c4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67b26f0d65128004f7645ac1f54a11bc56909269cb9b9337e447b0d5bf783c4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e9fc4a71369f80c06598341729c7db0cb6e0f33d70b0be64470bc6fd40d1cce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9fc4a71369f80c06598341729c7db0cb6e0f33d70b0be64470bc6fd40d1cce2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:26Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:26 crc kubenswrapper[4836]: I0217 14:07:26.233739 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b942ef1c58993ae6344d4aef5fcfe21434b5f3b3282bdf8070560fa8973a3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77390b24cb4a223b8eb6a46db49a4a88b68f87aa0b4b134b45a552c180dec5b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:26Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:26 crc kubenswrapper[4836]: I0217 14:07:26.246848 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:26Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:26 crc kubenswrapper[4836]: I0217 14:07:26.259730 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd6efd6a58b6d90897fa902c8d0c8082ecec2899f16adc3e56b98f0bc5c4640d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:26Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:26 crc kubenswrapper[4836]: I0217 14:07:26.286424 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:26 crc kubenswrapper[4836]: I0217 14:07:26.286449 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:26 crc kubenswrapper[4836]: I0217 14:07:26.286457 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:26 crc kubenswrapper[4836]: I0217 14:07:26.286472 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:26 crc kubenswrapper[4836]: I0217 14:07:26.286481 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:26Z","lastTransitionTime":"2026-02-17T14:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:26 crc kubenswrapper[4836]: I0217 14:07:26.388639 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:26 crc kubenswrapper[4836]: I0217 14:07:26.388695 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:26 crc kubenswrapper[4836]: I0217 14:07:26.388704 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:26 crc kubenswrapper[4836]: I0217 14:07:26.388717 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:26 crc kubenswrapper[4836]: I0217 14:07:26.388726 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:26Z","lastTransitionTime":"2026-02-17T14:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:26 crc kubenswrapper[4836]: I0217 14:07:26.491378 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:26 crc kubenswrapper[4836]: I0217 14:07:26.491424 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:26 crc kubenswrapper[4836]: I0217 14:07:26.491436 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:26 crc kubenswrapper[4836]: I0217 14:07:26.491451 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:26 crc kubenswrapper[4836]: I0217 14:07:26.491465 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:26Z","lastTransitionTime":"2026-02-17T14:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:26 crc kubenswrapper[4836]: I0217 14:07:26.522963 4836 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-30 23:56:33.039720854 +0000 UTC Feb 17 14:07:26 crc kubenswrapper[4836]: I0217 14:07:26.593245 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:26 crc kubenswrapper[4836]: I0217 14:07:26.593274 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:26 crc kubenswrapper[4836]: I0217 14:07:26.593281 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:26 crc kubenswrapper[4836]: I0217 14:07:26.593313 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:26 crc kubenswrapper[4836]: I0217 14:07:26.593322 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:26Z","lastTransitionTime":"2026-02-17T14:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:26 crc kubenswrapper[4836]: I0217 14:07:26.696065 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:26 crc kubenswrapper[4836]: I0217 14:07:26.696116 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:26 crc kubenswrapper[4836]: I0217 14:07:26.696129 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:26 crc kubenswrapper[4836]: I0217 14:07:26.696146 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:26 crc kubenswrapper[4836]: I0217 14:07:26.696158 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:26Z","lastTransitionTime":"2026-02-17T14:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:26 crc kubenswrapper[4836]: I0217 14:07:26.798387 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:26 crc kubenswrapper[4836]: I0217 14:07:26.798429 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:26 crc kubenswrapper[4836]: I0217 14:07:26.798438 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:26 crc kubenswrapper[4836]: I0217 14:07:26.798453 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:26 crc kubenswrapper[4836]: I0217 14:07:26.798462 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:26Z","lastTransitionTime":"2026-02-17T14:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:26 crc kubenswrapper[4836]: I0217 14:07:26.901175 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:26 crc kubenswrapper[4836]: I0217 14:07:26.901224 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:26 crc kubenswrapper[4836]: I0217 14:07:26.901236 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:26 crc kubenswrapper[4836]: I0217 14:07:26.901253 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:26 crc kubenswrapper[4836]: I0217 14:07:26.901265 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:26Z","lastTransitionTime":"2026-02-17T14:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:27 crc kubenswrapper[4836]: I0217 14:07:27.003961 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:27 crc kubenswrapper[4836]: I0217 14:07:27.004015 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:27 crc kubenswrapper[4836]: I0217 14:07:27.004025 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:27 crc kubenswrapper[4836]: I0217 14:07:27.004043 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:27 crc kubenswrapper[4836]: I0217 14:07:27.004052 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:27Z","lastTransitionTime":"2026-02-17T14:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:27 crc kubenswrapper[4836]: I0217 14:07:27.106446 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:27 crc kubenswrapper[4836]: I0217 14:07:27.106493 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:27 crc kubenswrapper[4836]: I0217 14:07:27.106503 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:27 crc kubenswrapper[4836]: I0217 14:07:27.106519 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:27 crc kubenswrapper[4836]: I0217 14:07:27.106530 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:27Z","lastTransitionTime":"2026-02-17T14:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:27 crc kubenswrapper[4836]: I0217 14:07:27.209125 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:27 crc kubenswrapper[4836]: I0217 14:07:27.209170 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:27 crc kubenswrapper[4836]: I0217 14:07:27.209183 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:27 crc kubenswrapper[4836]: I0217 14:07:27.209200 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:27 crc kubenswrapper[4836]: I0217 14:07:27.209211 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:27Z","lastTransitionTime":"2026-02-17T14:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:27 crc kubenswrapper[4836]: I0217 14:07:27.312020 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:27 crc kubenswrapper[4836]: I0217 14:07:27.312075 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:27 crc kubenswrapper[4836]: I0217 14:07:27.312085 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:27 crc kubenswrapper[4836]: I0217 14:07:27.312101 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:27 crc kubenswrapper[4836]: I0217 14:07:27.312112 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:27Z","lastTransitionTime":"2026-02-17T14:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:27 crc kubenswrapper[4836]: I0217 14:07:27.414693 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:27 crc kubenswrapper[4836]: I0217 14:07:27.414765 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:27 crc kubenswrapper[4836]: I0217 14:07:27.414777 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:27 crc kubenswrapper[4836]: I0217 14:07:27.414814 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:27 crc kubenswrapper[4836]: I0217 14:07:27.414825 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:27Z","lastTransitionTime":"2026-02-17T14:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:27 crc kubenswrapper[4836]: I0217 14:07:27.517159 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:27 crc kubenswrapper[4836]: I0217 14:07:27.517191 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:27 crc kubenswrapper[4836]: I0217 14:07:27.517202 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:27 crc kubenswrapper[4836]: I0217 14:07:27.517218 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:27 crc kubenswrapper[4836]: I0217 14:07:27.517229 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:27Z","lastTransitionTime":"2026-02-17T14:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:27 crc kubenswrapper[4836]: I0217 14:07:27.523498 4836 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-11 02:48:27.143352604 +0000 UTC Feb 17 14:07:27 crc kubenswrapper[4836]: I0217 14:07:27.567996 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:07:27 crc kubenswrapper[4836]: I0217 14:07:27.568042 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:07:27 crc kubenswrapper[4836]: I0217 14:07:27.568099 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c4txt" Feb 17 14:07:27 crc kubenswrapper[4836]: I0217 14:07:27.568138 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:07:27 crc kubenswrapper[4836]: E0217 14:07:27.568229 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:07:27 crc kubenswrapper[4836]: E0217 14:07:27.568418 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:07:27 crc kubenswrapper[4836]: E0217 14:07:27.568512 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:07:27 crc kubenswrapper[4836]: E0217 14:07:27.568655 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c4txt" podUID="8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c" Feb 17 14:07:27 crc kubenswrapper[4836]: I0217 14:07:27.619600 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:27 crc kubenswrapper[4836]: I0217 14:07:27.619643 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:27 crc kubenswrapper[4836]: I0217 14:07:27.619654 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:27 crc kubenswrapper[4836]: I0217 14:07:27.619667 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:27 crc kubenswrapper[4836]: I0217 14:07:27.619675 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:27Z","lastTransitionTime":"2026-02-17T14:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:27 crc kubenswrapper[4836]: I0217 14:07:27.722243 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:27 crc kubenswrapper[4836]: I0217 14:07:27.722285 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:27 crc kubenswrapper[4836]: I0217 14:07:27.722318 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:27 crc kubenswrapper[4836]: I0217 14:07:27.722336 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:27 crc kubenswrapper[4836]: I0217 14:07:27.722347 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:27Z","lastTransitionTime":"2026-02-17T14:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:27 crc kubenswrapper[4836]: I0217 14:07:27.824265 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:27 crc kubenswrapper[4836]: I0217 14:07:27.824336 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:27 crc kubenswrapper[4836]: I0217 14:07:27.824349 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:27 crc kubenswrapper[4836]: I0217 14:07:27.824364 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:27 crc kubenswrapper[4836]: I0217 14:07:27.824375 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:27Z","lastTransitionTime":"2026-02-17T14:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:27 crc kubenswrapper[4836]: I0217 14:07:27.927096 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:27 crc kubenswrapper[4836]: I0217 14:07:27.927139 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:27 crc kubenswrapper[4836]: I0217 14:07:27.927151 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:27 crc kubenswrapper[4836]: I0217 14:07:27.927166 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:27 crc kubenswrapper[4836]: I0217 14:07:27.927194 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:27Z","lastTransitionTime":"2026-02-17T14:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:28 crc kubenswrapper[4836]: I0217 14:07:28.029653 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:28 crc kubenswrapper[4836]: I0217 14:07:28.029719 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:28 crc kubenswrapper[4836]: I0217 14:07:28.029754 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:28 crc kubenswrapper[4836]: I0217 14:07:28.029778 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:28 crc kubenswrapper[4836]: I0217 14:07:28.029793 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:28Z","lastTransitionTime":"2026-02-17T14:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:28 crc kubenswrapper[4836]: I0217 14:07:28.132529 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:28 crc kubenswrapper[4836]: I0217 14:07:28.132600 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:28 crc kubenswrapper[4836]: I0217 14:07:28.132611 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:28 crc kubenswrapper[4836]: I0217 14:07:28.132629 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:28 crc kubenswrapper[4836]: I0217 14:07:28.132662 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:28Z","lastTransitionTime":"2026-02-17T14:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:28 crc kubenswrapper[4836]: I0217 14:07:28.234688 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:28 crc kubenswrapper[4836]: I0217 14:07:28.234733 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:28 crc kubenswrapper[4836]: I0217 14:07:28.234746 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:28 crc kubenswrapper[4836]: I0217 14:07:28.234763 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:28 crc kubenswrapper[4836]: I0217 14:07:28.234773 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:28Z","lastTransitionTime":"2026-02-17T14:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:28 crc kubenswrapper[4836]: I0217 14:07:28.338519 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:28 crc kubenswrapper[4836]: I0217 14:07:28.338648 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:28 crc kubenswrapper[4836]: I0217 14:07:28.338672 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:28 crc kubenswrapper[4836]: I0217 14:07:28.338694 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:28 crc kubenswrapper[4836]: I0217 14:07:28.338706 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:28Z","lastTransitionTime":"2026-02-17T14:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:28 crc kubenswrapper[4836]: I0217 14:07:28.441467 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:28 crc kubenswrapper[4836]: I0217 14:07:28.441517 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:28 crc kubenswrapper[4836]: I0217 14:07:28.441529 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:28 crc kubenswrapper[4836]: I0217 14:07:28.441545 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:28 crc kubenswrapper[4836]: I0217 14:07:28.441556 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:28Z","lastTransitionTime":"2026-02-17T14:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:28 crc kubenswrapper[4836]: I0217 14:07:28.524456 4836 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-26 14:48:24.594580092 +0000 UTC Feb 17 14:07:28 crc kubenswrapper[4836]: I0217 14:07:28.543725 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:28 crc kubenswrapper[4836]: I0217 14:07:28.543773 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:28 crc kubenswrapper[4836]: I0217 14:07:28.543789 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:28 crc kubenswrapper[4836]: I0217 14:07:28.543821 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:28 crc kubenswrapper[4836]: I0217 14:07:28.543834 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:28Z","lastTransitionTime":"2026-02-17T14:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:28 crc kubenswrapper[4836]: I0217 14:07:28.646532 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:28 crc kubenswrapper[4836]: I0217 14:07:28.646592 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:28 crc kubenswrapper[4836]: I0217 14:07:28.646605 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:28 crc kubenswrapper[4836]: I0217 14:07:28.646622 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:28 crc kubenswrapper[4836]: I0217 14:07:28.646633 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:28Z","lastTransitionTime":"2026-02-17T14:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:28 crc kubenswrapper[4836]: I0217 14:07:28.748886 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:28 crc kubenswrapper[4836]: I0217 14:07:28.748933 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:28 crc kubenswrapper[4836]: I0217 14:07:28.748943 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:28 crc kubenswrapper[4836]: I0217 14:07:28.748959 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:28 crc kubenswrapper[4836]: I0217 14:07:28.748989 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:28Z","lastTransitionTime":"2026-02-17T14:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:28 crc kubenswrapper[4836]: I0217 14:07:28.850643 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:28 crc kubenswrapper[4836]: I0217 14:07:28.851221 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:28 crc kubenswrapper[4836]: I0217 14:07:28.851240 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:28 crc kubenswrapper[4836]: I0217 14:07:28.851256 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:28 crc kubenswrapper[4836]: I0217 14:07:28.851267 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:28Z","lastTransitionTime":"2026-02-17T14:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:28 crc kubenswrapper[4836]: I0217 14:07:28.953604 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:28 crc kubenswrapper[4836]: I0217 14:07:28.953638 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:28 crc kubenswrapper[4836]: I0217 14:07:28.953646 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:28 crc kubenswrapper[4836]: I0217 14:07:28.953659 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:28 crc kubenswrapper[4836]: I0217 14:07:28.953668 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:28Z","lastTransitionTime":"2026-02-17T14:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:29 crc kubenswrapper[4836]: I0217 14:07:29.055676 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:29 crc kubenswrapper[4836]: I0217 14:07:29.055757 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:29 crc kubenswrapper[4836]: I0217 14:07:29.055769 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:29 crc kubenswrapper[4836]: I0217 14:07:29.055787 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:29 crc kubenswrapper[4836]: I0217 14:07:29.055798 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:29Z","lastTransitionTime":"2026-02-17T14:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:29 crc kubenswrapper[4836]: I0217 14:07:29.158822 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:29 crc kubenswrapper[4836]: I0217 14:07:29.158944 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:29 crc kubenswrapper[4836]: I0217 14:07:29.158963 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:29 crc kubenswrapper[4836]: I0217 14:07:29.158985 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:29 crc kubenswrapper[4836]: I0217 14:07:29.159002 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:29Z","lastTransitionTime":"2026-02-17T14:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:29 crc kubenswrapper[4836]: I0217 14:07:29.262661 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:29 crc kubenswrapper[4836]: I0217 14:07:29.262714 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:29 crc kubenswrapper[4836]: I0217 14:07:29.262726 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:29 crc kubenswrapper[4836]: I0217 14:07:29.262743 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:29 crc kubenswrapper[4836]: I0217 14:07:29.262752 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:29Z","lastTransitionTime":"2026-02-17T14:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:29 crc kubenswrapper[4836]: I0217 14:07:29.365801 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:29 crc kubenswrapper[4836]: I0217 14:07:29.365864 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:29 crc kubenswrapper[4836]: I0217 14:07:29.365875 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:29 crc kubenswrapper[4836]: I0217 14:07:29.365893 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:29 crc kubenswrapper[4836]: I0217 14:07:29.365903 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:29Z","lastTransitionTime":"2026-02-17T14:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:29 crc kubenswrapper[4836]: I0217 14:07:29.468475 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:29 crc kubenswrapper[4836]: I0217 14:07:29.468538 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:29 crc kubenswrapper[4836]: I0217 14:07:29.468553 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:29 crc kubenswrapper[4836]: I0217 14:07:29.468570 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:29 crc kubenswrapper[4836]: I0217 14:07:29.468904 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:29Z","lastTransitionTime":"2026-02-17T14:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:29 crc kubenswrapper[4836]: I0217 14:07:29.525628 4836 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-26 14:02:29.486625263 +0000 UTC Feb 17 14:07:29 crc kubenswrapper[4836]: I0217 14:07:29.567704 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:07:29 crc kubenswrapper[4836]: I0217 14:07:29.567728 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c4txt" Feb 17 14:07:29 crc kubenswrapper[4836]: I0217 14:07:29.567741 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:07:29 crc kubenswrapper[4836]: E0217 14:07:29.568084 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:07:29 crc kubenswrapper[4836]: E0217 14:07:29.568225 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c4txt" podUID="8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c" Feb 17 14:07:29 crc kubenswrapper[4836]: E0217 14:07:29.568323 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:07:29 crc kubenswrapper[4836]: I0217 14:07:29.568640 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:07:29 crc kubenswrapper[4836]: E0217 14:07:29.568745 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:07:29 crc kubenswrapper[4836]: I0217 14:07:29.571590 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:29 crc kubenswrapper[4836]: I0217 14:07:29.571667 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:29 crc kubenswrapper[4836]: I0217 14:07:29.571675 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:29 crc kubenswrapper[4836]: I0217 14:07:29.571691 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:29 crc kubenswrapper[4836]: I0217 14:07:29.571701 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:29Z","lastTransitionTime":"2026-02-17T14:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:29 crc kubenswrapper[4836]: I0217 14:07:29.673802 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:29 crc kubenswrapper[4836]: I0217 14:07:29.673846 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:29 crc kubenswrapper[4836]: I0217 14:07:29.673857 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:29 crc kubenswrapper[4836]: I0217 14:07:29.673874 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:29 crc kubenswrapper[4836]: I0217 14:07:29.673886 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:29Z","lastTransitionTime":"2026-02-17T14:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:29 crc kubenswrapper[4836]: I0217 14:07:29.777100 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:29 crc kubenswrapper[4836]: I0217 14:07:29.777174 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:29 crc kubenswrapper[4836]: I0217 14:07:29.777190 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:29 crc kubenswrapper[4836]: I0217 14:07:29.777712 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:29 crc kubenswrapper[4836]: I0217 14:07:29.777770 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:29Z","lastTransitionTime":"2026-02-17T14:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:29 crc kubenswrapper[4836]: I0217 14:07:29.881083 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:29 crc kubenswrapper[4836]: I0217 14:07:29.881159 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:29 crc kubenswrapper[4836]: I0217 14:07:29.881183 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:29 crc kubenswrapper[4836]: I0217 14:07:29.881212 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:29 crc kubenswrapper[4836]: I0217 14:07:29.881233 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:29Z","lastTransitionTime":"2026-02-17T14:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:29 crc kubenswrapper[4836]: I0217 14:07:29.984115 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:29 crc kubenswrapper[4836]: I0217 14:07:29.984169 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:29 crc kubenswrapper[4836]: I0217 14:07:29.984185 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:29 crc kubenswrapper[4836]: I0217 14:07:29.984204 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:29 crc kubenswrapper[4836]: I0217 14:07:29.984218 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:29Z","lastTransitionTime":"2026-02-17T14:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:30 crc kubenswrapper[4836]: I0217 14:07:30.086872 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:30 crc kubenswrapper[4836]: I0217 14:07:30.086917 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:30 crc kubenswrapper[4836]: I0217 14:07:30.086932 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:30 crc kubenswrapper[4836]: I0217 14:07:30.086949 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:30 crc kubenswrapper[4836]: I0217 14:07:30.086961 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:30Z","lastTransitionTime":"2026-02-17T14:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:30 crc kubenswrapper[4836]: I0217 14:07:30.189481 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:30 crc kubenswrapper[4836]: I0217 14:07:30.189533 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:30 crc kubenswrapper[4836]: I0217 14:07:30.189544 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:30 crc kubenswrapper[4836]: I0217 14:07:30.189563 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:30 crc kubenswrapper[4836]: I0217 14:07:30.189574 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:30Z","lastTransitionTime":"2026-02-17T14:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:30 crc kubenswrapper[4836]: I0217 14:07:30.292529 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:30 crc kubenswrapper[4836]: I0217 14:07:30.292584 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:30 crc kubenswrapper[4836]: I0217 14:07:30.292597 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:30 crc kubenswrapper[4836]: I0217 14:07:30.292615 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:30 crc kubenswrapper[4836]: I0217 14:07:30.292628 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:30Z","lastTransitionTime":"2026-02-17T14:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:30 crc kubenswrapper[4836]: I0217 14:07:30.395171 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:30 crc kubenswrapper[4836]: I0217 14:07:30.395260 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:30 crc kubenswrapper[4836]: I0217 14:07:30.395280 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:30 crc kubenswrapper[4836]: I0217 14:07:30.395323 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:30 crc kubenswrapper[4836]: I0217 14:07:30.395340 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:30Z","lastTransitionTime":"2026-02-17T14:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:30 crc kubenswrapper[4836]: I0217 14:07:30.497637 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:30 crc kubenswrapper[4836]: I0217 14:07:30.497704 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:30 crc kubenswrapper[4836]: I0217 14:07:30.497715 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:30 crc kubenswrapper[4836]: I0217 14:07:30.497729 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:30 crc kubenswrapper[4836]: I0217 14:07:30.497739 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:30Z","lastTransitionTime":"2026-02-17T14:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:30 crc kubenswrapper[4836]: I0217 14:07:30.526243 4836 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-31 12:38:40.528963581 +0000 UTC Feb 17 14:07:30 crc kubenswrapper[4836]: I0217 14:07:30.599664 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:30 crc kubenswrapper[4836]: I0217 14:07:30.599717 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:30 crc kubenswrapper[4836]: I0217 14:07:30.599733 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:30 crc kubenswrapper[4836]: I0217 14:07:30.599754 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:30 crc kubenswrapper[4836]: I0217 14:07:30.599770 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:30Z","lastTransitionTime":"2026-02-17T14:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:30 crc kubenswrapper[4836]: I0217 14:07:30.702921 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:30 crc kubenswrapper[4836]: I0217 14:07:30.702990 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:30 crc kubenswrapper[4836]: I0217 14:07:30.703004 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:30 crc kubenswrapper[4836]: I0217 14:07:30.703025 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:30 crc kubenswrapper[4836]: I0217 14:07:30.703040 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:30Z","lastTransitionTime":"2026-02-17T14:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:30 crc kubenswrapper[4836]: I0217 14:07:30.805697 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:30 crc kubenswrapper[4836]: I0217 14:07:30.805763 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:30 crc kubenswrapper[4836]: I0217 14:07:30.805784 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:30 crc kubenswrapper[4836]: I0217 14:07:30.805804 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:30 crc kubenswrapper[4836]: I0217 14:07:30.805818 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:30Z","lastTransitionTime":"2026-02-17T14:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:30 crc kubenswrapper[4836]: I0217 14:07:30.908206 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:30 crc kubenswrapper[4836]: I0217 14:07:30.908268 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:30 crc kubenswrapper[4836]: I0217 14:07:30.908279 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:30 crc kubenswrapper[4836]: I0217 14:07:30.908311 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:30 crc kubenswrapper[4836]: I0217 14:07:30.908324 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:30Z","lastTransitionTime":"2026-02-17T14:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:31 crc kubenswrapper[4836]: I0217 14:07:31.011108 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:31 crc kubenswrapper[4836]: I0217 14:07:31.011155 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:31 crc kubenswrapper[4836]: I0217 14:07:31.011164 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:31 crc kubenswrapper[4836]: I0217 14:07:31.011178 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:31 crc kubenswrapper[4836]: I0217 14:07:31.011190 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:31Z","lastTransitionTime":"2026-02-17T14:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:31 crc kubenswrapper[4836]: I0217 14:07:31.113472 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:31 crc kubenswrapper[4836]: I0217 14:07:31.113507 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:31 crc kubenswrapper[4836]: I0217 14:07:31.113518 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:31 crc kubenswrapper[4836]: I0217 14:07:31.113533 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:31 crc kubenswrapper[4836]: I0217 14:07:31.113547 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:31Z","lastTransitionTime":"2026-02-17T14:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:31 crc kubenswrapper[4836]: I0217 14:07:31.215829 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:31 crc kubenswrapper[4836]: I0217 14:07:31.215883 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:31 crc kubenswrapper[4836]: I0217 14:07:31.215895 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:31 crc kubenswrapper[4836]: I0217 14:07:31.215917 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:31 crc kubenswrapper[4836]: I0217 14:07:31.215928 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:31Z","lastTransitionTime":"2026-02-17T14:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:31 crc kubenswrapper[4836]: I0217 14:07:31.318617 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:31 crc kubenswrapper[4836]: I0217 14:07:31.318650 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:31 crc kubenswrapper[4836]: I0217 14:07:31.318659 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:31 crc kubenswrapper[4836]: I0217 14:07:31.318674 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:31 crc kubenswrapper[4836]: I0217 14:07:31.318683 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:31Z","lastTransitionTime":"2026-02-17T14:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:31 crc kubenswrapper[4836]: I0217 14:07:31.422086 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:31 crc kubenswrapper[4836]: I0217 14:07:31.422124 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:31 crc kubenswrapper[4836]: I0217 14:07:31.422135 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:31 crc kubenswrapper[4836]: I0217 14:07:31.422150 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:31 crc kubenswrapper[4836]: I0217 14:07:31.422162 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:31Z","lastTransitionTime":"2026-02-17T14:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:31 crc kubenswrapper[4836]: I0217 14:07:31.525872 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:31 crc kubenswrapper[4836]: I0217 14:07:31.525963 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:31 crc kubenswrapper[4836]: I0217 14:07:31.525986 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:31 crc kubenswrapper[4836]: I0217 14:07:31.526016 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:31 crc kubenswrapper[4836]: I0217 14:07:31.526033 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:31Z","lastTransitionTime":"2026-02-17T14:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:31 crc kubenswrapper[4836]: I0217 14:07:31.526676 4836 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 08:55:16.977807636 +0000 UTC Feb 17 14:07:31 crc kubenswrapper[4836]: I0217 14:07:31.567634 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:07:31 crc kubenswrapper[4836]: E0217 14:07:31.567767 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:07:31 crc kubenswrapper[4836]: I0217 14:07:31.567966 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c4txt" Feb 17 14:07:31 crc kubenswrapper[4836]: E0217 14:07:31.568066 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c4txt" podUID="8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c" Feb 17 14:07:31 crc kubenswrapper[4836]: I0217 14:07:31.568147 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:07:31 crc kubenswrapper[4836]: I0217 14:07:31.568199 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:07:31 crc kubenswrapper[4836]: E0217 14:07:31.568369 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:07:31 crc kubenswrapper[4836]: E0217 14:07:31.568651 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:07:31 crc kubenswrapper[4836]: I0217 14:07:31.628434 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:31 crc kubenswrapper[4836]: I0217 14:07:31.628465 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:31 crc kubenswrapper[4836]: I0217 14:07:31.628473 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:31 crc kubenswrapper[4836]: I0217 14:07:31.628486 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:31 crc kubenswrapper[4836]: I0217 14:07:31.628495 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:31Z","lastTransitionTime":"2026-02-17T14:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:31 crc kubenswrapper[4836]: I0217 14:07:31.731393 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:31 crc kubenswrapper[4836]: I0217 14:07:31.731448 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:31 crc kubenswrapper[4836]: I0217 14:07:31.731464 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:31 crc kubenswrapper[4836]: I0217 14:07:31.731487 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:31 crc kubenswrapper[4836]: I0217 14:07:31.731507 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:31Z","lastTransitionTime":"2026-02-17T14:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:31 crc kubenswrapper[4836]: I0217 14:07:31.836242 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:31 crc kubenswrapper[4836]: I0217 14:07:31.836288 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:31 crc kubenswrapper[4836]: I0217 14:07:31.836324 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:31 crc kubenswrapper[4836]: I0217 14:07:31.836341 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:31 crc kubenswrapper[4836]: I0217 14:07:31.836353 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:31Z","lastTransitionTime":"2026-02-17T14:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:31 crc kubenswrapper[4836]: I0217 14:07:31.938671 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:31 crc kubenswrapper[4836]: I0217 14:07:31.938718 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:31 crc kubenswrapper[4836]: I0217 14:07:31.938727 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:31 crc kubenswrapper[4836]: I0217 14:07:31.938741 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:31 crc kubenswrapper[4836]: I0217 14:07:31.938750 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:31Z","lastTransitionTime":"2026-02-17T14:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:32 crc kubenswrapper[4836]: I0217 14:07:32.041733 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:32 crc kubenswrapper[4836]: I0217 14:07:32.041783 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:32 crc kubenswrapper[4836]: I0217 14:07:32.041793 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:32 crc kubenswrapper[4836]: I0217 14:07:32.041814 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:32 crc kubenswrapper[4836]: I0217 14:07:32.041824 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:32Z","lastTransitionTime":"2026-02-17T14:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:32 crc kubenswrapper[4836]: I0217 14:07:32.144119 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:32 crc kubenswrapper[4836]: I0217 14:07:32.144164 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:32 crc kubenswrapper[4836]: I0217 14:07:32.144173 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:32 crc kubenswrapper[4836]: I0217 14:07:32.144187 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:32 crc kubenswrapper[4836]: I0217 14:07:32.144196 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:32Z","lastTransitionTime":"2026-02-17T14:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:32 crc kubenswrapper[4836]: I0217 14:07:32.247290 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:32 crc kubenswrapper[4836]: I0217 14:07:32.247347 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:32 crc kubenswrapper[4836]: I0217 14:07:32.247368 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:32 crc kubenswrapper[4836]: I0217 14:07:32.247384 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:32 crc kubenswrapper[4836]: I0217 14:07:32.247396 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:32Z","lastTransitionTime":"2026-02-17T14:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:32 crc kubenswrapper[4836]: I0217 14:07:32.350992 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:32 crc kubenswrapper[4836]: I0217 14:07:32.351072 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:32 crc kubenswrapper[4836]: I0217 14:07:32.351090 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:32 crc kubenswrapper[4836]: I0217 14:07:32.351113 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:32 crc kubenswrapper[4836]: I0217 14:07:32.351133 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:32Z","lastTransitionTime":"2026-02-17T14:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:32 crc kubenswrapper[4836]: I0217 14:07:32.453485 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:32 crc kubenswrapper[4836]: I0217 14:07:32.453528 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:32 crc kubenswrapper[4836]: I0217 14:07:32.453539 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:32 crc kubenswrapper[4836]: I0217 14:07:32.453554 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:32 crc kubenswrapper[4836]: I0217 14:07:32.453564 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:32Z","lastTransitionTime":"2026-02-17T14:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:32 crc kubenswrapper[4836]: I0217 14:07:32.527759 4836 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-11 20:25:02.760338148 +0000 UTC Feb 17 14:07:32 crc kubenswrapper[4836]: I0217 14:07:32.555347 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:32 crc kubenswrapper[4836]: I0217 14:07:32.555382 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:32 crc kubenswrapper[4836]: I0217 14:07:32.555389 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:32 crc kubenswrapper[4836]: I0217 14:07:32.555403 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:32 crc kubenswrapper[4836]: I0217 14:07:32.555413 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:32Z","lastTransitionTime":"2026-02-17T14:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:32 crc kubenswrapper[4836]: I0217 14:07:32.567696 4836 scope.go:117] "RemoveContainer" containerID="4db5a5e396fbddd65ca0e94f1414b226b324f8bce9dd7c243d40bfbd0d060bb1" Feb 17 14:07:32 crc kubenswrapper[4836]: I0217 14:07:32.658007 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:32 crc kubenswrapper[4836]: I0217 14:07:32.658050 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:32 crc kubenswrapper[4836]: I0217 14:07:32.658062 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:32 crc kubenswrapper[4836]: I0217 14:07:32.658080 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:32 crc kubenswrapper[4836]: I0217 14:07:32.658095 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:32Z","lastTransitionTime":"2026-02-17T14:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:32 crc kubenswrapper[4836]: I0217 14:07:32.760541 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:32 crc kubenswrapper[4836]: I0217 14:07:32.760572 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:32 crc kubenswrapper[4836]: I0217 14:07:32.760579 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:32 crc kubenswrapper[4836]: I0217 14:07:32.760594 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:32 crc kubenswrapper[4836]: I0217 14:07:32.760603 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:32Z","lastTransitionTime":"2026-02-17T14:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:32 crc kubenswrapper[4836]: I0217 14:07:32.863483 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:32 crc kubenswrapper[4836]: I0217 14:07:32.863562 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:32 crc kubenswrapper[4836]: I0217 14:07:32.863584 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:32 crc kubenswrapper[4836]: I0217 14:07:32.863610 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:32 crc kubenswrapper[4836]: I0217 14:07:32.863631 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:32Z","lastTransitionTime":"2026-02-17T14:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:32 crc kubenswrapper[4836]: I0217 14:07:32.887375 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:32 crc kubenswrapper[4836]: I0217 14:07:32.887413 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:32 crc kubenswrapper[4836]: I0217 14:07:32.887424 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:32 crc kubenswrapper[4836]: I0217 14:07:32.887441 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:32 crc kubenswrapper[4836]: I0217 14:07:32.887454 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:32Z","lastTransitionTime":"2026-02-17T14:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:32 crc kubenswrapper[4836]: E0217 14:07:32.908541 4836 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:07:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:07:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:07:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:07:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d638d470-b0e0-4be9-938f-7ec815bf6bd8\\\",\\\"systemUUID\\\":\\\"f194f106-0bf2-4b65-bcb3-5215631b39d2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:32Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:32 crc kubenswrapper[4836]: I0217 14:07:32.912292 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:32 crc kubenswrapper[4836]: I0217 14:07:32.912339 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:32 crc kubenswrapper[4836]: I0217 14:07:32.912348 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:32 crc kubenswrapper[4836]: I0217 14:07:32.912364 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:32 crc kubenswrapper[4836]: I0217 14:07:32.912374 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:32Z","lastTransitionTime":"2026-02-17T14:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:32 crc kubenswrapper[4836]: E0217 14:07:32.924900 4836 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:07:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:07:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:07:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:07:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d638d470-b0e0-4be9-938f-7ec815bf6bd8\\\",\\\"systemUUID\\\":\\\"f194f106-0bf2-4b65-bcb3-5215631b39d2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:32Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:32 crc kubenswrapper[4836]: I0217 14:07:32.928426 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:32 crc kubenswrapper[4836]: I0217 14:07:32.928473 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:32 crc kubenswrapper[4836]: I0217 14:07:32.928484 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:32 crc kubenswrapper[4836]: I0217 14:07:32.928500 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:32 crc kubenswrapper[4836]: I0217 14:07:32.928511 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:32Z","lastTransitionTime":"2026-02-17T14:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:32 crc kubenswrapper[4836]: E0217 14:07:32.939859 4836 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:07:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:07:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:07:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:07:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d638d470-b0e0-4be9-938f-7ec815bf6bd8\\\",\\\"systemUUID\\\":\\\"f194f106-0bf2-4b65-bcb3-5215631b39d2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:32Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:32 crc kubenswrapper[4836]: I0217 14:07:32.944104 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:32 crc kubenswrapper[4836]: I0217 14:07:32.944147 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:32 crc kubenswrapper[4836]: I0217 14:07:32.944159 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:32 crc kubenswrapper[4836]: I0217 14:07:32.944175 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:32 crc kubenswrapper[4836]: I0217 14:07:32.944187 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:32Z","lastTransitionTime":"2026-02-17T14:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:32 crc kubenswrapper[4836]: E0217 14:07:32.959442 4836 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:07:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:07:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:07:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:07:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d638d470-b0e0-4be9-938f-7ec815bf6bd8\\\",\\\"systemUUID\\\":\\\"f194f106-0bf2-4b65-bcb3-5215631b39d2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:32Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:32 crc kubenswrapper[4836]: I0217 14:07:32.964764 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:32 crc kubenswrapper[4836]: I0217 14:07:32.964818 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:32 crc kubenswrapper[4836]: I0217 14:07:32.964831 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:32 crc kubenswrapper[4836]: I0217 14:07:32.964849 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:32 crc kubenswrapper[4836]: I0217 14:07:32.964861 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:32Z","lastTransitionTime":"2026-02-17T14:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:32 crc kubenswrapper[4836]: E0217 14:07:32.978476 4836 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:07:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:07:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:07:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:07:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d638d470-b0e0-4be9-938f-7ec815bf6bd8\\\",\\\"systemUUID\\\":\\\"f194f106-0bf2-4b65-bcb3-5215631b39d2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:32Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:32 crc kubenswrapper[4836]: E0217 14:07:32.978636 4836 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 17 14:07:32 crc kubenswrapper[4836]: I0217 14:07:32.980501 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:32 crc kubenswrapper[4836]: I0217 14:07:32.980540 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:32 crc kubenswrapper[4836]: I0217 14:07:32.980550 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:32 crc kubenswrapper[4836]: I0217 14:07:32.980563 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:32 crc kubenswrapper[4836]: I0217 14:07:32.980572 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:32Z","lastTransitionTime":"2026-02-17T14:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:33 crc kubenswrapper[4836]: I0217 14:07:33.057718 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gfznp_67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e/ovnkube-controller/2.log" Feb 17 14:07:33 crc kubenswrapper[4836]: I0217 14:07:33.060772 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" event={"ID":"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e","Type":"ContainerStarted","Data":"38fa1bf6b59359eb32319ce0184baa7dac44d10d2a0d9502169ea98eb3408c83"} Feb 17 14:07:33 crc kubenswrapper[4836]: I0217 14:07:33.061246 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" Feb 17 14:07:33 crc kubenswrapper[4836]: I0217 14:07:33.089362 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"243121c7-6c63-4df6-a6a6-1ef6770e1125\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0143c73054039d5957c29bdf62217900d803ba2e0046c09dcff149b27fc94995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18b1d7908740d818568e489caf3447ab0f5768c180cc6d443adc67414e51cd07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e054e9ec9a1f8d54b704be11ff2e55c9aaad9d717b1817aa553cfbf2cc6ac16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a5d1fd5d2de253417bcb1e4edadc2363a1c259769418eb2b2aab3ff8253de2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fbf3d3daf43d0da96ad0c1340b3c72815adceec24d0ed5fc97f965d1d93e7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://854199187b81a3d6692a05073c35985feb14993f76f4ac86149c2b6bac635e4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://854199187b81a3d6692a05073c35985feb14993f76f4ac86149c2b6bac635e4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67b26f0d65128004f7645ac1f54a11bc56909269cb9b9337e447b0d5bf783c4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67b26f0d65128004f7645ac1f54a11bc56909269cb9b9337e447b0d5bf783c4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e9fc4a71369f80c06598341729c7db0cb6e0f33d70b0be64470bc6fd40d1cce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9fc4a71369f80c06598341729c7db0cb6e0f33d70b0be64470bc6fd40d1cce2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:33Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:33 crc kubenswrapper[4836]: I0217 14:07:33.091496 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:33 crc kubenswrapper[4836]: I0217 14:07:33.091540 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:33 crc kubenswrapper[4836]: I0217 14:07:33.091549 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:33 crc kubenswrapper[4836]: I0217 14:07:33.091563 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:33 crc kubenswrapper[4836]: I0217 14:07:33.091575 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:33Z","lastTransitionTime":"2026-02-17T14:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:33 crc kubenswrapper[4836]: I0217 14:07:33.112556 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b942ef1c58993ae6344d4aef5fcfe21434b5f3b3282bdf8070560fa8973a3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77390b24cb4a223b8eb6a46db49a4a88b68f87aa0b4b134b45a552c180dec5b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:33Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:33 crc kubenswrapper[4836]: I0217 14:07:33.125821 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:33Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:33 crc kubenswrapper[4836]: I0217 14:07:33.139848 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd6efd6a58b6d90897fa902c8d0c8082ecec2899f16adc3e56b98f0bc5c4640d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:33Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:33 crc kubenswrapper[4836]: I0217 14:07:33.155428 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vt5sw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d1f430-35ed-4c4e-a797-d7a0a5a45266\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25f1e7f83b2916db4faac1afe1f5107375c0e1674ff1d6f90f1025a9920111e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqtsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vt5sw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:33Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:33 crc kubenswrapper[4836]: I0217 14:07:33.171103 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jlz6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f87e91ef-e64c-45a5-9bd5-cc6537e51b1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e03fa870790f606ebf6db3bac825e59b209b7a2bd6cb22264597b30211fbc2fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vbzg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jlz6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:33Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:33 crc kubenswrapper[4836]: I0217 14:07:33.193931 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:33 crc kubenswrapper[4836]: I0217 14:07:33.193982 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:33 crc kubenswrapper[4836]: I0217 14:07:33.193993 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:33 crc kubenswrapper[4836]: I0217 14:07:33.194011 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:33 crc kubenswrapper[4836]: I0217 14:07:33.194023 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:33Z","lastTransitionTime":"2026-02-17T14:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:33 crc kubenswrapper[4836]: I0217 14:07:33.203496 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebae2d4e3375fda616b0ea16c0c86a5c1c36004e7d22873a289ff37874bbc74b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb05318b17722c3f731dc976d38c05a4ab7eafd686ad0fcf17bd01032409eaee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5576148df454de0fb3d15ea9de93736aab907ca8cf09669c3c25fde79ec6fad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47fdc413501c45a2d6e531a1427fd6df3e9d2c57e3d60098b31fbfd695d98a0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d36d269996169583e4e181aac060491640e74df94385276752f460c408273cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6545ec908eecb35a4405050dd77343f0acd4ec7d54ec55853a926e00b7f37f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38fa1bf6b59359eb32319ce0184baa7dac44d10d2a0d9502169ea98eb3408c83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4db5a5e396fbddd65ca0e94f1414b226b324f8bce9dd7c243d40bfbd0d060bb1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T14:07:02Z\\\",\\\"message\\\":\\\"\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0217 14:07:02.828170 6489 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]} options:{GoMap:map[iface-id-ver:3b6479f0-333b-4a96-9adf-2099afdc2447 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61897e97-c771-4738-8709-09636387cb00}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0217 14:07:02.828220 6489 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-config-operator/machine-config-controller\\\\\\\"}\\\\nI0217 14:07:02.828227 6489 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0217 14:07:02.828241 6489 services_controller.go:360] Finished syncing service machine-config-controller on namespace openshift-machine-config-operator for network=default : 1.693755ms\\\\nI0217 14:07:02.828254 6489 services_control\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:07:01Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8d2edfc6a88391d034f310661e09eb58503f33f015b5e067654db06e8b152e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81e6d07edc7d7857cb288980bc7e9d5002234c2fe642df6798611b150b6d66e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81e6d07edc7d7857cb288980bc7e9d5002234c2fe642df6798611b150b6d66e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gfznp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:33Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:33 crc kubenswrapper[4836]: I0217 14:07:33.225821 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15e7e5f3-c249-4609-937e-ffdc78580880\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://281c6f51b95a96d557a29b449759275131f2fde30c4814e7692edfff96442e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bc90fa5f9363034d51c35721493655d10f8f8d5fad4dda86ca4c882963fb7a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edaaa277a27d7e7b9e8b42acaa9ee04fc0933b440441ad3f8abee458e96945c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14e1fa543737ef273b75b2cffc78dc2edf4d1673c2c4095edda7c40db8238f2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a13a546a929cc67fa3a708f23186f65c3dbc28afd67a421a66c732b8308e4177\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:06:30.193193 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:06:30.195418 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-478710080/tls.crt::/tmp/serving-cert-478710080/tls.key\\\\\\\"\\\\nI0217 14:06:35.690345 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:06:35.701676 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:06:35.701714 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:06:35.701739 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:06:35.701745 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:06:35.709986 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 14:06:35.710011 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:06:35.710016 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:06:35.710020 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:06:35.710023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:06:35.710027 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:06:35.710030 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 14:06:35.710186 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 14:06:35.711336 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b36202875b19f7dafcfde168e006d9d47571de30240d5886e5e0920b74b609b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf6a761df18c5e7699f2ec87d4b97f19da71f588cb12eff8e541140a361fa905\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf6a761df18c5e7699f2ec87d4b97f19da71f588cb12eff8e541140a361fa905\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:33Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:33 crc kubenswrapper[4836]: I0217 14:07:33.240056 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e7e5246-9255-4e1e-95c9-44606b16c1ae\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://125ff8db8d520f18c812d9fabdb8e31c63958022c790a31167fabc3549de5f7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd23a5cef9ad5b95a15765a1cc2055e05de0f9821f426745c1e50829ed1c8141\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1c943885038503c4b93bc7688fb4ac415496280e3f969326883c0d366c66e08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73bf4b735f1125a8bb5b1f3f449b651c60fdc4ee62b918e28eb3b13278e1f7d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73bf4b735f1125a8bb5b1f3f449b651c60fdc4ee62b918e28eb3b13278e1f7d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:33Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:33 crc kubenswrapper[4836]: I0217 14:07:33.251834 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc30b1c397bb81c58abc71fb2c6d71c94b9a0f7962d8278edc4f65f1a1c64ef8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:33Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:33 crc kubenswrapper[4836]: I0217 14:07:33.264018 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:33Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:33 crc kubenswrapper[4836]: I0217 14:07:33.275684 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:33Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:33 crc kubenswrapper[4836]: I0217 14:07:33.292968 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t7845" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eeaa6bd-bab3-4310-9522-747924f2e825\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54ef47d3331023c2e4524dc2c8958b4ed4e2a499eb4d5f084f70cd4afcca1d32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6778ba6ea7cfd1d378cae6dc4330542c35cedaff8e820a2158571f542e0ec55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6778ba6ea7cfd1d378cae6dc4330542c35cedaff8e820a2158571f542e0ec55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14eab60dcf798778aa77f01af0307c7a5ef44846ceaf5a12e4a5f529b812aaa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14eab60dcf798778aa77f01af0307c7a5ef44846ceaf5a12e4a5f529b812aaa4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd65cf675b0d34fc5ea309bdcd17ef648bbf1c0ee6e2ece25f459d8d2403387c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd65cf675b0d34fc5ea309bdcd17ef648bbf1c0ee6e2ece25f459d8d2403387c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b15d92528d05fd8bcab26df5eeb85616b44e750574cc8effa1f34c39dc329bf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b15d92528d05fd8bcab26df5eeb85616b44e750574cc8effa1f34c39dc329bf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ddc9616f0a81f7623eb10b0cdb194656ad856548f814df2e6cebdb2419293f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ddc9616f0a81f7623eb10b0cdb194656ad856548f814df2e6cebdb2419293f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a96c0c25b8539ae8bb660010bbcd7d147a150bfb653dc725a34bc7e4e4b949b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a96c0c25b8539ae8bb660010bbcd7d147a150bfb653dc725a34bc7e4e4b949b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t7845\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:33Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:33 crc kubenswrapper[4836]: I0217 14:07:33.296664 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:33 crc kubenswrapper[4836]: I0217 14:07:33.296697 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:33 crc kubenswrapper[4836]: I0217 14:07:33.296707 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:33 crc kubenswrapper[4836]: I0217 14:07:33.296723 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:33 crc kubenswrapper[4836]: I0217 14:07:33.296735 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:33Z","lastTransitionTime":"2026-02-17T14:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:33 crc kubenswrapper[4836]: I0217 14:07:33.303678 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7nmc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b98ed9fc-ca9f-49c8-b92d-c5b58df2ce3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac6f7d7722015b1a0d40a119c1c74e8a37a7e96e735f56a93ee2fde68d0e10f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6dh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61f8fdf114b2f1ca00e55833ea4eeb033f97b9fa48a3308c4db3c7dbc617c3b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6dh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7nmc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:33Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:33 crc kubenswrapper[4836]: I0217 14:07:33.315700 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-c4txt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g78bt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g78bt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-c4txt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:33Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:33 crc kubenswrapper[4836]: I0217 14:07:33.328315 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a443a5a-83a8-4154-94b7-ace6b324f461\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8787a45a4a94da65a0f67f1a402bfa526fa57631259981ded7ec3569fa977f36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6196237bf10370f0d43ac7ba96f1fbc861d8690734cf883081f67616e6047dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21927a2142504e88fd8bb2a254d804c0e13f6a42e81e16c02205292de3de3155\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c21e5f00c6fb44be82d2a907c0f9d17ea47071615370135f0e94a62f1eb7840\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:33Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:33 crc kubenswrapper[4836]: I0217 14:07:33.341151 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c76cc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"592aa549-1b1b-441e-93e4-0821e05ff2b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b64d012815880d0ba6314438a96b0ffd6bdad4678135a9bd3c9c2a4d6eb83c41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d56329a484e41f44539eb8ca01b461cc6f79eebf756350b0f49c47c0431e8abc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T14:07:24Z\\\",\\\"message\\\":\\\"2026-02-17T14:06:39+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_500ea1db-a57d-4559-adb6-4997611db3f4\\\\n2026-02-17T14:06:39+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_500ea1db-a57d-4559-adb6-4997611db3f4 to /host/opt/cni/bin/\\\\n2026-02-17T14:06:39Z [verbose] multus-daemon started\\\\n2026-02-17T14:06:39Z [verbose] Readiness Indicator file check\\\\n2026-02-17T14:07:24Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8vh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c76cc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:33Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:33 crc kubenswrapper[4836]: I0217 14:07:33.354355 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"895a19c9-a3f0-4a15-aa19-19347121388c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bce78b27c441ccb16f0fd2489cbe3e38be0f3d330fcd2b602f62a8ca0aa8616b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99tf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c10f7b69881ea75bfa81905a42379fed8daf48e788b5f2787c522a52d28f58cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99tf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bkk9g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:33Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:33 crc kubenswrapper[4836]: I0217 14:07:33.399453 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:33 crc kubenswrapper[4836]: I0217 14:07:33.399498 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:33 crc kubenswrapper[4836]: I0217 14:07:33.399507 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:33 crc kubenswrapper[4836]: I0217 14:07:33.399522 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:33 crc kubenswrapper[4836]: I0217 14:07:33.399532 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:33Z","lastTransitionTime":"2026-02-17T14:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:33 crc kubenswrapper[4836]: I0217 14:07:33.501835 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:33 crc kubenswrapper[4836]: I0217 14:07:33.501882 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:33 crc kubenswrapper[4836]: I0217 14:07:33.501906 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:33 crc kubenswrapper[4836]: I0217 14:07:33.501926 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:33 crc kubenswrapper[4836]: I0217 14:07:33.501938 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:33Z","lastTransitionTime":"2026-02-17T14:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:33 crc kubenswrapper[4836]: I0217 14:07:33.528233 4836 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-03 16:02:57.355718812 +0000 UTC Feb 17 14:07:33 crc kubenswrapper[4836]: I0217 14:07:33.567724 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:07:33 crc kubenswrapper[4836]: I0217 14:07:33.567842 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:07:33 crc kubenswrapper[4836]: E0217 14:07:33.567951 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:07:33 crc kubenswrapper[4836]: I0217 14:07:33.568208 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c4txt" Feb 17 14:07:33 crc kubenswrapper[4836]: E0217 14:07:33.568280 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c4txt" podUID="8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c" Feb 17 14:07:33 crc kubenswrapper[4836]: I0217 14:07:33.568542 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:07:33 crc kubenswrapper[4836]: E0217 14:07:33.568605 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:07:33 crc kubenswrapper[4836]: E0217 14:07:33.568841 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:07:33 crc kubenswrapper[4836]: I0217 14:07:33.604724 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:33 crc kubenswrapper[4836]: I0217 14:07:33.604778 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:33 crc kubenswrapper[4836]: I0217 14:07:33.604792 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:33 crc kubenswrapper[4836]: I0217 14:07:33.604808 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:33 crc kubenswrapper[4836]: I0217 14:07:33.604822 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:33Z","lastTransitionTime":"2026-02-17T14:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:33 crc kubenswrapper[4836]: I0217 14:07:33.706960 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:33 crc kubenswrapper[4836]: I0217 14:07:33.707017 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:33 crc kubenswrapper[4836]: I0217 14:07:33.707029 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:33 crc kubenswrapper[4836]: I0217 14:07:33.707046 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:33 crc kubenswrapper[4836]: I0217 14:07:33.707058 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:33Z","lastTransitionTime":"2026-02-17T14:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:33 crc kubenswrapper[4836]: I0217 14:07:33.809659 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:33 crc kubenswrapper[4836]: I0217 14:07:33.809769 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:33 crc kubenswrapper[4836]: I0217 14:07:33.809786 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:33 crc kubenswrapper[4836]: I0217 14:07:33.810056 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:33 crc kubenswrapper[4836]: I0217 14:07:33.810071 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:33Z","lastTransitionTime":"2026-02-17T14:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:33 crc kubenswrapper[4836]: I0217 14:07:33.912013 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:33 crc kubenswrapper[4836]: I0217 14:07:33.912091 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:33 crc kubenswrapper[4836]: I0217 14:07:33.912102 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:33 crc kubenswrapper[4836]: I0217 14:07:33.912117 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:33 crc kubenswrapper[4836]: I0217 14:07:33.912129 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:33Z","lastTransitionTime":"2026-02-17T14:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:34 crc kubenswrapper[4836]: I0217 14:07:34.015201 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:34 crc kubenswrapper[4836]: I0217 14:07:34.015271 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:34 crc kubenswrapper[4836]: I0217 14:07:34.015289 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:34 crc kubenswrapper[4836]: I0217 14:07:34.015339 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:34 crc kubenswrapper[4836]: I0217 14:07:34.015365 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:34Z","lastTransitionTime":"2026-02-17T14:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:34 crc kubenswrapper[4836]: I0217 14:07:34.066263 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gfznp_67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e/ovnkube-controller/3.log" Feb 17 14:07:34 crc kubenswrapper[4836]: I0217 14:07:34.067116 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gfznp_67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e/ovnkube-controller/2.log" Feb 17 14:07:34 crc kubenswrapper[4836]: I0217 14:07:34.070121 4836 generic.go:334] "Generic (PLEG): container finished" podID="67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e" containerID="38fa1bf6b59359eb32319ce0184baa7dac44d10d2a0d9502169ea98eb3408c83" exitCode=1 Feb 17 14:07:34 crc kubenswrapper[4836]: I0217 14:07:34.070187 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" event={"ID":"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e","Type":"ContainerDied","Data":"38fa1bf6b59359eb32319ce0184baa7dac44d10d2a0d9502169ea98eb3408c83"} Feb 17 14:07:34 crc kubenswrapper[4836]: I0217 14:07:34.070244 4836 scope.go:117] "RemoveContainer" containerID="4db5a5e396fbddd65ca0e94f1414b226b324f8bce9dd7c243d40bfbd0d060bb1" Feb 17 14:07:34 crc kubenswrapper[4836]: I0217 14:07:34.071288 4836 scope.go:117] "RemoveContainer" containerID="38fa1bf6b59359eb32319ce0184baa7dac44d10d2a0d9502169ea98eb3408c83" Feb 17 14:07:34 crc kubenswrapper[4836]: E0217 14:07:34.071884 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-gfznp_openshift-ovn-kubernetes(67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" podUID="67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e" Feb 17 14:07:34 crc kubenswrapper[4836]: I0217 14:07:34.087719 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15e7e5f3-c249-4609-937e-ffdc78580880\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://281c6f51b95a96d557a29b449759275131f2fde30c4814e7692edfff96442e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bc90fa5f9363034d51c35721493655d10f8f8d5fad4dda86ca4c882963fb7a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edaaa277a27d7e7b9e8b42acaa9ee04fc0933b440441ad3f8abee458e96945c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14e1fa543737ef273b75b2cffc78dc2edf4d1673c2c4095edda7c40db8238f2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a13a546a929cc67fa3a708f23186f65c3dbc28afd67a421a66c732b8308e4177\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:06:30.193193 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:06:30.195418 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-478710080/tls.crt::/tmp/serving-cert-478710080/tls.key\\\\\\\"\\\\nI0217 14:06:35.690345 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:06:35.701676 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:06:35.701714 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:06:35.701739 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:06:35.701745 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:06:35.709986 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 14:06:35.710011 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:06:35.710016 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:06:35.710020 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:06:35.710023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:06:35.710027 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:06:35.710030 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 14:06:35.710186 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 14:06:35.711336 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b36202875b19f7dafcfde168e006d9d47571de30240d5886e5e0920b74b609b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf6a761df18c5e7699f2ec87d4b97f19da71f588cb12eff8e541140a361fa905\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf6a761df18c5e7699f2ec87d4b97f19da71f588cb12eff8e541140a361fa905\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:34Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:34 crc kubenswrapper[4836]: I0217 14:07:34.099635 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e7e5246-9255-4e1e-95c9-44606b16c1ae\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://125ff8db8d520f18c812d9fabdb8e31c63958022c790a31167fabc3549de5f7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd23a5cef9ad5b95a15765a1cc2055e05de0f9821f426745c1e50829ed1c8141\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1c943885038503c4b93bc7688fb4ac415496280e3f969326883c0d366c66e08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73bf4b735f1125a8bb5b1f3f449b651c60fdc4ee62b918e28eb3b13278e1f7d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73bf4b735f1125a8bb5b1f3f449b651c60fdc4ee62b918e28eb3b13278e1f7d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:34Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:34 crc kubenswrapper[4836]: I0217 14:07:34.113003 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc30b1c397bb81c58abc71fb2c6d71c94b9a0f7962d8278edc4f65f1a1c64ef8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:34Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:34 crc kubenswrapper[4836]: I0217 14:07:34.117021 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:34 crc kubenswrapper[4836]: I0217 14:07:34.117044 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:34 crc kubenswrapper[4836]: I0217 14:07:34.117054 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:34 crc kubenswrapper[4836]: I0217 14:07:34.117070 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:34 crc kubenswrapper[4836]: I0217 14:07:34.117081 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:34Z","lastTransitionTime":"2026-02-17T14:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:34 crc kubenswrapper[4836]: I0217 14:07:34.129119 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:34Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:34 crc kubenswrapper[4836]: I0217 14:07:34.142557 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:34Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:34 crc kubenswrapper[4836]: I0217 14:07:34.154423 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jlz6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f87e91ef-e64c-45a5-9bd5-cc6537e51b1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e03fa870790f606ebf6db3bac825e59b209b7a2bd6cb22264597b30211fbc2fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vbzg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jlz6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:34Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:34 crc kubenswrapper[4836]: I0217 14:07:34.172525 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebae2d4e3375fda616b0ea16c0c86a5c1c36004e7d22873a289ff37874bbc74b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb05318b17722c3f731dc976d38c05a4ab7eafd686ad0fcf17bd01032409eaee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5576148df454de0fb3d15ea9de93736aab907ca8cf09669c3c25fde79ec6fad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47fdc413501c45a2d6e531a1427fd6df3e9d2c57e3d60098b31fbfd695d98a0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d36d269996169583e4e181aac060491640e74df94385276752f460c408273cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6545ec908eecb35a4405050dd77343f0acd4ec7d54ec55853a926e00b7f37f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38fa1bf6b59359eb32319ce0184baa7dac44d10d2a0d9502169ea98eb3408c83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4db5a5e396fbddd65ca0e94f1414b226b324f8bce9dd7c243d40bfbd0d060bb1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T14:07:02Z\\\",\\\"message\\\":\\\"\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0217 14:07:02.828170 6489 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]} options:{GoMap:map[iface-id-ver:3b6479f0-333b-4a96-9adf-2099afdc2447 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61897e97-c771-4738-8709-09636387cb00}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0217 14:07:02.828220 6489 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-config-operator/machine-config-controller\\\\\\\"}\\\\nI0217 14:07:02.828227 6489 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0217 14:07:02.828241 6489 services_controller.go:360] Finished syncing service machine-config-controller on namespace openshift-machine-config-operator for network=default : 1.693755ms\\\\nI0217 14:07:02.828254 6489 services_control\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:07:01Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38fa1bf6b59359eb32319ce0184baa7dac44d10d2a0d9502169ea98eb3408c83\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T14:07:33Z\\\",\\\"message\\\":\\\"02 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 14:07:33.571037 6902 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 14:07:33.571067 6902 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 14:07:33.571422 6902 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 14:07:33.572098 6902 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0217 14:07:33.572236 6902 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0217 14:07:33.572310 6902 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0217 14:07:33.572319 6902 handler.go:208] Removed *v1.Node event handler 2\\\\nI0217 14:07:33.572320 6902 factory.go:656] Stopping watch factory\\\\nI0217 14:07:33.639691 6902 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0217 14:07:33.639727 6902 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0217 14:07:33.639772 6902 ovnkube.go:599] Stopped ovnkube\\\\nI0217 14:07:33.639792 6902 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0217 14:07:33.639858 6902 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8d2edfc6a88391d034f310661e09eb58503f33f015b5e067654db06e8b152e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81e6d07edc7d7857cb288980bc7e9d5002234c2fe642df6798611b150b6d66e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81e6d07edc7d7857cb288980bc7e9d5002234c2fe642df6798611b150b6d66e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gfznp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:34Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:34 crc kubenswrapper[4836]: I0217 14:07:34.190014 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t7845" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eeaa6bd-bab3-4310-9522-747924f2e825\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54ef47d3331023c2e4524dc2c8958b4ed4e2a499eb4d5f084f70cd4afcca1d32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6778ba6ea7cfd1d378cae6dc4330542c35cedaff8e820a2158571f542e0ec55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6778ba6ea7cfd1d378cae6dc4330542c35cedaff8e820a2158571f542e0ec55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14eab60dcf798778aa77f01af0307c7a5ef44846ceaf5a12e4a5f529b812aaa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14eab60dcf798778aa77f01af0307c7a5ef44846ceaf5a12e4a5f529b812aaa4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd65cf675b0d34fc5ea309bdcd17ef648bbf1c0ee6e2ece25f459d8d2403387c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd65cf675b0d34fc5ea309bdcd17ef648bbf1c0ee6e2ece25f459d8d2403387c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b15d92528d05fd8bcab26df5eeb85616b44e750574cc8effa1f34c39dc329bf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b15d92528d05fd8bcab26df5eeb85616b44e750574cc8effa1f34c39dc329bf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ddc9616f0a81f7623eb10b0cdb194656ad856548f814df2e6cebdb2419293f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ddc9616f0a81f7623eb10b0cdb194656ad856548f814df2e6cebdb2419293f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a96c0c25b8539ae8bb660010bbcd7d147a150bfb653dc725a34bc7e4e4b949b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a96c0c25b8539ae8bb660010bbcd7d147a150bfb653dc725a34bc7e4e4b949b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t7845\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:34Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:34 crc kubenswrapper[4836]: I0217 14:07:34.204694 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7nmc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b98ed9fc-ca9f-49c8-b92d-c5b58df2ce3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac6f7d7722015b1a0d40a119c1c74e8a37a7e96e735f56a93ee2fde68d0e10f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6dh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61f8fdf114b2f1ca00e55833ea4eeb033f97b9fa48a3308c4db3c7dbc617c3b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6dh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7nmc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:34Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:34 crc kubenswrapper[4836]: I0217 14:07:34.218070 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-c4txt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g78bt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g78bt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-c4txt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:34Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:34 crc kubenswrapper[4836]: I0217 14:07:34.218991 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:34 crc kubenswrapper[4836]: I0217 14:07:34.219034 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:34 crc kubenswrapper[4836]: I0217 14:07:34.219050 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:34 crc kubenswrapper[4836]: I0217 14:07:34.219074 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:34 crc kubenswrapper[4836]: I0217 14:07:34.219090 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:34Z","lastTransitionTime":"2026-02-17T14:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:34 crc kubenswrapper[4836]: I0217 14:07:34.238673 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a443a5a-83a8-4154-94b7-ace6b324f461\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8787a45a4a94da65a0f67f1a402bfa526fa57631259981ded7ec3569fa977f36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6196237bf10370f0d43ac7ba96f1fbc861d8690734cf883081f67616e6047dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21927a2142504e88fd8bb2a254d804c0e13f6a42e81e16c02205292de3de3155\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c21e5f00c6fb44be82d2a907c0f9d17ea47071615370135f0e94a62f1eb7840\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:34Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:34 crc kubenswrapper[4836]: I0217 14:07:34.254345 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c76cc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"592aa549-1b1b-441e-93e4-0821e05ff2b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b64d012815880d0ba6314438a96b0ffd6bdad4678135a9bd3c9c2a4d6eb83c41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d56329a484e41f44539eb8ca01b461cc6f79eebf756350b0f49c47c0431e8abc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T14:07:24Z\\\",\\\"message\\\":\\\"2026-02-17T14:06:39+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_500ea1db-a57d-4559-adb6-4997611db3f4\\\\n2026-02-17T14:06:39+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_500ea1db-a57d-4559-adb6-4997611db3f4 to /host/opt/cni/bin/\\\\n2026-02-17T14:06:39Z [verbose] multus-daemon started\\\\n2026-02-17T14:06:39Z [verbose] Readiness Indicator file check\\\\n2026-02-17T14:07:24Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8vh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c76cc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:34Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:34 crc kubenswrapper[4836]: I0217 14:07:34.271289 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"895a19c9-a3f0-4a15-aa19-19347121388c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bce78b27c441ccb16f0fd2489cbe3e38be0f3d330fcd2b602f62a8ca0aa8616b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99tf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c10f7b69881ea75bfa81905a42379fed8daf48e788b5f2787c522a52d28f58cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99tf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bkk9g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:34Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:34 crc kubenswrapper[4836]: I0217 14:07:34.293960 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"243121c7-6c63-4df6-a6a6-1ef6770e1125\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0143c73054039d5957c29bdf62217900d803ba2e0046c09dcff149b27fc94995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18b1d7908740d818568e489caf3447ab0f5768c180cc6d443adc67414e51cd07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e054e9ec9a1f8d54b704be11ff2e55c9aaad9d717b1817aa553cfbf2cc6ac16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a5d1fd5d2de253417bcb1e4edadc2363a1c259769418eb2b2aab3ff8253de2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fbf3d3daf43d0da96ad0c1340b3c72815adceec24d0ed5fc97f965d1d93e7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://854199187b81a3d6692a05073c35985feb14993f76f4ac86149c2b6bac635e4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://854199187b81a3d6692a05073c35985feb14993f76f4ac86149c2b6bac635e4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67b26f0d65128004f7645ac1f54a11bc56909269cb9b9337e447b0d5bf783c4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67b26f0d65128004f7645ac1f54a11bc56909269cb9b9337e447b0d5bf783c4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e9fc4a71369f80c06598341729c7db0cb6e0f33d70b0be64470bc6fd40d1cce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9fc4a71369f80c06598341729c7db0cb6e0f33d70b0be64470bc6fd40d1cce2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:34Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:34 crc kubenswrapper[4836]: I0217 14:07:34.308767 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b942ef1c58993ae6344d4aef5fcfe21434b5f3b3282bdf8070560fa8973a3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77390b24cb4a223b8eb6a46db49a4a88b68f87aa0b4b134b45a552c180dec5b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:34Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:34 crc kubenswrapper[4836]: I0217 14:07:34.321719 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:34 crc kubenswrapper[4836]: I0217 14:07:34.321767 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:34 crc kubenswrapper[4836]: I0217 14:07:34.321778 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:34 crc kubenswrapper[4836]: I0217 14:07:34.321794 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:34 crc kubenswrapper[4836]: I0217 14:07:34.321805 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:34Z","lastTransitionTime":"2026-02-17T14:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:34 crc kubenswrapper[4836]: I0217 14:07:34.324802 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:34Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:34 crc kubenswrapper[4836]: I0217 14:07:34.337586 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd6efd6a58b6d90897fa902c8d0c8082ecec2899f16adc3e56b98f0bc5c4640d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:34Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:34 crc kubenswrapper[4836]: I0217 14:07:34.349122 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vt5sw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d1f430-35ed-4c4e-a797-d7a0a5a45266\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25f1e7f83b2916db4faac1afe1f5107375c0e1674ff1d6f90f1025a9920111e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqtsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vt5sw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:34Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:34 crc kubenswrapper[4836]: I0217 14:07:34.424618 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:34 crc kubenswrapper[4836]: I0217 14:07:34.424665 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:34 crc kubenswrapper[4836]: I0217 14:07:34.424677 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:34 crc kubenswrapper[4836]: I0217 14:07:34.424696 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:34 crc kubenswrapper[4836]: I0217 14:07:34.424706 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:34Z","lastTransitionTime":"2026-02-17T14:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:34 crc kubenswrapper[4836]: I0217 14:07:34.527059 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:34 crc kubenswrapper[4836]: I0217 14:07:34.527118 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:34 crc kubenswrapper[4836]: I0217 14:07:34.527139 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:34 crc kubenswrapper[4836]: I0217 14:07:34.527161 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:34 crc kubenswrapper[4836]: I0217 14:07:34.527174 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:34Z","lastTransitionTime":"2026-02-17T14:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:34 crc kubenswrapper[4836]: I0217 14:07:34.529267 4836 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-05 20:33:10.70405295 +0000 UTC Feb 17 14:07:34 crc kubenswrapper[4836]: I0217 14:07:34.585777 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"895a19c9-a3f0-4a15-aa19-19347121388c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bce78b27c441ccb16f0fd2489cbe3e38be0f3d330fcd2b602f62a8ca0aa8616b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99tf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c10f7b69881ea75bfa81905a42379fed8daf48e788b5f2787c522a52d28f58cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99tf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bkk9g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:34Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:34 crc kubenswrapper[4836]: I0217 14:07:34.604737 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"243121c7-6c63-4df6-a6a6-1ef6770e1125\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0143c73054039d5957c29bdf62217900d803ba2e0046c09dcff149b27fc94995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18b1d7908740d818568e489caf3447ab0f5768c180cc6d443adc67414e51cd07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e054e9ec9a1f8d54b704be11ff2e55c9aaad9d717b1817aa553cfbf2cc6ac16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a5d1fd5d2de253417bcb1e4edadc2363a1c259769418eb2b2aab3ff8253de2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fbf3d3daf43d0da96ad0c1340b3c72815adceec24d0ed5fc97f965d1d93e7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://854199187b81a3d6692a05073c35985feb14993f76f4ac86149c2b6bac635e4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://854199187b81a3d6692a05073c35985feb14993f76f4ac86149c2b6bac635e4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67b26f0d65128004f7645ac1f54a11bc56909269cb9b9337e447b0d5bf783c4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67b26f0d65128004f7645ac1f54a11bc56909269cb9b9337e447b0d5bf783c4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e9fc4a71369f80c06598341729c7db0cb6e0f33d70b0be64470bc6fd40d1cce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9fc4a71369f80c06598341729c7db0cb6e0f33d70b0be64470bc6fd40d1cce2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:34Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:34 crc kubenswrapper[4836]: I0217 14:07:34.617324 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b942ef1c58993ae6344d4aef5fcfe21434b5f3b3282bdf8070560fa8973a3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77390b24cb4a223b8eb6a46db49a4a88b68f87aa0b4b134b45a552c180dec5b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:34Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:34 crc kubenswrapper[4836]: I0217 14:07:34.629768 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:34 crc kubenswrapper[4836]: I0217 14:07:34.629835 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:34 crc kubenswrapper[4836]: I0217 14:07:34.629847 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:34 crc kubenswrapper[4836]: I0217 14:07:34.629893 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:34 crc kubenswrapper[4836]: I0217 14:07:34.629912 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:34Z","lastTransitionTime":"2026-02-17T14:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:34 crc kubenswrapper[4836]: I0217 14:07:34.633450 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:34Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:34 crc kubenswrapper[4836]: I0217 14:07:34.646252 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd6efd6a58b6d90897fa902c8d0c8082ecec2899f16adc3e56b98f0bc5c4640d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:34Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:34 crc kubenswrapper[4836]: I0217 14:07:34.657257 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vt5sw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d1f430-35ed-4c4e-a797-d7a0a5a45266\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25f1e7f83b2916db4faac1afe1f5107375c0e1674ff1d6f90f1025a9920111e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqtsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vt5sw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:34Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:34 crc kubenswrapper[4836]: I0217 14:07:34.679141 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebae2d4e3375fda616b0ea16c0c86a5c1c36004e7d22873a289ff37874bbc74b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb05318b17722c3f731dc976d38c05a4ab7eafd686ad0fcf17bd01032409eaee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5576148df454de0fb3d15ea9de93736aab907ca8cf09669c3c25fde79ec6fad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47fdc413501c45a2d6e531a1427fd6df3e9d2c57e3d60098b31fbfd695d98a0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d36d269996169583e4e181aac060491640e74df94385276752f460c408273cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6545ec908eecb35a4405050dd77343f0acd4ec7d54ec55853a926e00b7f37f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38fa1bf6b59359eb32319ce0184baa7dac44d10d2a0d9502169ea98eb3408c83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4db5a5e396fbddd65ca0e94f1414b226b324f8bce9dd7c243d40bfbd0d060bb1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T14:07:02Z\\\",\\\"message\\\":\\\"\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0217 14:07:02.828170 6489 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]} options:{GoMap:map[iface-id-ver:3b6479f0-333b-4a96-9adf-2099afdc2447 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61897e97-c771-4738-8709-09636387cb00}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0217 14:07:02.828220 6489 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-config-operator/machine-config-controller\\\\\\\"}\\\\nI0217 14:07:02.828227 6489 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0217 14:07:02.828241 6489 services_controller.go:360] Finished syncing service machine-config-controller on namespace openshift-machine-config-operator for network=default : 1.693755ms\\\\nI0217 14:07:02.828254 6489 services_control\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:07:01Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38fa1bf6b59359eb32319ce0184baa7dac44d10d2a0d9502169ea98eb3408c83\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T14:07:33Z\\\",\\\"message\\\":\\\"02 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 14:07:33.571037 6902 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 14:07:33.571067 6902 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 14:07:33.571422 6902 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 14:07:33.572098 6902 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0217 14:07:33.572236 6902 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0217 14:07:33.572310 6902 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0217 14:07:33.572319 6902 handler.go:208] Removed *v1.Node event handler 2\\\\nI0217 14:07:33.572320 6902 factory.go:656] Stopping watch factory\\\\nI0217 14:07:33.639691 6902 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0217 14:07:33.639727 6902 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0217 14:07:33.639772 6902 ovnkube.go:599] Stopped ovnkube\\\\nI0217 14:07:33.639792 6902 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0217 14:07:33.639858 6902 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:07:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8d2edfc6a88391d034f310661e09eb58503f33f015b5e067654db06e8b152e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81e6d07edc7d7857cb288980bc7e9d5002234c2fe642df6798611b150b6d66e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81e6d07edc7d7857cb288980bc7e9d5002234c2fe642df6798611b150b6d66e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gfznp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:34Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:34 crc kubenswrapper[4836]: I0217 14:07:34.691753 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15e7e5f3-c249-4609-937e-ffdc78580880\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://281c6f51b95a96d557a29b449759275131f2fde30c4814e7692edfff96442e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bc90fa5f9363034d51c35721493655d10f8f8d5fad4dda86ca4c882963fb7a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edaaa277a27d7e7b9e8b42acaa9ee04fc0933b440441ad3f8abee458e96945c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14e1fa543737ef273b75b2cffc78dc2edf4d1673c2c4095edda7c40db8238f2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a13a546a929cc67fa3a708f23186f65c3dbc28afd67a421a66c732b8308e4177\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:06:30.193193 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:06:30.195418 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-478710080/tls.crt::/tmp/serving-cert-478710080/tls.key\\\\\\\"\\\\nI0217 14:06:35.690345 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:06:35.701676 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:06:35.701714 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:06:35.701739 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:06:35.701745 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:06:35.709986 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 14:06:35.710011 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:06:35.710016 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:06:35.710020 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:06:35.710023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:06:35.710027 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:06:35.710030 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 14:06:35.710186 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 14:06:35.711336 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b36202875b19f7dafcfde168e006d9d47571de30240d5886e5e0920b74b609b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf6a761df18c5e7699f2ec87d4b97f19da71f588cb12eff8e541140a361fa905\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf6a761df18c5e7699f2ec87d4b97f19da71f588cb12eff8e541140a361fa905\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:34Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:34 crc kubenswrapper[4836]: I0217 14:07:34.704248 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e7e5246-9255-4e1e-95c9-44606b16c1ae\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://125ff8db8d520f18c812d9fabdb8e31c63958022c790a31167fabc3549de5f7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd23a5cef9ad5b95a15765a1cc2055e05de0f9821f426745c1e50829ed1c8141\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1c943885038503c4b93bc7688fb4ac415496280e3f969326883c0d366c66e08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73bf4b735f1125a8bb5b1f3f449b651c60fdc4ee62b918e28eb3b13278e1f7d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73bf4b735f1125a8bb5b1f3f449b651c60fdc4ee62b918e28eb3b13278e1f7d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:34Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:34 crc kubenswrapper[4836]: I0217 14:07:34.717134 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc30b1c397bb81c58abc71fb2c6d71c94b9a0f7962d8278edc4f65f1a1c64ef8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:34Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:34 crc kubenswrapper[4836]: I0217 14:07:34.729266 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:34Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:34 crc kubenswrapper[4836]: I0217 14:07:34.732235 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:34 crc kubenswrapper[4836]: I0217 14:07:34.732274 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:34 crc kubenswrapper[4836]: I0217 14:07:34.732284 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:34 crc kubenswrapper[4836]: I0217 14:07:34.732324 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:34 crc kubenswrapper[4836]: I0217 14:07:34.732343 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:34Z","lastTransitionTime":"2026-02-17T14:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:34 crc kubenswrapper[4836]: I0217 14:07:34.740705 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:34Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:34 crc kubenswrapper[4836]: I0217 14:07:34.749666 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jlz6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f87e91ef-e64c-45a5-9bd5-cc6537e51b1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e03fa870790f606ebf6db3bac825e59b209b7a2bd6cb22264597b30211fbc2fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vbzg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jlz6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:34Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:34 crc kubenswrapper[4836]: I0217 14:07:34.764584 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t7845" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eeaa6bd-bab3-4310-9522-747924f2e825\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54ef47d3331023c2e4524dc2c8958b4ed4e2a499eb4d5f084f70cd4afcca1d32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6778ba6ea7cfd1d378cae6dc4330542c35cedaff8e820a2158571f542e0ec55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6778ba6ea7cfd1d378cae6dc4330542c35cedaff8e820a2158571f542e0ec55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14eab60dcf798778aa77f01af0307c7a5ef44846ceaf5a12e4a5f529b812aaa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14eab60dcf798778aa77f01af0307c7a5ef44846ceaf5a12e4a5f529b812aaa4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd65cf675b0d34fc5ea309bdcd17ef648bbf1c0ee6e2ece25f459d8d2403387c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd65cf675b0d34fc5ea309bdcd17ef648bbf1c0ee6e2ece25f459d8d2403387c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b15d92528d05fd8bcab26df5eeb85616b44e750574cc8effa1f34c39dc329bf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b15d92528d05fd8bcab26df5eeb85616b44e750574cc8effa1f34c39dc329bf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ddc9616f0a81f7623eb10b0cdb194656ad856548f814df2e6cebdb2419293f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ddc9616f0a81f7623eb10b0cdb194656ad856548f814df2e6cebdb2419293f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a96c0c25b8539ae8bb660010bbcd7d147a150bfb653dc725a34bc7e4e4b949b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a96c0c25b8539ae8bb660010bbcd7d147a150bfb653dc725a34bc7e4e4b949b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t7845\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:34Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:34 crc kubenswrapper[4836]: I0217 14:07:34.778020 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7nmc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b98ed9fc-ca9f-49c8-b92d-c5b58df2ce3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac6f7d7722015b1a0d40a119c1c74e8a37a7e96e735f56a93ee2fde68d0e10f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6dh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61f8fdf114b2f1ca00e55833ea4eeb033f97b9fa48a3308c4db3c7dbc617c3b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6dh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7nmc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:34Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:34 crc kubenswrapper[4836]: I0217 14:07:34.788852 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-c4txt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g78bt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g78bt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-c4txt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:34Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:34 crc kubenswrapper[4836]: I0217 14:07:34.802992 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a443a5a-83a8-4154-94b7-ace6b324f461\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8787a45a4a94da65a0f67f1a402bfa526fa57631259981ded7ec3569fa977f36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6196237bf10370f0d43ac7ba96f1fbc861d8690734cf883081f67616e6047dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21927a2142504e88fd8bb2a254d804c0e13f6a42e81e16c02205292de3de3155\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c21e5f00c6fb44be82d2a907c0f9d17ea47071615370135f0e94a62f1eb7840\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:34Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:34 crc kubenswrapper[4836]: I0217 14:07:34.818007 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c76cc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"592aa549-1b1b-441e-93e4-0821e05ff2b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b64d012815880d0ba6314438a96b0ffd6bdad4678135a9bd3c9c2a4d6eb83c41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d56329a484e41f44539eb8ca01b461cc6f79eebf756350b0f49c47c0431e8abc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T14:07:24Z\\\",\\\"message\\\":\\\"2026-02-17T14:06:39+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_500ea1db-a57d-4559-adb6-4997611db3f4\\\\n2026-02-17T14:06:39+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_500ea1db-a57d-4559-adb6-4997611db3f4 to /host/opt/cni/bin/\\\\n2026-02-17T14:06:39Z [verbose] multus-daemon started\\\\n2026-02-17T14:06:39Z [verbose] Readiness Indicator file check\\\\n2026-02-17T14:07:24Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8vh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c76cc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:34Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:34 crc kubenswrapper[4836]: I0217 14:07:34.834863 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:34 crc kubenswrapper[4836]: I0217 14:07:34.834902 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:34 crc kubenswrapper[4836]: I0217 14:07:34.834911 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:34 crc kubenswrapper[4836]: I0217 14:07:34.834924 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:34 crc kubenswrapper[4836]: I0217 14:07:34.834933 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:34Z","lastTransitionTime":"2026-02-17T14:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:34 crc kubenswrapper[4836]: I0217 14:07:34.937064 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:34 crc kubenswrapper[4836]: I0217 14:07:34.937101 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:34 crc kubenswrapper[4836]: I0217 14:07:34.937424 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:34 crc kubenswrapper[4836]: I0217 14:07:34.937772 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:34 crc kubenswrapper[4836]: I0217 14:07:34.937854 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:34Z","lastTransitionTime":"2026-02-17T14:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:35 crc kubenswrapper[4836]: I0217 14:07:35.041497 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:35 crc kubenswrapper[4836]: I0217 14:07:35.041553 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:35 crc kubenswrapper[4836]: I0217 14:07:35.041571 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:35 crc kubenswrapper[4836]: I0217 14:07:35.041594 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:35 crc kubenswrapper[4836]: I0217 14:07:35.041610 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:35Z","lastTransitionTime":"2026-02-17T14:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:35 crc kubenswrapper[4836]: I0217 14:07:35.077772 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gfznp_67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e/ovnkube-controller/3.log" Feb 17 14:07:35 crc kubenswrapper[4836]: I0217 14:07:35.083067 4836 scope.go:117] "RemoveContainer" containerID="38fa1bf6b59359eb32319ce0184baa7dac44d10d2a0d9502169ea98eb3408c83" Feb 17 14:07:35 crc kubenswrapper[4836]: E0217 14:07:35.083365 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-gfznp_openshift-ovn-kubernetes(67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" podUID="67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e" Feb 17 14:07:35 crc kubenswrapper[4836]: I0217 14:07:35.103466 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"895a19c9-a3f0-4a15-aa19-19347121388c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bce78b27c441ccb16f0fd2489cbe3e38be0f3d330fcd2b602f62a8ca0aa8616b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99tf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c10f7b69881ea75bfa81905a42379fed8daf48e788b5f2787c522a52d28f58cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99tf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bkk9g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:35Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:35 crc kubenswrapper[4836]: I0217 14:07:35.120557 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vt5sw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d1f430-35ed-4c4e-a797-d7a0a5a45266\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25f1e7f83b2916db4faac1afe1f5107375c0e1674ff1d6f90f1025a9920111e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqtsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vt5sw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:35Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:35 crc kubenswrapper[4836]: I0217 14:07:35.144560 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:35 crc kubenswrapper[4836]: I0217 14:07:35.144618 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:35 crc kubenswrapper[4836]: I0217 14:07:35.144631 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:35 crc kubenswrapper[4836]: I0217 14:07:35.144648 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:35 crc kubenswrapper[4836]: I0217 14:07:35.144659 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:35Z","lastTransitionTime":"2026-02-17T14:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:35 crc kubenswrapper[4836]: I0217 14:07:35.150666 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"243121c7-6c63-4df6-a6a6-1ef6770e1125\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0143c73054039d5957c29bdf62217900d803ba2e0046c09dcff149b27fc94995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18b1d7908740d818568e489caf3447ab0f5768c180cc6d443adc67414e51cd07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e054e9ec9a1f8d54b704be11ff2e55c9aaad9d717b1817aa553cfbf2cc6ac16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a5d1fd5d2de253417bcb1e4edadc2363a1c259769418eb2b2aab3ff8253de2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fbf3d3daf43d0da96ad0c1340b3c72815adceec24d0ed5fc97f965d1d93e7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://854199187b81a3d6692a05073c35985feb14993f76f4ac86149c2b6bac635e4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://854199187b81a3d6692a05073c35985feb14993f76f4ac86149c2b6bac635e4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67b26f0d65128004f7645ac1f54a11bc56909269cb9b9337e447b0d5bf783c4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67b26f0d65128004f7645ac1f54a11bc56909269cb9b9337e447b0d5bf783c4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e9fc4a71369f80c06598341729c7db0cb6e0f33d70b0be64470bc6fd40d1cce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9fc4a71369f80c06598341729c7db0cb6e0f33d70b0be64470bc6fd40d1cce2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:35Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:35 crc kubenswrapper[4836]: I0217 14:07:35.168083 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b942ef1c58993ae6344d4aef5fcfe21434b5f3b3282bdf8070560fa8973a3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77390b24cb4a223b8eb6a46db49a4a88b68f87aa0b4b134b45a552c180dec5b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:35Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:35 crc kubenswrapper[4836]: I0217 14:07:35.184085 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:35Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:35 crc kubenswrapper[4836]: I0217 14:07:35.196594 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd6efd6a58b6d90897fa902c8d0c8082ecec2899f16adc3e56b98f0bc5c4640d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:35Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:35 crc kubenswrapper[4836]: I0217 14:07:35.210458 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:35Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:35 crc kubenswrapper[4836]: I0217 14:07:35.225074 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:35Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:35 crc kubenswrapper[4836]: I0217 14:07:35.236521 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jlz6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f87e91ef-e64c-45a5-9bd5-cc6537e51b1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e03fa870790f606ebf6db3bac825e59b209b7a2bd6cb22264597b30211fbc2fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vbzg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jlz6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:35Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:35 crc kubenswrapper[4836]: I0217 14:07:35.247444 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:35 crc kubenswrapper[4836]: I0217 14:07:35.247504 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:35 crc kubenswrapper[4836]: I0217 14:07:35.247517 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:35 crc kubenswrapper[4836]: I0217 14:07:35.247538 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:35 crc kubenswrapper[4836]: I0217 14:07:35.247552 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:35Z","lastTransitionTime":"2026-02-17T14:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:35 crc kubenswrapper[4836]: I0217 14:07:35.268863 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebae2d4e3375fda616b0ea16c0c86a5c1c36004e7d22873a289ff37874bbc74b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb05318b17722c3f731dc976d38c05a4ab7eafd686ad0fcf17bd01032409eaee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5576148df454de0fb3d15ea9de93736aab907ca8cf09669c3c25fde79ec6fad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47fdc413501c45a2d6e531a1427fd6df3e9d2c57e3d60098b31fbfd695d98a0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d36d269996169583e4e181aac060491640e74df94385276752f460c408273cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6545ec908eecb35a4405050dd77343f0acd4ec7d54ec55853a926e00b7f37f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38fa1bf6b59359eb32319ce0184baa7dac44d10d2a0d9502169ea98eb3408c83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38fa1bf6b59359eb32319ce0184baa7dac44d10d2a0d9502169ea98eb3408c83\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T14:07:33Z\\\",\\\"message\\\":\\\"02 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 14:07:33.571037 6902 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 14:07:33.571067 6902 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 14:07:33.571422 6902 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 14:07:33.572098 6902 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0217 14:07:33.572236 6902 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0217 14:07:33.572310 6902 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0217 14:07:33.572319 6902 handler.go:208] Removed *v1.Node event handler 2\\\\nI0217 14:07:33.572320 6902 factory.go:656] Stopping watch factory\\\\nI0217 14:07:33.639691 6902 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0217 14:07:33.639727 6902 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0217 14:07:33.639772 6902 ovnkube.go:599] Stopped ovnkube\\\\nI0217 14:07:33.639792 6902 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0217 14:07:33.639858 6902 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:07:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-gfznp_openshift-ovn-kubernetes(67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8d2edfc6a88391d034f310661e09eb58503f33f015b5e067654db06e8b152e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81e6d07edc7d7857cb288980bc7e9d5002234c2fe642df6798611b150b6d66e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81e6d07edc7d7857cb288980bc7e9d5002234c2fe642df6798611b150b6d66e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gfznp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:35Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:35 crc kubenswrapper[4836]: I0217 14:07:35.284698 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15e7e5f3-c249-4609-937e-ffdc78580880\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://281c6f51b95a96d557a29b449759275131f2fde30c4814e7692edfff96442e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bc90fa5f9363034d51c35721493655d10f8f8d5fad4dda86ca4c882963fb7a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edaaa277a27d7e7b9e8b42acaa9ee04fc0933b440441ad3f8abee458e96945c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14e1fa543737ef273b75b2cffc78dc2edf4d1673c2c4095edda7c40db8238f2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a13a546a929cc67fa3a708f23186f65c3dbc28afd67a421a66c732b8308e4177\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:06:30.193193 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:06:30.195418 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-478710080/tls.crt::/tmp/serving-cert-478710080/tls.key\\\\\\\"\\\\nI0217 14:06:35.690345 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:06:35.701676 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:06:35.701714 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:06:35.701739 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:06:35.701745 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:06:35.709986 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 14:06:35.710011 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:06:35.710016 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:06:35.710020 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:06:35.710023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:06:35.710027 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:06:35.710030 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 14:06:35.710186 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 14:06:35.711336 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b36202875b19f7dafcfde168e006d9d47571de30240d5886e5e0920b74b609b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf6a761df18c5e7699f2ec87d4b97f19da71f588cb12eff8e541140a361fa905\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf6a761df18c5e7699f2ec87d4b97f19da71f588cb12eff8e541140a361fa905\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:35Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:35 crc kubenswrapper[4836]: I0217 14:07:35.298852 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e7e5246-9255-4e1e-95c9-44606b16c1ae\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://125ff8db8d520f18c812d9fabdb8e31c63958022c790a31167fabc3549de5f7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd23a5cef9ad5b95a15765a1cc2055e05de0f9821f426745c1e50829ed1c8141\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1c943885038503c4b93bc7688fb4ac415496280e3f969326883c0d366c66e08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73bf4b735f1125a8bb5b1f3f449b651c60fdc4ee62b918e28eb3b13278e1f7d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73bf4b735f1125a8bb5b1f3f449b651c60fdc4ee62b918e28eb3b13278e1f7d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:35Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:35 crc kubenswrapper[4836]: I0217 14:07:35.315154 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc30b1c397bb81c58abc71fb2c6d71c94b9a0f7962d8278edc4f65f1a1c64ef8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:35Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:35 crc kubenswrapper[4836]: I0217 14:07:35.332039 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t7845" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eeaa6bd-bab3-4310-9522-747924f2e825\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54ef47d3331023c2e4524dc2c8958b4ed4e2a499eb4d5f084f70cd4afcca1d32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6778ba6ea7cfd1d378cae6dc4330542c35cedaff8e820a2158571f542e0ec55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6778ba6ea7cfd1d378cae6dc4330542c35cedaff8e820a2158571f542e0ec55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14eab60dcf798778aa77f01af0307c7a5ef44846ceaf5a12e4a5f529b812aaa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14eab60dcf798778aa77f01af0307c7a5ef44846ceaf5a12e4a5f529b812aaa4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd65cf675b0d34fc5ea309bdcd17ef648bbf1c0ee6e2ece25f459d8d2403387c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd65cf675b0d34fc5ea309bdcd17ef648bbf1c0ee6e2ece25f459d8d2403387c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b15d92528d05fd8bcab26df5eeb85616b44e750574cc8effa1f34c39dc329bf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b15d92528d05fd8bcab26df5eeb85616b44e750574cc8effa1f34c39dc329bf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ddc9616f0a81f7623eb10b0cdb194656ad856548f814df2e6cebdb2419293f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ddc9616f0a81f7623eb10b0cdb194656ad856548f814df2e6cebdb2419293f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a96c0c25b8539ae8bb660010bbcd7d147a150bfb653dc725a34bc7e4e4b949b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a96c0c25b8539ae8bb660010bbcd7d147a150bfb653dc725a34bc7e4e4b949b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t7845\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:35Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:35 crc kubenswrapper[4836]: I0217 14:07:35.347383 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7nmc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b98ed9fc-ca9f-49c8-b92d-c5b58df2ce3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac6f7d7722015b1a0d40a119c1c74e8a37a7e96e735f56a93ee2fde68d0e10f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6dh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61f8fdf114b2f1ca00e55833ea4eeb033f97b9fa48a3308c4db3c7dbc617c3b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6dh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7nmc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:35Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:35 crc kubenswrapper[4836]: I0217 14:07:35.349855 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:35 crc kubenswrapper[4836]: I0217 14:07:35.349897 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:35 crc kubenswrapper[4836]: I0217 14:07:35.349908 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:35 crc kubenswrapper[4836]: I0217 14:07:35.349926 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:35 crc kubenswrapper[4836]: I0217 14:07:35.349936 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:35Z","lastTransitionTime":"2026-02-17T14:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:35 crc kubenswrapper[4836]: I0217 14:07:35.358279 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-c4txt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g78bt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g78bt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-c4txt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:35Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:35 crc kubenswrapper[4836]: I0217 14:07:35.369828 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a443a5a-83a8-4154-94b7-ace6b324f461\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8787a45a4a94da65a0f67f1a402bfa526fa57631259981ded7ec3569fa977f36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6196237bf10370f0d43ac7ba96f1fbc861d8690734cf883081f67616e6047dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21927a2142504e88fd8bb2a254d804c0e13f6a42e81e16c02205292de3de3155\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c21e5f00c6fb44be82d2a907c0f9d17ea47071615370135f0e94a62f1eb7840\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:35Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:35 crc kubenswrapper[4836]: I0217 14:07:35.381818 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c76cc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"592aa549-1b1b-441e-93e4-0821e05ff2b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b64d012815880d0ba6314438a96b0ffd6bdad4678135a9bd3c9c2a4d6eb83c41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d56329a484e41f44539eb8ca01b461cc6f79eebf756350b0f49c47c0431e8abc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T14:07:24Z\\\",\\\"message\\\":\\\"2026-02-17T14:06:39+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_500ea1db-a57d-4559-adb6-4997611db3f4\\\\n2026-02-17T14:06:39+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_500ea1db-a57d-4559-adb6-4997611db3f4 to /host/opt/cni/bin/\\\\n2026-02-17T14:06:39Z [verbose] multus-daemon started\\\\n2026-02-17T14:06:39Z [verbose] Readiness Indicator file check\\\\n2026-02-17T14:07:24Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8vh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c76cc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:35Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:35 crc kubenswrapper[4836]: I0217 14:07:35.452018 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:35 crc kubenswrapper[4836]: I0217 14:07:35.452345 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:35 crc kubenswrapper[4836]: I0217 14:07:35.452422 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:35 crc kubenswrapper[4836]: I0217 14:07:35.452513 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:35 crc kubenswrapper[4836]: I0217 14:07:35.452583 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:35Z","lastTransitionTime":"2026-02-17T14:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:35 crc kubenswrapper[4836]: I0217 14:07:35.529935 4836 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-07 17:21:19.640306914 +0000 UTC Feb 17 14:07:35 crc kubenswrapper[4836]: I0217 14:07:35.555437 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:35 crc kubenswrapper[4836]: I0217 14:07:35.555493 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:35 crc kubenswrapper[4836]: I0217 14:07:35.555509 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:35 crc kubenswrapper[4836]: I0217 14:07:35.555530 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:35 crc kubenswrapper[4836]: I0217 14:07:35.555546 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:35Z","lastTransitionTime":"2026-02-17T14:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:35 crc kubenswrapper[4836]: I0217 14:07:35.568005 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:07:35 crc kubenswrapper[4836]: I0217 14:07:35.568110 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:07:35 crc kubenswrapper[4836]: E0217 14:07:35.568209 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:07:35 crc kubenswrapper[4836]: I0217 14:07:35.568233 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c4txt" Feb 17 14:07:35 crc kubenswrapper[4836]: E0217 14:07:35.568818 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c4txt" podUID="8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c" Feb 17 14:07:35 crc kubenswrapper[4836]: E0217 14:07:35.568934 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:07:35 crc kubenswrapper[4836]: I0217 14:07:35.568981 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:07:35 crc kubenswrapper[4836]: E0217 14:07:35.569116 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:07:35 crc kubenswrapper[4836]: I0217 14:07:35.658194 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:35 crc kubenswrapper[4836]: I0217 14:07:35.658239 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:35 crc kubenswrapper[4836]: I0217 14:07:35.658255 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:35 crc kubenswrapper[4836]: I0217 14:07:35.658275 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:35 crc kubenswrapper[4836]: I0217 14:07:35.658337 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:35Z","lastTransitionTime":"2026-02-17T14:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:35 crc kubenswrapper[4836]: I0217 14:07:35.761758 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:35 crc kubenswrapper[4836]: I0217 14:07:35.761793 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:35 crc kubenswrapper[4836]: I0217 14:07:35.761801 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:35 crc kubenswrapper[4836]: I0217 14:07:35.761814 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:35 crc kubenswrapper[4836]: I0217 14:07:35.761823 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:35Z","lastTransitionTime":"2026-02-17T14:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:35 crc kubenswrapper[4836]: I0217 14:07:35.864207 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:35 crc kubenswrapper[4836]: I0217 14:07:35.864248 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:35 crc kubenswrapper[4836]: I0217 14:07:35.864259 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:35 crc kubenswrapper[4836]: I0217 14:07:35.864274 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:35 crc kubenswrapper[4836]: I0217 14:07:35.864283 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:35Z","lastTransitionTime":"2026-02-17T14:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:35 crc kubenswrapper[4836]: I0217 14:07:35.968665 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:35 crc kubenswrapper[4836]: I0217 14:07:35.968695 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:35 crc kubenswrapper[4836]: I0217 14:07:35.968703 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:35 crc kubenswrapper[4836]: I0217 14:07:35.968715 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:35 crc kubenswrapper[4836]: I0217 14:07:35.968724 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:35Z","lastTransitionTime":"2026-02-17T14:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:36 crc kubenswrapper[4836]: I0217 14:07:36.071497 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:36 crc kubenswrapper[4836]: I0217 14:07:36.071643 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:36 crc kubenswrapper[4836]: I0217 14:07:36.071657 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:36 crc kubenswrapper[4836]: I0217 14:07:36.071671 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:36 crc kubenswrapper[4836]: I0217 14:07:36.071678 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:36Z","lastTransitionTime":"2026-02-17T14:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:36 crc kubenswrapper[4836]: I0217 14:07:36.174647 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:36 crc kubenswrapper[4836]: I0217 14:07:36.174707 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:36 crc kubenswrapper[4836]: I0217 14:07:36.174721 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:36 crc kubenswrapper[4836]: I0217 14:07:36.174742 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:36 crc kubenswrapper[4836]: I0217 14:07:36.174756 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:36Z","lastTransitionTime":"2026-02-17T14:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:36 crc kubenswrapper[4836]: I0217 14:07:36.276765 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:36 crc kubenswrapper[4836]: I0217 14:07:36.276806 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:36 crc kubenswrapper[4836]: I0217 14:07:36.276818 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:36 crc kubenswrapper[4836]: I0217 14:07:36.276832 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:36 crc kubenswrapper[4836]: I0217 14:07:36.276841 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:36Z","lastTransitionTime":"2026-02-17T14:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:36 crc kubenswrapper[4836]: I0217 14:07:36.379373 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:36 crc kubenswrapper[4836]: I0217 14:07:36.379423 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:36 crc kubenswrapper[4836]: I0217 14:07:36.379437 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:36 crc kubenswrapper[4836]: I0217 14:07:36.379454 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:36 crc kubenswrapper[4836]: I0217 14:07:36.379466 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:36Z","lastTransitionTime":"2026-02-17T14:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:36 crc kubenswrapper[4836]: I0217 14:07:36.482129 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:36 crc kubenswrapper[4836]: I0217 14:07:36.482182 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:36 crc kubenswrapper[4836]: I0217 14:07:36.482195 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:36 crc kubenswrapper[4836]: I0217 14:07:36.482213 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:36 crc kubenswrapper[4836]: I0217 14:07:36.482225 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:36Z","lastTransitionTime":"2026-02-17T14:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:36 crc kubenswrapper[4836]: I0217 14:07:36.530312 4836 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-09 11:03:19.085337094 +0000 UTC Feb 17 14:07:36 crc kubenswrapper[4836]: I0217 14:07:36.585358 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:36 crc kubenswrapper[4836]: I0217 14:07:36.585844 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:36 crc kubenswrapper[4836]: I0217 14:07:36.585876 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:36 crc kubenswrapper[4836]: I0217 14:07:36.585908 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:36 crc kubenswrapper[4836]: I0217 14:07:36.585932 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:36Z","lastTransitionTime":"2026-02-17T14:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:36 crc kubenswrapper[4836]: I0217 14:07:36.688429 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:36 crc kubenswrapper[4836]: I0217 14:07:36.688477 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:36 crc kubenswrapper[4836]: I0217 14:07:36.688488 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:36 crc kubenswrapper[4836]: I0217 14:07:36.688505 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:36 crc kubenswrapper[4836]: I0217 14:07:36.688517 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:36Z","lastTransitionTime":"2026-02-17T14:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:36 crc kubenswrapper[4836]: I0217 14:07:36.791238 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:36 crc kubenswrapper[4836]: I0217 14:07:36.791281 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:36 crc kubenswrapper[4836]: I0217 14:07:36.791306 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:36 crc kubenswrapper[4836]: I0217 14:07:36.791320 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:36 crc kubenswrapper[4836]: I0217 14:07:36.791331 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:36Z","lastTransitionTime":"2026-02-17T14:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:36 crc kubenswrapper[4836]: I0217 14:07:36.894258 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:36 crc kubenswrapper[4836]: I0217 14:07:36.894331 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:36 crc kubenswrapper[4836]: I0217 14:07:36.894343 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:36 crc kubenswrapper[4836]: I0217 14:07:36.894363 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:36 crc kubenswrapper[4836]: I0217 14:07:36.894375 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:36Z","lastTransitionTime":"2026-02-17T14:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:36 crc kubenswrapper[4836]: I0217 14:07:36.997410 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:36 crc kubenswrapper[4836]: I0217 14:07:36.997468 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:36 crc kubenswrapper[4836]: I0217 14:07:36.997484 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:36 crc kubenswrapper[4836]: I0217 14:07:36.997505 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:36 crc kubenswrapper[4836]: I0217 14:07:36.997522 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:36Z","lastTransitionTime":"2026-02-17T14:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:37 crc kubenswrapper[4836]: I0217 14:07:37.100457 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:37 crc kubenswrapper[4836]: I0217 14:07:37.100492 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:37 crc kubenswrapper[4836]: I0217 14:07:37.100502 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:37 crc kubenswrapper[4836]: I0217 14:07:37.100514 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:37 crc kubenswrapper[4836]: I0217 14:07:37.100522 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:37Z","lastTransitionTime":"2026-02-17T14:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:37 crc kubenswrapper[4836]: I0217 14:07:37.202590 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:37 crc kubenswrapper[4836]: I0217 14:07:37.202622 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:37 crc kubenswrapper[4836]: I0217 14:07:37.202633 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:37 crc kubenswrapper[4836]: I0217 14:07:37.202649 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:37 crc kubenswrapper[4836]: I0217 14:07:37.202659 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:37Z","lastTransitionTime":"2026-02-17T14:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:37 crc kubenswrapper[4836]: I0217 14:07:37.305274 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:37 crc kubenswrapper[4836]: I0217 14:07:37.305335 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:37 crc kubenswrapper[4836]: I0217 14:07:37.305346 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:37 crc kubenswrapper[4836]: I0217 14:07:37.305362 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:37 crc kubenswrapper[4836]: I0217 14:07:37.305376 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:37Z","lastTransitionTime":"2026-02-17T14:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:37 crc kubenswrapper[4836]: I0217 14:07:37.407641 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:37 crc kubenswrapper[4836]: I0217 14:07:37.407685 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:37 crc kubenswrapper[4836]: I0217 14:07:37.407696 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:37 crc kubenswrapper[4836]: I0217 14:07:37.407712 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:37 crc kubenswrapper[4836]: I0217 14:07:37.407723 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:37Z","lastTransitionTime":"2026-02-17T14:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:37 crc kubenswrapper[4836]: I0217 14:07:37.510240 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:37 crc kubenswrapper[4836]: I0217 14:07:37.510275 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:37 crc kubenswrapper[4836]: I0217 14:07:37.510285 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:37 crc kubenswrapper[4836]: I0217 14:07:37.510313 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:37 crc kubenswrapper[4836]: I0217 14:07:37.510323 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:37Z","lastTransitionTime":"2026-02-17T14:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:37 crc kubenswrapper[4836]: I0217 14:07:37.530895 4836 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-22 05:45:21.917980104 +0000 UTC Feb 17 14:07:37 crc kubenswrapper[4836]: I0217 14:07:37.567764 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:07:37 crc kubenswrapper[4836]: I0217 14:07:37.567859 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c4txt" Feb 17 14:07:37 crc kubenswrapper[4836]: I0217 14:07:37.567918 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:07:37 crc kubenswrapper[4836]: I0217 14:07:37.567891 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:07:37 crc kubenswrapper[4836]: E0217 14:07:37.568672 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:07:37 crc kubenswrapper[4836]: E0217 14:07:37.568740 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:07:37 crc kubenswrapper[4836]: E0217 14:07:37.568816 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c4txt" podUID="8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c" Feb 17 14:07:37 crc kubenswrapper[4836]: E0217 14:07:37.568875 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:07:37 crc kubenswrapper[4836]: I0217 14:07:37.613657 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:37 crc kubenswrapper[4836]: I0217 14:07:37.613702 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:37 crc kubenswrapper[4836]: I0217 14:07:37.613714 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:37 crc kubenswrapper[4836]: I0217 14:07:37.613734 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:37 crc kubenswrapper[4836]: I0217 14:07:37.613748 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:37Z","lastTransitionTime":"2026-02-17T14:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:37 crc kubenswrapper[4836]: I0217 14:07:37.716636 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:37 crc kubenswrapper[4836]: I0217 14:07:37.716690 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:37 crc kubenswrapper[4836]: I0217 14:07:37.716702 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:37 crc kubenswrapper[4836]: I0217 14:07:37.716720 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:37 crc kubenswrapper[4836]: I0217 14:07:37.716731 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:37Z","lastTransitionTime":"2026-02-17T14:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:37 crc kubenswrapper[4836]: I0217 14:07:37.819515 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:37 crc kubenswrapper[4836]: I0217 14:07:37.819578 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:37 crc kubenswrapper[4836]: I0217 14:07:37.819591 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:37 crc kubenswrapper[4836]: I0217 14:07:37.819611 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:37 crc kubenswrapper[4836]: I0217 14:07:37.819624 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:37Z","lastTransitionTime":"2026-02-17T14:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:37 crc kubenswrapper[4836]: I0217 14:07:37.922364 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:37 crc kubenswrapper[4836]: I0217 14:07:37.922411 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:37 crc kubenswrapper[4836]: I0217 14:07:37.922423 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:37 crc kubenswrapper[4836]: I0217 14:07:37.922439 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:37 crc kubenswrapper[4836]: I0217 14:07:37.922453 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:37Z","lastTransitionTime":"2026-02-17T14:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:38 crc kubenswrapper[4836]: I0217 14:07:38.024898 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:38 crc kubenswrapper[4836]: I0217 14:07:38.024959 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:38 crc kubenswrapper[4836]: I0217 14:07:38.024973 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:38 crc kubenswrapper[4836]: I0217 14:07:38.024990 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:38 crc kubenswrapper[4836]: I0217 14:07:38.025006 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:38Z","lastTransitionTime":"2026-02-17T14:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:38 crc kubenswrapper[4836]: I0217 14:07:38.128173 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:38 crc kubenswrapper[4836]: I0217 14:07:38.129012 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:38 crc kubenswrapper[4836]: I0217 14:07:38.129070 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:38 crc kubenswrapper[4836]: I0217 14:07:38.129098 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:38 crc kubenswrapper[4836]: I0217 14:07:38.129129 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:38Z","lastTransitionTime":"2026-02-17T14:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:38 crc kubenswrapper[4836]: I0217 14:07:38.231716 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:38 crc kubenswrapper[4836]: I0217 14:07:38.231766 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:38 crc kubenswrapper[4836]: I0217 14:07:38.231778 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:38 crc kubenswrapper[4836]: I0217 14:07:38.231795 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:38 crc kubenswrapper[4836]: I0217 14:07:38.231810 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:38Z","lastTransitionTime":"2026-02-17T14:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:38 crc kubenswrapper[4836]: I0217 14:07:38.334754 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:38 crc kubenswrapper[4836]: I0217 14:07:38.334826 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:38 crc kubenswrapper[4836]: I0217 14:07:38.334859 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:38 crc kubenswrapper[4836]: I0217 14:07:38.334892 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:38 crc kubenswrapper[4836]: I0217 14:07:38.334913 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:38Z","lastTransitionTime":"2026-02-17T14:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:38 crc kubenswrapper[4836]: I0217 14:07:38.438468 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:38 crc kubenswrapper[4836]: I0217 14:07:38.438544 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:38 crc kubenswrapper[4836]: I0217 14:07:38.438560 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:38 crc kubenswrapper[4836]: I0217 14:07:38.438590 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:38 crc kubenswrapper[4836]: I0217 14:07:38.438606 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:38Z","lastTransitionTime":"2026-02-17T14:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:38 crc kubenswrapper[4836]: I0217 14:07:38.531228 4836 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-14 17:02:39.86197183 +0000 UTC Feb 17 14:07:38 crc kubenswrapper[4836]: I0217 14:07:38.540746 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:38 crc kubenswrapper[4836]: I0217 14:07:38.540785 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:38 crc kubenswrapper[4836]: I0217 14:07:38.540793 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:38 crc kubenswrapper[4836]: I0217 14:07:38.540806 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:38 crc kubenswrapper[4836]: I0217 14:07:38.540817 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:38Z","lastTransitionTime":"2026-02-17T14:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:38 crc kubenswrapper[4836]: I0217 14:07:38.643489 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:38 crc kubenswrapper[4836]: I0217 14:07:38.643545 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:38 crc kubenswrapper[4836]: I0217 14:07:38.643558 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:38 crc kubenswrapper[4836]: I0217 14:07:38.643576 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:38 crc kubenswrapper[4836]: I0217 14:07:38.643591 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:38Z","lastTransitionTime":"2026-02-17T14:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:38 crc kubenswrapper[4836]: I0217 14:07:38.746552 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:38 crc kubenswrapper[4836]: I0217 14:07:38.746614 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:38 crc kubenswrapper[4836]: I0217 14:07:38.746630 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:38 crc kubenswrapper[4836]: I0217 14:07:38.746656 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:38 crc kubenswrapper[4836]: I0217 14:07:38.746673 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:38Z","lastTransitionTime":"2026-02-17T14:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:38 crc kubenswrapper[4836]: I0217 14:07:38.848741 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:38 crc kubenswrapper[4836]: I0217 14:07:38.848790 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:38 crc kubenswrapper[4836]: I0217 14:07:38.848799 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:38 crc kubenswrapper[4836]: I0217 14:07:38.848812 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:38 crc kubenswrapper[4836]: I0217 14:07:38.848822 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:38Z","lastTransitionTime":"2026-02-17T14:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:38 crc kubenswrapper[4836]: I0217 14:07:38.951962 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:38 crc kubenswrapper[4836]: I0217 14:07:38.952028 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:38 crc kubenswrapper[4836]: I0217 14:07:38.952055 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:38 crc kubenswrapper[4836]: I0217 14:07:38.952109 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:38 crc kubenswrapper[4836]: I0217 14:07:38.952135 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:38Z","lastTransitionTime":"2026-02-17T14:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:39 crc kubenswrapper[4836]: I0217 14:07:39.054968 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:39 crc kubenswrapper[4836]: I0217 14:07:39.055015 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:39 crc kubenswrapper[4836]: I0217 14:07:39.055028 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:39 crc kubenswrapper[4836]: I0217 14:07:39.055046 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:39 crc kubenswrapper[4836]: I0217 14:07:39.055057 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:39Z","lastTransitionTime":"2026-02-17T14:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:39 crc kubenswrapper[4836]: I0217 14:07:39.157665 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:39 crc kubenswrapper[4836]: I0217 14:07:39.157717 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:39 crc kubenswrapper[4836]: I0217 14:07:39.157727 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:39 crc kubenswrapper[4836]: I0217 14:07:39.157742 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:39 crc kubenswrapper[4836]: I0217 14:07:39.157753 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:39Z","lastTransitionTime":"2026-02-17T14:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:39 crc kubenswrapper[4836]: I0217 14:07:39.260058 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:39 crc kubenswrapper[4836]: I0217 14:07:39.260093 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:39 crc kubenswrapper[4836]: I0217 14:07:39.260104 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:39 crc kubenswrapper[4836]: I0217 14:07:39.260116 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:39 crc kubenswrapper[4836]: I0217 14:07:39.260127 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:39Z","lastTransitionTime":"2026-02-17T14:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:39 crc kubenswrapper[4836]: I0217 14:07:39.268396 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:07:39 crc kubenswrapper[4836]: I0217 14:07:39.268486 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:07:39 crc kubenswrapper[4836]: I0217 14:07:39.268518 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:07:39 crc kubenswrapper[4836]: E0217 14:07:39.268624 4836 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 14:07:39 crc kubenswrapper[4836]: E0217 14:07:39.268678 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:08:43.268643807 +0000 UTC m=+149.611572076 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:07:39 crc kubenswrapper[4836]: E0217 14:07:39.268745 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 14:08:43.26873629 +0000 UTC m=+149.611664639 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 14:07:39 crc kubenswrapper[4836]: E0217 14:07:39.268720 4836 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 14:07:39 crc kubenswrapper[4836]: E0217 14:07:39.268960 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 14:08:43.268927605 +0000 UTC m=+149.611855874 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 14:07:39 crc kubenswrapper[4836]: I0217 14:07:39.366149 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:39 crc kubenswrapper[4836]: I0217 14:07:39.366244 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:39 crc kubenswrapper[4836]: I0217 14:07:39.366268 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:39 crc kubenswrapper[4836]: I0217 14:07:39.366331 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:39 crc kubenswrapper[4836]: I0217 14:07:39.366377 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:39Z","lastTransitionTime":"2026-02-17T14:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:39 crc kubenswrapper[4836]: I0217 14:07:39.368983 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:07:39 crc kubenswrapper[4836]: I0217 14:07:39.369054 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:07:39 crc kubenswrapper[4836]: E0217 14:07:39.369220 4836 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 14:07:39 crc kubenswrapper[4836]: E0217 14:07:39.369260 4836 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 14:07:39 crc kubenswrapper[4836]: E0217 14:07:39.369280 4836 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 14:07:39 crc kubenswrapper[4836]: E0217 14:07:39.369346 4836 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 14:07:39 crc kubenswrapper[4836]: E0217 14:07:39.369365 4836 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 14:07:39 crc kubenswrapper[4836]: E0217 14:07:39.369308 4836 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 14:07:39 crc kubenswrapper[4836]: E0217 14:07:39.369425 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-17 14:08:43.369403859 +0000 UTC m=+149.712332158 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 14:07:39 crc kubenswrapper[4836]: E0217 14:07:39.369466 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-17 14:08:43.36944953 +0000 UTC m=+149.712377799 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 14:07:39 crc kubenswrapper[4836]: I0217 14:07:39.469030 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:39 crc kubenswrapper[4836]: I0217 14:07:39.469083 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:39 crc kubenswrapper[4836]: I0217 14:07:39.469092 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:39 crc kubenswrapper[4836]: I0217 14:07:39.469106 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:39 crc kubenswrapper[4836]: I0217 14:07:39.469115 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:39Z","lastTransitionTime":"2026-02-17T14:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:39 crc kubenswrapper[4836]: I0217 14:07:39.531746 4836 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-19 15:28:24.157805793 +0000 UTC Feb 17 14:07:39 crc kubenswrapper[4836]: I0217 14:07:39.566938 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c4txt" Feb 17 14:07:39 crc kubenswrapper[4836]: I0217 14:07:39.566972 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:07:39 crc kubenswrapper[4836]: I0217 14:07:39.567068 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:07:39 crc kubenswrapper[4836]: I0217 14:07:39.567114 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:07:39 crc kubenswrapper[4836]: E0217 14:07:39.567384 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c4txt" podUID="8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c" Feb 17 14:07:39 crc kubenswrapper[4836]: E0217 14:07:39.567569 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:07:39 crc kubenswrapper[4836]: E0217 14:07:39.567713 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:07:39 crc kubenswrapper[4836]: E0217 14:07:39.567810 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:07:39 crc kubenswrapper[4836]: I0217 14:07:39.572229 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:39 crc kubenswrapper[4836]: I0217 14:07:39.572262 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:39 crc kubenswrapper[4836]: I0217 14:07:39.572274 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:39 crc kubenswrapper[4836]: I0217 14:07:39.572289 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:39 crc kubenswrapper[4836]: I0217 14:07:39.572321 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:39Z","lastTransitionTime":"2026-02-17T14:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:39 crc kubenswrapper[4836]: I0217 14:07:39.673953 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:39 crc kubenswrapper[4836]: I0217 14:07:39.673983 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:39 crc kubenswrapper[4836]: I0217 14:07:39.673992 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:39 crc kubenswrapper[4836]: I0217 14:07:39.674027 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:39 crc kubenswrapper[4836]: I0217 14:07:39.674037 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:39Z","lastTransitionTime":"2026-02-17T14:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:39 crc kubenswrapper[4836]: I0217 14:07:39.777277 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:39 crc kubenswrapper[4836]: I0217 14:07:39.777360 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:39 crc kubenswrapper[4836]: I0217 14:07:39.777374 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:39 crc kubenswrapper[4836]: I0217 14:07:39.777394 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:39 crc kubenswrapper[4836]: I0217 14:07:39.777406 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:39Z","lastTransitionTime":"2026-02-17T14:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:39 crc kubenswrapper[4836]: I0217 14:07:39.879872 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:39 crc kubenswrapper[4836]: I0217 14:07:39.879908 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:39 crc kubenswrapper[4836]: I0217 14:07:39.879936 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:39 crc kubenswrapper[4836]: I0217 14:07:39.879948 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:39 crc kubenswrapper[4836]: I0217 14:07:39.879958 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:39Z","lastTransitionTime":"2026-02-17T14:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:39 crc kubenswrapper[4836]: I0217 14:07:39.981712 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:39 crc kubenswrapper[4836]: I0217 14:07:39.981806 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:39 crc kubenswrapper[4836]: I0217 14:07:39.981823 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:39 crc kubenswrapper[4836]: I0217 14:07:39.981845 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:39 crc kubenswrapper[4836]: I0217 14:07:39.981862 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:39Z","lastTransitionTime":"2026-02-17T14:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:40 crc kubenswrapper[4836]: I0217 14:07:40.084447 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:40 crc kubenswrapper[4836]: I0217 14:07:40.084508 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:40 crc kubenswrapper[4836]: I0217 14:07:40.084519 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:40 crc kubenswrapper[4836]: I0217 14:07:40.084536 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:40 crc kubenswrapper[4836]: I0217 14:07:40.084547 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:40Z","lastTransitionTime":"2026-02-17T14:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:40 crc kubenswrapper[4836]: I0217 14:07:40.187345 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:40 crc kubenswrapper[4836]: I0217 14:07:40.187409 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:40 crc kubenswrapper[4836]: I0217 14:07:40.187425 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:40 crc kubenswrapper[4836]: I0217 14:07:40.187447 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:40 crc kubenswrapper[4836]: I0217 14:07:40.187464 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:40Z","lastTransitionTime":"2026-02-17T14:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:40 crc kubenswrapper[4836]: I0217 14:07:40.290342 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:40 crc kubenswrapper[4836]: I0217 14:07:40.290387 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:40 crc kubenswrapper[4836]: I0217 14:07:40.290397 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:40 crc kubenswrapper[4836]: I0217 14:07:40.290413 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:40 crc kubenswrapper[4836]: I0217 14:07:40.290422 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:40Z","lastTransitionTime":"2026-02-17T14:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:40 crc kubenswrapper[4836]: I0217 14:07:40.393018 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:40 crc kubenswrapper[4836]: I0217 14:07:40.393075 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:40 crc kubenswrapper[4836]: I0217 14:07:40.393088 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:40 crc kubenswrapper[4836]: I0217 14:07:40.393104 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:40 crc kubenswrapper[4836]: I0217 14:07:40.393115 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:40Z","lastTransitionTime":"2026-02-17T14:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:40 crc kubenswrapper[4836]: I0217 14:07:40.497413 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:40 crc kubenswrapper[4836]: I0217 14:07:40.497492 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:40 crc kubenswrapper[4836]: I0217 14:07:40.497515 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:40 crc kubenswrapper[4836]: I0217 14:07:40.497544 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:40 crc kubenswrapper[4836]: I0217 14:07:40.497567 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:40Z","lastTransitionTime":"2026-02-17T14:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:40 crc kubenswrapper[4836]: I0217 14:07:40.532337 4836 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-01 07:22:45.190093635 +0000 UTC Feb 17 14:07:40 crc kubenswrapper[4836]: I0217 14:07:40.600497 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:40 crc kubenswrapper[4836]: I0217 14:07:40.600539 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:40 crc kubenswrapper[4836]: I0217 14:07:40.600548 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:40 crc kubenswrapper[4836]: I0217 14:07:40.600563 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:40 crc kubenswrapper[4836]: I0217 14:07:40.600574 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:40Z","lastTransitionTime":"2026-02-17T14:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:40 crc kubenswrapper[4836]: I0217 14:07:40.703908 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:40 crc kubenswrapper[4836]: I0217 14:07:40.703980 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:40 crc kubenswrapper[4836]: I0217 14:07:40.703993 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:40 crc kubenswrapper[4836]: I0217 14:07:40.704011 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:40 crc kubenswrapper[4836]: I0217 14:07:40.704044 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:40Z","lastTransitionTime":"2026-02-17T14:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:40 crc kubenswrapper[4836]: I0217 14:07:40.806265 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:40 crc kubenswrapper[4836]: I0217 14:07:40.806333 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:40 crc kubenswrapper[4836]: I0217 14:07:40.806344 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:40 crc kubenswrapper[4836]: I0217 14:07:40.806357 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:40 crc kubenswrapper[4836]: I0217 14:07:40.806367 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:40Z","lastTransitionTime":"2026-02-17T14:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:40 crc kubenswrapper[4836]: I0217 14:07:40.908094 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:40 crc kubenswrapper[4836]: I0217 14:07:40.908131 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:40 crc kubenswrapper[4836]: I0217 14:07:40.908141 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:40 crc kubenswrapper[4836]: I0217 14:07:40.908152 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:40 crc kubenswrapper[4836]: I0217 14:07:40.908160 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:40Z","lastTransitionTime":"2026-02-17T14:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:41 crc kubenswrapper[4836]: I0217 14:07:41.011893 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:41 crc kubenswrapper[4836]: I0217 14:07:41.011960 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:41 crc kubenswrapper[4836]: I0217 14:07:41.011978 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:41 crc kubenswrapper[4836]: I0217 14:07:41.012000 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:41 crc kubenswrapper[4836]: I0217 14:07:41.012016 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:41Z","lastTransitionTime":"2026-02-17T14:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:41 crc kubenswrapper[4836]: I0217 14:07:41.114641 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:41 crc kubenswrapper[4836]: I0217 14:07:41.115125 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:41 crc kubenswrapper[4836]: I0217 14:07:41.115148 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:41 crc kubenswrapper[4836]: I0217 14:07:41.115174 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:41 crc kubenswrapper[4836]: I0217 14:07:41.115194 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:41Z","lastTransitionTime":"2026-02-17T14:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:41 crc kubenswrapper[4836]: I0217 14:07:41.217724 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:41 crc kubenswrapper[4836]: I0217 14:07:41.217785 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:41 crc kubenswrapper[4836]: I0217 14:07:41.217795 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:41 crc kubenswrapper[4836]: I0217 14:07:41.217815 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:41 crc kubenswrapper[4836]: I0217 14:07:41.217827 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:41Z","lastTransitionTime":"2026-02-17T14:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:41 crc kubenswrapper[4836]: I0217 14:07:41.321354 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:41 crc kubenswrapper[4836]: I0217 14:07:41.321434 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:41 crc kubenswrapper[4836]: I0217 14:07:41.321450 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:41 crc kubenswrapper[4836]: I0217 14:07:41.321475 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:41 crc kubenswrapper[4836]: I0217 14:07:41.321490 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:41Z","lastTransitionTime":"2026-02-17T14:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:41 crc kubenswrapper[4836]: I0217 14:07:41.424258 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:41 crc kubenswrapper[4836]: I0217 14:07:41.424340 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:41 crc kubenswrapper[4836]: I0217 14:07:41.424357 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:41 crc kubenswrapper[4836]: I0217 14:07:41.424381 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:41 crc kubenswrapper[4836]: I0217 14:07:41.424399 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:41Z","lastTransitionTime":"2026-02-17T14:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:41 crc kubenswrapper[4836]: I0217 14:07:41.527776 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:41 crc kubenswrapper[4836]: I0217 14:07:41.527827 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:41 crc kubenswrapper[4836]: I0217 14:07:41.527835 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:41 crc kubenswrapper[4836]: I0217 14:07:41.527851 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:41 crc kubenswrapper[4836]: I0217 14:07:41.527860 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:41Z","lastTransitionTime":"2026-02-17T14:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:41 crc kubenswrapper[4836]: I0217 14:07:41.532982 4836 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-24 14:21:57.448419359 +0000 UTC Feb 17 14:07:41 crc kubenswrapper[4836]: I0217 14:07:41.567623 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c4txt" Feb 17 14:07:41 crc kubenswrapper[4836]: E0217 14:07:41.567835 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c4txt" podUID="8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c" Feb 17 14:07:41 crc kubenswrapper[4836]: I0217 14:07:41.568557 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:07:41 crc kubenswrapper[4836]: I0217 14:07:41.568630 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:07:41 crc kubenswrapper[4836]: I0217 14:07:41.568746 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:07:41 crc kubenswrapper[4836]: E0217 14:07:41.568656 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:07:41 crc kubenswrapper[4836]: E0217 14:07:41.568854 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:07:41 crc kubenswrapper[4836]: E0217 14:07:41.568972 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:07:41 crc kubenswrapper[4836]: I0217 14:07:41.631009 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:41 crc kubenswrapper[4836]: I0217 14:07:41.631064 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:41 crc kubenswrapper[4836]: I0217 14:07:41.631077 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:41 crc kubenswrapper[4836]: I0217 14:07:41.631097 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:41 crc kubenswrapper[4836]: I0217 14:07:41.631108 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:41Z","lastTransitionTime":"2026-02-17T14:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:41 crc kubenswrapper[4836]: I0217 14:07:41.734382 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:41 crc kubenswrapper[4836]: I0217 14:07:41.734462 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:41 crc kubenswrapper[4836]: I0217 14:07:41.734481 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:41 crc kubenswrapper[4836]: I0217 14:07:41.734507 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:41 crc kubenswrapper[4836]: I0217 14:07:41.734526 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:41Z","lastTransitionTime":"2026-02-17T14:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:41 crc kubenswrapper[4836]: I0217 14:07:41.838432 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:41 crc kubenswrapper[4836]: I0217 14:07:41.838482 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:41 crc kubenswrapper[4836]: I0217 14:07:41.838494 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:41 crc kubenswrapper[4836]: I0217 14:07:41.838512 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:41 crc kubenswrapper[4836]: I0217 14:07:41.838524 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:41Z","lastTransitionTime":"2026-02-17T14:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:41 crc kubenswrapper[4836]: I0217 14:07:41.942561 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:41 crc kubenswrapper[4836]: I0217 14:07:41.942628 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:41 crc kubenswrapper[4836]: I0217 14:07:41.942640 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:41 crc kubenswrapper[4836]: I0217 14:07:41.942706 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:41 crc kubenswrapper[4836]: I0217 14:07:41.942719 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:41Z","lastTransitionTime":"2026-02-17T14:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:42 crc kubenswrapper[4836]: I0217 14:07:42.046029 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:42 crc kubenswrapper[4836]: I0217 14:07:42.046086 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:42 crc kubenswrapper[4836]: I0217 14:07:42.046096 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:42 crc kubenswrapper[4836]: I0217 14:07:42.046119 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:42 crc kubenswrapper[4836]: I0217 14:07:42.046130 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:42Z","lastTransitionTime":"2026-02-17T14:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:42 crc kubenswrapper[4836]: I0217 14:07:42.153528 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:42 crc kubenswrapper[4836]: I0217 14:07:42.154134 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:42 crc kubenswrapper[4836]: I0217 14:07:42.154411 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:42 crc kubenswrapper[4836]: I0217 14:07:42.154451 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:42 crc kubenswrapper[4836]: I0217 14:07:42.154473 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:42Z","lastTransitionTime":"2026-02-17T14:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:42 crc kubenswrapper[4836]: I0217 14:07:42.258723 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:42 crc kubenswrapper[4836]: I0217 14:07:42.258776 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:42 crc kubenswrapper[4836]: I0217 14:07:42.258785 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:42 crc kubenswrapper[4836]: I0217 14:07:42.258806 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:42 crc kubenswrapper[4836]: I0217 14:07:42.258818 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:42Z","lastTransitionTime":"2026-02-17T14:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:42 crc kubenswrapper[4836]: I0217 14:07:42.361765 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:42 crc kubenswrapper[4836]: I0217 14:07:42.361813 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:42 crc kubenswrapper[4836]: I0217 14:07:42.361925 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:42 crc kubenswrapper[4836]: I0217 14:07:42.361949 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:42 crc kubenswrapper[4836]: I0217 14:07:42.361964 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:42Z","lastTransitionTime":"2026-02-17T14:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:42 crc kubenswrapper[4836]: I0217 14:07:42.464740 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:42 crc kubenswrapper[4836]: I0217 14:07:42.464787 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:42 crc kubenswrapper[4836]: I0217 14:07:42.464799 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:42 crc kubenswrapper[4836]: I0217 14:07:42.464814 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:42 crc kubenswrapper[4836]: I0217 14:07:42.464825 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:42Z","lastTransitionTime":"2026-02-17T14:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:42 crc kubenswrapper[4836]: I0217 14:07:42.533288 4836 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-28 03:58:23.448727432 +0000 UTC Feb 17 14:07:42 crc kubenswrapper[4836]: I0217 14:07:42.567884 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:42 crc kubenswrapper[4836]: I0217 14:07:42.568005 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:42 crc kubenswrapper[4836]: I0217 14:07:42.568019 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:42 crc kubenswrapper[4836]: I0217 14:07:42.568036 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:42 crc kubenswrapper[4836]: I0217 14:07:42.568047 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:42Z","lastTransitionTime":"2026-02-17T14:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:42 crc kubenswrapper[4836]: I0217 14:07:42.669828 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:42 crc kubenswrapper[4836]: I0217 14:07:42.669884 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:42 crc kubenswrapper[4836]: I0217 14:07:42.669893 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:42 crc kubenswrapper[4836]: I0217 14:07:42.669905 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:42 crc kubenswrapper[4836]: I0217 14:07:42.669933 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:42Z","lastTransitionTime":"2026-02-17T14:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:42 crc kubenswrapper[4836]: I0217 14:07:42.772030 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:42 crc kubenswrapper[4836]: I0217 14:07:42.772086 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:42 crc kubenswrapper[4836]: I0217 14:07:42.772097 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:42 crc kubenswrapper[4836]: I0217 14:07:42.772115 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:42 crc kubenswrapper[4836]: I0217 14:07:42.772125 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:42Z","lastTransitionTime":"2026-02-17T14:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:42 crc kubenswrapper[4836]: I0217 14:07:42.875078 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:42 crc kubenswrapper[4836]: I0217 14:07:42.875128 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:42 crc kubenswrapper[4836]: I0217 14:07:42.875139 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:42 crc kubenswrapper[4836]: I0217 14:07:42.875155 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:42 crc kubenswrapper[4836]: I0217 14:07:42.875511 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:42Z","lastTransitionTime":"2026-02-17T14:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:42 crc kubenswrapper[4836]: I0217 14:07:42.978036 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:42 crc kubenswrapper[4836]: I0217 14:07:42.978093 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:42 crc kubenswrapper[4836]: I0217 14:07:42.978108 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:42 crc kubenswrapper[4836]: I0217 14:07:42.978124 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:42 crc kubenswrapper[4836]: I0217 14:07:42.978385 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:42Z","lastTransitionTime":"2026-02-17T14:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:43 crc kubenswrapper[4836]: I0217 14:07:43.081918 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:43 crc kubenswrapper[4836]: I0217 14:07:43.081973 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:43 crc kubenswrapper[4836]: I0217 14:07:43.081989 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:43 crc kubenswrapper[4836]: I0217 14:07:43.082008 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:43 crc kubenswrapper[4836]: I0217 14:07:43.082024 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:43Z","lastTransitionTime":"2026-02-17T14:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:43 crc kubenswrapper[4836]: I0217 14:07:43.184826 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:43 crc kubenswrapper[4836]: I0217 14:07:43.184897 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:43 crc kubenswrapper[4836]: I0217 14:07:43.184925 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:43 crc kubenswrapper[4836]: I0217 14:07:43.184956 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:43 crc kubenswrapper[4836]: I0217 14:07:43.184978 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:43Z","lastTransitionTime":"2026-02-17T14:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:43 crc kubenswrapper[4836]: I0217 14:07:43.200833 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:43 crc kubenswrapper[4836]: I0217 14:07:43.200906 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:43 crc kubenswrapper[4836]: I0217 14:07:43.200930 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:43 crc kubenswrapper[4836]: I0217 14:07:43.200959 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:43 crc kubenswrapper[4836]: I0217 14:07:43.200985 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:43Z","lastTransitionTime":"2026-02-17T14:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:43 crc kubenswrapper[4836]: E0217 14:07:43.220221 4836 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:07:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:07:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:07:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:07:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d638d470-b0e0-4be9-938f-7ec815bf6bd8\\\",\\\"systemUUID\\\":\\\"f194f106-0bf2-4b65-bcb3-5215631b39d2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:43Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:43 crc kubenswrapper[4836]: I0217 14:07:43.225410 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:43 crc kubenswrapper[4836]: I0217 14:07:43.225450 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:43 crc kubenswrapper[4836]: I0217 14:07:43.225492 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:43 crc kubenswrapper[4836]: I0217 14:07:43.225514 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:43 crc kubenswrapper[4836]: I0217 14:07:43.225529 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:43Z","lastTransitionTime":"2026-02-17T14:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:43 crc kubenswrapper[4836]: E0217 14:07:43.241163 4836 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:07:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:07:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:07:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:07:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d638d470-b0e0-4be9-938f-7ec815bf6bd8\\\",\\\"systemUUID\\\":\\\"f194f106-0bf2-4b65-bcb3-5215631b39d2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:43Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:43 crc kubenswrapper[4836]: I0217 14:07:43.244814 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:43 crc kubenswrapper[4836]: I0217 14:07:43.244841 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:43 crc kubenswrapper[4836]: I0217 14:07:43.244849 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:43 crc kubenswrapper[4836]: I0217 14:07:43.244863 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:43 crc kubenswrapper[4836]: I0217 14:07:43.244872 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:43Z","lastTransitionTime":"2026-02-17T14:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:43 crc kubenswrapper[4836]: E0217 14:07:43.255685 4836 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:07:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:07:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:07:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:07:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d638d470-b0e0-4be9-938f-7ec815bf6bd8\\\",\\\"systemUUID\\\":\\\"f194f106-0bf2-4b65-bcb3-5215631b39d2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:43Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:43 crc kubenswrapper[4836]: I0217 14:07:43.259900 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:43 crc kubenswrapper[4836]: I0217 14:07:43.259959 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:43 crc kubenswrapper[4836]: I0217 14:07:43.259971 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:43 crc kubenswrapper[4836]: I0217 14:07:43.260010 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:43 crc kubenswrapper[4836]: I0217 14:07:43.260022 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:43Z","lastTransitionTime":"2026-02-17T14:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:43 crc kubenswrapper[4836]: E0217 14:07:43.274172 4836 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:07:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:07:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:07:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:07:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d638d470-b0e0-4be9-938f-7ec815bf6bd8\\\",\\\"systemUUID\\\":\\\"f194f106-0bf2-4b65-bcb3-5215631b39d2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:43Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:43 crc kubenswrapper[4836]: I0217 14:07:43.277444 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:43 crc kubenswrapper[4836]: I0217 14:07:43.277487 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:43 crc kubenswrapper[4836]: I0217 14:07:43.277500 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:43 crc kubenswrapper[4836]: I0217 14:07:43.277538 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:43 crc kubenswrapper[4836]: I0217 14:07:43.277556 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:43Z","lastTransitionTime":"2026-02-17T14:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:43 crc kubenswrapper[4836]: E0217 14:07:43.288149 4836 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:07:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:07:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:07:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:07:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d638d470-b0e0-4be9-938f-7ec815bf6bd8\\\",\\\"systemUUID\\\":\\\"f194f106-0bf2-4b65-bcb3-5215631b39d2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:43Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:43 crc kubenswrapper[4836]: E0217 14:07:43.288332 4836 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 17 14:07:43 crc kubenswrapper[4836]: I0217 14:07:43.289715 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:43 crc kubenswrapper[4836]: I0217 14:07:43.289760 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:43 crc kubenswrapper[4836]: I0217 14:07:43.289770 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:43 crc kubenswrapper[4836]: I0217 14:07:43.289786 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:43 crc kubenswrapper[4836]: I0217 14:07:43.289796 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:43Z","lastTransitionTime":"2026-02-17T14:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:43 crc kubenswrapper[4836]: I0217 14:07:43.391555 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:43 crc kubenswrapper[4836]: I0217 14:07:43.391823 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:43 crc kubenswrapper[4836]: I0217 14:07:43.391957 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:43 crc kubenswrapper[4836]: I0217 14:07:43.392072 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:43 crc kubenswrapper[4836]: I0217 14:07:43.392162 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:43Z","lastTransitionTime":"2026-02-17T14:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:43 crc kubenswrapper[4836]: I0217 14:07:43.495917 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:43 crc kubenswrapper[4836]: I0217 14:07:43.495965 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:43 crc kubenswrapper[4836]: I0217 14:07:43.495982 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:43 crc kubenswrapper[4836]: I0217 14:07:43.496006 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:43 crc kubenswrapper[4836]: I0217 14:07:43.496024 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:43Z","lastTransitionTime":"2026-02-17T14:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:43 crc kubenswrapper[4836]: I0217 14:07:43.533501 4836 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 17:46:54.640625903 +0000 UTC Feb 17 14:07:43 crc kubenswrapper[4836]: I0217 14:07:43.567361 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:07:43 crc kubenswrapper[4836]: I0217 14:07:43.567423 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:07:43 crc kubenswrapper[4836]: I0217 14:07:43.567430 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c4txt" Feb 17 14:07:43 crc kubenswrapper[4836]: I0217 14:07:43.567380 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:07:43 crc kubenswrapper[4836]: E0217 14:07:43.567527 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:07:43 crc kubenswrapper[4836]: E0217 14:07:43.567682 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c4txt" podUID="8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c" Feb 17 14:07:43 crc kubenswrapper[4836]: E0217 14:07:43.567757 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:07:43 crc kubenswrapper[4836]: E0217 14:07:43.567858 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:07:43 crc kubenswrapper[4836]: I0217 14:07:43.599022 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:43 crc kubenswrapper[4836]: I0217 14:07:43.599070 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:43 crc kubenswrapper[4836]: I0217 14:07:43.599113 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:43 crc kubenswrapper[4836]: I0217 14:07:43.599134 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:43 crc kubenswrapper[4836]: I0217 14:07:43.599149 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:43Z","lastTransitionTime":"2026-02-17T14:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:43 crc kubenswrapper[4836]: I0217 14:07:43.702483 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:43 crc kubenswrapper[4836]: I0217 14:07:43.702555 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:43 crc kubenswrapper[4836]: I0217 14:07:43.702578 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:43 crc kubenswrapper[4836]: I0217 14:07:43.702608 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:43 crc kubenswrapper[4836]: I0217 14:07:43.702631 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:43Z","lastTransitionTime":"2026-02-17T14:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:43 crc kubenswrapper[4836]: I0217 14:07:43.805549 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:43 crc kubenswrapper[4836]: I0217 14:07:43.805629 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:43 crc kubenswrapper[4836]: I0217 14:07:43.805650 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:43 crc kubenswrapper[4836]: I0217 14:07:43.805683 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:43 crc kubenswrapper[4836]: I0217 14:07:43.805704 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:43Z","lastTransitionTime":"2026-02-17T14:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:43 crc kubenswrapper[4836]: I0217 14:07:43.908599 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:43 crc kubenswrapper[4836]: I0217 14:07:43.908663 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:43 crc kubenswrapper[4836]: I0217 14:07:43.908680 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:43 crc kubenswrapper[4836]: I0217 14:07:43.908702 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:43 crc kubenswrapper[4836]: I0217 14:07:43.908720 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:43Z","lastTransitionTime":"2026-02-17T14:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:44 crc kubenswrapper[4836]: I0217 14:07:44.011612 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:44 crc kubenswrapper[4836]: I0217 14:07:44.011655 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:44 crc kubenswrapper[4836]: I0217 14:07:44.011670 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:44 crc kubenswrapper[4836]: I0217 14:07:44.011690 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:44 crc kubenswrapper[4836]: I0217 14:07:44.011705 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:44Z","lastTransitionTime":"2026-02-17T14:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:44 crc kubenswrapper[4836]: I0217 14:07:44.117200 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:44 crc kubenswrapper[4836]: I0217 14:07:44.117280 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:44 crc kubenswrapper[4836]: I0217 14:07:44.117345 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:44 crc kubenswrapper[4836]: I0217 14:07:44.117377 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:44 crc kubenswrapper[4836]: I0217 14:07:44.117411 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:44Z","lastTransitionTime":"2026-02-17T14:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:44 crc kubenswrapper[4836]: I0217 14:07:44.221150 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:44 crc kubenswrapper[4836]: I0217 14:07:44.221216 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:44 crc kubenswrapper[4836]: I0217 14:07:44.221250 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:44 crc kubenswrapper[4836]: I0217 14:07:44.221286 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:44 crc kubenswrapper[4836]: I0217 14:07:44.221364 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:44Z","lastTransitionTime":"2026-02-17T14:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:44 crc kubenswrapper[4836]: I0217 14:07:44.324650 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:44 crc kubenswrapper[4836]: I0217 14:07:44.324711 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:44 crc kubenswrapper[4836]: I0217 14:07:44.324720 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:44 crc kubenswrapper[4836]: I0217 14:07:44.324738 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:44 crc kubenswrapper[4836]: I0217 14:07:44.324750 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:44Z","lastTransitionTime":"2026-02-17T14:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:44 crc kubenswrapper[4836]: I0217 14:07:44.429647 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:44 crc kubenswrapper[4836]: I0217 14:07:44.429688 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:44 crc kubenswrapper[4836]: I0217 14:07:44.429697 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:44 crc kubenswrapper[4836]: I0217 14:07:44.429710 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:44 crc kubenswrapper[4836]: I0217 14:07:44.429719 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:44Z","lastTransitionTime":"2026-02-17T14:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:44 crc kubenswrapper[4836]: I0217 14:07:44.532438 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:44 crc kubenswrapper[4836]: I0217 14:07:44.532711 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:44 crc kubenswrapper[4836]: I0217 14:07:44.532765 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:44 crc kubenswrapper[4836]: I0217 14:07:44.532800 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:44 crc kubenswrapper[4836]: I0217 14:07:44.532823 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:44Z","lastTransitionTime":"2026-02-17T14:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:44 crc kubenswrapper[4836]: I0217 14:07:44.534634 4836 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-29 22:54:20.404045616 +0000 UTC Feb 17 14:07:44 crc kubenswrapper[4836]: I0217 14:07:44.585954 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vt5sw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d1f430-35ed-4c4e-a797-d7a0a5a45266\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://25f1e7f83b2916db4faac1afe1f5107375c0e1674ff1d6f90f1025a9920111e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kqtsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vt5sw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:44Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:44 crc kubenswrapper[4836]: I0217 14:07:44.612123 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"243121c7-6c63-4df6-a6a6-1ef6770e1125\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0143c73054039d5957c29bdf62217900d803ba2e0046c09dcff149b27fc94995\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://18b1d7908740d818568e489caf3447ab0f5768c180cc6d443adc67414e51cd07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e054e9ec9a1f8d54b704be11ff2e55c9aaad9d717b1817aa553cfbf2cc6ac16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a5d1fd5d2de253417bcb1e4edadc2363a1c259769418eb2b2aab3ff8253de2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fbf3d3daf43d0da96ad0c1340b3c72815adceec24d0ed5fc97f965d1d93e7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://854199187b81a3d6692a05073c35985feb14993f76f4ac86149c2b6bac635e4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://854199187b81a3d6692a05073c35985feb14993f76f4ac86149c2b6bac635e4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67b26f0d65128004f7645ac1f54a11bc56909269cb9b9337e447b0d5bf783c4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67b26f0d65128004f7645ac1f54a11bc56909269cb9b9337e447b0d5bf783c4e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e9fc4a71369f80c06598341729c7db0cb6e0f33d70b0be64470bc6fd40d1cce2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9fc4a71369f80c06598341729c7db0cb6e0f33d70b0be64470bc6fd40d1cce2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:44Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:44 crc kubenswrapper[4836]: I0217 14:07:44.634277 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91b942ef1c58993ae6344d4aef5fcfe21434b5f3b3282bdf8070560fa8973a3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77390b24cb4a223b8eb6a46db49a4a88b68f87aa0b4b134b45a552c180dec5b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:44Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:44 crc kubenswrapper[4836]: I0217 14:07:44.635342 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:44 crc kubenswrapper[4836]: I0217 14:07:44.635374 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:44 crc kubenswrapper[4836]: I0217 14:07:44.635385 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:44 crc kubenswrapper[4836]: I0217 14:07:44.635409 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:44 crc kubenswrapper[4836]: I0217 14:07:44.635433 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:44Z","lastTransitionTime":"2026-02-17T14:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:44 crc kubenswrapper[4836]: I0217 14:07:44.653099 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:44Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:44 crc kubenswrapper[4836]: I0217 14:07:44.676027 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd6efd6a58b6d90897fa902c8d0c8082ecec2899f16adc3e56b98f0bc5c4640d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:44Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:44 crc kubenswrapper[4836]: I0217 14:07:44.691677 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:44Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:44 crc kubenswrapper[4836]: I0217 14:07:44.706076 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:44Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:44 crc kubenswrapper[4836]: I0217 14:07:44.719137 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jlz6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f87e91ef-e64c-45a5-9bd5-cc6537e51b1b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e03fa870790f606ebf6db3bac825e59b209b7a2bd6cb22264597b30211fbc2fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vbzg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jlz6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:44Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:44 crc kubenswrapper[4836]: I0217 14:07:44.739388 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:44 crc kubenswrapper[4836]: I0217 14:07:44.739431 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:44 crc kubenswrapper[4836]: I0217 14:07:44.739440 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:44 crc kubenswrapper[4836]: I0217 14:07:44.739455 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:44 crc kubenswrapper[4836]: I0217 14:07:44.739466 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:44Z","lastTransitionTime":"2026-02-17T14:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:44 crc kubenswrapper[4836]: I0217 14:07:44.739518 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebae2d4e3375fda616b0ea16c0c86a5c1c36004e7d22873a289ff37874bbc74b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bb05318b17722c3f731dc976d38c05a4ab7eafd686ad0fcf17bd01032409eaee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5576148df454de0fb3d15ea9de93736aab907ca8cf09669c3c25fde79ec6fad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://47fdc413501c45a2d6e531a1427fd6df3e9d2c57e3d60098b31fbfd695d98a0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d36d269996169583e4e181aac060491640e74df94385276752f460c408273cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6545ec908eecb35a4405050dd77343f0acd4ec7d54ec55853a926e00b7f37f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38fa1bf6b59359eb32319ce0184baa7dac44d10d2a0d9502169ea98eb3408c83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38fa1bf6b59359eb32319ce0184baa7dac44d10d2a0d9502169ea98eb3408c83\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T14:07:33Z\\\",\\\"message\\\":\\\"02 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 14:07:33.571037 6902 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 14:07:33.571067 6902 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 14:07:33.571422 6902 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 14:07:33.572098 6902 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0217 14:07:33.572236 6902 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0217 14:07:33.572310 6902 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0217 14:07:33.572319 6902 handler.go:208] Removed *v1.Node event handler 2\\\\nI0217 14:07:33.572320 6902 factory.go:656] Stopping watch factory\\\\nI0217 14:07:33.639691 6902 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0217 14:07:33.639727 6902 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0217 14:07:33.639772 6902 ovnkube.go:599] Stopped ovnkube\\\\nI0217 14:07:33.639792 6902 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0217 14:07:33.639858 6902 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:07:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-gfznp_openshift-ovn-kubernetes(67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c8d2edfc6a88391d034f310661e09eb58503f33f015b5e067654db06e8b152e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81e6d07edc7d7857cb288980bc7e9d5002234c2fe642df6798611b150b6d66e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81e6d07edc7d7857cb288980bc7e9d5002234c2fe642df6798611b150b6d66e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7zdwb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gfznp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:44Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:44 crc kubenswrapper[4836]: I0217 14:07:44.754526 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15e7e5f3-c249-4609-937e-ffdc78580880\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://281c6f51b95a96d557a29b449759275131f2fde30c4814e7692edfff96442e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9bc90fa5f9363034d51c35721493655d10f8f8d5fad4dda86ca4c882963fb7a3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://edaaa277a27d7e7b9e8b42acaa9ee04fc0933b440441ad3f8abee458e96945c0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://14e1fa543737ef273b75b2cffc78dc2edf4d1673c2c4095edda7c40db8238f2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a13a546a929cc67fa3a708f23186f65c3dbc28afd67a421a66c732b8308e4177\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 14:06:30.193193 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 14:06:30.195418 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-478710080/tls.crt::/tmp/serving-cert-478710080/tls.key\\\\\\\"\\\\nI0217 14:06:35.690345 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 14:06:35.701676 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 14:06:35.701714 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 14:06:35.701739 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 14:06:35.701745 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 14:06:35.709986 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 14:06:35.710011 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:06:35.710016 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 14:06:35.710020 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 14:06:35.710023 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 14:06:35.710027 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 14:06:35.710030 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0217 14:06:35.710186 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0217 14:06:35.711336 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:19Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b36202875b19f7dafcfde168e006d9d47571de30240d5886e5e0920b74b609b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf6a761df18c5e7699f2ec87d4b97f19da71f588cb12eff8e541140a361fa905\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf6a761df18c5e7699f2ec87d4b97f19da71f588cb12eff8e541140a361fa905\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:44Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:44 crc kubenswrapper[4836]: I0217 14:07:44.769145 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e7e5246-9255-4e1e-95c9-44606b16c1ae\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://125ff8db8d520f18c812d9fabdb8e31c63958022c790a31167fabc3549de5f7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd23a5cef9ad5b95a15765a1cc2055e05de0f9821f426745c1e50829ed1c8141\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1c943885038503c4b93bc7688fb4ac415496280e3f969326883c0d366c66e08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73bf4b735f1125a8bb5b1f3f449b651c60fdc4ee62b918e28eb3b13278e1f7d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73bf4b735f1125a8bb5b1f3f449b651c60fdc4ee62b918e28eb3b13278e1f7d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:44Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:44 crc kubenswrapper[4836]: I0217 14:07:44.784878 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc30b1c397bb81c58abc71fb2c6d71c94b9a0f7962d8278edc4f65f1a1c64ef8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:44Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:44 crc kubenswrapper[4836]: I0217 14:07:44.803432 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t7845" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eeaa6bd-bab3-4310-9522-747924f2e825\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://54ef47d3331023c2e4524dc2c8958b4ed4e2a499eb4d5f084f70cd4afcca1d32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6778ba6ea7cfd1d378cae6dc4330542c35cedaff8e820a2158571f542e0ec55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6778ba6ea7cfd1d378cae6dc4330542c35cedaff8e820a2158571f542e0ec55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14eab60dcf798778aa77f01af0307c7a5ef44846ceaf5a12e4a5f529b812aaa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14eab60dcf798778aa77f01af0307c7a5ef44846ceaf5a12e4a5f529b812aaa4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd65cf675b0d34fc5ea309bdcd17ef648bbf1c0ee6e2ece25f459d8d2403387c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd65cf675b0d34fc5ea309bdcd17ef648bbf1c0ee6e2ece25f459d8d2403387c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b15d92528d05fd8bcab26df5eeb85616b44e750574cc8effa1f34c39dc329bf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b15d92528d05fd8bcab26df5eeb85616b44e750574cc8effa1f34c39dc329bf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2ddc9616f0a81f7623eb10b0cdb194656ad856548f814df2e6cebdb2419293f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ddc9616f0a81f7623eb10b0cdb194656ad856548f814df2e6cebdb2419293f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a96c0c25b8539ae8bb660010bbcd7d147a150bfb653dc725a34bc7e4e4b949b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a96c0c25b8539ae8bb660010bbcd7d147a150bfb653dc725a34bc7e4e4b949b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T14:06:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-grf7r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t7845\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:44Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:44 crc kubenswrapper[4836]: I0217 14:07:44.815172 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7nmc8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b98ed9fc-ca9f-49c8-b92d-c5b58df2ce3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac6f7d7722015b1a0d40a119c1c74e8a37a7e96e735f56a93ee2fde68d0e10f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6dh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61f8fdf114b2f1ca00e55833ea4eeb033f97b9fa48a3308c4db3c7dbc617c3b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j6dh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7nmc8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:44Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:44 crc kubenswrapper[4836]: I0217 14:07:44.828523 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-c4txt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g78bt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-g78bt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-c4txt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:44Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:44 crc kubenswrapper[4836]: I0217 14:07:44.842056 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a443a5a-83a8-4154-94b7-ace6b324f461\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8787a45a4a94da65a0f67f1a402bfa526fa57631259981ded7ec3569fa977f36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6196237bf10370f0d43ac7ba96f1fbc861d8690734cf883081f67616e6047dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21927a2142504e88fd8bb2a254d804c0e13f6a42e81e16c02205292de3de3155\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c21e5f00c6fb44be82d2a907c0f9d17ea47071615370135f0e94a62f1eb7840\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:44Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:44 crc kubenswrapper[4836]: I0217 14:07:44.842164 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:44 crc kubenswrapper[4836]: I0217 14:07:44.842193 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:44 crc kubenswrapper[4836]: I0217 14:07:44.842203 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:44 crc kubenswrapper[4836]: I0217 14:07:44.842218 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:44 crc kubenswrapper[4836]: I0217 14:07:44.842228 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:44Z","lastTransitionTime":"2026-02-17T14:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:44 crc kubenswrapper[4836]: I0217 14:07:44.856716 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-c76cc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"592aa549-1b1b-441e-93e4-0821e05ff2b2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b64d012815880d0ba6314438a96b0ffd6bdad4678135a9bd3c9c2a4d6eb83c41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d56329a484e41f44539eb8ca01b461cc6f79eebf756350b0f49c47c0431e8abc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T14:07:24Z\\\",\\\"message\\\":\\\"2026-02-17T14:06:39+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_500ea1db-a57d-4559-adb6-4997611db3f4\\\\n2026-02-17T14:06:39+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_500ea1db-a57d-4559-adb6-4997611db3f4 to /host/opt/cni/bin/\\\\n2026-02-17T14:06:39Z [verbose] multus-daemon started\\\\n2026-02-17T14:06:39Z [verbose] Readiness Indicator file check\\\\n2026-02-17T14:07:24Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8vh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-c76cc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:44Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:44 crc kubenswrapper[4836]: I0217 14:07:44.868090 4836 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"895a19c9-a3f0-4a15-aa19-19347121388c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T14:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bce78b27c441ccb16f0fd2489cbe3e38be0f3d330fcd2b602f62a8ca0aa8616b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99tf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c10f7b69881ea75bfa81905a42379fed8daf48e788b5f2787c522a52d28f58cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T14:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-99tf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T14:06:35Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bkk9g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T14:07:44Z is after 2025-08-24T17:21:41Z" Feb 17 14:07:44 crc kubenswrapper[4836]: I0217 14:07:44.943843 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:44 crc kubenswrapper[4836]: I0217 14:07:44.943878 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:44 crc kubenswrapper[4836]: I0217 14:07:44.943889 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:44 crc kubenswrapper[4836]: I0217 14:07:44.943906 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:44 crc kubenswrapper[4836]: I0217 14:07:44.943917 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:44Z","lastTransitionTime":"2026-02-17T14:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:45 crc kubenswrapper[4836]: I0217 14:07:45.046166 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:45 crc kubenswrapper[4836]: I0217 14:07:45.046198 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:45 crc kubenswrapper[4836]: I0217 14:07:45.046206 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:45 crc kubenswrapper[4836]: I0217 14:07:45.046220 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:45 crc kubenswrapper[4836]: I0217 14:07:45.046229 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:45Z","lastTransitionTime":"2026-02-17T14:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:45 crc kubenswrapper[4836]: I0217 14:07:45.149148 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:45 crc kubenswrapper[4836]: I0217 14:07:45.149193 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:45 crc kubenswrapper[4836]: I0217 14:07:45.149206 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:45 crc kubenswrapper[4836]: I0217 14:07:45.149225 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:45 crc kubenswrapper[4836]: I0217 14:07:45.149238 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:45Z","lastTransitionTime":"2026-02-17T14:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:45 crc kubenswrapper[4836]: I0217 14:07:45.252525 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:45 crc kubenswrapper[4836]: I0217 14:07:45.252695 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:45 crc kubenswrapper[4836]: I0217 14:07:45.252718 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:45 crc kubenswrapper[4836]: I0217 14:07:45.252754 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:45 crc kubenswrapper[4836]: I0217 14:07:45.252791 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:45Z","lastTransitionTime":"2026-02-17T14:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:45 crc kubenswrapper[4836]: I0217 14:07:45.355581 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:45 crc kubenswrapper[4836]: I0217 14:07:45.355640 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:45 crc kubenswrapper[4836]: I0217 14:07:45.355651 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:45 crc kubenswrapper[4836]: I0217 14:07:45.355668 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:45 crc kubenswrapper[4836]: I0217 14:07:45.355680 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:45Z","lastTransitionTime":"2026-02-17T14:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:45 crc kubenswrapper[4836]: I0217 14:07:45.457543 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:45 crc kubenswrapper[4836]: I0217 14:07:45.457597 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:45 crc kubenswrapper[4836]: I0217 14:07:45.457615 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:45 crc kubenswrapper[4836]: I0217 14:07:45.457638 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:45 crc kubenswrapper[4836]: I0217 14:07:45.457654 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:45Z","lastTransitionTime":"2026-02-17T14:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:45 crc kubenswrapper[4836]: I0217 14:07:45.535692 4836 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-09 09:46:32.808339694 +0000 UTC Feb 17 14:07:45 crc kubenswrapper[4836]: I0217 14:07:45.559904 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:45 crc kubenswrapper[4836]: I0217 14:07:45.559957 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:45 crc kubenswrapper[4836]: I0217 14:07:45.559972 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:45 crc kubenswrapper[4836]: I0217 14:07:45.559994 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:45 crc kubenswrapper[4836]: I0217 14:07:45.560007 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:45Z","lastTransitionTime":"2026-02-17T14:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:45 crc kubenswrapper[4836]: I0217 14:07:45.567310 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:07:45 crc kubenswrapper[4836]: I0217 14:07:45.567542 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c4txt" Feb 17 14:07:45 crc kubenswrapper[4836]: I0217 14:07:45.567590 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:07:45 crc kubenswrapper[4836]: I0217 14:07:45.567560 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:07:45 crc kubenswrapper[4836]: E0217 14:07:45.567677 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:07:45 crc kubenswrapper[4836]: E0217 14:07:45.567768 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c4txt" podUID="8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c" Feb 17 14:07:45 crc kubenswrapper[4836]: E0217 14:07:45.567844 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:07:45 crc kubenswrapper[4836]: E0217 14:07:45.567921 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:07:45 crc kubenswrapper[4836]: I0217 14:07:45.662712 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:45 crc kubenswrapper[4836]: I0217 14:07:45.662751 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:45 crc kubenswrapper[4836]: I0217 14:07:45.662762 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:45 crc kubenswrapper[4836]: I0217 14:07:45.662776 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:45 crc kubenswrapper[4836]: I0217 14:07:45.662787 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:45Z","lastTransitionTime":"2026-02-17T14:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:45 crc kubenswrapper[4836]: I0217 14:07:45.765389 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:45 crc kubenswrapper[4836]: I0217 14:07:45.765446 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:45 crc kubenswrapper[4836]: I0217 14:07:45.765461 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:45 crc kubenswrapper[4836]: I0217 14:07:45.765480 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:45 crc kubenswrapper[4836]: I0217 14:07:45.765497 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:45Z","lastTransitionTime":"2026-02-17T14:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:45 crc kubenswrapper[4836]: I0217 14:07:45.868312 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:45 crc kubenswrapper[4836]: I0217 14:07:45.868352 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:45 crc kubenswrapper[4836]: I0217 14:07:45.868366 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:45 crc kubenswrapper[4836]: I0217 14:07:45.868385 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:45 crc kubenswrapper[4836]: I0217 14:07:45.868398 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:45Z","lastTransitionTime":"2026-02-17T14:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:45 crc kubenswrapper[4836]: I0217 14:07:45.970644 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:45 crc kubenswrapper[4836]: I0217 14:07:45.970700 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:45 crc kubenswrapper[4836]: I0217 14:07:45.970718 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:45 crc kubenswrapper[4836]: I0217 14:07:45.970738 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:45 crc kubenswrapper[4836]: I0217 14:07:45.970753 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:45Z","lastTransitionTime":"2026-02-17T14:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:46 crc kubenswrapper[4836]: I0217 14:07:46.073073 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:46 crc kubenswrapper[4836]: I0217 14:07:46.073145 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:46 crc kubenswrapper[4836]: I0217 14:07:46.073172 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:46 crc kubenswrapper[4836]: I0217 14:07:46.073200 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:46 crc kubenswrapper[4836]: I0217 14:07:46.073220 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:46Z","lastTransitionTime":"2026-02-17T14:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:46 crc kubenswrapper[4836]: I0217 14:07:46.175934 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:46 crc kubenswrapper[4836]: I0217 14:07:46.176178 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:46 crc kubenswrapper[4836]: I0217 14:07:46.176410 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:46 crc kubenswrapper[4836]: I0217 14:07:46.176516 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:46 crc kubenswrapper[4836]: I0217 14:07:46.176702 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:46Z","lastTransitionTime":"2026-02-17T14:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:46 crc kubenswrapper[4836]: I0217 14:07:46.279021 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:46 crc kubenswrapper[4836]: I0217 14:07:46.279058 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:46 crc kubenswrapper[4836]: I0217 14:07:46.279071 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:46 crc kubenswrapper[4836]: I0217 14:07:46.279094 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:46 crc kubenswrapper[4836]: I0217 14:07:46.279108 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:46Z","lastTransitionTime":"2026-02-17T14:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:46 crc kubenswrapper[4836]: I0217 14:07:46.381946 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:46 crc kubenswrapper[4836]: I0217 14:07:46.381990 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:46 crc kubenswrapper[4836]: I0217 14:07:46.382001 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:46 crc kubenswrapper[4836]: I0217 14:07:46.382017 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:46 crc kubenswrapper[4836]: I0217 14:07:46.382030 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:46Z","lastTransitionTime":"2026-02-17T14:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:46 crc kubenswrapper[4836]: I0217 14:07:46.484471 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:46 crc kubenswrapper[4836]: I0217 14:07:46.484539 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:46 crc kubenswrapper[4836]: I0217 14:07:46.484549 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:46 crc kubenswrapper[4836]: I0217 14:07:46.484562 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:46 crc kubenswrapper[4836]: I0217 14:07:46.484570 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:46Z","lastTransitionTime":"2026-02-17T14:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:46 crc kubenswrapper[4836]: I0217 14:07:46.536134 4836 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-14 09:50:22.6035668 +0000 UTC Feb 17 14:07:46 crc kubenswrapper[4836]: I0217 14:07:46.568394 4836 scope.go:117] "RemoveContainer" containerID="38fa1bf6b59359eb32319ce0184baa7dac44d10d2a0d9502169ea98eb3408c83" Feb 17 14:07:46 crc kubenswrapper[4836]: E0217 14:07:46.568570 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-gfznp_openshift-ovn-kubernetes(67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" podUID="67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e" Feb 17 14:07:46 crc kubenswrapper[4836]: I0217 14:07:46.578543 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Feb 17 14:07:46 crc kubenswrapper[4836]: I0217 14:07:46.586466 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:46 crc kubenswrapper[4836]: I0217 14:07:46.586486 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:46 crc kubenswrapper[4836]: I0217 14:07:46.586494 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:46 crc kubenswrapper[4836]: I0217 14:07:46.586505 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:46 crc kubenswrapper[4836]: I0217 14:07:46.586513 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:46Z","lastTransitionTime":"2026-02-17T14:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:46 crc kubenswrapper[4836]: I0217 14:07:46.688714 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:46 crc kubenswrapper[4836]: I0217 14:07:46.689003 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:46 crc kubenswrapper[4836]: I0217 14:07:46.689084 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:46 crc kubenswrapper[4836]: I0217 14:07:46.689157 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:46 crc kubenswrapper[4836]: I0217 14:07:46.689225 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:46Z","lastTransitionTime":"2026-02-17T14:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:46 crc kubenswrapper[4836]: I0217 14:07:46.791567 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:46 crc kubenswrapper[4836]: I0217 14:07:46.791683 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:46 crc kubenswrapper[4836]: I0217 14:07:46.791726 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:46 crc kubenswrapper[4836]: I0217 14:07:46.791739 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:46 crc kubenswrapper[4836]: I0217 14:07:46.791750 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:46Z","lastTransitionTime":"2026-02-17T14:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:46 crc kubenswrapper[4836]: I0217 14:07:46.894873 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:46 crc kubenswrapper[4836]: I0217 14:07:46.894916 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:46 crc kubenswrapper[4836]: I0217 14:07:46.894928 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:46 crc kubenswrapper[4836]: I0217 14:07:46.894941 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:46 crc kubenswrapper[4836]: I0217 14:07:46.894949 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:46Z","lastTransitionTime":"2026-02-17T14:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:46 crc kubenswrapper[4836]: I0217 14:07:46.999120 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:46 crc kubenswrapper[4836]: I0217 14:07:46.999181 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:46 crc kubenswrapper[4836]: I0217 14:07:46.999205 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:46 crc kubenswrapper[4836]: I0217 14:07:46.999236 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:46 crc kubenswrapper[4836]: I0217 14:07:46.999255 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:46Z","lastTransitionTime":"2026-02-17T14:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:47 crc kubenswrapper[4836]: I0217 14:07:47.102477 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:47 crc kubenswrapper[4836]: I0217 14:07:47.102529 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:47 crc kubenswrapper[4836]: I0217 14:07:47.102540 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:47 crc kubenswrapper[4836]: I0217 14:07:47.102558 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:47 crc kubenswrapper[4836]: I0217 14:07:47.102570 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:47Z","lastTransitionTime":"2026-02-17T14:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:47 crc kubenswrapper[4836]: I0217 14:07:47.205384 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:47 crc kubenswrapper[4836]: I0217 14:07:47.205421 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:47 crc kubenswrapper[4836]: I0217 14:07:47.205433 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:47 crc kubenswrapper[4836]: I0217 14:07:47.205449 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:47 crc kubenswrapper[4836]: I0217 14:07:47.205460 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:47Z","lastTransitionTime":"2026-02-17T14:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:47 crc kubenswrapper[4836]: I0217 14:07:47.308163 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:47 crc kubenswrapper[4836]: I0217 14:07:47.308203 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:47 crc kubenswrapper[4836]: I0217 14:07:47.308216 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:47 crc kubenswrapper[4836]: I0217 14:07:47.308232 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:47 crc kubenswrapper[4836]: I0217 14:07:47.308242 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:47Z","lastTransitionTime":"2026-02-17T14:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:47 crc kubenswrapper[4836]: I0217 14:07:47.415248 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:47 crc kubenswrapper[4836]: I0217 14:07:47.415463 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:47 crc kubenswrapper[4836]: I0217 14:07:47.415489 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:47 crc kubenswrapper[4836]: I0217 14:07:47.415520 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:47 crc kubenswrapper[4836]: I0217 14:07:47.415538 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:47Z","lastTransitionTime":"2026-02-17T14:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:47 crc kubenswrapper[4836]: I0217 14:07:47.518361 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:47 crc kubenswrapper[4836]: I0217 14:07:47.518450 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:47 crc kubenswrapper[4836]: I0217 14:07:47.518466 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:47 crc kubenswrapper[4836]: I0217 14:07:47.518491 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:47 crc kubenswrapper[4836]: I0217 14:07:47.518510 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:47Z","lastTransitionTime":"2026-02-17T14:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:47 crc kubenswrapper[4836]: I0217 14:07:47.536778 4836 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-19 03:58:35.586604921 +0000 UTC Feb 17 14:07:47 crc kubenswrapper[4836]: I0217 14:07:47.567377 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:07:47 crc kubenswrapper[4836]: I0217 14:07:47.567692 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c4txt" Feb 17 14:07:47 crc kubenswrapper[4836]: I0217 14:07:47.567378 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:07:47 crc kubenswrapper[4836]: E0217 14:07:47.567889 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c4txt" podUID="8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c" Feb 17 14:07:47 crc kubenswrapper[4836]: E0217 14:07:47.567761 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:07:47 crc kubenswrapper[4836]: E0217 14:07:47.568044 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:07:47 crc kubenswrapper[4836]: I0217 14:07:47.567377 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:07:47 crc kubenswrapper[4836]: E0217 14:07:47.568474 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:07:47 crc kubenswrapper[4836]: I0217 14:07:47.620652 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:47 crc kubenswrapper[4836]: I0217 14:07:47.620943 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:47 crc kubenswrapper[4836]: I0217 14:07:47.621044 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:47 crc kubenswrapper[4836]: I0217 14:07:47.621143 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:47 crc kubenswrapper[4836]: I0217 14:07:47.621443 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:47Z","lastTransitionTime":"2026-02-17T14:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:47 crc kubenswrapper[4836]: I0217 14:07:47.724398 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:47 crc kubenswrapper[4836]: I0217 14:07:47.724447 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:47 crc kubenswrapper[4836]: I0217 14:07:47.724459 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:47 crc kubenswrapper[4836]: I0217 14:07:47.724477 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:47 crc kubenswrapper[4836]: I0217 14:07:47.724488 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:47Z","lastTransitionTime":"2026-02-17T14:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:47 crc kubenswrapper[4836]: I0217 14:07:47.827264 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:47 crc kubenswrapper[4836]: I0217 14:07:47.827328 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:47 crc kubenswrapper[4836]: I0217 14:07:47.827343 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:47 crc kubenswrapper[4836]: I0217 14:07:47.827358 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:47 crc kubenswrapper[4836]: I0217 14:07:47.827368 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:47Z","lastTransitionTime":"2026-02-17T14:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:47 crc kubenswrapper[4836]: I0217 14:07:47.930422 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:47 crc kubenswrapper[4836]: I0217 14:07:47.930465 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:47 crc kubenswrapper[4836]: I0217 14:07:47.930477 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:47 crc kubenswrapper[4836]: I0217 14:07:47.930493 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:47 crc kubenswrapper[4836]: I0217 14:07:47.930521 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:47Z","lastTransitionTime":"2026-02-17T14:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:48 crc kubenswrapper[4836]: I0217 14:07:48.033114 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:48 crc kubenswrapper[4836]: I0217 14:07:48.033156 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:48 crc kubenswrapper[4836]: I0217 14:07:48.033164 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:48 crc kubenswrapper[4836]: I0217 14:07:48.033180 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:48 crc kubenswrapper[4836]: I0217 14:07:48.033191 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:48Z","lastTransitionTime":"2026-02-17T14:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:48 crc kubenswrapper[4836]: I0217 14:07:48.136600 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:48 crc kubenswrapper[4836]: I0217 14:07:48.136674 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:48 crc kubenswrapper[4836]: I0217 14:07:48.136693 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:48 crc kubenswrapper[4836]: I0217 14:07:48.136716 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:48 crc kubenswrapper[4836]: I0217 14:07:48.136733 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:48Z","lastTransitionTime":"2026-02-17T14:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:48 crc kubenswrapper[4836]: I0217 14:07:48.240036 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:48 crc kubenswrapper[4836]: I0217 14:07:48.240161 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:48 crc kubenswrapper[4836]: I0217 14:07:48.240187 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:48 crc kubenswrapper[4836]: I0217 14:07:48.240217 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:48 crc kubenswrapper[4836]: I0217 14:07:48.240240 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:48Z","lastTransitionTime":"2026-02-17T14:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:48 crc kubenswrapper[4836]: I0217 14:07:48.342744 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:48 crc kubenswrapper[4836]: I0217 14:07:48.342793 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:48 crc kubenswrapper[4836]: I0217 14:07:48.342805 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:48 crc kubenswrapper[4836]: I0217 14:07:48.342822 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:48 crc kubenswrapper[4836]: I0217 14:07:48.342835 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:48Z","lastTransitionTime":"2026-02-17T14:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:48 crc kubenswrapper[4836]: I0217 14:07:48.445123 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:48 crc kubenswrapper[4836]: I0217 14:07:48.445169 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:48 crc kubenswrapper[4836]: I0217 14:07:48.445181 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:48 crc kubenswrapper[4836]: I0217 14:07:48.445199 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:48 crc kubenswrapper[4836]: I0217 14:07:48.445211 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:48Z","lastTransitionTime":"2026-02-17T14:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:48 crc kubenswrapper[4836]: I0217 14:07:48.537277 4836 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-04 06:49:34.849817307 +0000 UTC Feb 17 14:07:48 crc kubenswrapper[4836]: I0217 14:07:48.547878 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:48 crc kubenswrapper[4836]: I0217 14:07:48.547936 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:48 crc kubenswrapper[4836]: I0217 14:07:48.547958 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:48 crc kubenswrapper[4836]: I0217 14:07:48.547988 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:48 crc kubenswrapper[4836]: I0217 14:07:48.548014 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:48Z","lastTransitionTime":"2026-02-17T14:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:48 crc kubenswrapper[4836]: I0217 14:07:48.651502 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:48 crc kubenswrapper[4836]: I0217 14:07:48.651579 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:48 crc kubenswrapper[4836]: I0217 14:07:48.651597 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:48 crc kubenswrapper[4836]: I0217 14:07:48.651624 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:48 crc kubenswrapper[4836]: I0217 14:07:48.651642 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:48Z","lastTransitionTime":"2026-02-17T14:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:48 crc kubenswrapper[4836]: I0217 14:07:48.753680 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:48 crc kubenswrapper[4836]: I0217 14:07:48.753720 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:48 crc kubenswrapper[4836]: I0217 14:07:48.753729 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:48 crc kubenswrapper[4836]: I0217 14:07:48.753742 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:48 crc kubenswrapper[4836]: I0217 14:07:48.753751 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:48Z","lastTransitionTime":"2026-02-17T14:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:48 crc kubenswrapper[4836]: I0217 14:07:48.855990 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:48 crc kubenswrapper[4836]: I0217 14:07:48.856027 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:48 crc kubenswrapper[4836]: I0217 14:07:48.856050 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:48 crc kubenswrapper[4836]: I0217 14:07:48.856079 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:48 crc kubenswrapper[4836]: I0217 14:07:48.856103 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:48Z","lastTransitionTime":"2026-02-17T14:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:48 crc kubenswrapper[4836]: I0217 14:07:48.957891 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:48 crc kubenswrapper[4836]: I0217 14:07:48.957930 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:48 crc kubenswrapper[4836]: I0217 14:07:48.957944 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:48 crc kubenswrapper[4836]: I0217 14:07:48.957960 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:48 crc kubenswrapper[4836]: I0217 14:07:48.958003 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:48Z","lastTransitionTime":"2026-02-17T14:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:49 crc kubenswrapper[4836]: I0217 14:07:49.060922 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:49 crc kubenswrapper[4836]: I0217 14:07:49.060994 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:49 crc kubenswrapper[4836]: I0217 14:07:49.061020 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:49 crc kubenswrapper[4836]: I0217 14:07:49.061052 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:49 crc kubenswrapper[4836]: I0217 14:07:49.061076 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:49Z","lastTransitionTime":"2026-02-17T14:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:49 crc kubenswrapper[4836]: I0217 14:07:49.163132 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:49 crc kubenswrapper[4836]: I0217 14:07:49.163163 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:49 crc kubenswrapper[4836]: I0217 14:07:49.163171 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:49 crc kubenswrapper[4836]: I0217 14:07:49.163184 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:49 crc kubenswrapper[4836]: I0217 14:07:49.163195 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:49Z","lastTransitionTime":"2026-02-17T14:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:49 crc kubenswrapper[4836]: I0217 14:07:49.265720 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:49 crc kubenswrapper[4836]: I0217 14:07:49.265774 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:49 crc kubenswrapper[4836]: I0217 14:07:49.265786 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:49 crc kubenswrapper[4836]: I0217 14:07:49.265808 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:49 crc kubenswrapper[4836]: I0217 14:07:49.265820 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:49Z","lastTransitionTime":"2026-02-17T14:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:49 crc kubenswrapper[4836]: I0217 14:07:49.369075 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:49 crc kubenswrapper[4836]: I0217 14:07:49.369136 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:49 crc kubenswrapper[4836]: I0217 14:07:49.369154 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:49 crc kubenswrapper[4836]: I0217 14:07:49.369176 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:49 crc kubenswrapper[4836]: I0217 14:07:49.369193 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:49Z","lastTransitionTime":"2026-02-17T14:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:49 crc kubenswrapper[4836]: I0217 14:07:49.471580 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:49 crc kubenswrapper[4836]: I0217 14:07:49.471649 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:49 crc kubenswrapper[4836]: I0217 14:07:49.471671 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:49 crc kubenswrapper[4836]: I0217 14:07:49.471696 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:49 crc kubenswrapper[4836]: I0217 14:07:49.471716 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:49Z","lastTransitionTime":"2026-02-17T14:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:49 crc kubenswrapper[4836]: I0217 14:07:49.538518 4836 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-29 05:35:54.929073955 +0000 UTC Feb 17 14:07:49 crc kubenswrapper[4836]: I0217 14:07:49.568043 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c4txt" Feb 17 14:07:49 crc kubenswrapper[4836]: I0217 14:07:49.568089 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:07:49 crc kubenswrapper[4836]: I0217 14:07:49.568080 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:07:49 crc kubenswrapper[4836]: I0217 14:07:49.568144 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:07:49 crc kubenswrapper[4836]: E0217 14:07:49.568276 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:07:49 crc kubenswrapper[4836]: E0217 14:07:49.568487 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:07:49 crc kubenswrapper[4836]: E0217 14:07:49.568619 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:07:49 crc kubenswrapper[4836]: E0217 14:07:49.568930 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c4txt" podUID="8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c" Feb 17 14:07:49 crc kubenswrapper[4836]: I0217 14:07:49.574563 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:49 crc kubenswrapper[4836]: I0217 14:07:49.574605 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:49 crc kubenswrapper[4836]: I0217 14:07:49.574615 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:49 crc kubenswrapper[4836]: I0217 14:07:49.574630 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:49 crc kubenswrapper[4836]: I0217 14:07:49.574639 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:49Z","lastTransitionTime":"2026-02-17T14:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:49 crc kubenswrapper[4836]: I0217 14:07:49.676462 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:49 crc kubenswrapper[4836]: I0217 14:07:49.676496 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:49 crc kubenswrapper[4836]: I0217 14:07:49.676504 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:49 crc kubenswrapper[4836]: I0217 14:07:49.676517 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:49 crc kubenswrapper[4836]: I0217 14:07:49.676533 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:49Z","lastTransitionTime":"2026-02-17T14:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:49 crc kubenswrapper[4836]: I0217 14:07:49.779178 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:49 crc kubenswrapper[4836]: I0217 14:07:49.779262 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:49 crc kubenswrapper[4836]: I0217 14:07:49.779285 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:49 crc kubenswrapper[4836]: I0217 14:07:49.779368 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:49 crc kubenswrapper[4836]: I0217 14:07:49.779393 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:49Z","lastTransitionTime":"2026-02-17T14:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:49 crc kubenswrapper[4836]: I0217 14:07:49.882350 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:49 crc kubenswrapper[4836]: I0217 14:07:49.882418 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:49 crc kubenswrapper[4836]: I0217 14:07:49.882437 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:49 crc kubenswrapper[4836]: I0217 14:07:49.882465 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:49 crc kubenswrapper[4836]: I0217 14:07:49.882485 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:49Z","lastTransitionTime":"2026-02-17T14:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:49 crc kubenswrapper[4836]: I0217 14:07:49.984862 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:49 crc kubenswrapper[4836]: I0217 14:07:49.984894 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:49 crc kubenswrapper[4836]: I0217 14:07:49.984902 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:49 crc kubenswrapper[4836]: I0217 14:07:49.984915 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:49 crc kubenswrapper[4836]: I0217 14:07:49.984944 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:49Z","lastTransitionTime":"2026-02-17T14:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:50 crc kubenswrapper[4836]: I0217 14:07:50.088532 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:50 crc kubenswrapper[4836]: I0217 14:07:50.088595 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:50 crc kubenswrapper[4836]: I0217 14:07:50.088608 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:50 crc kubenswrapper[4836]: I0217 14:07:50.088629 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:50 crc kubenswrapper[4836]: I0217 14:07:50.088643 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:50Z","lastTransitionTime":"2026-02-17T14:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:50 crc kubenswrapper[4836]: I0217 14:07:50.191250 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:50 crc kubenswrapper[4836]: I0217 14:07:50.191309 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:50 crc kubenswrapper[4836]: I0217 14:07:50.191328 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:50 crc kubenswrapper[4836]: I0217 14:07:50.191346 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:50 crc kubenswrapper[4836]: I0217 14:07:50.191357 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:50Z","lastTransitionTime":"2026-02-17T14:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:50 crc kubenswrapper[4836]: I0217 14:07:50.293969 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:50 crc kubenswrapper[4836]: I0217 14:07:50.294005 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:50 crc kubenswrapper[4836]: I0217 14:07:50.294014 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:50 crc kubenswrapper[4836]: I0217 14:07:50.294027 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:50 crc kubenswrapper[4836]: I0217 14:07:50.294035 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:50Z","lastTransitionTime":"2026-02-17T14:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:50 crc kubenswrapper[4836]: I0217 14:07:50.395885 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:50 crc kubenswrapper[4836]: I0217 14:07:50.395922 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:50 crc kubenswrapper[4836]: I0217 14:07:50.395931 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:50 crc kubenswrapper[4836]: I0217 14:07:50.395946 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:50 crc kubenswrapper[4836]: I0217 14:07:50.395955 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:50Z","lastTransitionTime":"2026-02-17T14:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:50 crc kubenswrapper[4836]: I0217 14:07:50.498976 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:50 crc kubenswrapper[4836]: I0217 14:07:50.499001 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:50 crc kubenswrapper[4836]: I0217 14:07:50.499008 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:50 crc kubenswrapper[4836]: I0217 14:07:50.499021 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:50 crc kubenswrapper[4836]: I0217 14:07:50.499030 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:50Z","lastTransitionTime":"2026-02-17T14:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:50 crc kubenswrapper[4836]: I0217 14:07:50.539392 4836 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-11 21:52:46.854796929 +0000 UTC Feb 17 14:07:50 crc kubenswrapper[4836]: I0217 14:07:50.602187 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:50 crc kubenswrapper[4836]: I0217 14:07:50.602242 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:50 crc kubenswrapper[4836]: I0217 14:07:50.602254 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:50 crc kubenswrapper[4836]: I0217 14:07:50.602269 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:50 crc kubenswrapper[4836]: I0217 14:07:50.602281 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:50Z","lastTransitionTime":"2026-02-17T14:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:50 crc kubenswrapper[4836]: I0217 14:07:50.705398 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:50 crc kubenswrapper[4836]: I0217 14:07:50.705445 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:50 crc kubenswrapper[4836]: I0217 14:07:50.705455 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:50 crc kubenswrapper[4836]: I0217 14:07:50.705469 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:50 crc kubenswrapper[4836]: I0217 14:07:50.705478 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:50Z","lastTransitionTime":"2026-02-17T14:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:50 crc kubenswrapper[4836]: I0217 14:07:50.808173 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:50 crc kubenswrapper[4836]: I0217 14:07:50.808250 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:50 crc kubenswrapper[4836]: I0217 14:07:50.808278 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:50 crc kubenswrapper[4836]: I0217 14:07:50.808356 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:50 crc kubenswrapper[4836]: I0217 14:07:50.808385 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:50Z","lastTransitionTime":"2026-02-17T14:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:50 crc kubenswrapper[4836]: I0217 14:07:50.912529 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:50 crc kubenswrapper[4836]: I0217 14:07:50.912609 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:50 crc kubenswrapper[4836]: I0217 14:07:50.912628 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:50 crc kubenswrapper[4836]: I0217 14:07:50.912657 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:50 crc kubenswrapper[4836]: I0217 14:07:50.912676 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:50Z","lastTransitionTime":"2026-02-17T14:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:51 crc kubenswrapper[4836]: I0217 14:07:51.015864 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:51 crc kubenswrapper[4836]: I0217 14:07:51.016124 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:51 crc kubenswrapper[4836]: I0217 14:07:51.016249 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:51 crc kubenswrapper[4836]: I0217 14:07:51.016350 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:51 crc kubenswrapper[4836]: I0217 14:07:51.016419 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:51Z","lastTransitionTime":"2026-02-17T14:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:51 crc kubenswrapper[4836]: I0217 14:07:51.119164 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:51 crc kubenswrapper[4836]: I0217 14:07:51.119233 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:51 crc kubenswrapper[4836]: I0217 14:07:51.119246 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:51 crc kubenswrapper[4836]: I0217 14:07:51.119265 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:51 crc kubenswrapper[4836]: I0217 14:07:51.119283 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:51Z","lastTransitionTime":"2026-02-17T14:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:51 crc kubenswrapper[4836]: I0217 14:07:51.222151 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:51 crc kubenswrapper[4836]: I0217 14:07:51.222212 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:51 crc kubenswrapper[4836]: I0217 14:07:51.222231 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:51 crc kubenswrapper[4836]: I0217 14:07:51.222251 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:51 crc kubenswrapper[4836]: I0217 14:07:51.222262 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:51Z","lastTransitionTime":"2026-02-17T14:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:51 crc kubenswrapper[4836]: I0217 14:07:51.324900 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:51 crc kubenswrapper[4836]: I0217 14:07:51.325248 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:51 crc kubenswrapper[4836]: I0217 14:07:51.325371 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:51 crc kubenswrapper[4836]: I0217 14:07:51.325466 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:51 crc kubenswrapper[4836]: I0217 14:07:51.325555 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:51Z","lastTransitionTime":"2026-02-17T14:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:51 crc kubenswrapper[4836]: I0217 14:07:51.428593 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:51 crc kubenswrapper[4836]: I0217 14:07:51.428633 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:51 crc kubenswrapper[4836]: I0217 14:07:51.428644 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:51 crc kubenswrapper[4836]: I0217 14:07:51.428660 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:51 crc kubenswrapper[4836]: I0217 14:07:51.428671 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:51Z","lastTransitionTime":"2026-02-17T14:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:51 crc kubenswrapper[4836]: I0217 14:07:51.531800 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:51 crc kubenswrapper[4836]: I0217 14:07:51.532194 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:51 crc kubenswrapper[4836]: I0217 14:07:51.532463 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:51 crc kubenswrapper[4836]: I0217 14:07:51.532704 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:51 crc kubenswrapper[4836]: I0217 14:07:51.532890 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:51Z","lastTransitionTime":"2026-02-17T14:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:51 crc kubenswrapper[4836]: I0217 14:07:51.540336 4836 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-02 12:58:12.314732636 +0000 UTC Feb 17 14:07:51 crc kubenswrapper[4836]: I0217 14:07:51.566948 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c4txt" Feb 17 14:07:51 crc kubenswrapper[4836]: I0217 14:07:51.566984 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:07:51 crc kubenswrapper[4836]: I0217 14:07:51.566954 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:07:51 crc kubenswrapper[4836]: E0217 14:07:51.567072 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c4txt" podUID="8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c" Feb 17 14:07:51 crc kubenswrapper[4836]: I0217 14:07:51.566948 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:07:51 crc kubenswrapper[4836]: E0217 14:07:51.567186 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:07:51 crc kubenswrapper[4836]: E0217 14:07:51.567241 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:07:51 crc kubenswrapper[4836]: E0217 14:07:51.567281 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:07:51 crc kubenswrapper[4836]: I0217 14:07:51.635823 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:51 crc kubenswrapper[4836]: I0217 14:07:51.635874 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:51 crc kubenswrapper[4836]: I0217 14:07:51.635890 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:51 crc kubenswrapper[4836]: I0217 14:07:51.635909 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:51 crc kubenswrapper[4836]: I0217 14:07:51.635923 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:51Z","lastTransitionTime":"2026-02-17T14:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:51 crc kubenswrapper[4836]: I0217 14:07:51.738402 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:51 crc kubenswrapper[4836]: I0217 14:07:51.738462 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:51 crc kubenswrapper[4836]: I0217 14:07:51.738485 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:51 crc kubenswrapper[4836]: I0217 14:07:51.738511 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:51 crc kubenswrapper[4836]: I0217 14:07:51.738529 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:51Z","lastTransitionTime":"2026-02-17T14:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:51 crc kubenswrapper[4836]: I0217 14:07:51.842015 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:51 crc kubenswrapper[4836]: I0217 14:07:51.842062 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:51 crc kubenswrapper[4836]: I0217 14:07:51.842075 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:51 crc kubenswrapper[4836]: I0217 14:07:51.842096 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:51 crc kubenswrapper[4836]: I0217 14:07:51.842112 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:51Z","lastTransitionTime":"2026-02-17T14:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:51 crc kubenswrapper[4836]: I0217 14:07:51.944563 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:51 crc kubenswrapper[4836]: I0217 14:07:51.944601 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:51 crc kubenswrapper[4836]: I0217 14:07:51.944610 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:51 crc kubenswrapper[4836]: I0217 14:07:51.944626 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:51 crc kubenswrapper[4836]: I0217 14:07:51.944636 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:51Z","lastTransitionTime":"2026-02-17T14:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:52 crc kubenswrapper[4836]: I0217 14:07:52.047762 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:52 crc kubenswrapper[4836]: I0217 14:07:52.047834 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:52 crc kubenswrapper[4836]: I0217 14:07:52.047853 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:52 crc kubenswrapper[4836]: I0217 14:07:52.047886 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:52 crc kubenswrapper[4836]: I0217 14:07:52.047924 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:52Z","lastTransitionTime":"2026-02-17T14:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:52 crc kubenswrapper[4836]: I0217 14:07:52.149864 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:52 crc kubenswrapper[4836]: I0217 14:07:52.149911 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:52 crc kubenswrapper[4836]: I0217 14:07:52.149923 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:52 crc kubenswrapper[4836]: I0217 14:07:52.149942 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:52 crc kubenswrapper[4836]: I0217 14:07:52.149954 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:52Z","lastTransitionTime":"2026-02-17T14:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:52 crc kubenswrapper[4836]: I0217 14:07:52.253002 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:52 crc kubenswrapper[4836]: I0217 14:07:52.253047 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:52 crc kubenswrapper[4836]: I0217 14:07:52.253059 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:52 crc kubenswrapper[4836]: I0217 14:07:52.253076 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:52 crc kubenswrapper[4836]: I0217 14:07:52.253094 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:52Z","lastTransitionTime":"2026-02-17T14:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:52 crc kubenswrapper[4836]: I0217 14:07:52.355834 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:52 crc kubenswrapper[4836]: I0217 14:07:52.355897 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:52 crc kubenswrapper[4836]: I0217 14:07:52.355908 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:52 crc kubenswrapper[4836]: I0217 14:07:52.355921 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:52 crc kubenswrapper[4836]: I0217 14:07:52.355930 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:52Z","lastTransitionTime":"2026-02-17T14:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:52 crc kubenswrapper[4836]: I0217 14:07:52.459025 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:52 crc kubenswrapper[4836]: I0217 14:07:52.459063 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:52 crc kubenswrapper[4836]: I0217 14:07:52.459073 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:52 crc kubenswrapper[4836]: I0217 14:07:52.459088 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:52 crc kubenswrapper[4836]: I0217 14:07:52.459099 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:52Z","lastTransitionTime":"2026-02-17T14:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:52 crc kubenswrapper[4836]: I0217 14:07:52.541230 4836 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-02 01:36:28.824234902 +0000 UTC Feb 17 14:07:52 crc kubenswrapper[4836]: I0217 14:07:52.561079 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:52 crc kubenswrapper[4836]: I0217 14:07:52.561170 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:52 crc kubenswrapper[4836]: I0217 14:07:52.561188 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:52 crc kubenswrapper[4836]: I0217 14:07:52.561209 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:52 crc kubenswrapper[4836]: I0217 14:07:52.561224 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:52Z","lastTransitionTime":"2026-02-17T14:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:52 crc kubenswrapper[4836]: I0217 14:07:52.663078 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:52 crc kubenswrapper[4836]: I0217 14:07:52.663120 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:52 crc kubenswrapper[4836]: I0217 14:07:52.663135 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:52 crc kubenswrapper[4836]: I0217 14:07:52.663150 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:52 crc kubenswrapper[4836]: I0217 14:07:52.663159 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:52Z","lastTransitionTime":"2026-02-17T14:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:52 crc kubenswrapper[4836]: I0217 14:07:52.766564 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:52 crc kubenswrapper[4836]: I0217 14:07:52.766632 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:52 crc kubenswrapper[4836]: I0217 14:07:52.766657 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:52 crc kubenswrapper[4836]: I0217 14:07:52.766686 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:52 crc kubenswrapper[4836]: I0217 14:07:52.766703 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:52Z","lastTransitionTime":"2026-02-17T14:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:52 crc kubenswrapper[4836]: I0217 14:07:52.869706 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:52 crc kubenswrapper[4836]: I0217 14:07:52.869807 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:52 crc kubenswrapper[4836]: I0217 14:07:52.869825 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:52 crc kubenswrapper[4836]: I0217 14:07:52.869849 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:52 crc kubenswrapper[4836]: I0217 14:07:52.869866 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:52Z","lastTransitionTime":"2026-02-17T14:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:52 crc kubenswrapper[4836]: I0217 14:07:52.972706 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:52 crc kubenswrapper[4836]: I0217 14:07:52.972767 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:52 crc kubenswrapper[4836]: I0217 14:07:52.972776 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:52 crc kubenswrapper[4836]: I0217 14:07:52.972794 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:52 crc kubenswrapper[4836]: I0217 14:07:52.972803 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:52Z","lastTransitionTime":"2026-02-17T14:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:53 crc kubenswrapper[4836]: I0217 14:07:53.075729 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:53 crc kubenswrapper[4836]: I0217 14:07:53.075801 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:53 crc kubenswrapper[4836]: I0217 14:07:53.075818 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:53 crc kubenswrapper[4836]: I0217 14:07:53.075840 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:53 crc kubenswrapper[4836]: I0217 14:07:53.075860 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:53Z","lastTransitionTime":"2026-02-17T14:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:53 crc kubenswrapper[4836]: I0217 14:07:53.178972 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:53 crc kubenswrapper[4836]: I0217 14:07:53.179026 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:53 crc kubenswrapper[4836]: I0217 14:07:53.179039 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:53 crc kubenswrapper[4836]: I0217 14:07:53.179057 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:53 crc kubenswrapper[4836]: I0217 14:07:53.179070 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:53Z","lastTransitionTime":"2026-02-17T14:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:53 crc kubenswrapper[4836]: I0217 14:07:53.281897 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:53 crc kubenswrapper[4836]: I0217 14:07:53.281943 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:53 crc kubenswrapper[4836]: I0217 14:07:53.281954 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:53 crc kubenswrapper[4836]: I0217 14:07:53.281972 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:53 crc kubenswrapper[4836]: I0217 14:07:53.281984 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:53Z","lastTransitionTime":"2026-02-17T14:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:53 crc kubenswrapper[4836]: I0217 14:07:53.384219 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:53 crc kubenswrapper[4836]: I0217 14:07:53.384274 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:53 crc kubenswrapper[4836]: I0217 14:07:53.384290 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:53 crc kubenswrapper[4836]: I0217 14:07:53.384338 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:53 crc kubenswrapper[4836]: I0217 14:07:53.384351 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:53Z","lastTransitionTime":"2026-02-17T14:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:53 crc kubenswrapper[4836]: I0217 14:07:53.487085 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:53 crc kubenswrapper[4836]: I0217 14:07:53.487112 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:53 crc kubenswrapper[4836]: I0217 14:07:53.487120 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:53 crc kubenswrapper[4836]: I0217 14:07:53.487132 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:53 crc kubenswrapper[4836]: I0217 14:07:53.487142 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:53Z","lastTransitionTime":"2026-02-17T14:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:53 crc kubenswrapper[4836]: I0217 14:07:53.542735 4836 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-05 14:21:19.455757075 +0000 UTC Feb 17 14:07:53 crc kubenswrapper[4836]: I0217 14:07:53.567464 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:07:53 crc kubenswrapper[4836]: I0217 14:07:53.567482 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:07:53 crc kubenswrapper[4836]: I0217 14:07:53.567535 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c4txt" Feb 17 14:07:53 crc kubenswrapper[4836]: I0217 14:07:53.567615 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:07:53 crc kubenswrapper[4836]: E0217 14:07:53.567621 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:07:53 crc kubenswrapper[4836]: E0217 14:07:53.567690 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:07:53 crc kubenswrapper[4836]: E0217 14:07:53.567762 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:07:53 crc kubenswrapper[4836]: E0217 14:07:53.567822 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c4txt" podUID="8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c" Feb 17 14:07:53 crc kubenswrapper[4836]: I0217 14:07:53.589255 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:53 crc kubenswrapper[4836]: I0217 14:07:53.589289 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:53 crc kubenswrapper[4836]: I0217 14:07:53.589315 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:53 crc kubenswrapper[4836]: I0217 14:07:53.589328 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:53 crc kubenswrapper[4836]: I0217 14:07:53.589337 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:53Z","lastTransitionTime":"2026-02-17T14:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:53 crc kubenswrapper[4836]: I0217 14:07:53.684903 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 14:07:53 crc kubenswrapper[4836]: I0217 14:07:53.684942 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 14:07:53 crc kubenswrapper[4836]: I0217 14:07:53.684952 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 14:07:53 crc kubenswrapper[4836]: I0217 14:07:53.684967 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 14:07:53 crc kubenswrapper[4836]: I0217 14:07:53.684978 4836 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T14:07:53Z","lastTransitionTime":"2026-02-17T14:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 14:07:53 crc kubenswrapper[4836]: I0217 14:07:53.729414 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-ff9ns"] Feb 17 14:07:53 crc kubenswrapper[4836]: I0217 14:07:53.730010 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ff9ns" Feb 17 14:07:53 crc kubenswrapper[4836]: I0217 14:07:53.732757 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 17 14:07:53 crc kubenswrapper[4836]: I0217 14:07:53.733017 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 17 14:07:53 crc kubenswrapper[4836]: I0217 14:07:53.733955 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 17 14:07:53 crc kubenswrapper[4836]: I0217 14:07:53.740551 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 17 14:07:53 crc kubenswrapper[4836]: I0217 14:07:53.752334 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/053cbec1-9d4f-42bc-8df5-28eb6c95a6c0-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-ff9ns\" (UID: \"053cbec1-9d4f-42bc-8df5-28eb6c95a6c0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ff9ns" Feb 17 14:07:53 crc kubenswrapper[4836]: I0217 14:07:53.752366 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/053cbec1-9d4f-42bc-8df5-28eb6c95a6c0-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-ff9ns\" (UID: \"053cbec1-9d4f-42bc-8df5-28eb6c95a6c0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ff9ns" Feb 17 14:07:53 crc kubenswrapper[4836]: I0217 14:07:53.752386 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/053cbec1-9d4f-42bc-8df5-28eb6c95a6c0-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-ff9ns\" (UID: \"053cbec1-9d4f-42bc-8df5-28eb6c95a6c0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ff9ns" Feb 17 14:07:53 crc kubenswrapper[4836]: I0217 14:07:53.752403 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/053cbec1-9d4f-42bc-8df5-28eb6c95a6c0-service-ca\") pod \"cluster-version-operator-5c965bbfc6-ff9ns\" (UID: \"053cbec1-9d4f-42bc-8df5-28eb6c95a6c0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ff9ns" Feb 17 14:07:53 crc kubenswrapper[4836]: I0217 14:07:53.752545 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/053cbec1-9d4f-42bc-8df5-28eb6c95a6c0-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-ff9ns\" (UID: \"053cbec1-9d4f-42bc-8df5-28eb6c95a6c0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ff9ns" Feb 17 14:07:53 crc kubenswrapper[4836]: I0217 14:07:53.761882 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=78.761844054 podStartE2EDuration="1m18.761844054s" podCreationTimestamp="2026-02-17 14:06:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:07:53.760031944 +0000 UTC m=+100.102960263" watchObservedRunningTime="2026-02-17 14:07:53.761844054 +0000 UTC m=+100.104772333" Feb 17 14:07:53 crc kubenswrapper[4836]: I0217 14:07:53.848514 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-vt5sw" podStartSLOduration=78.848493769 podStartE2EDuration="1m18.848493769s" podCreationTimestamp="2026-02-17 14:06:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:07:53.848222091 +0000 UTC m=+100.191150370" watchObservedRunningTime="2026-02-17 14:07:53.848493769 +0000 UTC m=+100.191422038" Feb 17 14:07:53 crc kubenswrapper[4836]: I0217 14:07:53.853660 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/053cbec1-9d4f-42bc-8df5-28eb6c95a6c0-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-ff9ns\" (UID: \"053cbec1-9d4f-42bc-8df5-28eb6c95a6c0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ff9ns" Feb 17 14:07:53 crc kubenswrapper[4836]: I0217 14:07:53.853806 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/053cbec1-9d4f-42bc-8df5-28eb6c95a6c0-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-ff9ns\" (UID: \"053cbec1-9d4f-42bc-8df5-28eb6c95a6c0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ff9ns" Feb 17 14:07:53 crc kubenswrapper[4836]: I0217 14:07:53.853836 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/053cbec1-9d4f-42bc-8df5-28eb6c95a6c0-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-ff9ns\" (UID: \"053cbec1-9d4f-42bc-8df5-28eb6c95a6c0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ff9ns" Feb 17 14:07:53 crc kubenswrapper[4836]: I0217 14:07:53.853861 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/053cbec1-9d4f-42bc-8df5-28eb6c95a6c0-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-ff9ns\" (UID: \"053cbec1-9d4f-42bc-8df5-28eb6c95a6c0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ff9ns" Feb 17 14:07:53 crc kubenswrapper[4836]: I0217 14:07:53.853883 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/053cbec1-9d4f-42bc-8df5-28eb6c95a6c0-service-ca\") pod \"cluster-version-operator-5c965bbfc6-ff9ns\" (UID: \"053cbec1-9d4f-42bc-8df5-28eb6c95a6c0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ff9ns" Feb 17 14:07:53 crc kubenswrapper[4836]: I0217 14:07:53.854521 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/053cbec1-9d4f-42bc-8df5-28eb6c95a6c0-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-ff9ns\" (UID: \"053cbec1-9d4f-42bc-8df5-28eb6c95a6c0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ff9ns" Feb 17 14:07:53 crc kubenswrapper[4836]: I0217 14:07:53.854554 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/053cbec1-9d4f-42bc-8df5-28eb6c95a6c0-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-ff9ns\" (UID: \"053cbec1-9d4f-42bc-8df5-28eb6c95a6c0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ff9ns" Feb 17 14:07:53 crc kubenswrapper[4836]: I0217 14:07:53.855014 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/053cbec1-9d4f-42bc-8df5-28eb6c95a6c0-service-ca\") pod \"cluster-version-operator-5c965bbfc6-ff9ns\" (UID: \"053cbec1-9d4f-42bc-8df5-28eb6c95a6c0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ff9ns" Feb 17 14:07:53 crc kubenswrapper[4836]: I0217 14:07:53.861467 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/053cbec1-9d4f-42bc-8df5-28eb6c95a6c0-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-ff9ns\" (UID: \"053cbec1-9d4f-42bc-8df5-28eb6c95a6c0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ff9ns" Feb 17 14:07:53 crc kubenswrapper[4836]: I0217 14:07:53.868803 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=7.868778581 podStartE2EDuration="7.868778581s" podCreationTimestamp="2026-02-17 14:07:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:07:53.868416951 +0000 UTC m=+100.211345230" watchObservedRunningTime="2026-02-17 14:07:53.868778581 +0000 UTC m=+100.211706850" Feb 17 14:07:53 crc kubenswrapper[4836]: I0217 14:07:53.872682 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/053cbec1-9d4f-42bc-8df5-28eb6c95a6c0-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-ff9ns\" (UID: \"053cbec1-9d4f-42bc-8df5-28eb6c95a6c0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ff9ns" Feb 17 14:07:53 crc kubenswrapper[4836]: I0217 14:07:53.902055 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=77.90203656 podStartE2EDuration="1m17.90203656s" podCreationTimestamp="2026-02-17 14:06:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:07:53.884843111 +0000 UTC m=+100.227771400" watchObservedRunningTime="2026-02-17 14:07:53.90203656 +0000 UTC m=+100.244964829" Feb 17 14:07:53 crc kubenswrapper[4836]: I0217 14:07:53.917963 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=44.917944395 podStartE2EDuration="44.917944395s" podCreationTimestamp="2026-02-17 14:07:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:07:53.903444617 +0000 UTC m=+100.246372936" watchObservedRunningTime="2026-02-17 14:07:53.917944395 +0000 UTC m=+100.260872664" Feb 17 14:07:53 crc kubenswrapper[4836]: I0217 14:07:53.955253 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c-metrics-certs\") pod \"network-metrics-daemon-c4txt\" (UID: \"8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c\") " pod="openshift-multus/network-metrics-daemon-c4txt" Feb 17 14:07:53 crc kubenswrapper[4836]: E0217 14:07:53.955510 4836 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 14:07:53 crc kubenswrapper[4836]: E0217 14:07:53.955615 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c-metrics-certs podName:8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c nodeName:}" failed. No retries permitted until 2026-02-17 14:08:57.955594021 +0000 UTC m=+164.298522340 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c-metrics-certs") pod "network-metrics-daemon-c4txt" (UID: "8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 14:07:53 crc kubenswrapper[4836]: I0217 14:07:53.957805 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-jlz6g" podStartSLOduration=78.95779353 podStartE2EDuration="1m18.95779353s" podCreationTimestamp="2026-02-17 14:06:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:07:53.95708283 +0000 UTC m=+100.300011119" watchObservedRunningTime="2026-02-17 14:07:53.95779353 +0000 UTC m=+100.300721799" Feb 17 14:07:53 crc kubenswrapper[4836]: I0217 14:07:53.998785 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-t7845" podStartSLOduration=78.998768145 podStartE2EDuration="1m18.998768145s" podCreationTimestamp="2026-02-17 14:06:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:07:53.998157548 +0000 UTC m=+100.341085817" watchObservedRunningTime="2026-02-17 14:07:53.998768145 +0000 UTC m=+100.341696414" Feb 17 14:07:54 crc kubenswrapper[4836]: I0217 14:07:54.010659 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7nmc8" podStartSLOduration=78.010636731 podStartE2EDuration="1m18.010636731s" podCreationTimestamp="2026-02-17 14:06:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:07:54.009675736 +0000 UTC m=+100.352604025" watchObservedRunningTime="2026-02-17 14:07:54.010636731 +0000 UTC m=+100.353565000" Feb 17 14:07:54 crc kubenswrapper[4836]: I0217 14:07:54.055823 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ff9ns" Feb 17 14:07:54 crc kubenswrapper[4836]: I0217 14:07:54.060228 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=75.060212827 podStartE2EDuration="1m15.060212827s" podCreationTimestamp="2026-02-17 14:06:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:07:54.05959087 +0000 UTC m=+100.402519169" watchObservedRunningTime="2026-02-17 14:07:54.060212827 +0000 UTC m=+100.403141096" Feb 17 14:07:54 crc kubenswrapper[4836]: W0217 14:07:54.084614 4836 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod053cbec1_9d4f_42bc_8df5_28eb6c95a6c0.slice/crio-4e76d16e65fe5e76ea24a33a9de21232e9b6592452d31dbe3c10a1103008ab47 WatchSource:0}: Error finding container 4e76d16e65fe5e76ea24a33a9de21232e9b6592452d31dbe3c10a1103008ab47: Status 404 returned error can't find the container with id 4e76d16e65fe5e76ea24a33a9de21232e9b6592452d31dbe3c10a1103008ab47 Feb 17 14:07:54 crc kubenswrapper[4836]: I0217 14:07:54.085640 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-c76cc" podStartSLOduration=79.085616956 podStartE2EDuration="1m19.085616956s" podCreationTimestamp="2026-02-17 14:06:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:07:54.085134843 +0000 UTC m=+100.428063142" watchObservedRunningTime="2026-02-17 14:07:54.085616956 +0000 UTC m=+100.428545235" Feb 17 14:07:54 crc kubenswrapper[4836]: I0217 14:07:54.141124 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ff9ns" event={"ID":"053cbec1-9d4f-42bc-8df5-28eb6c95a6c0","Type":"ContainerStarted","Data":"4e76d16e65fe5e76ea24a33a9de21232e9b6592452d31dbe3c10a1103008ab47"} Feb 17 14:07:54 crc kubenswrapper[4836]: I0217 14:07:54.543940 4836 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-15 18:23:40.301507524 +0000 UTC Feb 17 14:07:54 crc kubenswrapper[4836]: I0217 14:07:54.544022 4836 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Feb 17 14:07:54 crc kubenswrapper[4836]: I0217 14:07:54.563861 4836 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 17 14:07:55 crc kubenswrapper[4836]: I0217 14:07:55.145553 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ff9ns" event={"ID":"053cbec1-9d4f-42bc-8df5-28eb6c95a6c0","Type":"ContainerStarted","Data":"f036d2cecfdd209cf7b301a4b376a4d50f7ea7c9e4e6c1c5524b67376b2b1226"} Feb 17 14:07:55 crc kubenswrapper[4836]: I0217 14:07:55.158756 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ff9ns" podStartSLOduration=80.158717992 podStartE2EDuration="1m20.158717992s" podCreationTimestamp="2026-02-17 14:06:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:07:55.158052395 +0000 UTC m=+101.500980674" watchObservedRunningTime="2026-02-17 14:07:55.158717992 +0000 UTC m=+101.501646251" Feb 17 14:07:55 crc kubenswrapper[4836]: I0217 14:07:55.159080 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" podStartSLOduration=80.159074292 podStartE2EDuration="1m20.159074292s" podCreationTimestamp="2026-02-17 14:06:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:07:54.097618996 +0000 UTC m=+100.440547275" watchObservedRunningTime="2026-02-17 14:07:55.159074292 +0000 UTC m=+101.502002561" Feb 17 14:07:55 crc kubenswrapper[4836]: I0217 14:07:55.567956 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:07:55 crc kubenswrapper[4836]: I0217 14:07:55.567958 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:07:55 crc kubenswrapper[4836]: E0217 14:07:55.568698 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:07:55 crc kubenswrapper[4836]: I0217 14:07:55.568006 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:07:55 crc kubenswrapper[4836]: E0217 14:07:55.569080 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:07:55 crc kubenswrapper[4836]: I0217 14:07:55.567998 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c4txt" Feb 17 14:07:55 crc kubenswrapper[4836]: E0217 14:07:55.568805 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:07:55 crc kubenswrapper[4836]: E0217 14:07:55.569622 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c4txt" podUID="8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c" Feb 17 14:07:57 crc kubenswrapper[4836]: I0217 14:07:57.566947 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:07:57 crc kubenswrapper[4836]: I0217 14:07:57.566997 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:07:57 crc kubenswrapper[4836]: E0217 14:07:57.567093 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:07:57 crc kubenswrapper[4836]: I0217 14:07:57.567109 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c4txt" Feb 17 14:07:57 crc kubenswrapper[4836]: I0217 14:07:57.567661 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:07:57 crc kubenswrapper[4836]: E0217 14:07:57.567746 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:07:57 crc kubenswrapper[4836]: E0217 14:07:57.567747 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c4txt" podUID="8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c" Feb 17 14:07:57 crc kubenswrapper[4836]: E0217 14:07:57.567934 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:07:57 crc kubenswrapper[4836]: I0217 14:07:57.568078 4836 scope.go:117] "RemoveContainer" containerID="38fa1bf6b59359eb32319ce0184baa7dac44d10d2a0d9502169ea98eb3408c83" Feb 17 14:07:57 crc kubenswrapper[4836]: E0217 14:07:57.568372 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-gfznp_openshift-ovn-kubernetes(67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" podUID="67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e" Feb 17 14:07:59 crc kubenswrapper[4836]: I0217 14:07:59.567702 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:07:59 crc kubenswrapper[4836]: I0217 14:07:59.567805 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c4txt" Feb 17 14:07:59 crc kubenswrapper[4836]: I0217 14:07:59.567710 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:07:59 crc kubenswrapper[4836]: E0217 14:07:59.567840 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:07:59 crc kubenswrapper[4836]: I0217 14:07:59.567731 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:07:59 crc kubenswrapper[4836]: E0217 14:07:59.568073 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c4txt" podUID="8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c" Feb 17 14:07:59 crc kubenswrapper[4836]: E0217 14:07:59.568170 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:07:59 crc kubenswrapper[4836]: E0217 14:07:59.568218 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:08:01 crc kubenswrapper[4836]: I0217 14:08:01.567224 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:08:01 crc kubenswrapper[4836]: I0217 14:08:01.567241 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c4txt" Feb 17 14:08:01 crc kubenswrapper[4836]: I0217 14:08:01.567404 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:08:01 crc kubenswrapper[4836]: E0217 14:08:01.567409 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:08:01 crc kubenswrapper[4836]: I0217 14:08:01.567468 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:08:01 crc kubenswrapper[4836]: E0217 14:08:01.567601 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c4txt" podUID="8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c" Feb 17 14:08:01 crc kubenswrapper[4836]: E0217 14:08:01.567779 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:08:01 crc kubenswrapper[4836]: E0217 14:08:01.567842 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:08:03 crc kubenswrapper[4836]: I0217 14:08:03.567748 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c4txt" Feb 17 14:08:03 crc kubenswrapper[4836]: I0217 14:08:03.567796 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:08:03 crc kubenswrapper[4836]: I0217 14:08:03.567736 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:08:03 crc kubenswrapper[4836]: I0217 14:08:03.567844 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:08:03 crc kubenswrapper[4836]: E0217 14:08:03.567903 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c4txt" podUID="8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c" Feb 17 14:08:03 crc kubenswrapper[4836]: E0217 14:08:03.568035 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:08:03 crc kubenswrapper[4836]: E0217 14:08:03.568147 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:08:03 crc kubenswrapper[4836]: E0217 14:08:03.568329 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:08:05 crc kubenswrapper[4836]: I0217 14:08:05.568482 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:08:05 crc kubenswrapper[4836]: E0217 14:08:05.568665 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:08:05 crc kubenswrapper[4836]: I0217 14:08:05.568958 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c4txt" Feb 17 14:08:05 crc kubenswrapper[4836]: E0217 14:08:05.569056 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c4txt" podUID="8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c" Feb 17 14:08:05 crc kubenswrapper[4836]: I0217 14:08:05.569290 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:08:05 crc kubenswrapper[4836]: E0217 14:08:05.569428 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:08:05 crc kubenswrapper[4836]: I0217 14:08:05.569692 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:08:05 crc kubenswrapper[4836]: E0217 14:08:05.569790 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:08:07 crc kubenswrapper[4836]: I0217 14:08:07.567409 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:08:07 crc kubenswrapper[4836]: I0217 14:08:07.567453 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c4txt" Feb 17 14:08:07 crc kubenswrapper[4836]: I0217 14:08:07.567498 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:08:07 crc kubenswrapper[4836]: E0217 14:08:07.567551 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:08:07 crc kubenswrapper[4836]: E0217 14:08:07.567633 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:08:07 crc kubenswrapper[4836]: E0217 14:08:07.567766 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c4txt" podUID="8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c" Feb 17 14:08:07 crc kubenswrapper[4836]: I0217 14:08:07.567912 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:08:07 crc kubenswrapper[4836]: E0217 14:08:07.567987 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:08:09 crc kubenswrapper[4836]: I0217 14:08:09.567317 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:08:09 crc kubenswrapper[4836]: I0217 14:08:09.567319 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:08:09 crc kubenswrapper[4836]: E0217 14:08:09.567456 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:08:09 crc kubenswrapper[4836]: I0217 14:08:09.567536 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:08:09 crc kubenswrapper[4836]: I0217 14:08:09.567536 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c4txt" Feb 17 14:08:09 crc kubenswrapper[4836]: E0217 14:08:09.567662 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:08:09 crc kubenswrapper[4836]: E0217 14:08:09.567763 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:08:09 crc kubenswrapper[4836]: E0217 14:08:09.567825 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c4txt" podUID="8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c" Feb 17 14:08:11 crc kubenswrapper[4836]: I0217 14:08:11.202919 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-c76cc_592aa549-1b1b-441e-93e4-0821e05ff2b2/kube-multus/1.log" Feb 17 14:08:11 crc kubenswrapper[4836]: I0217 14:08:11.203740 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-c76cc_592aa549-1b1b-441e-93e4-0821e05ff2b2/kube-multus/0.log" Feb 17 14:08:11 crc kubenswrapper[4836]: I0217 14:08:11.203794 4836 generic.go:334] "Generic (PLEG): container finished" podID="592aa549-1b1b-441e-93e4-0821e05ff2b2" containerID="b64d012815880d0ba6314438a96b0ffd6bdad4678135a9bd3c9c2a4d6eb83c41" exitCode=1 Feb 17 14:08:11 crc kubenswrapper[4836]: I0217 14:08:11.203853 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-c76cc" event={"ID":"592aa549-1b1b-441e-93e4-0821e05ff2b2","Type":"ContainerDied","Data":"b64d012815880d0ba6314438a96b0ffd6bdad4678135a9bd3c9c2a4d6eb83c41"} Feb 17 14:08:11 crc kubenswrapper[4836]: I0217 14:08:11.203902 4836 scope.go:117] "RemoveContainer" containerID="d56329a484e41f44539eb8ca01b461cc6f79eebf756350b0f49c47c0431e8abc" Feb 17 14:08:11 crc kubenswrapper[4836]: I0217 14:08:11.205132 4836 scope.go:117] "RemoveContainer" containerID="b64d012815880d0ba6314438a96b0ffd6bdad4678135a9bd3c9c2a4d6eb83c41" Feb 17 14:08:11 crc kubenswrapper[4836]: E0217 14:08:11.205749 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-c76cc_openshift-multus(592aa549-1b1b-441e-93e4-0821e05ff2b2)\"" pod="openshift-multus/multus-c76cc" podUID="592aa549-1b1b-441e-93e4-0821e05ff2b2" Feb 17 14:08:11 crc kubenswrapper[4836]: I0217 14:08:11.567247 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:08:11 crc kubenswrapper[4836]: I0217 14:08:11.567402 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:08:11 crc kubenswrapper[4836]: I0217 14:08:11.567442 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c4txt" Feb 17 14:08:11 crc kubenswrapper[4836]: I0217 14:08:11.567471 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:08:11 crc kubenswrapper[4836]: E0217 14:08:11.568057 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c4txt" podUID="8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c" Feb 17 14:08:11 crc kubenswrapper[4836]: E0217 14:08:11.568241 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:08:11 crc kubenswrapper[4836]: E0217 14:08:11.568610 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:08:11 crc kubenswrapper[4836]: E0217 14:08:11.568751 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:08:11 crc kubenswrapper[4836]: I0217 14:08:11.568915 4836 scope.go:117] "RemoveContainer" containerID="38fa1bf6b59359eb32319ce0184baa7dac44d10d2a0d9502169ea98eb3408c83" Feb 17 14:08:11 crc kubenswrapper[4836]: E0217 14:08:11.569079 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-gfznp_openshift-ovn-kubernetes(67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" podUID="67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e" Feb 17 14:08:12 crc kubenswrapper[4836]: I0217 14:08:12.208899 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-c76cc_592aa549-1b1b-441e-93e4-0821e05ff2b2/kube-multus/1.log" Feb 17 14:08:13 crc kubenswrapper[4836]: I0217 14:08:13.567912 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:08:13 crc kubenswrapper[4836]: I0217 14:08:13.567966 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c4txt" Feb 17 14:08:13 crc kubenswrapper[4836]: I0217 14:08:13.568096 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:08:13 crc kubenswrapper[4836]: E0217 14:08:13.568152 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:08:13 crc kubenswrapper[4836]: I0217 14:08:13.568183 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:08:13 crc kubenswrapper[4836]: E0217 14:08:13.568359 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:08:13 crc kubenswrapper[4836]: E0217 14:08:13.568473 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:08:13 crc kubenswrapper[4836]: E0217 14:08:13.568562 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c4txt" podUID="8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c" Feb 17 14:08:14 crc kubenswrapper[4836]: E0217 14:08:14.495950 4836 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Feb 17 14:08:14 crc kubenswrapper[4836]: E0217 14:08:14.675999 4836 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 17 14:08:15 crc kubenswrapper[4836]: I0217 14:08:15.567751 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:08:15 crc kubenswrapper[4836]: I0217 14:08:15.567750 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:08:15 crc kubenswrapper[4836]: I0217 14:08:15.567795 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c4txt" Feb 17 14:08:15 crc kubenswrapper[4836]: I0217 14:08:15.568528 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:08:15 crc kubenswrapper[4836]: E0217 14:08:15.568742 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:08:15 crc kubenswrapper[4836]: E0217 14:08:15.568819 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:08:15 crc kubenswrapper[4836]: E0217 14:08:15.568964 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c4txt" podUID="8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c" Feb 17 14:08:15 crc kubenswrapper[4836]: E0217 14:08:15.569158 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:08:17 crc kubenswrapper[4836]: I0217 14:08:17.568100 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:08:17 crc kubenswrapper[4836]: I0217 14:08:17.568128 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:08:17 crc kubenswrapper[4836]: I0217 14:08:17.568104 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c4txt" Feb 17 14:08:17 crc kubenswrapper[4836]: E0217 14:08:17.568256 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:08:17 crc kubenswrapper[4836]: I0217 14:08:17.568358 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:08:17 crc kubenswrapper[4836]: E0217 14:08:17.568466 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c4txt" podUID="8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c" Feb 17 14:08:17 crc kubenswrapper[4836]: E0217 14:08:17.568611 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:08:17 crc kubenswrapper[4836]: E0217 14:08:17.568712 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:08:19 crc kubenswrapper[4836]: I0217 14:08:19.567853 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:08:19 crc kubenswrapper[4836]: I0217 14:08:19.567862 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:08:19 crc kubenswrapper[4836]: I0217 14:08:19.567888 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c4txt" Feb 17 14:08:19 crc kubenswrapper[4836]: I0217 14:08:19.567894 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:08:19 crc kubenswrapper[4836]: E0217 14:08:19.568691 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:08:19 crc kubenswrapper[4836]: E0217 14:08:19.568774 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:08:19 crc kubenswrapper[4836]: E0217 14:08:19.568871 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c4txt" podUID="8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c" Feb 17 14:08:19 crc kubenswrapper[4836]: E0217 14:08:19.568928 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:08:19 crc kubenswrapper[4836]: E0217 14:08:19.677194 4836 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 17 14:08:21 crc kubenswrapper[4836]: I0217 14:08:21.567355 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:08:21 crc kubenswrapper[4836]: I0217 14:08:21.567420 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:08:21 crc kubenswrapper[4836]: I0217 14:08:21.567460 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:08:21 crc kubenswrapper[4836]: I0217 14:08:21.567443 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c4txt" Feb 17 14:08:21 crc kubenswrapper[4836]: E0217 14:08:21.567500 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:08:21 crc kubenswrapper[4836]: E0217 14:08:21.567583 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:08:21 crc kubenswrapper[4836]: E0217 14:08:21.567649 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c4txt" podUID="8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c" Feb 17 14:08:21 crc kubenswrapper[4836]: E0217 14:08:21.567780 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:08:22 crc kubenswrapper[4836]: I0217 14:08:22.568526 4836 scope.go:117] "RemoveContainer" containerID="b64d012815880d0ba6314438a96b0ffd6bdad4678135a9bd3c9c2a4d6eb83c41" Feb 17 14:08:23 crc kubenswrapper[4836]: I0217 14:08:23.245253 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-c76cc_592aa549-1b1b-441e-93e4-0821e05ff2b2/kube-multus/1.log" Feb 17 14:08:23 crc kubenswrapper[4836]: I0217 14:08:23.245586 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-c76cc" event={"ID":"592aa549-1b1b-441e-93e4-0821e05ff2b2","Type":"ContainerStarted","Data":"d7051348fa11415bbd3ca42ccce04342cfc29fef1e5015e7fedf40514e49824c"} Feb 17 14:08:23 crc kubenswrapper[4836]: I0217 14:08:23.567183 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:08:23 crc kubenswrapper[4836]: I0217 14:08:23.567223 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:08:23 crc kubenswrapper[4836]: I0217 14:08:23.567191 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c4txt" Feb 17 14:08:23 crc kubenswrapper[4836]: E0217 14:08:23.567363 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:08:23 crc kubenswrapper[4836]: I0217 14:08:23.567327 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:08:23 crc kubenswrapper[4836]: E0217 14:08:23.567437 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:08:23 crc kubenswrapper[4836]: E0217 14:08:23.567589 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:08:23 crc kubenswrapper[4836]: E0217 14:08:23.567840 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c4txt" podUID="8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c" Feb 17 14:08:24 crc kubenswrapper[4836]: E0217 14:08:24.677731 4836 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 17 14:08:25 crc kubenswrapper[4836]: I0217 14:08:25.568024 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c4txt" Feb 17 14:08:25 crc kubenswrapper[4836]: I0217 14:08:25.568062 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:08:25 crc kubenswrapper[4836]: I0217 14:08:25.568020 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:08:25 crc kubenswrapper[4836]: E0217 14:08:25.568225 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c4txt" podUID="8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c" Feb 17 14:08:25 crc kubenswrapper[4836]: E0217 14:08:25.568432 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:08:25 crc kubenswrapper[4836]: E0217 14:08:25.568559 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:08:25 crc kubenswrapper[4836]: I0217 14:08:25.568671 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:08:25 crc kubenswrapper[4836]: E0217 14:08:25.568771 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:08:26 crc kubenswrapper[4836]: I0217 14:08:26.568369 4836 scope.go:117] "RemoveContainer" containerID="38fa1bf6b59359eb32319ce0184baa7dac44d10d2a0d9502169ea98eb3408c83" Feb 17 14:08:27 crc kubenswrapper[4836]: I0217 14:08:27.261372 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gfznp_67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e/ovnkube-controller/3.log" Feb 17 14:08:27 crc kubenswrapper[4836]: I0217 14:08:27.263710 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" event={"ID":"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e","Type":"ContainerStarted","Data":"61bfc9a285863adea7e23ce49aaa4e592d2fd16c4e2e57632d659c3574f51775"} Feb 17 14:08:27 crc kubenswrapper[4836]: I0217 14:08:27.264190 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" Feb 17 14:08:27 crc kubenswrapper[4836]: I0217 14:08:27.293578 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" podStartSLOduration=112.293540766 podStartE2EDuration="1m52.293540766s" podCreationTimestamp="2026-02-17 14:06:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:08:27.290702281 +0000 UTC m=+133.633630590" watchObservedRunningTime="2026-02-17 14:08:27.293540766 +0000 UTC m=+133.636469055" Feb 17 14:08:27 crc kubenswrapper[4836]: I0217 14:08:27.464261 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-c4txt"] Feb 17 14:08:27 crc kubenswrapper[4836]: I0217 14:08:27.464457 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c4txt" Feb 17 14:08:27 crc kubenswrapper[4836]: E0217 14:08:27.464620 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c4txt" podUID="8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c" Feb 17 14:08:27 crc kubenswrapper[4836]: I0217 14:08:27.567238 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:08:27 crc kubenswrapper[4836]: I0217 14:08:27.567334 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:08:27 crc kubenswrapper[4836]: I0217 14:08:27.567238 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:08:27 crc kubenswrapper[4836]: E0217 14:08:27.567417 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:08:27 crc kubenswrapper[4836]: E0217 14:08:27.567526 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:08:27 crc kubenswrapper[4836]: E0217 14:08:27.567587 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:08:29 crc kubenswrapper[4836]: I0217 14:08:29.567936 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:08:29 crc kubenswrapper[4836]: I0217 14:08:29.568033 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c4txt" Feb 17 14:08:29 crc kubenswrapper[4836]: I0217 14:08:29.568045 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:08:29 crc kubenswrapper[4836]: E0217 14:08:29.568622 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c4txt" podUID="8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c" Feb 17 14:08:29 crc kubenswrapper[4836]: E0217 14:08:29.568445 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 14:08:29 crc kubenswrapper[4836]: I0217 14:08:29.568108 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:08:29 crc kubenswrapper[4836]: E0217 14:08:29.568796 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 14:08:29 crc kubenswrapper[4836]: E0217 14:08:29.568810 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 14:08:31 crc kubenswrapper[4836]: I0217 14:08:31.567511 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:08:31 crc kubenswrapper[4836]: I0217 14:08:31.567564 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c4txt" Feb 17 14:08:31 crc kubenswrapper[4836]: I0217 14:08:31.567536 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:08:31 crc kubenswrapper[4836]: I0217 14:08:31.567625 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:08:31 crc kubenswrapper[4836]: I0217 14:08:31.570676 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 17 14:08:31 crc kubenswrapper[4836]: I0217 14:08:31.570831 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 17 14:08:31 crc kubenswrapper[4836]: I0217 14:08:31.570858 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 17 14:08:31 crc kubenswrapper[4836]: I0217 14:08:31.571311 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 17 14:08:31 crc kubenswrapper[4836]: I0217 14:08:31.571507 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 17 14:08:31 crc kubenswrapper[4836]: I0217 14:08:31.572488 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.271098 4836 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.343104 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-cnq25"] Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.344063 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-jjmwc"] Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.344670 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-jjmwc" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.344780 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-cnq25" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.345152 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-5l6x4"] Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.346280 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-z6h7n"] Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.346730 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-5l6x4" Feb 17 14:08:34 crc kubenswrapper[4836]: W0217 14:08:34.347192 4836 reflector.go:561] object-"openshift-machine-api"/"machine-api-operator-tls": failed to list *v1.Secret: secrets "machine-api-operator-tls" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-machine-api": no relationship found between node 'crc' and this object Feb 17 14:08:34 crc kubenswrapper[4836]: E0217 14:08:34.347406 4836 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-api\"/\"machine-api-operator-tls\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"machine-api-operator-tls\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-machine-api\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.347573 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-z6h7n" Feb 17 14:08:34 crc kubenswrapper[4836]: W0217 14:08:34.348710 4836 reflector.go:561] object-"openshift-apiserver"/"etcd-serving-ca": failed to list *v1.ConfigMap: configmaps "etcd-serving-ca" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-apiserver": no relationship found between node 'crc' and this object Feb 17 14:08:34 crc kubenswrapper[4836]: E0217 14:08:34.348764 4836 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver\"/\"etcd-serving-ca\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"etcd-serving-ca\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 17 14:08:34 crc kubenswrapper[4836]: W0217 14:08:34.348935 4836 reflector.go:561] object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7": failed to list *v1.Secret: secrets "machine-api-operator-dockercfg-mfbb7" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-machine-api": no relationship found between node 'crc' and this object Feb 17 14:08:34 crc kubenswrapper[4836]: E0217 14:08:34.348971 4836 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-api\"/\"machine-api-operator-dockercfg-mfbb7\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"machine-api-operator-dockercfg-mfbb7\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-machine-api\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 17 14:08:34 crc kubenswrapper[4836]: W0217 14:08:34.348989 4836 reflector.go:561] object-"openshift-apiserver"/"serving-cert": failed to list *v1.Secret: secrets "serving-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-apiserver": no relationship found between node 'crc' and this object Feb 17 14:08:34 crc kubenswrapper[4836]: E0217 14:08:34.349026 4836 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver\"/\"serving-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"serving-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 17 14:08:34 crc kubenswrapper[4836]: W0217 14:08:34.349040 4836 reflector.go:561] object-"openshift-apiserver"/"etcd-client": failed to list *v1.Secret: secrets "etcd-client" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-apiserver": no relationship found between node 'crc' and this object Feb 17 14:08:34 crc kubenswrapper[4836]: W0217 14:08:34.349071 4836 reflector.go:561] object-"openshift-apiserver"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-apiserver": no relationship found between node 'crc' and this object Feb 17 14:08:34 crc kubenswrapper[4836]: E0217 14:08:34.349074 4836 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver\"/\"etcd-client\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"etcd-client\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 17 14:08:34 crc kubenswrapper[4836]: W0217 14:08:34.348943 4836 reflector.go:561] object-"openshift-apiserver"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-apiserver": no relationship found between node 'crc' and this object Feb 17 14:08:34 crc kubenswrapper[4836]: E0217 14:08:34.349104 4836 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 17 14:08:34 crc kubenswrapper[4836]: E0217 14:08:34.349120 4836 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.349131 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-cwlxz"] Feb 17 14:08:34 crc kubenswrapper[4836]: W0217 14:08:34.349179 4836 reflector.go:561] object-"openshift-machine-api"/"machine-api-operator-images": failed to list *v1.ConfigMap: configmaps "machine-api-operator-images" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-machine-api": no relationship found between node 'crc' and this object Feb 17 14:08:34 crc kubenswrapper[4836]: E0217 14:08:34.349194 4836 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-api\"/\"machine-api-operator-images\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"machine-api-operator-images\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-machine-api\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 17 14:08:34 crc kubenswrapper[4836]: W0217 14:08:34.349234 4836 reflector.go:561] object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff": failed to list *v1.Secret: secrets "openshift-apiserver-sa-dockercfg-djjff" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-apiserver": no relationship found between node 'crc' and this object Feb 17 14:08:34 crc kubenswrapper[4836]: E0217 14:08:34.349248 4836 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver\"/\"openshift-apiserver-sa-dockercfg-djjff\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"openshift-apiserver-sa-dockercfg-djjff\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 17 14:08:34 crc kubenswrapper[4836]: W0217 14:08:34.349651 4836 reflector.go:561] object-"openshift-apiserver"/"encryption-config-1": failed to list *v1.Secret: secrets "encryption-config-1" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-apiserver": no relationship found between node 'crc' and this object Feb 17 14:08:34 crc kubenswrapper[4836]: E0217 14:08:34.349691 4836 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver\"/\"encryption-config-1\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"encryption-config-1\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.350342 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cwlxz" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.351216 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-2mmw4"] Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.351698 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2mmw4" Feb 17 14:08:34 crc kubenswrapper[4836]: W0217 14:08:34.352613 4836 reflector.go:561] object-"openshift-apiserver"/"image-import-ca": failed to list *v1.ConfigMap: configmaps "image-import-ca" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-apiserver": no relationship found between node 'crc' and this object Feb 17 14:08:34 crc kubenswrapper[4836]: E0217 14:08:34.352652 4836 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver\"/\"image-import-ca\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"image-import-ca\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 17 14:08:34 crc kubenswrapper[4836]: W0217 14:08:34.352716 4836 reflector.go:561] object-"openshift-machine-api"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-machine-api": no relationship found between node 'crc' and this object Feb 17 14:08:34 crc kubenswrapper[4836]: E0217 14:08:34.352732 4836 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-api\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-machine-api\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 17 14:08:34 crc kubenswrapper[4836]: W0217 14:08:34.353092 4836 reflector.go:561] object-"openshift-oauth-apiserver"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-oauth-apiserver": no relationship found between node 'crc' and this object Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.353123 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 17 14:08:34 crc kubenswrapper[4836]: W0217 14:08:34.353101 4836 reflector.go:561] object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq": failed to list *v1.Secret: secrets "oauth-apiserver-sa-dockercfg-6r2bq" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-oauth-apiserver": no relationship found between node 'crc' and this object Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.353178 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 17 14:08:34 crc kubenswrapper[4836]: E0217 14:08:34.353177 4836 reflector.go:158] "Unhandled Error" err="object-\"openshift-oauth-apiserver\"/\"oauth-apiserver-sa-dockercfg-6r2bq\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"oauth-apiserver-sa-dockercfg-6r2bq\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-oauth-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 17 14:08:34 crc kubenswrapper[4836]: W0217 14:08:34.353192 4836 reflector.go:561] object-"openshift-controller-manager"/"openshift-global-ca": failed to list *v1.ConfigMap: configmaps "openshift-global-ca" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-controller-manager": no relationship found between node 'crc' and this object Feb 17 14:08:34 crc kubenswrapper[4836]: E0217 14:08:34.353139 4836 reflector.go:158] "Unhandled Error" err="object-\"openshift-oauth-apiserver\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-oauth-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 17 14:08:34 crc kubenswrapper[4836]: W0217 14:08:34.353271 4836 reflector.go:561] object-"openshift-oauth-apiserver"/"serving-cert": failed to list *v1.Secret: secrets "serving-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-oauth-apiserver": no relationship found between node 'crc' and this object Feb 17 14:08:34 crc kubenswrapper[4836]: W0217 14:08:34.353289 4836 reflector.go:561] object-"openshift-controller-manager"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-controller-manager": no relationship found between node 'crc' and this object Feb 17 14:08:34 crc kubenswrapper[4836]: W0217 14:08:34.353317 4836 reflector.go:561] object-"openshift-controller-manager"/"serving-cert": failed to list *v1.Secret: secrets "serving-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-controller-manager": no relationship found between node 'crc' and this object Feb 17 14:08:34 crc kubenswrapper[4836]: W0217 14:08:34.353208 4836 reflector.go:561] object-"openshift-apiserver"/"config": failed to list *v1.ConfigMap: configmaps "config" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-apiserver": no relationship found between node 'crc' and this object Feb 17 14:08:34 crc kubenswrapper[4836]: E0217 14:08:34.353322 4836 reflector.go:158] "Unhandled Error" err="object-\"openshift-oauth-apiserver\"/\"serving-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"serving-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-oauth-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 17 14:08:34 crc kubenswrapper[4836]: E0217 14:08:34.353330 4836 reflector.go:158] "Unhandled Error" err="object-\"openshift-controller-manager\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 17 14:08:34 crc kubenswrapper[4836]: W0217 14:08:34.353341 4836 reflector.go:561] object-"openshift-apiserver"/"trusted-ca-bundle": failed to list *v1.ConfigMap: configmaps "trusted-ca-bundle" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-apiserver": no relationship found between node 'crc' and this object Feb 17 14:08:34 crc kubenswrapper[4836]: W0217 14:08:34.353365 4836 reflector.go:561] object-"openshift-controller-manager"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-controller-manager": no relationship found between node 'crc' and this object Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.353363 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 17 14:08:34 crc kubenswrapper[4836]: E0217 14:08:34.353288 4836 reflector.go:158] "Unhandled Error" err="object-\"openshift-controller-manager\"/\"openshift-global-ca\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-global-ca\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 17 14:08:34 crc kubenswrapper[4836]: E0217 14:08:34.353340 4836 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver\"/\"config\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"config\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 17 14:08:34 crc kubenswrapper[4836]: E0217 14:08:34.353372 4836 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver\"/\"trusted-ca-bundle\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"trusted-ca-bundle\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 17 14:08:34 crc kubenswrapper[4836]: E0217 14:08:34.353377 4836 reflector.go:158] "Unhandled Error" err="object-\"openshift-controller-manager\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 17 14:08:34 crc kubenswrapper[4836]: E0217 14:08:34.353338 4836 reflector.go:158] "Unhandled Error" err="object-\"openshift-controller-manager\"/\"serving-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"serving-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 17 14:08:34 crc kubenswrapper[4836]: W0217 14:08:34.353421 4836 reflector.go:561] object-"openshift-controller-manager"/"config": failed to list *v1.ConfigMap: configmaps "config" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-controller-manager": no relationship found between node 'crc' and this object Feb 17 14:08:34 crc kubenswrapper[4836]: W0217 14:08:34.353451 4836 reflector.go:561] object-"openshift-oauth-apiserver"/"trusted-ca-bundle": failed to list *v1.ConfigMap: configmaps "trusted-ca-bundle" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-oauth-apiserver": no relationship found between node 'crc' and this object Feb 17 14:08:34 crc kubenswrapper[4836]: E0217 14:08:34.353470 4836 reflector.go:158] "Unhandled Error" err="object-\"openshift-oauth-apiserver\"/\"trusted-ca-bundle\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"trusted-ca-bundle\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-oauth-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.353481 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 17 14:08:34 crc kubenswrapper[4836]: E0217 14:08:34.353477 4836 reflector.go:158] "Unhandled Error" err="object-\"openshift-controller-manager\"/\"config\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"config\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 17 14:08:34 crc kubenswrapper[4836]: W0217 14:08:34.353544 4836 reflector.go:561] object-"openshift-machine-api"/"kube-rbac-proxy": failed to list *v1.ConfigMap: configmaps "kube-rbac-proxy" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-machine-api": no relationship found between node 'crc' and this object Feb 17 14:08:34 crc kubenswrapper[4836]: W0217 14:08:34.353590 4836 reflector.go:561] object-"openshift-apiserver"/"audit-1": failed to list *v1.ConfigMap: configmaps "audit-1" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-apiserver": no relationship found between node 'crc' and this object Feb 17 14:08:34 crc kubenswrapper[4836]: E0217 14:08:34.353602 4836 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-api\"/\"kube-rbac-proxy\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-rbac-proxy\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-machine-api\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 17 14:08:34 crc kubenswrapper[4836]: E0217 14:08:34.353612 4836 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver\"/\"audit-1\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"audit-1\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.353615 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.353814 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 17 14:08:34 crc kubenswrapper[4836]: W0217 14:08:34.353865 4836 reflector.go:561] object-"openshift-machine-api"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-machine-api": no relationship found between node 'crc' and this object Feb 17 14:08:34 crc kubenswrapper[4836]: E0217 14:08:34.353885 4836 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-api\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-machine-api\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.353943 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.355177 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.355762 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.356958 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.357352 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.357639 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.357794 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.359117 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.359262 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.359261 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.359524 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.359382 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.359397 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.362397 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-kcm8s"] Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.363190 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-kcm8s" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.365768 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.366059 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.366258 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.366379 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.371003 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.378741 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-5cbbv"] Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.379270 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-5cbbv" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.380512 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-98frx"] Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.390109 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-98frx" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.391837 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.392146 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.392406 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.460698 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.461902 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.462591 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.463412 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.463787 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.464984 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-6rsds"] Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.466245 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.466863 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.479998 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-7klmp"] Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.481459 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-6rsds" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.482097 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-7klmp" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.490541 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.492237 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-2rnsr"] Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.492869 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-8sd2q"] Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.493379 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jf2hd"] Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.493802 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wd65t"] Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.494262 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wd65t" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.497028 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nncbw"] Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.502472 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rq6q9"] Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.502893 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-7htl2"] Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.503390 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7htl2" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.510977 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nncbw" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.511390 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rq6q9" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.511446 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jf2hd" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.512052 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2rnsr" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.512345 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-8sd2q" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.513128 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/95872171-94c1-4b8a-935f-ae180a4e3d11-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-98frx\" (UID: \"95872171-94c1-4b8a-935f-ae180a4e3d11\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-98frx" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.513168 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7qrb\" (UniqueName: \"kubernetes.io/projected/66402e53-3287-45c4-bceb-78fc99836c5b-kube-api-access-q7qrb\") pod \"apiserver-76f77b778f-cnq25\" (UID: \"66402e53-3287-45c4-bceb-78fc99836c5b\") " pod="openshift-apiserver/apiserver-76f77b778f-cnq25" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.513195 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8c77bcf1-4025-4c35-9580-41e9a61195e8-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-5l6x4\" (UID: \"8c77bcf1-4025-4c35-9580-41e9a61195e8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5l6x4" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.513218 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2840702b-d22f-4184-bada-4cd337d79407-config\") pod \"console-operator-58897d9998-kcm8s\" (UID: \"2840702b-d22f-4184-bada-4cd337d79407\") " pod="openshift-console-operator/console-operator-58897d9998-kcm8s" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.513308 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2840702b-d22f-4184-bada-4cd337d79407-trusted-ca\") pod \"console-operator-58897d9998-kcm8s\" (UID: \"2840702b-d22f-4184-bada-4cd337d79407\") " pod="openshift-console-operator/console-operator-58897d9998-kcm8s" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.513338 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/66402e53-3287-45c4-bceb-78fc99836c5b-etcd-client\") pod \"apiserver-76f77b778f-cnq25\" (UID: \"66402e53-3287-45c4-bceb-78fc99836c5b\") " pod="openshift-apiserver/apiserver-76f77b778f-cnq25" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.513560 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/66402e53-3287-45c4-bceb-78fc99836c5b-trusted-ca-bundle\") pod \"apiserver-76f77b778f-cnq25\" (UID: \"66402e53-3287-45c4-bceb-78fc99836c5b\") " pod="openshift-apiserver/apiserver-76f77b778f-cnq25" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.513586 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfgf7\" (UniqueName: \"kubernetes.io/projected/1ecc7c98-e9a3-4850-a741-7e0bcf670e27-kube-api-access-rfgf7\") pod \"machine-api-operator-5694c8668f-jjmwc\" (UID: \"1ecc7c98-e9a3-4850-a741-7e0bcf670e27\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jjmwc" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.513611 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/66402e53-3287-45c4-bceb-78fc99836c5b-node-pullsecrets\") pod \"apiserver-76f77b778f-cnq25\" (UID: \"66402e53-3287-45c4-bceb-78fc99836c5b\") " pod="openshift-apiserver/apiserver-76f77b778f-cnq25" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.513649 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e634dd4f-5615-4dd7-ac0f-8c22a3ee7d40-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-z6h7n\" (UID: \"e634dd4f-5615-4dd7-ac0f-8c22a3ee7d40\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-z6h7n" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.513674 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/1ecc7c98-e9a3-4850-a741-7e0bcf670e27-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-jjmwc\" (UID: \"1ecc7c98-e9a3-4850-a741-7e0bcf670e27\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jjmwc" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.513718 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/66402e53-3287-45c4-bceb-78fc99836c5b-audit-dir\") pod \"apiserver-76f77b778f-cnq25\" (UID: \"66402e53-3287-45c4-bceb-78fc99836c5b\") " pod="openshift-apiserver/apiserver-76f77b778f-cnq25" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.513741 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhngg\" (UniqueName: \"kubernetes.io/projected/5ad14aa6-962d-4f8f-babe-745f65d63560-kube-api-access-fhngg\") pod \"route-controller-manager-6576b87f9c-2mmw4\" (UID: \"5ad14aa6-962d-4f8f-babe-745f65d63560\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2mmw4" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.513766 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/66402e53-3287-45c4-bceb-78fc99836c5b-image-import-ca\") pod \"apiserver-76f77b778f-cnq25\" (UID: \"66402e53-3287-45c4-bceb-78fc99836c5b\") " pod="openshift-apiserver/apiserver-76f77b778f-cnq25" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.513791 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e634dd4f-5615-4dd7-ac0f-8c22a3ee7d40-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-z6h7n\" (UID: \"e634dd4f-5615-4dd7-ac0f-8c22a3ee7d40\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-z6h7n" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.513844 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/444e52ba-f376-40d9-b32f-aa5b523e4134-auth-proxy-config\") pod \"machine-approver-56656f9798-cwlxz\" (UID: \"444e52ba-f376-40d9-b32f-aa5b523e4134\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cwlxz" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.513876 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5ad14aa6-962d-4f8f-babe-745f65d63560-client-ca\") pod \"route-controller-manager-6576b87f9c-2mmw4\" (UID: \"5ad14aa6-962d-4f8f-babe-745f65d63560\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2mmw4" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.513903 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dptq7\" (UniqueName: \"kubernetes.io/projected/444e52ba-f376-40d9-b32f-aa5b523e4134-kube-api-access-dptq7\") pod \"machine-approver-56656f9798-cwlxz\" (UID: \"444e52ba-f376-40d9-b32f-aa5b523e4134\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cwlxz" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.513928 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/66402e53-3287-45c4-bceb-78fc99836c5b-audit\") pod \"apiserver-76f77b778f-cnq25\" (UID: \"66402e53-3287-45c4-bceb-78fc99836c5b\") " pod="openshift-apiserver/apiserver-76f77b778f-cnq25" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.513954 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrncr\" (UniqueName: \"kubernetes.io/projected/95872171-94c1-4b8a-935f-ae180a4e3d11-kube-api-access-wrncr\") pod \"authentication-operator-69f744f599-98frx\" (UID: \"95872171-94c1-4b8a-935f-ae180a4e3d11\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-98frx" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.513980 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/95872171-94c1-4b8a-935f-ae180a4e3d11-service-ca-bundle\") pod \"authentication-operator-69f744f599-98frx\" (UID: \"95872171-94c1-4b8a-935f-ae180a4e3d11\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-98frx" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.514002 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/66402e53-3287-45c4-bceb-78fc99836c5b-encryption-config\") pod \"apiserver-76f77b778f-cnq25\" (UID: \"66402e53-3287-45c4-bceb-78fc99836c5b\") " pod="openshift-apiserver/apiserver-76f77b778f-cnq25" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.514036 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjcv7\" (UniqueName: \"kubernetes.io/projected/d9eb5c8b-f3c7-4068-82c7-28520f6905c6-kube-api-access-gjcv7\") pod \"downloads-7954f5f757-5cbbv\" (UID: \"d9eb5c8b-f3c7-4068-82c7-28520f6905c6\") " pod="openshift-console/downloads-7954f5f757-5cbbv" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.514077 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/95872171-94c1-4b8a-935f-ae180a4e3d11-serving-cert\") pod \"authentication-operator-69f744f599-98frx\" (UID: \"95872171-94c1-4b8a-935f-ae180a4e3d11\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-98frx" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.514100 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/66402e53-3287-45c4-bceb-78fc99836c5b-serving-cert\") pod \"apiserver-76f77b778f-cnq25\" (UID: \"66402e53-3287-45c4-bceb-78fc99836c5b\") " pod="openshift-apiserver/apiserver-76f77b778f-cnq25" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.514137 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c77bcf1-4025-4c35-9580-41e9a61195e8-config\") pod \"controller-manager-879f6c89f-5l6x4\" (UID: \"8c77bcf1-4025-4c35-9580-41e9a61195e8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5l6x4" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.514161 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjskd\" (UniqueName: \"kubernetes.io/projected/e634dd4f-5615-4dd7-ac0f-8c22a3ee7d40-kube-api-access-xjskd\") pod \"apiserver-7bbb656c7d-z6h7n\" (UID: \"e634dd4f-5615-4dd7-ac0f-8c22a3ee7d40\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-z6h7n" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.514195 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95872171-94c1-4b8a-935f-ae180a4e3d11-config\") pod \"authentication-operator-69f744f599-98frx\" (UID: \"95872171-94c1-4b8a-935f-ae180a4e3d11\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-98frx" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.514214 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2840702b-d22f-4184-bada-4cd337d79407-serving-cert\") pod \"console-operator-58897d9998-kcm8s\" (UID: \"2840702b-d22f-4184-bada-4cd337d79407\") " pod="openshift-console-operator/console-operator-58897d9998-kcm8s" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.514245 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8c77bcf1-4025-4c35-9580-41e9a61195e8-serving-cert\") pod \"controller-manager-879f6c89f-5l6x4\" (UID: \"8c77bcf1-4025-4c35-9580-41e9a61195e8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5l6x4" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.514266 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/444e52ba-f376-40d9-b32f-aa5b523e4134-config\") pod \"machine-approver-56656f9798-cwlxz\" (UID: \"444e52ba-f376-40d9-b32f-aa5b523e4134\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cwlxz" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.515970 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.516398 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.516819 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.517092 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.517385 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.517592 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8c77bcf1-4025-4c35-9580-41e9a61195e8-client-ca\") pod \"controller-manager-879f6c89f-5l6x4\" (UID: \"8c77bcf1-4025-4c35-9580-41e9a61195e8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5l6x4" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.517659 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e634dd4f-5615-4dd7-ac0f-8c22a3ee7d40-serving-cert\") pod \"apiserver-7bbb656c7d-z6h7n\" (UID: \"e634dd4f-5615-4dd7-ac0f-8c22a3ee7d40\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-z6h7n" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.517696 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.517709 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2wqh\" (UniqueName: \"kubernetes.io/projected/2840702b-d22f-4184-bada-4cd337d79407-kube-api-access-z2wqh\") pod \"console-operator-58897d9998-kcm8s\" (UID: \"2840702b-d22f-4184-bada-4cd337d79407\") " pod="openshift-console-operator/console-operator-58897d9998-kcm8s" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.517737 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/444e52ba-f376-40d9-b32f-aa5b523e4134-machine-approver-tls\") pod \"machine-approver-56656f9798-cwlxz\" (UID: \"444e52ba-f376-40d9-b32f-aa5b523e4134\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cwlxz" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.517822 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.517819 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e634dd4f-5615-4dd7-ac0f-8c22a3ee7d40-audit-dir\") pod \"apiserver-7bbb656c7d-z6h7n\" (UID: \"e634dd4f-5615-4dd7-ac0f-8c22a3ee7d40\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-z6h7n" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.517903 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ecc7c98-e9a3-4850-a741-7e0bcf670e27-config\") pod \"machine-api-operator-5694c8668f-jjmwc\" (UID: \"1ecc7c98-e9a3-4850-a741-7e0bcf670e27\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jjmwc" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.517972 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66402e53-3287-45c4-bceb-78fc99836c5b-config\") pod \"apiserver-76f77b778f-cnq25\" (UID: \"66402e53-3287-45c4-bceb-78fc99836c5b\") " pod="openshift-apiserver/apiserver-76f77b778f-cnq25" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.517997 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e634dd4f-5615-4dd7-ac0f-8c22a3ee7d40-etcd-client\") pod \"apiserver-7bbb656c7d-z6h7n\" (UID: \"e634dd4f-5615-4dd7-ac0f-8c22a3ee7d40\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-z6h7n" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.518056 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nj6ml\" (UniqueName: \"kubernetes.io/projected/8c77bcf1-4025-4c35-9580-41e9a61195e8-kube-api-access-nj6ml\") pod \"controller-manager-879f6c89f-5l6x4\" (UID: \"8c77bcf1-4025-4c35-9580-41e9a61195e8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5l6x4" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.518082 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ad14aa6-962d-4f8f-babe-745f65d63560-config\") pod \"route-controller-manager-6576b87f9c-2mmw4\" (UID: \"5ad14aa6-962d-4f8f-babe-745f65d63560\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2mmw4" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.518103 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e634dd4f-5615-4dd7-ac0f-8c22a3ee7d40-audit-policies\") pod \"apiserver-7bbb656c7d-z6h7n\" (UID: \"e634dd4f-5615-4dd7-ac0f-8c22a3ee7d40\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-z6h7n" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.518154 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5ad14aa6-962d-4f8f-babe-745f65d63560-serving-cert\") pod \"route-controller-manager-6576b87f9c-2mmw4\" (UID: \"5ad14aa6-962d-4f8f-babe-745f65d63560\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2mmw4" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.518179 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/1ecc7c98-e9a3-4850-a741-7e0bcf670e27-images\") pod \"machine-api-operator-5694c8668f-jjmwc\" (UID: \"1ecc7c98-e9a3-4850-a741-7e0bcf670e27\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jjmwc" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.518228 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/66402e53-3287-45c4-bceb-78fc99836c5b-etcd-serving-ca\") pod \"apiserver-76f77b778f-cnq25\" (UID: \"66402e53-3287-45c4-bceb-78fc99836c5b\") " pod="openshift-apiserver/apiserver-76f77b778f-cnq25" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.518251 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e634dd4f-5615-4dd7-ac0f-8c22a3ee7d40-encryption-config\") pod \"apiserver-7bbb656c7d-z6h7n\" (UID: \"e634dd4f-5615-4dd7-ac0f-8c22a3ee7d40\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-z6h7n" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.518890 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.521509 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-6zspj"] Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.518972 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.519002 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.519073 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.519356 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.519485 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.519516 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.519540 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.519812 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.519832 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.519879 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.519940 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.520163 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.520215 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.520323 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.520762 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.521386 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.516043 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.521479 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.521626 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.521707 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.524210 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-gtjx7"] Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.524770 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-l5kfm"] Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.524916 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-6zspj" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.525427 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x2x76"] Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.525945 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-khbdr"] Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.532418 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sknds"] Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.533189 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sknds" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.526632 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-gtjx7" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.526748 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-l5kfm" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.526828 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x2x76" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.526512 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.532813 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-khbdr" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.528634 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.529561 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.529561 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.529615 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.529740 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.529851 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.530000 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.530141 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.538266 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-blr59"] Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.539089 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-blr59" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.556664 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.556986 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.557344 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.557679 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.560860 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wsdcq"] Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.562066 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.562530 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wsdcq" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.580440 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.586555 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.588247 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.591032 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.592321 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.594353 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wgzvh"] Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.597343 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2xtf8"] Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.597750 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wgzvh" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.597944 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-qc9j6"] Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.598537 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-ch9j6"] Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.599049 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2xtf8" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.599107 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-72n7k"] Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.599157 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-ch9j6" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.599194 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qc9j6" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.600168 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522280-c8fps"] Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.600602 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522280-c8fps" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.600741 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-72n7k" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.602036 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-fqzrl"] Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.602582 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4bfcw"] Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.603059 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-fqzrl" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.603409 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4bfcw" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.604597 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dsrq8"] Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.606545 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dsrq8" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.606573 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.606711 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-5vhz9"] Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.607230 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-5vhz9" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.607928 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jhzxl"] Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.609215 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jhzxl" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.609346 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-jjmwc"] Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.610454 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-ld8ls"] Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.611010 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-ld8ls" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.611693 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-98frx"] Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.612671 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-2mmw4"] Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.613682 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-z6h7n"] Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.615878 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-cnq25"] Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.616017 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-5cbbv"] Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.617352 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-7klmp"] Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.618789 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-8sd2q"] Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.619068 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/1ecc7c98-e9a3-4850-a741-7e0bcf670e27-images\") pod \"machine-api-operator-5694c8668f-jjmwc\" (UID: \"1ecc7c98-e9a3-4850-a741-7e0bcf670e27\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jjmwc" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.619108 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/66402e53-3287-45c4-bceb-78fc99836c5b-etcd-serving-ca\") pod \"apiserver-76f77b778f-cnq25\" (UID: \"66402e53-3287-45c4-bceb-78fc99836c5b\") " pod="openshift-apiserver/apiserver-76f77b778f-cnq25" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.619132 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e634dd4f-5615-4dd7-ac0f-8c22a3ee7d40-encryption-config\") pod \"apiserver-7bbb656c7d-z6h7n\" (UID: \"e634dd4f-5615-4dd7-ac0f-8c22a3ee7d40\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-z6h7n" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.619159 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2840702b-d22f-4184-bada-4cd337d79407-config\") pod \"console-operator-58897d9998-kcm8s\" (UID: \"2840702b-d22f-4184-bada-4cd337d79407\") " pod="openshift-console-operator/console-operator-58897d9998-kcm8s" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.619186 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-6rsds\" (UID: \"c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7\") " pod="openshift-authentication/oauth-openshift-558db77b4-6rsds" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.619215 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/95872171-94c1-4b8a-935f-ae180a4e3d11-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-98frx\" (UID: \"95872171-94c1-4b8a-935f-ae180a4e3d11\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-98frx" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.619238 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7qrb\" (UniqueName: \"kubernetes.io/projected/66402e53-3287-45c4-bceb-78fc99836c5b-kube-api-access-q7qrb\") pod \"apiserver-76f77b778f-cnq25\" (UID: \"66402e53-3287-45c4-bceb-78fc99836c5b\") " pod="openshift-apiserver/apiserver-76f77b778f-cnq25" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.619261 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8c77bcf1-4025-4c35-9580-41e9a61195e8-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-5l6x4\" (UID: \"8c77bcf1-4025-4c35-9580-41e9a61195e8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5l6x4" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.619312 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3bb5e0b8-9179-4570-a3d8-acaa80b2c884-metrics-tls\") pod \"ingress-operator-5b745b69d9-7htl2\" (UID: \"3bb5e0b8-9179-4570-a3d8-acaa80b2c884\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7htl2" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.619339 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-6rsds\" (UID: \"c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7\") " pod="openshift-authentication/oauth-openshift-558db77b4-6rsds" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.619364 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2840702b-d22f-4184-bada-4cd337d79407-trusted-ca\") pod \"console-operator-58897d9998-kcm8s\" (UID: \"2840702b-d22f-4184-bada-4cd337d79407\") " pod="openshift-console-operator/console-operator-58897d9998-kcm8s" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.619386 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7-audit-policies\") pod \"oauth-openshift-558db77b4-6rsds\" (UID: \"c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7\") " pod="openshift-authentication/oauth-openshift-558db77b4-6rsds" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.619411 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/65684d1d-5242-464d-8caf-ad4866bf6a86-config\") pod \"etcd-operator-b45778765-7klmp\" (UID: \"65684d1d-5242-464d-8caf-ad4866bf6a86\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7klmp" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.619441 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/18ec8466-f311-4f81-ae38-48635b000ced-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-nncbw\" (UID: \"18ec8466-f311-4f81-ae38-48635b000ced\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nncbw" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.619468 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/66402e53-3287-45c4-bceb-78fc99836c5b-etcd-client\") pod \"apiserver-76f77b778f-cnq25\" (UID: \"66402e53-3287-45c4-bceb-78fc99836c5b\") " pod="openshift-apiserver/apiserver-76f77b778f-cnq25" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.619495 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/66402e53-3287-45c4-bceb-78fc99836c5b-trusted-ca-bundle\") pod \"apiserver-76f77b778f-cnq25\" (UID: \"66402e53-3287-45c4-bceb-78fc99836c5b\") " pod="openshift-apiserver/apiserver-76f77b778f-cnq25" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.619525 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3bb5e0b8-9179-4570-a3d8-acaa80b2c884-trusted-ca\") pod \"ingress-operator-5b745b69d9-7htl2\" (UID: \"3bb5e0b8-9179-4570-a3d8-acaa80b2c884\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7htl2" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.619550 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rfgf7\" (UniqueName: \"kubernetes.io/projected/1ecc7c98-e9a3-4850-a741-7e0bcf670e27-kube-api-access-rfgf7\") pod \"machine-api-operator-5694c8668f-jjmwc\" (UID: \"1ecc7c98-e9a3-4850-a741-7e0bcf670e27\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jjmwc" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.619571 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/66402e53-3287-45c4-bceb-78fc99836c5b-node-pullsecrets\") pod \"apiserver-76f77b778f-cnq25\" (UID: \"66402e53-3287-45c4-bceb-78fc99836c5b\") " pod="openshift-apiserver/apiserver-76f77b778f-cnq25" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.619596 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e634dd4f-5615-4dd7-ac0f-8c22a3ee7d40-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-z6h7n\" (UID: \"e634dd4f-5615-4dd7-ac0f-8c22a3ee7d40\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-z6h7n" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.619705 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qb8wv\" (UniqueName: \"kubernetes.io/projected/c26f912f-f640-4b4c-ab61-dd2a163f12ab-kube-api-access-qb8wv\") pod \"openshift-config-operator-7777fb866f-2rnsr\" (UID: \"c26f912f-f640-4b4c-ab61-dd2a163f12ab\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2rnsr" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.619743 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/1ecc7c98-e9a3-4850-a741-7e0bcf670e27-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-jjmwc\" (UID: \"1ecc7c98-e9a3-4850-a741-7e0bcf670e27\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jjmwc" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.619768 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkc7h\" (UniqueName: \"kubernetes.io/projected/3bb5e0b8-9179-4570-a3d8-acaa80b2c884-kube-api-access-lkc7h\") pod \"ingress-operator-5b745b69d9-7htl2\" (UID: \"3bb5e0b8-9179-4570-a3d8-acaa80b2c884\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7htl2" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.619809 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/66402e53-3287-45c4-bceb-78fc99836c5b-audit-dir\") pod \"apiserver-76f77b778f-cnq25\" (UID: \"66402e53-3287-45c4-bceb-78fc99836c5b\") " pod="openshift-apiserver/apiserver-76f77b778f-cnq25" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.619834 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhngg\" (UniqueName: \"kubernetes.io/projected/5ad14aa6-962d-4f8f-babe-745f65d63560-kube-api-access-fhngg\") pod \"route-controller-manager-6576b87f9c-2mmw4\" (UID: \"5ad14aa6-962d-4f8f-babe-745f65d63560\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2mmw4" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.619858 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a3f2789-dd41-4e95-8174-db3a40098b0e-config\") pod \"kube-controller-manager-operator-78b949d7b-wd65t\" (UID: \"9a3f2789-dd41-4e95-8174-db3a40098b0e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wd65t" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.619881 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7-audit-dir\") pod \"oauth-openshift-558db77b4-6rsds\" (UID: \"c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7\") " pod="openshift-authentication/oauth-openshift-558db77b4-6rsds" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.619903 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/66402e53-3287-45c4-bceb-78fc99836c5b-image-import-ca\") pod \"apiserver-76f77b778f-cnq25\" (UID: \"66402e53-3287-45c4-bceb-78fc99836c5b\") " pod="openshift-apiserver/apiserver-76f77b778f-cnq25" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.619927 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e634dd4f-5615-4dd7-ac0f-8c22a3ee7d40-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-z6h7n\" (UID: \"e634dd4f-5615-4dd7-ac0f-8c22a3ee7d40\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-z6h7n" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.619958 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/921ecdc3-b5f3-44e4-9300-d25342d944d8-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-jf2hd\" (UID: \"921ecdc3-b5f3-44e4-9300-d25342d944d8\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jf2hd" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.620013 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/444e52ba-f376-40d9-b32f-aa5b523e4134-auth-proxy-config\") pod \"machine-approver-56656f9798-cwlxz\" (UID: \"444e52ba-f376-40d9-b32f-aa5b523e4134\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cwlxz" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.620039 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/65684d1d-5242-464d-8caf-ad4866bf6a86-etcd-ca\") pod \"etcd-operator-b45778765-7klmp\" (UID: \"65684d1d-5242-464d-8caf-ad4866bf6a86\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7klmp" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.620079 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcvr7\" (UniqueName: \"kubernetes.io/projected/f8080d32-cbe7-4b02-8791-d9f1f9aca269-kube-api-access-dcvr7\") pod \"kube-storage-version-migrator-operator-b67b599dd-rq6q9\" (UID: \"f8080d32-cbe7-4b02-8791-d9f1f9aca269\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rq6q9" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.620105 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5ad14aa6-962d-4f8f-babe-745f65d63560-client-ca\") pod \"route-controller-manager-6576b87f9c-2mmw4\" (UID: \"5ad14aa6-962d-4f8f-babe-745f65d63560\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2mmw4" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.620129 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dptq7\" (UniqueName: \"kubernetes.io/projected/444e52ba-f376-40d9-b32f-aa5b523e4134-kube-api-access-dptq7\") pod \"machine-approver-56656f9798-cwlxz\" (UID: \"444e52ba-f376-40d9-b32f-aa5b523e4134\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cwlxz" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.620162 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/66402e53-3287-45c4-bceb-78fc99836c5b-audit\") pod \"apiserver-76f77b778f-cnq25\" (UID: \"66402e53-3287-45c4-bceb-78fc99836c5b\") " pod="openshift-apiserver/apiserver-76f77b778f-cnq25" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.620187 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-6rsds\" (UID: \"c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7\") " pod="openshift-authentication/oauth-openshift-558db77b4-6rsds" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.620232 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18ec8466-f311-4f81-ae38-48635b000ced-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-nncbw\" (UID: \"18ec8466-f311-4f81-ae38-48635b000ced\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nncbw" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.620269 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-6rsds\" (UID: \"c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7\") " pod="openshift-authentication/oauth-openshift-558db77b4-6rsds" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.620313 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-6rsds\" (UID: \"c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7\") " pod="openshift-authentication/oauth-openshift-558db77b4-6rsds" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.620342 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrncr\" (UniqueName: \"kubernetes.io/projected/95872171-94c1-4b8a-935f-ae180a4e3d11-kube-api-access-wrncr\") pod \"authentication-operator-69f744f599-98frx\" (UID: \"95872171-94c1-4b8a-935f-ae180a4e3d11\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-98frx" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.620544 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/65684d1d-5242-464d-8caf-ad4866bf6a86-serving-cert\") pod \"etcd-operator-b45778765-7klmp\" (UID: \"65684d1d-5242-464d-8caf-ad4866bf6a86\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7klmp" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.620565 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-6rsds\" (UID: \"c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7\") " pod="openshift-authentication/oauth-openshift-558db77b4-6rsds" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.620590 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/95872171-94c1-4b8a-935f-ae180a4e3d11-service-ca-bundle\") pod \"authentication-operator-69f744f599-98frx\" (UID: \"95872171-94c1-4b8a-935f-ae180a4e3d11\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-98frx" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.620611 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/66402e53-3287-45c4-bceb-78fc99836c5b-encryption-config\") pod \"apiserver-76f77b778f-cnq25\" (UID: \"66402e53-3287-45c4-bceb-78fc99836c5b\") " pod="openshift-apiserver/apiserver-76f77b778f-cnq25" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.620634 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjcv7\" (UniqueName: \"kubernetes.io/projected/d9eb5c8b-f3c7-4068-82c7-28520f6905c6-kube-api-access-gjcv7\") pod \"downloads-7954f5f757-5cbbv\" (UID: \"d9eb5c8b-f3c7-4068-82c7-28520f6905c6\") " pod="openshift-console/downloads-7954f5f757-5cbbv" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.620654 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/171e2af0-2993-4cd3-942f-043bccca2813-metrics-tls\") pod \"dns-operator-744455d44c-8sd2q\" (UID: \"171e2af0-2993-4cd3-942f-043bccca2813\") " pod="openshift-dns-operator/dns-operator-744455d44c-8sd2q" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.620688 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/95872171-94c1-4b8a-935f-ae180a4e3d11-serving-cert\") pod \"authentication-operator-69f744f599-98frx\" (UID: \"95872171-94c1-4b8a-935f-ae180a4e3d11\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-98frx" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.620724 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/66402e53-3287-45c4-bceb-78fc99836c5b-serving-cert\") pod \"apiserver-76f77b778f-cnq25\" (UID: \"66402e53-3287-45c4-bceb-78fc99836c5b\") " pod="openshift-apiserver/apiserver-76f77b778f-cnq25" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.620745 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9a3f2789-dd41-4e95-8174-db3a40098b0e-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-wd65t\" (UID: \"9a3f2789-dd41-4e95-8174-db3a40098b0e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wd65t" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.620784 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c26f912f-f640-4b4c-ab61-dd2a163f12ab-serving-cert\") pod \"openshift-config-operator-7777fb866f-2rnsr\" (UID: \"c26f912f-f640-4b4c-ab61-dd2a163f12ab\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2rnsr" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.620807 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f8080d32-cbe7-4b02-8791-d9f1f9aca269-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-rq6q9\" (UID: \"f8080d32-cbe7-4b02-8791-d9f1f9aca269\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rq6q9" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.620838 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c77bcf1-4025-4c35-9580-41e9a61195e8-config\") pod \"controller-manager-879f6c89f-5l6x4\" (UID: \"8c77bcf1-4025-4c35-9580-41e9a61195e8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5l6x4" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.620873 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjskd\" (UniqueName: \"kubernetes.io/projected/e634dd4f-5615-4dd7-ac0f-8c22a3ee7d40-kube-api-access-xjskd\") pod \"apiserver-7bbb656c7d-z6h7n\" (UID: \"e634dd4f-5615-4dd7-ac0f-8c22a3ee7d40\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-z6h7n" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.620902 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/65684d1d-5242-464d-8caf-ad4866bf6a86-etcd-client\") pod \"etcd-operator-b45778765-7klmp\" (UID: \"65684d1d-5242-464d-8caf-ad4866bf6a86\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7klmp" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.620926 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/65684d1d-5242-464d-8caf-ad4866bf6a86-etcd-service-ca\") pod \"etcd-operator-b45778765-7klmp\" (UID: \"65684d1d-5242-464d-8caf-ad4866bf6a86\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7klmp" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.620951 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-6rsds\" (UID: \"c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7\") " pod="openshift-authentication/oauth-openshift-558db77b4-6rsds" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.620974 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6htjx\" (UniqueName: \"kubernetes.io/projected/c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7-kube-api-access-6htjx\") pod \"oauth-openshift-558db77b4-6rsds\" (UID: \"c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7\") " pod="openshift-authentication/oauth-openshift-558db77b4-6rsds" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.620999 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95872171-94c1-4b8a-935f-ae180a4e3d11-config\") pod \"authentication-operator-69f744f599-98frx\" (UID: \"95872171-94c1-4b8a-935f-ae180a4e3d11\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-98frx" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.621023 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2840702b-d22f-4184-bada-4cd337d79407-serving-cert\") pod \"console-operator-58897d9998-kcm8s\" (UID: \"2840702b-d22f-4184-bada-4cd337d79407\") " pod="openshift-console-operator/console-operator-58897d9998-kcm8s" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.621045 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8c77bcf1-4025-4c35-9580-41e9a61195e8-serving-cert\") pod \"controller-manager-879f6c89f-5l6x4\" (UID: \"8c77bcf1-4025-4c35-9580-41e9a61195e8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5l6x4" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.621080 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/444e52ba-f376-40d9-b32f-aa5b523e4134-config\") pod \"machine-approver-56656f9798-cwlxz\" (UID: \"444e52ba-f376-40d9-b32f-aa5b523e4134\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cwlxz" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.621109 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3bb5e0b8-9179-4570-a3d8-acaa80b2c884-bound-sa-token\") pod \"ingress-operator-5b745b69d9-7htl2\" (UID: \"3bb5e0b8-9179-4570-a3d8-acaa80b2c884\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7htl2" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.621142 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5z42f\" (UniqueName: \"kubernetes.io/projected/171e2af0-2993-4cd3-942f-043bccca2813-kube-api-access-5z42f\") pod \"dns-operator-744455d44c-8sd2q\" (UID: \"171e2af0-2993-4cd3-942f-043bccca2813\") " pod="openshift-dns-operator/dns-operator-744455d44c-8sd2q" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.621169 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8c77bcf1-4025-4c35-9580-41e9a61195e8-client-ca\") pod \"controller-manager-879f6c89f-5l6x4\" (UID: \"8c77bcf1-4025-4c35-9580-41e9a61195e8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5l6x4" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.621205 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8080d32-cbe7-4b02-8791-d9f1f9aca269-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-rq6q9\" (UID: \"f8080d32-cbe7-4b02-8791-d9f1f9aca269\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rq6q9" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.621229 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e634dd4f-5615-4dd7-ac0f-8c22a3ee7d40-serving-cert\") pod \"apiserver-7bbb656c7d-z6h7n\" (UID: \"e634dd4f-5615-4dd7-ac0f-8c22a3ee7d40\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-z6h7n" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.621250 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2wqh\" (UniqueName: \"kubernetes.io/projected/2840702b-d22f-4184-bada-4cd337d79407-kube-api-access-z2wqh\") pod \"console-operator-58897d9998-kcm8s\" (UID: \"2840702b-d22f-4184-bada-4cd337d79407\") " pod="openshift-console-operator/console-operator-58897d9998-kcm8s" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.621263 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2840702b-d22f-4184-bada-4cd337d79407-config\") pod \"console-operator-58897d9998-kcm8s\" (UID: \"2840702b-d22f-4184-bada-4cd337d79407\") " pod="openshift-console-operator/console-operator-58897d9998-kcm8s" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.621273 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/444e52ba-f376-40d9-b32f-aa5b523e4134-machine-approver-tls\") pod \"machine-approver-56656f9798-cwlxz\" (UID: \"444e52ba-f376-40d9-b32f-aa5b523e4134\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cwlxz" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.621349 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/66402e53-3287-45c4-bceb-78fc99836c5b-node-pullsecrets\") pod \"apiserver-76f77b778f-cnq25\" (UID: \"66402e53-3287-45c4-bceb-78fc99836c5b\") " pod="openshift-apiserver/apiserver-76f77b778f-cnq25" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.621402 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-5l6x4"] Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.622705 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/95872171-94c1-4b8a-935f-ae180a4e3d11-service-ca-bundle\") pod \"authentication-operator-69f744f599-98frx\" (UID: \"95872171-94c1-4b8a-935f-ae180a4e3d11\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-98frx" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.622742 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e634dd4f-5615-4dd7-ac0f-8c22a3ee7d40-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-z6h7n\" (UID: \"e634dd4f-5615-4dd7-ac0f-8c22a3ee7d40\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-z6h7n" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.622950 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/66402e53-3287-45c4-bceb-78fc99836c5b-audit-dir\") pod \"apiserver-76f77b778f-cnq25\" (UID: \"66402e53-3287-45c4-bceb-78fc99836c5b\") " pod="openshift-apiserver/apiserver-76f77b778f-cnq25" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.623527 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9mxp\" (UniqueName: \"kubernetes.io/projected/921ecdc3-b5f3-44e4-9300-d25342d944d8-kube-api-access-n9mxp\") pod \"cluster-samples-operator-665b6dd947-jf2hd\" (UID: \"921ecdc3-b5f3-44e4-9300-d25342d944d8\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jf2hd" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.623565 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-6rsds\" (UID: \"c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7\") " pod="openshift-authentication/oauth-openshift-558db77b4-6rsds" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.623614 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-6rsds\" (UID: \"c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7\") " pod="openshift-authentication/oauth-openshift-558db77b4-6rsds" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.623670 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-6rsds\" (UID: \"c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7\") " pod="openshift-authentication/oauth-openshift-558db77b4-6rsds" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.623702 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-6rsds\" (UID: \"c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7\") " pod="openshift-authentication/oauth-openshift-558db77b4-6rsds" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.623765 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e634dd4f-5615-4dd7-ac0f-8c22a3ee7d40-audit-dir\") pod \"apiserver-7bbb656c7d-z6h7n\" (UID: \"e634dd4f-5615-4dd7-ac0f-8c22a3ee7d40\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-z6h7n" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.648008 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/444e52ba-f376-40d9-b32f-aa5b523e4134-config\") pod \"machine-approver-56656f9798-cwlxz\" (UID: \"444e52ba-f376-40d9-b32f-aa5b523e4134\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cwlxz" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.646361 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sknds"] Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.623820 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/444e52ba-f376-40d9-b32f-aa5b523e4134-auth-proxy-config\") pod \"machine-approver-56656f9798-cwlxz\" (UID: \"444e52ba-f376-40d9-b32f-aa5b523e4134\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cwlxz" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.647453 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/95872171-94c1-4b8a-935f-ae180a4e3d11-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-98frx\" (UID: \"95872171-94c1-4b8a-935f-ae180a4e3d11\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-98frx" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.648117 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-7htl2"] Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.647896 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e634dd4f-5615-4dd7-ac0f-8c22a3ee7d40-audit-dir\") pod \"apiserver-7bbb656c7d-z6h7n\" (UID: \"e634dd4f-5615-4dd7-ac0f-8c22a3ee7d40\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-z6h7n" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.648253 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9a3f2789-dd41-4e95-8174-db3a40098b0e-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-wd65t\" (UID: \"9a3f2789-dd41-4e95-8174-db3a40098b0e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wd65t" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.648334 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95872171-94c1-4b8a-935f-ae180a4e3d11-config\") pod \"authentication-operator-69f744f599-98frx\" (UID: \"95872171-94c1-4b8a-935f-ae180a4e3d11\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-98frx" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.648382 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8c77bcf1-4025-4c35-9580-41e9a61195e8-client-ca\") pod \"controller-manager-879f6c89f-5l6x4\" (UID: \"8c77bcf1-4025-4c35-9580-41e9a61195e8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5l6x4" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.648413 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ecc7c98-e9a3-4850-a741-7e0bcf670e27-config\") pod \"machine-api-operator-5694c8668f-jjmwc\" (UID: \"1ecc7c98-e9a3-4850-a741-7e0bcf670e27\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jjmwc" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.648509 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66402e53-3287-45c4-bceb-78fc99836c5b-config\") pod \"apiserver-76f77b778f-cnq25\" (UID: \"66402e53-3287-45c4-bceb-78fc99836c5b\") " pod="openshift-apiserver/apiserver-76f77b778f-cnq25" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.648669 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2840702b-d22f-4184-bada-4cd337d79407-trusted-ca\") pod \"console-operator-58897d9998-kcm8s\" (UID: \"2840702b-d22f-4184-bada-4cd337d79407\") " pod="openshift-console-operator/console-operator-58897d9998-kcm8s" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.650888 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/95872171-94c1-4b8a-935f-ae180a4e3d11-serving-cert\") pod \"authentication-operator-69f744f599-98frx\" (UID: \"95872171-94c1-4b8a-935f-ae180a4e3d11\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-98frx" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.652180 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2840702b-d22f-4184-bada-4cd337d79407-serving-cert\") pod \"console-operator-58897d9998-kcm8s\" (UID: \"2840702b-d22f-4184-bada-4cd337d79407\") " pod="openshift-console-operator/console-operator-58897d9998-kcm8s" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.652478 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5ad14aa6-962d-4f8f-babe-745f65d63560-client-ca\") pod \"route-controller-manager-6576b87f9c-2mmw4\" (UID: \"5ad14aa6-962d-4f8f-babe-745f65d63560\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2mmw4" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.652565 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.652817 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/444e52ba-f376-40d9-b32f-aa5b523e4134-machine-approver-tls\") pod \"machine-approver-56656f9798-cwlxz\" (UID: \"444e52ba-f376-40d9-b32f-aa5b523e4134\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cwlxz" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.653666 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.660748 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e634dd4f-5615-4dd7-ac0f-8c22a3ee7d40-encryption-config\") pod \"apiserver-7bbb656c7d-z6h7n\" (UID: \"e634dd4f-5615-4dd7-ac0f-8c22a3ee7d40\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-z6h7n" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.648596 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6jrc\" (UniqueName: \"kubernetes.io/projected/18ec8466-f311-4f81-ae38-48635b000ced-kube-api-access-g6jrc\") pod \"openshift-controller-manager-operator-756b6f6bc6-nncbw\" (UID: \"18ec8466-f311-4f81-ae38-48635b000ced\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nncbw" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.664229 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e634dd4f-5615-4dd7-ac0f-8c22a3ee7d40-etcd-client\") pod \"apiserver-7bbb656c7d-z6h7n\" (UID: \"e634dd4f-5615-4dd7-ac0f-8c22a3ee7d40\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-z6h7n" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.664274 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/c26f912f-f640-4b4c-ab61-dd2a163f12ab-available-featuregates\") pod \"openshift-config-operator-7777fb866f-2rnsr\" (UID: \"c26f912f-f640-4b4c-ab61-dd2a163f12ab\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2rnsr" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.664326 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nj6ml\" (UniqueName: \"kubernetes.io/projected/8c77bcf1-4025-4c35-9580-41e9a61195e8-kube-api-access-nj6ml\") pod \"controller-manager-879f6c89f-5l6x4\" (UID: \"8c77bcf1-4025-4c35-9580-41e9a61195e8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5l6x4" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.664353 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ad14aa6-962d-4f8f-babe-745f65d63560-config\") pod \"route-controller-manager-6576b87f9c-2mmw4\" (UID: \"5ad14aa6-962d-4f8f-babe-745f65d63560\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2mmw4" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.664379 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkl9p\" (UniqueName: \"kubernetes.io/projected/65684d1d-5242-464d-8caf-ad4866bf6a86-kube-api-access-jkl9p\") pod \"etcd-operator-b45778765-7klmp\" (UID: \"65684d1d-5242-464d-8caf-ad4866bf6a86\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7klmp" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.664407 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e634dd4f-5615-4dd7-ac0f-8c22a3ee7d40-audit-policies\") pod \"apiserver-7bbb656c7d-z6h7n\" (UID: \"e634dd4f-5615-4dd7-ac0f-8c22a3ee7d40\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-z6h7n" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.664429 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5ad14aa6-962d-4f8f-babe-745f65d63560-serving-cert\") pod \"route-controller-manager-6576b87f9c-2mmw4\" (UID: \"5ad14aa6-962d-4f8f-babe-745f65d63560\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2mmw4" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.665523 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e634dd4f-5615-4dd7-ac0f-8c22a3ee7d40-audit-policies\") pod \"apiserver-7bbb656c7d-z6h7n\" (UID: \"e634dd4f-5615-4dd7-ac0f-8c22a3ee7d40\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-z6h7n" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.666464 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ad14aa6-962d-4f8f-babe-745f65d63560-config\") pod \"route-controller-manager-6576b87f9c-2mmw4\" (UID: \"5ad14aa6-962d-4f8f-babe-745f65d63560\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2mmw4" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.667493 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.667917 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e634dd4f-5615-4dd7-ac0f-8c22a3ee7d40-etcd-client\") pod \"apiserver-7bbb656c7d-z6h7n\" (UID: \"e634dd4f-5615-4dd7-ac0f-8c22a3ee7d40\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-z6h7n" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.671200 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-2rnsr"] Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.672556 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5ad14aa6-962d-4f8f-babe-745f65d63560-serving-cert\") pod \"route-controller-manager-6576b87f9c-2mmw4\" (UID: \"5ad14aa6-962d-4f8f-babe-745f65d63560\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2mmw4" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.675334 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-blr59"] Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.676851 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-284hg"] Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.677596 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-284hg" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.679218 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-6zspj"] Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.681169 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rq6q9"] Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.682218 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nncbw"] Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.683308 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jf2hd"] Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.684342 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-72n7k"] Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.685286 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wd65t"] Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.686915 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-6scjm"] Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.687623 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.688076 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-6scjm" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.688397 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-kcm8s"] Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.690377 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-6rsds"] Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.691485 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-l5kfm"] Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.692832 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wgzvh"] Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.694255 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-gtjx7"] Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.695592 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2xtf8"] Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.696763 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-khbdr"] Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.698671 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-qc9j6"] Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.699666 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x2x76"] Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.701335 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4bfcw"] Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.702149 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-284hg"] Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.703386 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-6scjm"] Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.704561 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522280-c8fps"] Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.705926 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wsdcq"] Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.706876 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.707580 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-ch9j6"] Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.708697 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jhzxl"] Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.709767 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dsrq8"] Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.710953 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-ld8ls"] Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.712082 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-5vhz9"] Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.716160 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-9kmt4"] Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.716845 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-9kmt4" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.717511 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-8ngwr"] Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.718704 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-9kmt4"] Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.718769 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-8ngwr" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.727255 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.746597 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.765054 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3bb5e0b8-9179-4570-a3d8-acaa80b2c884-metrics-tls\") pod \"ingress-operator-5b745b69d9-7htl2\" (UID: \"3bb5e0b8-9179-4570-a3d8-acaa80b2c884\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7htl2" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.765087 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7-audit-policies\") pod \"oauth-openshift-558db77b4-6rsds\" (UID: \"c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7\") " pod="openshift-authentication/oauth-openshift-558db77b4-6rsds" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.765105 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-6rsds\" (UID: \"c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7\") " pod="openshift-authentication/oauth-openshift-558db77b4-6rsds" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.765146 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/65684d1d-5242-464d-8caf-ad4866bf6a86-config\") pod \"etcd-operator-b45778765-7klmp\" (UID: \"65684d1d-5242-464d-8caf-ad4866bf6a86\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7klmp" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.765161 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3bb5e0b8-9179-4570-a3d8-acaa80b2c884-trusted-ca\") pod \"ingress-operator-5b745b69d9-7htl2\" (UID: \"3bb5e0b8-9179-4570-a3d8-acaa80b2c884\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7htl2" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.765176 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/18ec8466-f311-4f81-ae38-48635b000ced-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-nncbw\" (UID: \"18ec8466-f311-4f81-ae38-48635b000ced\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nncbw" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.765822 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/65684d1d-5242-464d-8caf-ad4866bf6a86-config\") pod \"etcd-operator-b45778765-7klmp\" (UID: \"65684d1d-5242-464d-8caf-ad4866bf6a86\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7klmp" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.766377 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3bb5e0b8-9179-4570-a3d8-acaa80b2c884-trusted-ca\") pod \"ingress-operator-5b745b69d9-7htl2\" (UID: \"3bb5e0b8-9179-4570-a3d8-acaa80b2c884\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7htl2" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.765934 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d30a99f-a727-4eb4-9a32-0508707384bf-config\") pod \"openshift-apiserver-operator-796bbdcf4f-dsrq8\" (UID: \"1d30a99f-a727-4eb4-9a32-0508707384bf\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dsrq8" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.766490 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lkc7h\" (UniqueName: \"kubernetes.io/projected/3bb5e0b8-9179-4570-a3d8-acaa80b2c884-kube-api-access-lkc7h\") pod \"ingress-operator-5b745b69d9-7htl2\" (UID: \"3bb5e0b8-9179-4570-a3d8-acaa80b2c884\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7htl2" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.766510 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qb8wv\" (UniqueName: \"kubernetes.io/projected/c26f912f-f640-4b4c-ab61-dd2a163f12ab-kube-api-access-qb8wv\") pod \"openshift-config-operator-7777fb866f-2rnsr\" (UID: \"c26f912f-f640-4b4c-ab61-dd2a163f12ab\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2rnsr" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.766552 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a3f2789-dd41-4e95-8174-db3a40098b0e-config\") pod \"kube-controller-manager-operator-78b949d7b-wd65t\" (UID: \"9a3f2789-dd41-4e95-8174-db3a40098b0e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wd65t" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.766613 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7-audit-policies\") pod \"oauth-openshift-558db77b4-6rsds\" (UID: \"c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7\") " pod="openshift-authentication/oauth-openshift-558db77b4-6rsds" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.767078 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a3f2789-dd41-4e95-8174-db3a40098b0e-config\") pod \"kube-controller-manager-operator-78b949d7b-wd65t\" (UID: \"9a3f2789-dd41-4e95-8174-db3a40098b0e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wd65t" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.767220 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.767492 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7-audit-dir\") pod \"oauth-openshift-558db77b4-6rsds\" (UID: \"c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7\") " pod="openshift-authentication/oauth-openshift-558db77b4-6rsds" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.767536 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/921ecdc3-b5f3-44e4-9300-d25342d944d8-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-jf2hd\" (UID: \"921ecdc3-b5f3-44e4-9300-d25342d944d8\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jf2hd" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.767563 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/65684d1d-5242-464d-8caf-ad4866bf6a86-etcd-ca\") pod \"etcd-operator-b45778765-7klmp\" (UID: \"65684d1d-5242-464d-8caf-ad4866bf6a86\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7klmp" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.767960 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dcvr7\" (UniqueName: \"kubernetes.io/projected/f8080d32-cbe7-4b02-8791-d9f1f9aca269-kube-api-access-dcvr7\") pod \"kube-storage-version-migrator-operator-b67b599dd-rq6q9\" (UID: \"f8080d32-cbe7-4b02-8791-d9f1f9aca269\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rq6q9" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.767982 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/331c189b-0cb5-4733-a233-894429c709a9-srv-cert\") pod \"olm-operator-6b444d44fb-sknds\" (UID: \"331c189b-0cb5-4733-a233-894429c709a9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sknds" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.768002 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-6rsds\" (UID: \"c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7\") " pod="openshift-authentication/oauth-openshift-558db77b4-6rsds" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.768051 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/65684d1d-5242-464d-8caf-ad4866bf6a86-serving-cert\") pod \"etcd-operator-b45778765-7klmp\" (UID: \"65684d1d-5242-464d-8caf-ad4866bf6a86\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7klmp" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.768068 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18ec8466-f311-4f81-ae38-48635b000ced-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-nncbw\" (UID: \"18ec8466-f311-4f81-ae38-48635b000ced\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nncbw" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.768086 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-6rsds\" (UID: \"c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7\") " pod="openshift-authentication/oauth-openshift-558db77b4-6rsds" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.768102 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-6rsds\" (UID: \"c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7\") " pod="openshift-authentication/oauth-openshift-558db77b4-6rsds" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.768126 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/331c189b-0cb5-4733-a233-894429c709a9-profile-collector-cert\") pod \"olm-operator-6b444d44fb-sknds\" (UID: \"331c189b-0cb5-4733-a233-894429c709a9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sknds" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.768147 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-6rsds\" (UID: \"c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7\") " pod="openshift-authentication/oauth-openshift-558db77b4-6rsds" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.768171 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/171e2af0-2993-4cd3-942f-043bccca2813-metrics-tls\") pod \"dns-operator-744455d44c-8sd2q\" (UID: \"171e2af0-2993-4cd3-942f-043bccca2813\") " pod="openshift-dns-operator/dns-operator-744455d44c-8sd2q" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.768187 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/985bc83c-52fa-45dc-ab4f-6e47ee47683e-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-khbdr\" (UID: \"985bc83c-52fa-45dc-ab4f-6e47ee47683e\") " pod="openshift-marketplace/marketplace-operator-79b997595-khbdr" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.768203 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xj5j\" (UniqueName: \"kubernetes.io/projected/331c189b-0cb5-4733-a233-894429c709a9-kube-api-access-6xj5j\") pod \"olm-operator-6b444d44fb-sknds\" (UID: \"331c189b-0cb5-4733-a233-894429c709a9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sknds" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.768225 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9a3f2789-dd41-4e95-8174-db3a40098b0e-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-wd65t\" (UID: \"9a3f2789-dd41-4e95-8174-db3a40098b0e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wd65t" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.768248 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/65684d1d-5242-464d-8caf-ad4866bf6a86-etcd-client\") pod \"etcd-operator-b45778765-7klmp\" (UID: \"65684d1d-5242-464d-8caf-ad4866bf6a86\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7klmp" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.768265 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c26f912f-f640-4b4c-ab61-dd2a163f12ab-serving-cert\") pod \"openshift-config-operator-7777fb866f-2rnsr\" (UID: \"c26f912f-f640-4b4c-ab61-dd2a163f12ab\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2rnsr" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.768281 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f8080d32-cbe7-4b02-8791-d9f1f9aca269-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-rq6q9\" (UID: \"f8080d32-cbe7-4b02-8791-d9f1f9aca269\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rq6q9" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.768359 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/65684d1d-5242-464d-8caf-ad4866bf6a86-etcd-service-ca\") pod \"etcd-operator-b45778765-7klmp\" (UID: \"65684d1d-5242-464d-8caf-ad4866bf6a86\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7klmp" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.768388 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-6rsds\" (UID: \"c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7\") " pod="openshift-authentication/oauth-openshift-558db77b4-6rsds" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.768410 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6htjx\" (UniqueName: \"kubernetes.io/projected/c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7-kube-api-access-6htjx\") pod \"oauth-openshift-558db77b4-6rsds\" (UID: \"c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7\") " pod="openshift-authentication/oauth-openshift-558db77b4-6rsds" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.768434 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d30a99f-a727-4eb4-9a32-0508707384bf-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-dsrq8\" (UID: \"1d30a99f-a727-4eb4-9a32-0508707384bf\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dsrq8" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.768456 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5z42f\" (UniqueName: \"kubernetes.io/projected/171e2af0-2993-4cd3-942f-043bccca2813-kube-api-access-5z42f\") pod \"dns-operator-744455d44c-8sd2q\" (UID: \"171e2af0-2993-4cd3-942f-043bccca2813\") " pod="openshift-dns-operator/dns-operator-744455d44c-8sd2q" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.768477 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3bb5e0b8-9179-4570-a3d8-acaa80b2c884-bound-sa-token\") pod \"ingress-operator-5b745b69d9-7htl2\" (UID: \"3bb5e0b8-9179-4570-a3d8-acaa80b2c884\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7htl2" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.768505 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-6rsds\" (UID: \"c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7\") " pod="openshift-authentication/oauth-openshift-558db77b4-6rsds" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.768631 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/65684d1d-5242-464d-8caf-ad4866bf6a86-etcd-ca\") pod \"etcd-operator-b45778765-7klmp\" (UID: \"65684d1d-5242-464d-8caf-ad4866bf6a86\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7klmp" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.768522 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8080d32-cbe7-4b02-8791-d9f1f9aca269-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-rq6q9\" (UID: \"f8080d32-cbe7-4b02-8791-d9f1f9aca269\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rq6q9" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.767592 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7-audit-dir\") pod \"oauth-openshift-558db77b4-6rsds\" (UID: \"c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7\") " pod="openshift-authentication/oauth-openshift-558db77b4-6rsds" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.768930 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9mxp\" (UniqueName: \"kubernetes.io/projected/921ecdc3-b5f3-44e4-9300-d25342d944d8-kube-api-access-n9mxp\") pod \"cluster-samples-operator-665b6dd947-jf2hd\" (UID: \"921ecdc3-b5f3-44e4-9300-d25342d944d8\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jf2hd" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.768946 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9a3f2789-dd41-4e95-8174-db3a40098b0e-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-wd65t\" (UID: \"9a3f2789-dd41-4e95-8174-db3a40098b0e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wd65t" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.768962 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-6rsds\" (UID: \"c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7\") " pod="openshift-authentication/oauth-openshift-558db77b4-6rsds" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.768979 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-6rsds\" (UID: \"c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7\") " pod="openshift-authentication/oauth-openshift-558db77b4-6rsds" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.768994 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-6rsds\" (UID: \"c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7\") " pod="openshift-authentication/oauth-openshift-558db77b4-6rsds" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.769012 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6jrc\" (UniqueName: \"kubernetes.io/projected/18ec8466-f311-4f81-ae38-48635b000ced-kube-api-access-g6jrc\") pod \"openshift-controller-manager-operator-756b6f6bc6-nncbw\" (UID: \"18ec8466-f311-4f81-ae38-48635b000ced\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nncbw" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.769050 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqh4c\" (UniqueName: \"kubernetes.io/projected/19216a1e-34af-4764-a621-e5097db4751b-kube-api-access-bqh4c\") pod \"multus-admission-controller-857f4d67dd-ch9j6\" (UID: \"19216a1e-34af-4764-a621-e5097db4751b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-ch9j6" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.769088 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crtnq\" (UniqueName: \"kubernetes.io/projected/1d30a99f-a727-4eb4-9a32-0508707384bf-kube-api-access-crtnq\") pod \"openshift-apiserver-operator-796bbdcf4f-dsrq8\" (UID: \"1d30a99f-a727-4eb4-9a32-0508707384bf\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dsrq8" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.769105 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/c26f912f-f640-4b4c-ab61-dd2a163f12ab-available-featuregates\") pod \"openshift-config-operator-7777fb866f-2rnsr\" (UID: \"c26f912f-f640-4b4c-ab61-dd2a163f12ab\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2rnsr" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.769150 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/985bc83c-52fa-45dc-ab4f-6e47ee47683e-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-khbdr\" (UID: \"985bc83c-52fa-45dc-ab4f-6e47ee47683e\") " pod="openshift-marketplace/marketplace-operator-79b997595-khbdr" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.769168 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfrbt\" (UniqueName: \"kubernetes.io/projected/985bc83c-52fa-45dc-ab4f-6e47ee47683e-kube-api-access-sfrbt\") pod \"marketplace-operator-79b997595-khbdr\" (UID: \"985bc83c-52fa-45dc-ab4f-6e47ee47683e\") " pod="openshift-marketplace/marketplace-operator-79b997595-khbdr" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.769184 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkl9p\" (UniqueName: \"kubernetes.io/projected/65684d1d-5242-464d-8caf-ad4866bf6a86-kube-api-access-jkl9p\") pod \"etcd-operator-b45778765-7klmp\" (UID: \"65684d1d-5242-464d-8caf-ad4866bf6a86\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7klmp" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.769200 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/19216a1e-34af-4764-a621-e5097db4751b-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-ch9j6\" (UID: \"19216a1e-34af-4764-a621-e5097db4751b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-ch9j6" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.769243 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-6rsds\" (UID: \"c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7\") " pod="openshift-authentication/oauth-openshift-558db77b4-6rsds" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.770322 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-6rsds\" (UID: \"c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7\") " pod="openshift-authentication/oauth-openshift-558db77b4-6rsds" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.770768 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-6rsds\" (UID: \"c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7\") " pod="openshift-authentication/oauth-openshift-558db77b4-6rsds" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.771355 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-6rsds\" (UID: \"c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7\") " pod="openshift-authentication/oauth-openshift-558db77b4-6rsds" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.771556 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/18ec8466-f311-4f81-ae38-48635b000ced-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-nncbw\" (UID: \"18ec8466-f311-4f81-ae38-48635b000ced\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nncbw" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.771864 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/65684d1d-5242-464d-8caf-ad4866bf6a86-etcd-service-ca\") pod \"etcd-operator-b45778765-7klmp\" (UID: \"65684d1d-5242-464d-8caf-ad4866bf6a86\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7klmp" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.772312 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3bb5e0b8-9179-4570-a3d8-acaa80b2c884-metrics-tls\") pod \"ingress-operator-5b745b69d9-7htl2\" (UID: \"3bb5e0b8-9179-4570-a3d8-acaa80b2c884\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7htl2" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.772609 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/c26f912f-f640-4b4c-ab61-dd2a163f12ab-available-featuregates\") pod \"openshift-config-operator-7777fb866f-2rnsr\" (UID: \"c26f912f-f640-4b4c-ab61-dd2a163f12ab\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2rnsr" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.773332 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18ec8466-f311-4f81-ae38-48635b000ced-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-nncbw\" (UID: \"18ec8466-f311-4f81-ae38-48635b000ced\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nncbw" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.773409 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-6rsds\" (UID: \"c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7\") " pod="openshift-authentication/oauth-openshift-558db77b4-6rsds" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.775284 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-6rsds\" (UID: \"c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7\") " pod="openshift-authentication/oauth-openshift-558db77b4-6rsds" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.775713 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-6rsds\" (UID: \"c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7\") " pod="openshift-authentication/oauth-openshift-558db77b4-6rsds" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.776220 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/921ecdc3-b5f3-44e4-9300-d25342d944d8-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-jf2hd\" (UID: \"921ecdc3-b5f3-44e4-9300-d25342d944d8\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jf2hd" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.778721 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-6rsds\" (UID: \"c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7\") " pod="openshift-authentication/oauth-openshift-558db77b4-6rsds" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.779631 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/65684d1d-5242-464d-8caf-ad4866bf6a86-serving-cert\") pod \"etcd-operator-b45778765-7klmp\" (UID: \"65684d1d-5242-464d-8caf-ad4866bf6a86\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7klmp" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.780971 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f8080d32-cbe7-4b02-8791-d9f1f9aca269-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-rq6q9\" (UID: \"f8080d32-cbe7-4b02-8791-d9f1f9aca269\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rq6q9" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.781998 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/171e2af0-2993-4cd3-942f-043bccca2813-metrics-tls\") pod \"dns-operator-744455d44c-8sd2q\" (UID: \"171e2af0-2993-4cd3-942f-043bccca2813\") " pod="openshift-dns-operator/dns-operator-744455d44c-8sd2q" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.782129 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-6rsds\" (UID: \"c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7\") " pod="openshift-authentication/oauth-openshift-558db77b4-6rsds" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.782281 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9a3f2789-dd41-4e95-8174-db3a40098b0e-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-wd65t\" (UID: \"9a3f2789-dd41-4e95-8174-db3a40098b0e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wd65t" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.782410 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-6rsds\" (UID: \"c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7\") " pod="openshift-authentication/oauth-openshift-558db77b4-6rsds" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.782412 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-6rsds\" (UID: \"c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7\") " pod="openshift-authentication/oauth-openshift-558db77b4-6rsds" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.784832 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c26f912f-f640-4b4c-ab61-dd2a163f12ab-serving-cert\") pod \"openshift-config-operator-7777fb866f-2rnsr\" (UID: \"c26f912f-f640-4b4c-ab61-dd2a163f12ab\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2rnsr" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.786829 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.786920 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/65684d1d-5242-464d-8caf-ad4866bf6a86-etcd-client\") pod \"etcd-operator-b45778765-7klmp\" (UID: \"65684d1d-5242-464d-8caf-ad4866bf6a86\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7klmp" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.795365 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-6rsds\" (UID: \"c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7\") " pod="openshift-authentication/oauth-openshift-558db77b4-6rsds" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.806620 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.826862 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.854176 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.866746 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.869887 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqh4c\" (UniqueName: \"kubernetes.io/projected/19216a1e-34af-4764-a621-e5097db4751b-kube-api-access-bqh4c\") pod \"multus-admission-controller-857f4d67dd-ch9j6\" (UID: \"19216a1e-34af-4764-a621-e5097db4751b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-ch9j6" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.869943 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfrbt\" (UniqueName: \"kubernetes.io/projected/985bc83c-52fa-45dc-ab4f-6e47ee47683e-kube-api-access-sfrbt\") pod \"marketplace-operator-79b997595-khbdr\" (UID: \"985bc83c-52fa-45dc-ab4f-6e47ee47683e\") " pod="openshift-marketplace/marketplace-operator-79b997595-khbdr" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.869969 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crtnq\" (UniqueName: \"kubernetes.io/projected/1d30a99f-a727-4eb4-9a32-0508707384bf-kube-api-access-crtnq\") pod \"openshift-apiserver-operator-796bbdcf4f-dsrq8\" (UID: \"1d30a99f-a727-4eb4-9a32-0508707384bf\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dsrq8" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.870014 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/985bc83c-52fa-45dc-ab4f-6e47ee47683e-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-khbdr\" (UID: \"985bc83c-52fa-45dc-ab4f-6e47ee47683e\") " pod="openshift-marketplace/marketplace-operator-79b997595-khbdr" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.870052 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/19216a1e-34af-4764-a621-e5097db4751b-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-ch9j6\" (UID: \"19216a1e-34af-4764-a621-e5097db4751b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-ch9j6" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.870148 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d30a99f-a727-4eb4-9a32-0508707384bf-config\") pod \"openshift-apiserver-operator-796bbdcf4f-dsrq8\" (UID: \"1d30a99f-a727-4eb4-9a32-0508707384bf\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dsrq8" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.870235 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/331c189b-0cb5-4733-a233-894429c709a9-srv-cert\") pod \"olm-operator-6b444d44fb-sknds\" (UID: \"331c189b-0cb5-4733-a233-894429c709a9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sknds" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.870280 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/331c189b-0cb5-4733-a233-894429c709a9-profile-collector-cert\") pod \"olm-operator-6b444d44fb-sknds\" (UID: \"331c189b-0cb5-4733-a233-894429c709a9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sknds" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.870336 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/985bc83c-52fa-45dc-ab4f-6e47ee47683e-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-khbdr\" (UID: \"985bc83c-52fa-45dc-ab4f-6e47ee47683e\") " pod="openshift-marketplace/marketplace-operator-79b997595-khbdr" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.870358 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xj5j\" (UniqueName: \"kubernetes.io/projected/331c189b-0cb5-4733-a233-894429c709a9-kube-api-access-6xj5j\") pod \"olm-operator-6b444d44fb-sknds\" (UID: \"331c189b-0cb5-4733-a233-894429c709a9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sknds" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.870421 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d30a99f-a727-4eb4-9a32-0508707384bf-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-dsrq8\" (UID: \"1d30a99f-a727-4eb4-9a32-0508707384bf\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dsrq8" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.871592 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8080d32-cbe7-4b02-8791-d9f1f9aca269-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-rq6q9\" (UID: \"f8080d32-cbe7-4b02-8791-d9f1f9aca269\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rq6q9" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.886650 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.906665 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.927446 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.946698 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.967956 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.973809 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/331c189b-0cb5-4733-a233-894429c709a9-srv-cert\") pod \"olm-operator-6b444d44fb-sknds\" (UID: \"331c189b-0cb5-4733-a233-894429c709a9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sknds" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.987065 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 17 14:08:34 crc kubenswrapper[4836]: I0217 14:08:34.993420 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/331c189b-0cb5-4733-a233-894429c709a9-profile-collector-cert\") pod \"olm-operator-6b444d44fb-sknds\" (UID: \"331c189b-0cb5-4733-a233-894429c709a9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sknds" Feb 17 14:08:35 crc kubenswrapper[4836]: I0217 14:08:35.007347 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 17 14:08:35 crc kubenswrapper[4836]: I0217 14:08:35.027385 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 17 14:08:35 crc kubenswrapper[4836]: I0217 14:08:35.047396 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 17 14:08:35 crc kubenswrapper[4836]: I0217 14:08:35.066720 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 17 14:08:35 crc kubenswrapper[4836]: I0217 14:08:35.087635 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 17 14:08:35 crc kubenswrapper[4836]: I0217 14:08:35.107605 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 17 14:08:35 crc kubenswrapper[4836]: I0217 14:08:35.127427 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 17 14:08:35 crc kubenswrapper[4836]: I0217 14:08:35.134429 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/985bc83c-52fa-45dc-ab4f-6e47ee47683e-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-khbdr\" (UID: \"985bc83c-52fa-45dc-ab4f-6e47ee47683e\") " pod="openshift-marketplace/marketplace-operator-79b997595-khbdr" Feb 17 14:08:35 crc kubenswrapper[4836]: I0217 14:08:35.147928 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 17 14:08:35 crc kubenswrapper[4836]: I0217 14:08:35.173849 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 17 14:08:35 crc kubenswrapper[4836]: I0217 14:08:35.183009 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/985bc83c-52fa-45dc-ab4f-6e47ee47683e-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-khbdr\" (UID: \"985bc83c-52fa-45dc-ab4f-6e47ee47683e\") " pod="openshift-marketplace/marketplace-operator-79b997595-khbdr" Feb 17 14:08:35 crc kubenswrapper[4836]: I0217 14:08:35.187142 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 17 14:08:35 crc kubenswrapper[4836]: I0217 14:08:35.207879 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 17 14:08:35 crc kubenswrapper[4836]: I0217 14:08:35.227049 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 17 14:08:35 crc kubenswrapper[4836]: I0217 14:08:35.246871 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 17 14:08:35 crc kubenswrapper[4836]: I0217 14:08:35.266737 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 17 14:08:35 crc kubenswrapper[4836]: I0217 14:08:35.286640 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 17 14:08:35 crc kubenswrapper[4836]: I0217 14:08:35.327145 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 17 14:08:35 crc kubenswrapper[4836]: I0217 14:08:35.347259 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 17 14:08:35 crc kubenswrapper[4836]: I0217 14:08:35.366743 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 17 14:08:35 crc kubenswrapper[4836]: I0217 14:08:35.387388 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 17 14:08:35 crc kubenswrapper[4836]: I0217 14:08:35.406886 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 17 14:08:35 crc kubenswrapper[4836]: I0217 14:08:35.427260 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 17 14:08:35 crc kubenswrapper[4836]: I0217 14:08:35.433175 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/19216a1e-34af-4764-a621-e5097db4751b-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-ch9j6\" (UID: \"19216a1e-34af-4764-a621-e5097db4751b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-ch9j6" Feb 17 14:08:35 crc kubenswrapper[4836]: I0217 14:08:35.446760 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 17 14:08:35 crc kubenswrapper[4836]: I0217 14:08:35.467184 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 17 14:08:35 crc kubenswrapper[4836]: I0217 14:08:35.488272 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 17 14:08:35 crc kubenswrapper[4836]: I0217 14:08:35.508430 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 17 14:08:35 crc kubenswrapper[4836]: I0217 14:08:35.527600 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 17 14:08:35 crc kubenswrapper[4836]: I0217 14:08:35.548516 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 17 14:08:35 crc kubenswrapper[4836]: I0217 14:08:35.568350 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 17 14:08:35 crc kubenswrapper[4836]: I0217 14:08:35.587766 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 17 14:08:35 crc kubenswrapper[4836]: I0217 14:08:35.605589 4836 request.go:700] Waited for 1.004541941s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-service-ca/secrets?fieldSelector=metadata.name%3Dsigning-key&limit=500&resourceVersion=0 Feb 17 14:08:35 crc kubenswrapper[4836]: I0217 14:08:35.607356 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 17 14:08:35 crc kubenswrapper[4836]: E0217 14:08:35.620179 4836 configmap.go:193] Couldn't get configMap openshift-machine-api/machine-api-operator-images: failed to sync configmap cache: timed out waiting for the condition Feb 17 14:08:35 crc kubenswrapper[4836]: E0217 14:08:35.620376 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1ecc7c98-e9a3-4850-a741-7e0bcf670e27-images podName:1ecc7c98-e9a3-4850-a741-7e0bcf670e27 nodeName:}" failed. No retries permitted until 2026-02-17 14:08:36.120349395 +0000 UTC m=+142.463277664 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "images" (UniqueName: "kubernetes.io/configmap/1ecc7c98-e9a3-4850-a741-7e0bcf670e27-images") pod "machine-api-operator-5694c8668f-jjmwc" (UID: "1ecc7c98-e9a3-4850-a741-7e0bcf670e27") : failed to sync configmap cache: timed out waiting for the condition Feb 17 14:08:35 crc kubenswrapper[4836]: E0217 14:08:35.622446 4836 configmap.go:193] Couldn't get configMap openshift-apiserver/image-import-ca: failed to sync configmap cache: timed out waiting for the condition Feb 17 14:08:35 crc kubenswrapper[4836]: E0217 14:08:35.622473 4836 configmap.go:193] Couldn't get configMap openshift-oauth-apiserver/trusted-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Feb 17 14:08:35 crc kubenswrapper[4836]: E0217 14:08:35.622475 4836 configmap.go:193] Couldn't get configMap openshift-apiserver/etcd-serving-ca: failed to sync configmap cache: timed out waiting for the condition Feb 17 14:08:35 crc kubenswrapper[4836]: E0217 14:08:35.622536 4836 configmap.go:193] Couldn't get configMap openshift-apiserver/audit-1: failed to sync configmap cache: timed out waiting for the condition Feb 17 14:08:35 crc kubenswrapper[4836]: E0217 14:08:35.622507 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/66402e53-3287-45c4-bceb-78fc99836c5b-image-import-ca podName:66402e53-3287-45c4-bceb-78fc99836c5b nodeName:}" failed. No retries permitted until 2026-02-17 14:08:36.122492581 +0000 UTC m=+142.465420850 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "image-import-ca" (UniqueName: "kubernetes.io/configmap/66402e53-3287-45c4-bceb-78fc99836c5b-image-import-ca") pod "apiserver-76f77b778f-cnq25" (UID: "66402e53-3287-45c4-bceb-78fc99836c5b") : failed to sync configmap cache: timed out waiting for the condition Feb 17 14:08:35 crc kubenswrapper[4836]: E0217 14:08:35.622573 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e634dd4f-5615-4dd7-ac0f-8c22a3ee7d40-trusted-ca-bundle podName:e634dd4f-5615-4dd7-ac0f-8c22a3ee7d40 nodeName:}" failed. No retries permitted until 2026-02-17 14:08:36.122566813 +0000 UTC m=+142.465495082 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/e634dd4f-5615-4dd7-ac0f-8c22a3ee7d40-trusted-ca-bundle") pod "apiserver-7bbb656c7d-z6h7n" (UID: "e634dd4f-5615-4dd7-ac0f-8c22a3ee7d40") : failed to sync configmap cache: timed out waiting for the condition Feb 17 14:08:35 crc kubenswrapper[4836]: E0217 14:08:35.622584 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/66402e53-3287-45c4-bceb-78fc99836c5b-audit podName:66402e53-3287-45c4-bceb-78fc99836c5b nodeName:}" failed. No retries permitted until 2026-02-17 14:08:36.122578454 +0000 UTC m=+142.465506723 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "audit" (UniqueName: "kubernetes.io/configmap/66402e53-3287-45c4-bceb-78fc99836c5b-audit") pod "apiserver-76f77b778f-cnq25" (UID: "66402e53-3287-45c4-bceb-78fc99836c5b") : failed to sync configmap cache: timed out waiting for the condition Feb 17 14:08:35 crc kubenswrapper[4836]: E0217 14:08:35.622597 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/66402e53-3287-45c4-bceb-78fc99836c5b-etcd-serving-ca podName:66402e53-3287-45c4-bceb-78fc99836c5b nodeName:}" failed. No retries permitted until 2026-02-17 14:08:36.122591314 +0000 UTC m=+142.465519573 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etcd-serving-ca" (UniqueName: "kubernetes.io/configmap/66402e53-3287-45c4-bceb-78fc99836c5b-etcd-serving-ca") pod "apiserver-76f77b778f-cnq25" (UID: "66402e53-3287-45c4-bceb-78fc99836c5b") : failed to sync configmap cache: timed out waiting for the condition Feb 17 14:08:35 crc kubenswrapper[4836]: E0217 14:08:35.622599 4836 configmap.go:193] Couldn't get configMap openshift-apiserver/trusted-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Feb 17 14:08:35 crc kubenswrapper[4836]: E0217 14:08:35.622703 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/66402e53-3287-45c4-bceb-78fc99836c5b-trusted-ca-bundle podName:66402e53-3287-45c4-bceb-78fc99836c5b nodeName:}" failed. No retries permitted until 2026-02-17 14:08:36.122670236 +0000 UTC m=+142.465598555 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/66402e53-3287-45c4-bceb-78fc99836c5b-trusted-ca-bundle") pod "apiserver-76f77b778f-cnq25" (UID: "66402e53-3287-45c4-bceb-78fc99836c5b") : failed to sync configmap cache: timed out waiting for the condition Feb 17 14:08:35 crc kubenswrapper[4836]: E0217 14:08:35.624010 4836 secret.go:188] Couldn't get secret openshift-machine-api/machine-api-operator-tls: failed to sync secret cache: timed out waiting for the condition Feb 17 14:08:35 crc kubenswrapper[4836]: E0217 14:08:35.624044 4836 secret.go:188] Couldn't get secret openshift-apiserver/encryption-config-1: failed to sync secret cache: timed out waiting for the condition Feb 17 14:08:35 crc kubenswrapper[4836]: E0217 14:08:35.624115 4836 configmap.go:193] Couldn't get configMap openshift-controller-manager/config: failed to sync configmap cache: timed out waiting for the condition Feb 17 14:08:35 crc kubenswrapper[4836]: E0217 14:08:35.624134 4836 secret.go:188] Couldn't get secret openshift-apiserver/serving-cert: failed to sync secret cache: timed out waiting for the condition Feb 17 14:08:35 crc kubenswrapper[4836]: E0217 14:08:35.624140 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1ecc7c98-e9a3-4850-a741-7e0bcf670e27-machine-api-operator-tls podName:1ecc7c98-e9a3-4850-a741-7e0bcf670e27 nodeName:}" failed. No retries permitted until 2026-02-17 14:08:36.124106454 +0000 UTC m=+142.467034763 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "machine-api-operator-tls" (UniqueName: "kubernetes.io/secret/1ecc7c98-e9a3-4850-a741-7e0bcf670e27-machine-api-operator-tls") pod "machine-api-operator-5694c8668f-jjmwc" (UID: "1ecc7c98-e9a3-4850-a741-7e0bcf670e27") : failed to sync secret cache: timed out waiting for the condition Feb 17 14:08:35 crc kubenswrapper[4836]: E0217 14:08:35.624196 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66402e53-3287-45c4-bceb-78fc99836c5b-encryption-config podName:66402e53-3287-45c4-bceb-78fc99836c5b nodeName:}" failed. No retries permitted until 2026-02-17 14:08:36.124180956 +0000 UTC m=+142.467109215 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "encryption-config" (UniqueName: "kubernetes.io/secret/66402e53-3287-45c4-bceb-78fc99836c5b-encryption-config") pod "apiserver-76f77b778f-cnq25" (UID: "66402e53-3287-45c4-bceb-78fc99836c5b") : failed to sync secret cache: timed out waiting for the condition Feb 17 14:08:35 crc kubenswrapper[4836]: E0217 14:08:35.624211 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8c77bcf1-4025-4c35-9580-41e9a61195e8-config podName:8c77bcf1-4025-4c35-9580-41e9a61195e8 nodeName:}" failed. No retries permitted until 2026-02-17 14:08:36.124202906 +0000 UTC m=+142.467131165 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/8c77bcf1-4025-4c35-9580-41e9a61195e8-config") pod "controller-manager-879f6c89f-5l6x4" (UID: "8c77bcf1-4025-4c35-9580-41e9a61195e8") : failed to sync configmap cache: timed out waiting for the condition Feb 17 14:08:35 crc kubenswrapper[4836]: E0217 14:08:35.624243 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66402e53-3287-45c4-bceb-78fc99836c5b-serving-cert podName:66402e53-3287-45c4-bceb-78fc99836c5b nodeName:}" failed. No retries permitted until 2026-02-17 14:08:36.124221097 +0000 UTC m=+142.467149436 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/66402e53-3287-45c4-bceb-78fc99836c5b-serving-cert") pod "apiserver-76f77b778f-cnq25" (UID: "66402e53-3287-45c4-bceb-78fc99836c5b") : failed to sync secret cache: timed out waiting for the condition Feb 17 14:08:35 crc kubenswrapper[4836]: I0217 14:08:35.626503 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 17 14:08:35 crc kubenswrapper[4836]: E0217 14:08:35.646391 4836 configmap.go:193] Couldn't get configMap openshift-controller-manager/openshift-global-ca: failed to sync configmap cache: timed out waiting for the condition Feb 17 14:08:35 crc kubenswrapper[4836]: E0217 14:08:35.646536 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8c77bcf1-4025-4c35-9580-41e9a61195e8-proxy-ca-bundles podName:8c77bcf1-4025-4c35-9580-41e9a61195e8 nodeName:}" failed. No retries permitted until 2026-02-17 14:08:36.146501549 +0000 UTC m=+142.489429858 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-ca-bundles" (UniqueName: "kubernetes.io/configmap/8c77bcf1-4025-4c35-9580-41e9a61195e8-proxy-ca-bundles") pod "controller-manager-879f6c89f-5l6x4" (UID: "8c77bcf1-4025-4c35-9580-41e9a61195e8") : failed to sync configmap cache: timed out waiting for the condition Feb 17 14:08:35 crc kubenswrapper[4836]: E0217 14:08:35.646756 4836 secret.go:188] Couldn't get secret openshift-apiserver/etcd-client: failed to sync secret cache: timed out waiting for the condition Feb 17 14:08:35 crc kubenswrapper[4836]: E0217 14:08:35.646864 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66402e53-3287-45c4-bceb-78fc99836c5b-etcd-client podName:66402e53-3287-45c4-bceb-78fc99836c5b nodeName:}" failed. No retries permitted until 2026-02-17 14:08:36.146835857 +0000 UTC m=+142.489764166 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etcd-client" (UniqueName: "kubernetes.io/secret/66402e53-3287-45c4-bceb-78fc99836c5b-etcd-client") pod "apiserver-76f77b778f-cnq25" (UID: "66402e53-3287-45c4-bceb-78fc99836c5b") : failed to sync secret cache: timed out waiting for the condition Feb 17 14:08:35 crc kubenswrapper[4836]: I0217 14:08:35.647283 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 17 14:08:35 crc kubenswrapper[4836]: E0217 14:08:35.647453 4836 secret.go:188] Couldn't get secret openshift-oauth-apiserver/serving-cert: failed to sync secret cache: timed out waiting for the condition Feb 17 14:08:35 crc kubenswrapper[4836]: E0217 14:08:35.647514 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e634dd4f-5615-4dd7-ac0f-8c22a3ee7d40-serving-cert podName:e634dd4f-5615-4dd7-ac0f-8c22a3ee7d40 nodeName:}" failed. No retries permitted until 2026-02-17 14:08:36.147500775 +0000 UTC m=+142.490429064 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/e634dd4f-5615-4dd7-ac0f-8c22a3ee7d40-serving-cert") pod "apiserver-7bbb656c7d-z6h7n" (UID: "e634dd4f-5615-4dd7-ac0f-8c22a3ee7d40") : failed to sync secret cache: timed out waiting for the condition Feb 17 14:08:35 crc kubenswrapper[4836]: E0217 14:08:35.648760 4836 configmap.go:193] Couldn't get configMap openshift-apiserver/config: failed to sync configmap cache: timed out waiting for the condition Feb 17 14:08:35 crc kubenswrapper[4836]: E0217 14:08:35.648804 4836 secret.go:188] Couldn't get secret openshift-controller-manager/serving-cert: failed to sync secret cache: timed out waiting for the condition Feb 17 14:08:35 crc kubenswrapper[4836]: E0217 14:08:35.648821 4836 configmap.go:193] Couldn't get configMap openshift-machine-api/kube-rbac-proxy: failed to sync configmap cache: timed out waiting for the condition Feb 17 14:08:35 crc kubenswrapper[4836]: E0217 14:08:35.648874 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/66402e53-3287-45c4-bceb-78fc99836c5b-config podName:66402e53-3287-45c4-bceb-78fc99836c5b nodeName:}" failed. No retries permitted until 2026-02-17 14:08:36.148845621 +0000 UTC m=+142.491773930 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/66402e53-3287-45c4-bceb-78fc99836c5b-config") pod "apiserver-76f77b778f-cnq25" (UID: "66402e53-3287-45c4-bceb-78fc99836c5b") : failed to sync configmap cache: timed out waiting for the condition Feb 17 14:08:35 crc kubenswrapper[4836]: E0217 14:08:35.648917 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8c77bcf1-4025-4c35-9580-41e9a61195e8-serving-cert podName:8c77bcf1-4025-4c35-9580-41e9a61195e8 nodeName:}" failed. No retries permitted until 2026-02-17 14:08:36.148905133 +0000 UTC m=+142.491833442 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/8c77bcf1-4025-4c35-9580-41e9a61195e8-serving-cert") pod "controller-manager-879f6c89f-5l6x4" (UID: "8c77bcf1-4025-4c35-9580-41e9a61195e8") : failed to sync secret cache: timed out waiting for the condition Feb 17 14:08:35 crc kubenswrapper[4836]: E0217 14:08:35.648938 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1ecc7c98-e9a3-4850-a741-7e0bcf670e27-config podName:1ecc7c98-e9a3-4850-a741-7e0bcf670e27 nodeName:}" failed. No retries permitted until 2026-02-17 14:08:36.148928573 +0000 UTC m=+142.491856882 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/1ecc7c98-e9a3-4850-a741-7e0bcf670e27-config") pod "machine-api-operator-5694c8668f-jjmwc" (UID: "1ecc7c98-e9a3-4850-a741-7e0bcf670e27") : failed to sync configmap cache: timed out waiting for the condition Feb 17 14:08:35 crc kubenswrapper[4836]: I0217 14:08:35.667008 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 17 14:08:35 crc kubenswrapper[4836]: I0217 14:08:35.693060 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 17 14:08:35 crc kubenswrapper[4836]: I0217 14:08:35.707564 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 17 14:08:35 crc kubenswrapper[4836]: I0217 14:08:35.727221 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 17 14:08:35 crc kubenswrapper[4836]: I0217 14:08:35.747922 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 17 14:08:35 crc kubenswrapper[4836]: I0217 14:08:35.767167 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 17 14:08:35 crc kubenswrapper[4836]: I0217 14:08:35.788778 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 17 14:08:35 crc kubenswrapper[4836]: I0217 14:08:35.807051 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 17 14:08:35 crc kubenswrapper[4836]: I0217 14:08:35.827551 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 17 14:08:35 crc kubenswrapper[4836]: I0217 14:08:35.847749 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 17 14:08:35 crc kubenswrapper[4836]: I0217 14:08:35.869125 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 17 14:08:35 crc kubenswrapper[4836]: E0217 14:08:35.870783 4836 secret.go:188] Couldn't get secret openshift-apiserver-operator/openshift-apiserver-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Feb 17 14:08:35 crc kubenswrapper[4836]: E0217 14:08:35.870849 4836 configmap.go:193] Couldn't get configMap openshift-apiserver-operator/openshift-apiserver-operator-config: failed to sync configmap cache: timed out waiting for the condition Feb 17 14:08:35 crc kubenswrapper[4836]: E0217 14:08:35.870912 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1d30a99f-a727-4eb4-9a32-0508707384bf-serving-cert podName:1d30a99f-a727-4eb4-9a32-0508707384bf nodeName:}" failed. No retries permitted until 2026-02-17 14:08:36.370876326 +0000 UTC m=+142.713804635 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/1d30a99f-a727-4eb4-9a32-0508707384bf-serving-cert") pod "openshift-apiserver-operator-796bbdcf4f-dsrq8" (UID: "1d30a99f-a727-4eb4-9a32-0508707384bf") : failed to sync secret cache: timed out waiting for the condition Feb 17 14:08:35 crc kubenswrapper[4836]: E0217 14:08:35.870952 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1d30a99f-a727-4eb4-9a32-0508707384bf-config podName:1d30a99f-a727-4eb4-9a32-0508707384bf nodeName:}" failed. No retries permitted until 2026-02-17 14:08:36.370935778 +0000 UTC m=+142.713864087 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/1d30a99f-a727-4eb4-9a32-0508707384bf-config") pod "openshift-apiserver-operator-796bbdcf4f-dsrq8" (UID: "1d30a99f-a727-4eb4-9a32-0508707384bf") : failed to sync configmap cache: timed out waiting for the condition Feb 17 14:08:35 crc kubenswrapper[4836]: I0217 14:08:35.887732 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 17 14:08:35 crc kubenswrapper[4836]: I0217 14:08:35.907138 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 17 14:08:35 crc kubenswrapper[4836]: I0217 14:08:35.927607 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 17 14:08:35 crc kubenswrapper[4836]: I0217 14:08:35.947529 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 17 14:08:35 crc kubenswrapper[4836]: I0217 14:08:35.967136 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 17 14:08:35 crc kubenswrapper[4836]: I0217 14:08:35.987915 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 17 14:08:36 crc kubenswrapper[4836]: I0217 14:08:36.007138 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 17 14:08:36 crc kubenswrapper[4836]: I0217 14:08:36.027516 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 17 14:08:36 crc kubenswrapper[4836]: I0217 14:08:36.047584 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 17 14:08:36 crc kubenswrapper[4836]: I0217 14:08:36.066866 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 17 14:08:36 crc kubenswrapper[4836]: I0217 14:08:36.087138 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 17 14:08:36 crc kubenswrapper[4836]: I0217 14:08:36.107436 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 17 14:08:36 crc kubenswrapper[4836]: I0217 14:08:36.127480 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 17 14:08:36 crc kubenswrapper[4836]: I0217 14:08:36.146828 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 17 14:08:36 crc kubenswrapper[4836]: I0217 14:08:36.187068 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e634dd4f-5615-4dd7-ac0f-8c22a3ee7d40-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-z6h7n\" (UID: \"e634dd4f-5615-4dd7-ac0f-8c22a3ee7d40\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-z6h7n" Feb 17 14:08:36 crc kubenswrapper[4836]: I0217 14:08:36.187111 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/66402e53-3287-45c4-bceb-78fc99836c5b-image-import-ca\") pod \"apiserver-76f77b778f-cnq25\" (UID: \"66402e53-3287-45c4-bceb-78fc99836c5b\") " pod="openshift-apiserver/apiserver-76f77b778f-cnq25" Feb 17 14:08:36 crc kubenswrapper[4836]: I0217 14:08:36.187150 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/66402e53-3287-45c4-bceb-78fc99836c5b-audit\") pod \"apiserver-76f77b778f-cnq25\" (UID: \"66402e53-3287-45c4-bceb-78fc99836c5b\") " pod="openshift-apiserver/apiserver-76f77b778f-cnq25" Feb 17 14:08:36 crc kubenswrapper[4836]: I0217 14:08:36.187181 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/66402e53-3287-45c4-bceb-78fc99836c5b-encryption-config\") pod \"apiserver-76f77b778f-cnq25\" (UID: \"66402e53-3287-45c4-bceb-78fc99836c5b\") " pod="openshift-apiserver/apiserver-76f77b778f-cnq25" Feb 17 14:08:36 crc kubenswrapper[4836]: I0217 14:08:36.187215 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/66402e53-3287-45c4-bceb-78fc99836c5b-serving-cert\") pod \"apiserver-76f77b778f-cnq25\" (UID: \"66402e53-3287-45c4-bceb-78fc99836c5b\") " pod="openshift-apiserver/apiserver-76f77b778f-cnq25" Feb 17 14:08:36 crc kubenswrapper[4836]: I0217 14:08:36.187245 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c77bcf1-4025-4c35-9580-41e9a61195e8-config\") pod \"controller-manager-879f6c89f-5l6x4\" (UID: \"8c77bcf1-4025-4c35-9580-41e9a61195e8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5l6x4" Feb 17 14:08:36 crc kubenswrapper[4836]: I0217 14:08:36.187312 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8c77bcf1-4025-4c35-9580-41e9a61195e8-serving-cert\") pod \"controller-manager-879f6c89f-5l6x4\" (UID: \"8c77bcf1-4025-4c35-9580-41e9a61195e8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5l6x4" Feb 17 14:08:36 crc kubenswrapper[4836]: I0217 14:08:36.187350 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e634dd4f-5615-4dd7-ac0f-8c22a3ee7d40-serving-cert\") pod \"apiserver-7bbb656c7d-z6h7n\" (UID: \"e634dd4f-5615-4dd7-ac0f-8c22a3ee7d40\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-z6h7n" Feb 17 14:08:36 crc kubenswrapper[4836]: I0217 14:08:36.187385 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66402e53-3287-45c4-bceb-78fc99836c5b-config\") pod \"apiserver-76f77b778f-cnq25\" (UID: \"66402e53-3287-45c4-bceb-78fc99836c5b\") " pod="openshift-apiserver/apiserver-76f77b778f-cnq25" Feb 17 14:08:36 crc kubenswrapper[4836]: I0217 14:08:36.187448 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ecc7c98-e9a3-4850-a741-7e0bcf670e27-config\") pod \"machine-api-operator-5694c8668f-jjmwc\" (UID: \"1ecc7c98-e9a3-4850-a741-7e0bcf670e27\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jjmwc" Feb 17 14:08:36 crc kubenswrapper[4836]: I0217 14:08:36.187841 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/66402e53-3287-45c4-bceb-78fc99836c5b-etcd-serving-ca\") pod \"apiserver-76f77b778f-cnq25\" (UID: \"66402e53-3287-45c4-bceb-78fc99836c5b\") " pod="openshift-apiserver/apiserver-76f77b778f-cnq25" Feb 17 14:08:36 crc kubenswrapper[4836]: I0217 14:08:36.187881 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/1ecc7c98-e9a3-4850-a741-7e0bcf670e27-images\") pod \"machine-api-operator-5694c8668f-jjmwc\" (UID: \"1ecc7c98-e9a3-4850-a741-7e0bcf670e27\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jjmwc" Feb 17 14:08:36 crc kubenswrapper[4836]: I0217 14:08:36.187910 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8c77bcf1-4025-4c35-9580-41e9a61195e8-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-5l6x4\" (UID: \"8c77bcf1-4025-4c35-9580-41e9a61195e8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5l6x4" Feb 17 14:08:36 crc kubenswrapper[4836]: I0217 14:08:36.187946 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/66402e53-3287-45c4-bceb-78fc99836c5b-trusted-ca-bundle\") pod \"apiserver-76f77b778f-cnq25\" (UID: \"66402e53-3287-45c4-bceb-78fc99836c5b\") " pod="openshift-apiserver/apiserver-76f77b778f-cnq25" Feb 17 14:08:36 crc kubenswrapper[4836]: I0217 14:08:36.187968 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/66402e53-3287-45c4-bceb-78fc99836c5b-etcd-client\") pod \"apiserver-76f77b778f-cnq25\" (UID: \"66402e53-3287-45c4-bceb-78fc99836c5b\") " pod="openshift-apiserver/apiserver-76f77b778f-cnq25" Feb 17 14:08:36 crc kubenswrapper[4836]: I0217 14:08:36.188009 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/1ecc7c98-e9a3-4850-a741-7e0bcf670e27-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-jjmwc\" (UID: \"1ecc7c98-e9a3-4850-a741-7e0bcf670e27\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jjmwc" Feb 17 14:08:36 crc kubenswrapper[4836]: I0217 14:08:36.202465 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dptq7\" (UniqueName: \"kubernetes.io/projected/444e52ba-f376-40d9-b32f-aa5b523e4134-kube-api-access-dptq7\") pod \"machine-approver-56656f9798-cwlxz\" (UID: \"444e52ba-f376-40d9-b32f-aa5b523e4134\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cwlxz" Feb 17 14:08:36 crc kubenswrapper[4836]: I0217 14:08:36.224253 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrncr\" (UniqueName: \"kubernetes.io/projected/95872171-94c1-4b8a-935f-ae180a4e3d11-kube-api-access-wrncr\") pod \"authentication-operator-69f744f599-98frx\" (UID: \"95872171-94c1-4b8a-935f-ae180a4e3d11\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-98frx" Feb 17 14:08:36 crc kubenswrapper[4836]: I0217 14:08:36.241062 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2wqh\" (UniqueName: \"kubernetes.io/projected/2840702b-d22f-4184-bada-4cd337d79407-kube-api-access-z2wqh\") pod \"console-operator-58897d9998-kcm8s\" (UID: \"2840702b-d22f-4184-bada-4cd337d79407\") " pod="openshift-console-operator/console-operator-58897d9998-kcm8s" Feb 17 14:08:36 crc kubenswrapper[4836]: I0217 14:08:36.261769 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjcv7\" (UniqueName: \"kubernetes.io/projected/d9eb5c8b-f3c7-4068-82c7-28520f6905c6-kube-api-access-gjcv7\") pod \"downloads-7954f5f757-5cbbv\" (UID: \"d9eb5c8b-f3c7-4068-82c7-28520f6905c6\") " pod="openshift-console/downloads-7954f5f757-5cbbv" Feb 17 14:08:36 crc kubenswrapper[4836]: I0217 14:08:36.289963 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhngg\" (UniqueName: \"kubernetes.io/projected/5ad14aa6-962d-4f8f-babe-745f65d63560-kube-api-access-fhngg\") pod \"route-controller-manager-6576b87f9c-2mmw4\" (UID: \"5ad14aa6-962d-4f8f-babe-745f65d63560\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2mmw4" Feb 17 14:08:36 crc kubenswrapper[4836]: I0217 14:08:36.361902 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cwlxz" Feb 17 14:08:36 crc kubenswrapper[4836]: I0217 14:08:36.371377 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 17 14:08:36 crc kubenswrapper[4836]: W0217 14:08:36.381234 4836 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod444e52ba_f376_40d9_b32f_aa5b523e4134.slice/crio-5870e23e3158b354b5696c829a5d80e0b5b48d934f8bae509eca81077c2a9882 WatchSource:0}: Error finding container 5870e23e3158b354b5696c829a5d80e0b5b48d934f8bae509eca81077c2a9882: Status 404 returned error can't find the container with id 5870e23e3158b354b5696c829a5d80e0b5b48d934f8bae509eca81077c2a9882 Feb 17 14:08:36 crc kubenswrapper[4836]: I0217 14:08:36.387871 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 17 14:08:36 crc kubenswrapper[4836]: I0217 14:08:36.391777 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d30a99f-a727-4eb4-9a32-0508707384bf-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-dsrq8\" (UID: \"1d30a99f-a727-4eb4-9a32-0508707384bf\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dsrq8" Feb 17 14:08:36 crc kubenswrapper[4836]: I0217 14:08:36.392922 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d30a99f-a727-4eb4-9a32-0508707384bf-config\") pod \"openshift-apiserver-operator-796bbdcf4f-dsrq8\" (UID: \"1d30a99f-a727-4eb4-9a32-0508707384bf\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dsrq8" Feb 17 14:08:36 crc kubenswrapper[4836]: I0217 14:08:36.393484 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d30a99f-a727-4eb4-9a32-0508707384bf-config\") pod \"openshift-apiserver-operator-796bbdcf4f-dsrq8\" (UID: \"1d30a99f-a727-4eb4-9a32-0508707384bf\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dsrq8" Feb 17 14:08:36 crc kubenswrapper[4836]: I0217 14:08:36.394899 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d30a99f-a727-4eb4-9a32-0508707384bf-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-dsrq8\" (UID: \"1d30a99f-a727-4eb4-9a32-0508707384bf\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dsrq8" Feb 17 14:08:36 crc kubenswrapper[4836]: I0217 14:08:36.396214 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2mmw4" Feb 17 14:08:36 crc kubenswrapper[4836]: I0217 14:08:36.405537 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-kcm8s" Feb 17 14:08:36 crc kubenswrapper[4836]: I0217 14:08:36.407869 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 17 14:08:36 crc kubenswrapper[4836]: I0217 14:08:36.414073 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-5cbbv" Feb 17 14:08:36 crc kubenswrapper[4836]: I0217 14:08:36.427469 4836 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 17 14:08:36 crc kubenswrapper[4836]: I0217 14:08:36.451600 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 17 14:08:36 crc kubenswrapper[4836]: I0217 14:08:36.468443 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 17 14:08:36 crc kubenswrapper[4836]: I0217 14:08:36.471539 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-98frx" Feb 17 14:08:36 crc kubenswrapper[4836]: I0217 14:08:36.487320 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 17 14:08:36 crc kubenswrapper[4836]: I0217 14:08:36.513066 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 17 14:08:36 crc kubenswrapper[4836]: I0217 14:08:36.528622 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 17 14:08:36 crc kubenswrapper[4836]: I0217 14:08:36.549577 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 17 14:08:36 crc kubenswrapper[4836]: I0217 14:08:36.567230 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 17 14:08:36 crc kubenswrapper[4836]: I0217 14:08:36.595126 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 17 14:08:36 crc kubenswrapper[4836]: I0217 14:08:36.623094 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 17 14:08:36 crc kubenswrapper[4836]: I0217 14:08:36.625404 4836 request.go:700] Waited for 1.858710819s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-config-operator/serviceaccounts/openshift-config-operator/token Feb 17 14:08:36 crc kubenswrapper[4836]: I0217 14:08:36.647750 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qb8wv\" (UniqueName: \"kubernetes.io/projected/c26f912f-f640-4b4c-ab61-dd2a163f12ab-kube-api-access-qb8wv\") pod \"openshift-config-operator-7777fb866f-2rnsr\" (UID: \"c26f912f-f640-4b4c-ab61-dd2a163f12ab\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2rnsr" Feb 17 14:08:36 crc kubenswrapper[4836]: I0217 14:08:36.669046 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkc7h\" (UniqueName: \"kubernetes.io/projected/3bb5e0b8-9179-4570-a3d8-acaa80b2c884-kube-api-access-lkc7h\") pod \"ingress-operator-5b745b69d9-7htl2\" (UID: \"3bb5e0b8-9179-4570-a3d8-acaa80b2c884\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7htl2" Feb 17 14:08:36 crc kubenswrapper[4836]: I0217 14:08:36.700872 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9a3f2789-dd41-4e95-8174-db3a40098b0e-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-wd65t\" (UID: \"9a3f2789-dd41-4e95-8174-db3a40098b0e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wd65t" Feb 17 14:08:36 crc kubenswrapper[4836]: I0217 14:08:36.707370 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcvr7\" (UniqueName: \"kubernetes.io/projected/f8080d32-cbe7-4b02-8791-d9f1f9aca269-kube-api-access-dcvr7\") pod \"kube-storage-version-migrator-operator-b67b599dd-rq6q9\" (UID: \"f8080d32-cbe7-4b02-8791-d9f1f9aca269\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rq6q9" Feb 17 14:08:36 crc kubenswrapper[4836]: I0217 14:08:36.730417 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5z42f\" (UniqueName: \"kubernetes.io/projected/171e2af0-2993-4cd3-942f-043bccca2813-kube-api-access-5z42f\") pod \"dns-operator-744455d44c-8sd2q\" (UID: \"171e2af0-2993-4cd3-942f-043bccca2813\") " pod="openshift-dns-operator/dns-operator-744455d44c-8sd2q" Feb 17 14:08:36 crc kubenswrapper[4836]: I0217 14:08:36.748536 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9mxp\" (UniqueName: \"kubernetes.io/projected/921ecdc3-b5f3-44e4-9300-d25342d944d8-kube-api-access-n9mxp\") pod \"cluster-samples-operator-665b6dd947-jf2hd\" (UID: \"921ecdc3-b5f3-44e4-9300-d25342d944d8\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jf2hd" Feb 17 14:08:36 crc kubenswrapper[4836]: I0217 14:08:36.783059 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3bb5e0b8-9179-4570-a3d8-acaa80b2c884-bound-sa-token\") pod \"ingress-operator-5b745b69d9-7htl2\" (UID: \"3bb5e0b8-9179-4570-a3d8-acaa80b2c884\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7htl2" Feb 17 14:08:36 crc kubenswrapper[4836]: I0217 14:08:36.786342 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6htjx\" (UniqueName: \"kubernetes.io/projected/c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7-kube-api-access-6htjx\") pod \"oauth-openshift-558db77b4-6rsds\" (UID: \"c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7\") " pod="openshift-authentication/oauth-openshift-558db77b4-6rsds" Feb 17 14:08:36 crc kubenswrapper[4836]: I0217 14:08:36.791795 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-6rsds" Feb 17 14:08:36 crc kubenswrapper[4836]: I0217 14:08:36.802164 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6jrc\" (UniqueName: \"kubernetes.io/projected/18ec8466-f311-4f81-ae38-48635b000ced-kube-api-access-g6jrc\") pod \"openshift-controller-manager-operator-756b6f6bc6-nncbw\" (UID: \"18ec8466-f311-4f81-ae38-48635b000ced\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nncbw" Feb 17 14:08:36 crc kubenswrapper[4836]: I0217 14:08:36.816624 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wd65t" Feb 17 14:08:36 crc kubenswrapper[4836]: I0217 14:08:36.822349 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7htl2" Feb 17 14:08:36 crc kubenswrapper[4836]: I0217 14:08:36.827713 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkl9p\" (UniqueName: \"kubernetes.io/projected/65684d1d-5242-464d-8caf-ad4866bf6a86-kube-api-access-jkl9p\") pod \"etcd-operator-b45778765-7klmp\" (UID: \"65684d1d-5242-464d-8caf-ad4866bf6a86\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7klmp" Feb 17 14:08:36 crc kubenswrapper[4836]: I0217 14:08:36.827908 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nncbw" Feb 17 14:08:36 crc kubenswrapper[4836]: I0217 14:08:36.836038 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rq6q9" Feb 17 14:08:36 crc kubenswrapper[4836]: I0217 14:08:36.842645 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jf2hd" Feb 17 14:08:36 crc kubenswrapper[4836]: I0217 14:08:36.850728 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2rnsr" Feb 17 14:08:36 crc kubenswrapper[4836]: I0217 14:08:36.855829 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqh4c\" (UniqueName: \"kubernetes.io/projected/19216a1e-34af-4764-a621-e5097db4751b-kube-api-access-bqh4c\") pod \"multus-admission-controller-857f4d67dd-ch9j6\" (UID: \"19216a1e-34af-4764-a621-e5097db4751b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-ch9j6" Feb 17 14:08:36 crc kubenswrapper[4836]: I0217 14:08:36.857391 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-5cbbv"] Feb 17 14:08:36 crc kubenswrapper[4836]: I0217 14:08:36.863424 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crtnq\" (UniqueName: \"kubernetes.io/projected/1d30a99f-a727-4eb4-9a32-0508707384bf-kube-api-access-crtnq\") pod \"openshift-apiserver-operator-796bbdcf4f-dsrq8\" (UID: \"1d30a99f-a727-4eb4-9a32-0508707384bf\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dsrq8" Feb 17 14:08:36 crc kubenswrapper[4836]: I0217 14:08:36.863486 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-8sd2q" Feb 17 14:08:36 crc kubenswrapper[4836]: I0217 14:08:36.889340 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfrbt\" (UniqueName: \"kubernetes.io/projected/985bc83c-52fa-45dc-ab4f-6e47ee47683e-kube-api-access-sfrbt\") pod \"marketplace-operator-79b997595-khbdr\" (UID: \"985bc83c-52fa-45dc-ab4f-6e47ee47683e\") " pod="openshift-marketplace/marketplace-operator-79b997595-khbdr" Feb 17 14:08:36 crc kubenswrapper[4836]: W0217 14:08:36.897510 4836 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd9eb5c8b_f3c7_4068_82c7_28520f6905c6.slice/crio-a85018b0fd8f1071325c064f8c51af4a4dc2d8ec54e655fce3e24d025f5a1f07 WatchSource:0}: Error finding container a85018b0fd8f1071325c064f8c51af4a4dc2d8ec54e655fce3e24d025f5a1f07: Status 404 returned error can't find the container with id a85018b0fd8f1071325c064f8c51af4a4dc2d8ec54e655fce3e24d025f5a1f07 Feb 17 14:08:36 crc kubenswrapper[4836]: I0217 14:08:36.904536 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xj5j\" (UniqueName: \"kubernetes.io/projected/331c189b-0cb5-4733-a233-894429c709a9-kube-api-access-6xj5j\") pod \"olm-operator-6b444d44fb-sknds\" (UID: \"331c189b-0cb5-4733-a233-894429c709a9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sknds" Feb 17 14:08:36 crc kubenswrapper[4836]: I0217 14:08:36.907020 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 17 14:08:36 crc kubenswrapper[4836]: I0217 14:08:36.910607 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sknds" Feb 17 14:08:36 crc kubenswrapper[4836]: I0217 14:08:36.926778 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 17 14:08:36 crc kubenswrapper[4836]: I0217 14:08:36.942031 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-khbdr" Feb 17 14:08:36 crc kubenswrapper[4836]: I0217 14:08:36.957087 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 17 14:08:36 crc kubenswrapper[4836]: I0217 14:08:36.968937 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 17 14:08:36 crc kubenswrapper[4836]: I0217 14:08:36.984778 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-ch9j6" Feb 17 14:08:36 crc kubenswrapper[4836]: I0217 14:08:36.986950 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 17 14:08:36 crc kubenswrapper[4836]: I0217 14:08:36.988032 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/66402e53-3287-45c4-bceb-78fc99836c5b-serving-cert\") pod \"apiserver-76f77b778f-cnq25\" (UID: \"66402e53-3287-45c4-bceb-78fc99836c5b\") " pod="openshift-apiserver/apiserver-76f77b778f-cnq25" Feb 17 14:08:36 crc kubenswrapper[4836]: I0217 14:08:36.995252 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nj6ml\" (UniqueName: \"kubernetes.io/projected/8c77bcf1-4025-4c35-9580-41e9a61195e8-kube-api-access-nj6ml\") pod \"controller-manager-879f6c89f-5l6x4\" (UID: \"8c77bcf1-4025-4c35-9580-41e9a61195e8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5l6x4" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.014551 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.027574 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8c77bcf1-4025-4c35-9580-41e9a61195e8-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-5l6x4\" (UID: \"8c77bcf1-4025-4c35-9580-41e9a61195e8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5l6x4" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.031957 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.039664 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dsrq8" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.043838 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e634dd4f-5615-4dd7-ac0f-8c22a3ee7d40-serving-cert\") pod \"apiserver-7bbb656c7d-z6h7n\" (UID: \"e634dd4f-5615-4dd7-ac0f-8c22a3ee7d40\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-z6h7n" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.049625 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-6rsds"] Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.066864 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.088257 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.091647 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c77bcf1-4025-4c35-9580-41e9a61195e8-config\") pod \"controller-manager-879f6c89f-5l6x4\" (UID: \"8c77bcf1-4025-4c35-9580-41e9a61195e8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5l6x4" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.102585 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-7klmp" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.107051 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.107956 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/66402e53-3287-45c4-bceb-78fc99836c5b-audit\") pod \"apiserver-76f77b778f-cnq25\" (UID: \"66402e53-3287-45c4-bceb-78fc99836c5b\") " pod="openshift-apiserver/apiserver-76f77b778f-cnq25" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.109001 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-2mmw4"] Feb 17 14:08:37 crc kubenswrapper[4836]: W0217 14:08:37.113128 4836 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2d8fb42_9c68_4eb3_a8c9_4e4a98772ae7.slice/crio-f9b98c0ff2091be32d114061b6cc2daa5338c6afb2d30dae1e18fe2afc9b3ea3 WatchSource:0}: Error finding container f9b98c0ff2091be32d114061b6cc2daa5338c6afb2d30dae1e18fe2afc9b3ea3: Status 404 returned error can't find the container with id f9b98c0ff2091be32d114061b6cc2daa5338c6afb2d30dae1e18fe2afc9b3ea3 Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.116696 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-kcm8s"] Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.129162 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.136089 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-98frx"] Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.137682 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wd65t"] Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.144585 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8c77bcf1-4025-4c35-9580-41e9a61195e8-serving-cert\") pod \"controller-manager-879f6c89f-5l6x4\" (UID: \"8c77bcf1-4025-4c35-9580-41e9a61195e8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-5l6x4" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.147125 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.154173 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e634dd4f-5615-4dd7-ac0f-8c22a3ee7d40-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-z6h7n\" (UID: \"e634dd4f-5615-4dd7-ac0f-8c22a3ee7d40\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-z6h7n" Feb 17 14:08:37 crc kubenswrapper[4836]: W0217 14:08:37.162812 4836 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ad14aa6_962d_4f8f_babe_745f65d63560.slice/crio-5a18cb47469c9084e91c362d0628474be4ea76582e846e1e93705e36c466141f WatchSource:0}: Error finding container 5a18cb47469c9084e91c362d0628474be4ea76582e846e1e93705e36c466141f: Status 404 returned error can't find the container with id 5a18cb47469c9084e91c362d0628474be4ea76582e846e1e93705e36c466141f Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.167507 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.169646 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rq6q9"] Feb 17 14:08:37 crc kubenswrapper[4836]: E0217 14:08:37.187147 4836 projected.go:288] Couldn't get configMap openshift-machine-api/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Feb 17 14:08:37 crc kubenswrapper[4836]: E0217 14:08:37.188002 4836 projected.go:194] Error preparing data for projected volume kube-api-access-rfgf7 for pod openshift-machine-api/machine-api-operator-5694c8668f-jjmwc: failed to sync configmap cache: timed out waiting for the condition Feb 17 14:08:37 crc kubenswrapper[4836]: E0217 14:08:37.188086 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1ecc7c98-e9a3-4850-a741-7e0bcf670e27-kube-api-access-rfgf7 podName:1ecc7c98-e9a3-4850-a741-7e0bcf670e27 nodeName:}" failed. No retries permitted until 2026-02-17 14:08:37.688062438 +0000 UTC m=+144.030990707 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-rfgf7" (UniqueName: "kubernetes.io/projected/1ecc7c98-e9a3-4850-a741-7e0bcf670e27-kube-api-access-rfgf7") pod "machine-api-operator-5694c8668f-jjmwc" (UID: "1ecc7c98-e9a3-4850-a741-7e0bcf670e27") : failed to sync configmap cache: timed out waiting for the condition Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.187215 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 17 14:08:37 crc kubenswrapper[4836]: E0217 14:08:37.188613 4836 secret.go:188] Couldn't get secret openshift-machine-api/machine-api-operator-tls: failed to sync secret cache: timed out waiting for the condition Feb 17 14:08:37 crc kubenswrapper[4836]: E0217 14:08:37.188694 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1ecc7c98-e9a3-4850-a741-7e0bcf670e27-machine-api-operator-tls podName:1ecc7c98-e9a3-4850-a741-7e0bcf670e27 nodeName:}" failed. No retries permitted until 2026-02-17 14:08:38.188674135 +0000 UTC m=+144.531602464 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "machine-api-operator-tls" (UniqueName: "kubernetes.io/secret/1ecc7c98-e9a3-4850-a741-7e0bcf670e27-machine-api-operator-tls") pod "machine-api-operator-5694c8668f-jjmwc" (UID: "1ecc7c98-e9a3-4850-a741-7e0bcf670e27") : failed to sync secret cache: timed out waiting for the condition Feb 17 14:08:37 crc kubenswrapper[4836]: E0217 14:08:37.188757 4836 configmap.go:193] Couldn't get configMap openshift-apiserver/image-import-ca: failed to sync configmap cache: timed out waiting for the condition Feb 17 14:08:37 crc kubenswrapper[4836]: E0217 14:08:37.188782 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/66402e53-3287-45c4-bceb-78fc99836c5b-image-import-ca podName:66402e53-3287-45c4-bceb-78fc99836c5b nodeName:}" failed. No retries permitted until 2026-02-17 14:08:38.188775947 +0000 UTC m=+144.531704216 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "image-import-ca" (UniqueName: "kubernetes.io/configmap/66402e53-3287-45c4-bceb-78fc99836c5b-image-import-ca") pod "apiserver-76f77b778f-cnq25" (UID: "66402e53-3287-45c4-bceb-78fc99836c5b") : failed to sync configmap cache: timed out waiting for the condition Feb 17 14:08:37 crc kubenswrapper[4836]: E0217 14:08:37.188781 4836 configmap.go:193] Couldn't get configMap openshift-apiserver/trusted-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Feb 17 14:08:37 crc kubenswrapper[4836]: E0217 14:08:37.188804 4836 configmap.go:193] Couldn't get configMap openshift-machine-api/kube-rbac-proxy: failed to sync configmap cache: timed out waiting for the condition Feb 17 14:08:37 crc kubenswrapper[4836]: E0217 14:08:37.188824 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1ecc7c98-e9a3-4850-a741-7e0bcf670e27-config podName:1ecc7c98-e9a3-4850-a741-7e0bcf670e27 nodeName:}" failed. No retries permitted until 2026-02-17 14:08:38.188818958 +0000 UTC m=+144.531747227 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/1ecc7c98-e9a3-4850-a741-7e0bcf670e27-config") pod "machine-api-operator-5694c8668f-jjmwc" (UID: "1ecc7c98-e9a3-4850-a741-7e0bcf670e27") : failed to sync configmap cache: timed out waiting for the condition Feb 17 14:08:37 crc kubenswrapper[4836]: E0217 14:08:37.188848 4836 configmap.go:193] Couldn't get configMap openshift-apiserver/etcd-serving-ca: failed to sync configmap cache: timed out waiting for the condition Feb 17 14:08:37 crc kubenswrapper[4836]: E0217 14:08:37.188900 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/66402e53-3287-45c4-bceb-78fc99836c5b-trusted-ca-bundle podName:66402e53-3287-45c4-bceb-78fc99836c5b nodeName:}" failed. No retries permitted until 2026-02-17 14:08:38.188835909 +0000 UTC m=+144.531764218 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/66402e53-3287-45c4-bceb-78fc99836c5b-trusted-ca-bundle") pod "apiserver-76f77b778f-cnq25" (UID: "66402e53-3287-45c4-bceb-78fc99836c5b") : failed to sync configmap cache: timed out waiting for the condition Feb 17 14:08:37 crc kubenswrapper[4836]: E0217 14:08:37.188925 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/66402e53-3287-45c4-bceb-78fc99836c5b-etcd-serving-ca podName:66402e53-3287-45c4-bceb-78fc99836c5b nodeName:}" failed. No retries permitted until 2026-02-17 14:08:38.188914741 +0000 UTC m=+144.531843110 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etcd-serving-ca" (UniqueName: "kubernetes.io/configmap/66402e53-3287-45c4-bceb-78fc99836c5b-etcd-serving-ca") pod "apiserver-76f77b778f-cnq25" (UID: "66402e53-3287-45c4-bceb-78fc99836c5b") : failed to sync configmap cache: timed out waiting for the condition Feb 17 14:08:37 crc kubenswrapper[4836]: E0217 14:08:37.188968 4836 secret.go:188] Couldn't get secret openshift-apiserver/encryption-config-1: failed to sync secret cache: timed out waiting for the condition Feb 17 14:08:37 crc kubenswrapper[4836]: E0217 14:08:37.189006 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/66402e53-3287-45c4-bceb-78fc99836c5b-encryption-config podName:66402e53-3287-45c4-bceb-78fc99836c5b nodeName:}" failed. No retries permitted until 2026-02-17 14:08:38.188995293 +0000 UTC m=+144.531923622 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "encryption-config" (UniqueName: "kubernetes.io/secret/66402e53-3287-45c4-bceb-78fc99836c5b-encryption-config") pod "apiserver-76f77b778f-cnq25" (UID: "66402e53-3287-45c4-bceb-78fc99836c5b") : failed to sync secret cache: timed out waiting for the condition Feb 17 14:08:37 crc kubenswrapper[4836]: E0217 14:08:37.189038 4836 configmap.go:193] Couldn't get configMap openshift-machine-api/machine-api-operator-images: failed to sync configmap cache: timed out waiting for the condition Feb 17 14:08:37 crc kubenswrapper[4836]: E0217 14:08:37.189059 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1ecc7c98-e9a3-4850-a741-7e0bcf670e27-images podName:1ecc7c98-e9a3-4850-a741-7e0bcf670e27 nodeName:}" failed. No retries permitted until 2026-02-17 14:08:38.189053515 +0000 UTC m=+144.531981784 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "images" (UniqueName: "kubernetes.io/configmap/1ecc7c98-e9a3-4850-a741-7e0bcf670e27-images") pod "machine-api-operator-5694c8668f-jjmwc" (UID: "1ecc7c98-e9a3-4850-a741-7e0bcf670e27") : failed to sync configmap cache: timed out waiting for the condition Feb 17 14:08:37 crc kubenswrapper[4836]: E0217 14:08:37.189075 4836 configmap.go:193] Couldn't get configMap openshift-apiserver/config: failed to sync configmap cache: timed out waiting for the condition Feb 17 14:08:37 crc kubenswrapper[4836]: E0217 14:08:37.189106 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/66402e53-3287-45c4-bceb-78fc99836c5b-config podName:66402e53-3287-45c4-bceb-78fc99836c5b nodeName:}" failed. No retries permitted until 2026-02-17 14:08:38.189099486 +0000 UTC m=+144.532027855 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/66402e53-3287-45c4-bceb-78fc99836c5b-config") pod "apiserver-76f77b778f-cnq25" (UID: "66402e53-3287-45c4-bceb-78fc99836c5b") : failed to sync configmap cache: timed out waiting for the condition Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.192214 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjskd\" (UniqueName: \"kubernetes.io/projected/e634dd4f-5615-4dd7-ac0f-8c22a3ee7d40-kube-api-access-xjskd\") pod \"apiserver-7bbb656c7d-z6h7n\" (UID: \"e634dd4f-5615-4dd7-ac0f-8c22a3ee7d40\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-z6h7n" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.201074 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/66402e53-3287-45c4-bceb-78fc99836c5b-etcd-client\") pod \"apiserver-76f77b778f-cnq25\" (UID: \"66402e53-3287-45c4-bceb-78fc99836c5b\") " pod="openshift-apiserver/apiserver-76f77b778f-cnq25" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.210310 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.222203 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-5l6x4" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.230213 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-z6h7n" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.234330 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.254983 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.269815 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.295061 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.306950 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.309603 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wd65t" event={"ID":"9a3f2789-dd41-4e95-8174-db3a40098b0e","Type":"ContainerStarted","Data":"d799c4391a6169e5f0f62b3e1b0b8dfd89a83d47ce55959b8154cdef4b995635"} Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.314902 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-6rsds" event={"ID":"c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7","Type":"ContainerStarted","Data":"f9b98c0ff2091be32d114061b6cc2daa5338c6afb2d30dae1e18fe2afc9b3ea3"} Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.319734 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rq6q9" event={"ID":"f8080d32-cbe7-4b02-8791-d9f1f9aca269","Type":"ContainerStarted","Data":"1705268b5549ed86694d48833fc8eafd8038dbc2777f480409afc935da199ae6"} Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.322180 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-kcm8s" event={"ID":"2840702b-d22f-4184-bada-4cd337d79407","Type":"ContainerStarted","Data":"134a9b18572422f88105c67f86dd3ba6359bba7958ffe80d7adf51cd875bec9c"} Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.324887 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cwlxz" event={"ID":"444e52ba-f376-40d9-b32f-aa5b523e4134","Type":"ContainerStarted","Data":"ff1ce2a2ed81c25aa98d7d7035011726b226ae8015dadecc845c0d72af7c204d"} Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.324926 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cwlxz" event={"ID":"444e52ba-f376-40d9-b32f-aa5b523e4134","Type":"ContainerStarted","Data":"a9f603d37f5106008f4b5191c2012e8d9b2c86e1c1a683e4a6edfbeff60eac0d"} Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.324939 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cwlxz" event={"ID":"444e52ba-f376-40d9-b32f-aa5b523e4134","Type":"ContainerStarted","Data":"5870e23e3158b354b5696c829a5d80e0b5b48d934f8bae509eca81077c2a9882"} Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.327766 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-98frx" event={"ID":"95872171-94c1-4b8a-935f-ae180a4e3d11","Type":"ContainerStarted","Data":"18b274722755de1985dbce64b08d691afd2c8f098180eef0d46ca7be76106cef"} Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.328326 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.331843 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-5cbbv" event={"ID":"d9eb5c8b-f3c7-4068-82c7-28520f6905c6","Type":"ContainerStarted","Data":"92b59bab9fd909d359405ecf217a49ab1de8122281a49768577c5a706060d118"} Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.331894 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-5cbbv" event={"ID":"d9eb5c8b-f3c7-4068-82c7-28520f6905c6","Type":"ContainerStarted","Data":"a85018b0fd8f1071325c064f8c51af4a4dc2d8ec54e655fce3e24d025f5a1f07"} Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.332422 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-5cbbv" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.333737 4836 patch_prober.go:28] interesting pod/downloads-7954f5f757-5cbbv container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.333787 4836 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-5cbbv" podUID="d9eb5c8b-f3c7-4068-82c7-28520f6905c6" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.334263 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7qrb\" (UniqueName: \"kubernetes.io/projected/66402e53-3287-45c4-bceb-78fc99836c5b-kube-api-access-q7qrb\") pod \"apiserver-76f77b778f-cnq25\" (UID: \"66402e53-3287-45c4-bceb-78fc99836c5b\") " pod="openshift-apiserver/apiserver-76f77b778f-cnq25" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.337099 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2mmw4" event={"ID":"5ad14aa6-962d-4f8f-babe-745f65d63560","Type":"ContainerStarted","Data":"5a18cb47469c9084e91c362d0628474be4ea76582e846e1e93705e36c466141f"} Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.346668 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.370808 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.393845 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.410170 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.430947 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.512107 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4cd3f585-c95f-43ee-962c-ea33aff90415-trusted-ca\") pod \"image-registry-697d97f7c8-5vhz9\" (UID: \"4cd3f585-c95f-43ee-962c-ea33aff90415\") " pod="openshift-image-registry/image-registry-697d97f7c8-5vhz9" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.512155 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/296ae94a-36e6-480b-9395-8f6a96621fdf-service-ca-bundle\") pod \"router-default-5444994796-fqzrl\" (UID: \"296ae94a-36e6-480b-9395-8f6a96621fdf\") " pod="openshift-ingress/router-default-5444994796-fqzrl" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.512186 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhp9d\" (UniqueName: \"kubernetes.io/projected/4cd3f585-c95f-43ee-962c-ea33aff90415-kube-api-access-vhp9d\") pod \"image-registry-697d97f7c8-5vhz9\" (UID: \"4cd3f585-c95f-43ee-962c-ea33aff90415\") " pod="openshift-image-registry/image-registry-697d97f7c8-5vhz9" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.512223 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6d52104b-91e7-4a3a-9138-163eb850485d-console-serving-cert\") pod \"console-f9d7485db-6zspj\" (UID: \"6d52104b-91e7-4a3a-9138-163eb850485d\") " pod="openshift-console/console-f9d7485db-6zspj" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.512256 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8fcc519-bc1d-4a7e-8005-cb29f435f4e5-config\") pod \"service-ca-operator-777779d784-ld8ls\" (UID: \"b8fcc519-bc1d-4a7e-8005-cb29f435f4e5\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ld8ls" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.512280 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1679c4a6-a707-4150-825b-5cb8b90cb27c-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-2xtf8\" (UID: \"1679c4a6-a707-4150-825b-5cb8b90cb27c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2xtf8" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.512321 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njj8f\" (UniqueName: \"kubernetes.io/projected/628fd7f0-d4b6-4866-b7d4-6966ed698611-kube-api-access-njj8f\") pod \"machine-config-operator-74547568cd-qc9j6\" (UID: \"628fd7f0-d4b6-4866-b7d4-6966ed698611\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qc9j6" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.512374 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grvpk\" (UniqueName: \"kubernetes.io/projected/6d52104b-91e7-4a3a-9138-163eb850485d-kube-api-access-grvpk\") pod \"console-f9d7485db-6zspj\" (UID: \"6d52104b-91e7-4a3a-9138-163eb850485d\") " pod="openshift-console/console-f9d7485db-6zspj" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.512408 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b8fcc519-bc1d-4a7e-8005-cb29f435f4e5-serving-cert\") pod \"service-ca-operator-777779d784-ld8ls\" (UID: \"b8fcc519-bc1d-4a7e-8005-cb29f435f4e5\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ld8ls" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.512459 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1a506e2e-c940-4f10-b89c-948d10ba8902-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-4bfcw\" (UID: \"1a506e2e-c940-4f10-b89c-948d10ba8902\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4bfcw" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.512513 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jf64r\" (UniqueName: \"kubernetes.io/projected/83427963-071f-40a0-8988-b39a3d41e59f-kube-api-access-jf64r\") pod \"migrator-59844c95c7-gtjx7\" (UID: \"83427963-071f-40a0-8988-b39a3d41e59f\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-gtjx7" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.512549 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/ad67d365-7ef5-406c-9ffe-6f66253704c9-profile-collector-cert\") pod \"catalog-operator-68c6474976-wgzvh\" (UID: \"ad67d365-7ef5-406c-9ffe-6f66253704c9\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wgzvh" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.512576 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/cea58b47-da5e-4dc7-be23-19d8408318d7-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-jhzxl\" (UID: \"cea58b47-da5e-4dc7-be23-19d8408318d7\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jhzxl" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.512608 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5vhz9\" (UID: \"4cd3f585-c95f-43ee-962c-ea33aff90415\") " pod="openshift-image-registry/image-registry-697d97f7c8-5vhz9" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.512672 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4cd3f585-c95f-43ee-962c-ea33aff90415-bound-sa-token\") pod \"image-registry-697d97f7c8-5vhz9\" (UID: \"4cd3f585-c95f-43ee-962c-ea33aff90415\") " pod="openshift-image-registry/image-registry-697d97f7c8-5vhz9" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.512712 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vc9lz\" (UniqueName: \"kubernetes.io/projected/1ce6cce5-c0bb-4d10-8458-bb9e15832a9c-kube-api-access-vc9lz\") pod \"package-server-manager-789f6589d5-blr59\" (UID: \"1ce6cce5-c0bb-4d10-8458-bb9e15832a9c\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-blr59" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.512747 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/628fd7f0-d4b6-4866-b7d4-6966ed698611-proxy-tls\") pod \"machine-config-operator-74547568cd-qc9j6\" (UID: \"628fd7f0-d4b6-4866-b7d4-6966ed698611\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qc9j6" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.512799 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxf8g\" (UniqueName: \"kubernetes.io/projected/1a506e2e-c940-4f10-b89c-948d10ba8902-kube-api-access-qxf8g\") pod \"cluster-image-registry-operator-dc59b4c8b-4bfcw\" (UID: \"1a506e2e-c940-4f10-b89c-948d10ba8902\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4bfcw" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.512824 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4cd3f585-c95f-43ee-962c-ea33aff90415-registry-tls\") pod \"image-registry-697d97f7c8-5vhz9\" (UID: \"4cd3f585-c95f-43ee-962c-ea33aff90415\") " pod="openshift-image-registry/image-registry-697d97f7c8-5vhz9" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.512850 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4cd3f585-c95f-43ee-962c-ea33aff90415-installation-pull-secrets\") pod \"image-registry-697d97f7c8-5vhz9\" (UID: \"4cd3f585-c95f-43ee-962c-ea33aff90415\") " pod="openshift-image-registry/image-registry-697d97f7c8-5vhz9" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.512874 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1679c4a6-a707-4150-825b-5cb8b90cb27c-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-2xtf8\" (UID: \"1679c4a6-a707-4150-825b-5cb8b90cb27c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2xtf8" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.512989 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rsrv\" (UniqueName: \"kubernetes.io/projected/56a1d7ef-ccae-4b8e-b94f-edee0ce6e902-kube-api-access-7rsrv\") pod \"machine-config-controller-84d6567774-l5kfm\" (UID: \"56a1d7ef-ccae-4b8e-b94f-edee0ce6e902\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-l5kfm" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.513015 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/759ba4ec-c9e0-43a5-9f05-5a05bde7c3fb-config\") pod \"kube-apiserver-operator-766d6c64bb-wsdcq\" (UID: \"759ba4ec-c9e0-43a5-9f05-5a05bde7c3fb\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wsdcq" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.513038 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/ad67d365-7ef5-406c-9ffe-6f66253704c9-srv-cert\") pod \"catalog-operator-68c6474976-wgzvh\" (UID: \"ad67d365-7ef5-406c-9ffe-6f66253704c9\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wgzvh" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.513060 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/c758606a-b3e4-494e-a2a6-7a7320277b37-tmpfs\") pod \"packageserver-d55dfcdfc-x2x76\" (UID: \"c758606a-b3e4-494e-a2a6-7a7320277b37\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x2x76" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.513082 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/91eb437c-beea-4f2d-b3f7-505b87fe6dee-config-volume\") pod \"collect-profiles-29522280-c8fps\" (UID: \"91eb437c-beea-4f2d-b3f7-505b87fe6dee\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522280-c8fps" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.513103 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6d52104b-91e7-4a3a-9138-163eb850485d-oauth-serving-cert\") pod \"console-f9d7485db-6zspj\" (UID: \"6d52104b-91e7-4a3a-9138-163eb850485d\") " pod="openshift-console/console-f9d7485db-6zspj" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.513129 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjkns\" (UniqueName: \"kubernetes.io/projected/b8fcc519-bc1d-4a7e-8005-cb29f435f4e5-kube-api-access-fjkns\") pod \"service-ca-operator-777779d784-ld8ls\" (UID: \"b8fcc519-bc1d-4a7e-8005-cb29f435f4e5\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ld8ls" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.513152 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/628fd7f0-d4b6-4866-b7d4-6966ed698611-images\") pod \"machine-config-operator-74547568cd-qc9j6\" (UID: \"628fd7f0-d4b6-4866-b7d4-6966ed698611\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qc9j6" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.513174 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c758606a-b3e4-494e-a2a6-7a7320277b37-apiservice-cert\") pod \"packageserver-d55dfcdfc-x2x76\" (UID: \"c758606a-b3e4-494e-a2a6-7a7320277b37\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x2x76" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.513200 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4cd3f585-c95f-43ee-962c-ea33aff90415-ca-trust-extracted\") pod \"image-registry-697d97f7c8-5vhz9\" (UID: \"4cd3f585-c95f-43ee-962c-ea33aff90415\") " pod="openshift-image-registry/image-registry-697d97f7c8-5vhz9" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.513254 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1679c4a6-a707-4150-825b-5cb8b90cb27c-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-2xtf8\" (UID: \"1679c4a6-a707-4150-825b-5cb8b90cb27c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2xtf8" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.513277 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/1a506e2e-c940-4f10-b89c-948d10ba8902-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-4bfcw\" (UID: \"1a506e2e-c940-4f10-b89c-948d10ba8902\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4bfcw" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.513376 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r66m8\" (UniqueName: \"kubernetes.io/projected/0fc0c6ee-c7d3-4e99-a6c2-b8de8f7ffa00-kube-api-access-r66m8\") pod \"service-ca-9c57cc56f-72n7k\" (UID: \"0fc0c6ee-c7d3-4e99-a6c2-b8de8f7ffa00\") " pod="openshift-service-ca/service-ca-9c57cc56f-72n7k" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.513448 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/0fc0c6ee-c7d3-4e99-a6c2-b8de8f7ffa00-signing-key\") pod \"service-ca-9c57cc56f-72n7k\" (UID: \"0fc0c6ee-c7d3-4e99-a6c2-b8de8f7ffa00\") " pod="openshift-service-ca/service-ca-9c57cc56f-72n7k" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.513486 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6d52104b-91e7-4a3a-9138-163eb850485d-console-oauth-config\") pod \"console-f9d7485db-6zspj\" (UID: \"6d52104b-91e7-4a3a-9138-163eb850485d\") " pod="openshift-console/console-f9d7485db-6zspj" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.513535 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/296ae94a-36e6-480b-9395-8f6a96621fdf-default-certificate\") pod \"router-default-5444994796-fqzrl\" (UID: \"296ae94a-36e6-480b-9395-8f6a96621fdf\") " pod="openshift-ingress/router-default-5444994796-fqzrl" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.513578 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6grq\" (UniqueName: \"kubernetes.io/projected/cea58b47-da5e-4dc7-be23-19d8408318d7-kube-api-access-x6grq\") pod \"control-plane-machine-set-operator-78cbb6b69f-jhzxl\" (UID: \"cea58b47-da5e-4dc7-be23-19d8408318d7\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jhzxl" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.513643 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/628fd7f0-d4b6-4866-b7d4-6966ed698611-auth-proxy-config\") pod \"machine-config-operator-74547568cd-qc9j6\" (UID: \"628fd7f0-d4b6-4866-b7d4-6966ed698611\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qc9j6" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.513665 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84mc8\" (UniqueName: \"kubernetes.io/projected/c758606a-b3e4-494e-a2a6-7a7320277b37-kube-api-access-84mc8\") pod \"packageserver-d55dfcdfc-x2x76\" (UID: \"c758606a-b3e4-494e-a2a6-7a7320277b37\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x2x76" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.513714 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4cd3f585-c95f-43ee-962c-ea33aff90415-registry-certificates\") pod \"image-registry-697d97f7c8-5vhz9\" (UID: \"4cd3f585-c95f-43ee-962c-ea33aff90415\") " pod="openshift-image-registry/image-registry-697d97f7c8-5vhz9" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.513780 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/56a1d7ef-ccae-4b8e-b94f-edee0ce6e902-proxy-tls\") pod \"machine-config-controller-84d6567774-l5kfm\" (UID: \"56a1d7ef-ccae-4b8e-b94f-edee0ce6e902\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-l5kfm" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.513805 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kdnd\" (UniqueName: \"kubernetes.io/projected/91eb437c-beea-4f2d-b3f7-505b87fe6dee-kube-api-access-5kdnd\") pod \"collect-profiles-29522280-c8fps\" (UID: \"91eb437c-beea-4f2d-b3f7-505b87fe6dee\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522280-c8fps" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.513826 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/759ba4ec-c9e0-43a5-9f05-5a05bde7c3fb-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-wsdcq\" (UID: \"759ba4ec-c9e0-43a5-9f05-5a05bde7c3fb\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wsdcq" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.513904 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gd2r2\" (UniqueName: \"kubernetes.io/projected/ad67d365-7ef5-406c-9ffe-6f66253704c9-kube-api-access-gd2r2\") pod \"catalog-operator-68c6474976-wgzvh\" (UID: \"ad67d365-7ef5-406c-9ffe-6f66253704c9\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wgzvh" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.513964 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/56a1d7ef-ccae-4b8e-b94f-edee0ce6e902-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-l5kfm\" (UID: \"56a1d7ef-ccae-4b8e-b94f-edee0ce6e902\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-l5kfm" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.514084 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6d52104b-91e7-4a3a-9138-163eb850485d-console-config\") pod \"console-f9d7485db-6zspj\" (UID: \"6d52104b-91e7-4a3a-9138-163eb850485d\") " pod="openshift-console/console-f9d7485db-6zspj" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.514191 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6d52104b-91e7-4a3a-9138-163eb850485d-trusted-ca-bundle\") pod \"console-f9d7485db-6zspj\" (UID: \"6d52104b-91e7-4a3a-9138-163eb850485d\") " pod="openshift-console/console-f9d7485db-6zspj" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.514216 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c758606a-b3e4-494e-a2a6-7a7320277b37-webhook-cert\") pod \"packageserver-d55dfcdfc-x2x76\" (UID: \"c758606a-b3e4-494e-a2a6-7a7320277b37\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x2x76" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.514323 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/0fc0c6ee-c7d3-4e99-a6c2-b8de8f7ffa00-signing-cabundle\") pod \"service-ca-9c57cc56f-72n7k\" (UID: \"0fc0c6ee-c7d3-4e99-a6c2-b8de8f7ffa00\") " pod="openshift-service-ca/service-ca-9c57cc56f-72n7k" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.514347 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/296ae94a-36e6-480b-9395-8f6a96621fdf-stats-auth\") pod \"router-default-5444994796-fqzrl\" (UID: \"296ae94a-36e6-480b-9395-8f6a96621fdf\") " pod="openshift-ingress/router-default-5444994796-fqzrl" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.514425 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9t82\" (UniqueName: \"kubernetes.io/projected/296ae94a-36e6-480b-9395-8f6a96621fdf-kube-api-access-p9t82\") pod \"router-default-5444994796-fqzrl\" (UID: \"296ae94a-36e6-480b-9395-8f6a96621fdf\") " pod="openshift-ingress/router-default-5444994796-fqzrl" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.514484 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6d52104b-91e7-4a3a-9138-163eb850485d-service-ca\") pod \"console-f9d7485db-6zspj\" (UID: \"6d52104b-91e7-4a3a-9138-163eb850485d\") " pod="openshift-console/console-f9d7485db-6zspj" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.514537 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/759ba4ec-c9e0-43a5-9f05-5a05bde7c3fb-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-wsdcq\" (UID: \"759ba4ec-c9e0-43a5-9f05-5a05bde7c3fb\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wsdcq" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.514645 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/91eb437c-beea-4f2d-b3f7-505b87fe6dee-secret-volume\") pod \"collect-profiles-29522280-c8fps\" (UID: \"91eb437c-beea-4f2d-b3f7-505b87fe6dee\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522280-c8fps" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.514672 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1a506e2e-c940-4f10-b89c-948d10ba8902-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-4bfcw\" (UID: \"1a506e2e-c940-4f10-b89c-948d10ba8902\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4bfcw" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.514755 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/296ae94a-36e6-480b-9395-8f6a96621fdf-metrics-certs\") pod \"router-default-5444994796-fqzrl\" (UID: \"296ae94a-36e6-480b-9395-8f6a96621fdf\") " pod="openshift-ingress/router-default-5444994796-fqzrl" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.514813 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/1ce6cce5-c0bb-4d10-8458-bb9e15832a9c-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-blr59\" (UID: \"1ce6cce5-c0bb-4d10-8458-bb9e15832a9c\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-blr59" Feb 17 14:08:37 crc kubenswrapper[4836]: E0217 14:08:37.515354 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:08:38.015342148 +0000 UTC m=+144.358270417 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5vhz9" (UID: "4cd3f585-c95f-43ee-962c-ea33aff90415") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.591129 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-2rnsr"] Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.592076 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-7htl2"] Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.615894 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.616425 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5kdnd\" (UniqueName: \"kubernetes.io/projected/91eb437c-beea-4f2d-b3f7-505b87fe6dee-kube-api-access-5kdnd\") pod \"collect-profiles-29522280-c8fps\" (UID: \"91eb437c-beea-4f2d-b3f7-505b87fe6dee\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522280-c8fps" Feb 17 14:08:37 crc kubenswrapper[4836]: E0217 14:08:37.617448 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:08:38.117425048 +0000 UTC m=+144.460353317 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.618277 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/759ba4ec-c9e0-43a5-9f05-5a05bde7c3fb-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-wsdcq\" (UID: \"759ba4ec-c9e0-43a5-9f05-5a05bde7c3fb\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wsdcq" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.618349 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gd2r2\" (UniqueName: \"kubernetes.io/projected/ad67d365-7ef5-406c-9ffe-6f66253704c9-kube-api-access-gd2r2\") pod \"catalog-operator-68c6474976-wgzvh\" (UID: \"ad67d365-7ef5-406c-9ffe-6f66253704c9\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wgzvh" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.618379 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/56a1d7ef-ccae-4b8e-b94f-edee0ce6e902-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-l5kfm\" (UID: \"56a1d7ef-ccae-4b8e-b94f-edee0ce6e902\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-l5kfm" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.618426 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/c2f4f6fb-d604-402f-83d0-6b25781c3aa8-plugins-dir\") pod \"csi-hostpathplugin-6scjm\" (UID: \"c2f4f6fb-d604-402f-83d0-6b25781c3aa8\") " pod="hostpath-provisioner/csi-hostpathplugin-6scjm" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.618479 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6d52104b-91e7-4a3a-9138-163eb850485d-console-config\") pod \"console-f9d7485db-6zspj\" (UID: \"6d52104b-91e7-4a3a-9138-163eb850485d\") " pod="openshift-console/console-f9d7485db-6zspj" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.618543 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/8efc7eee-3b20-4cdf-9062-d64472b2c888-certs\") pod \"machine-config-server-8ngwr\" (UID: \"8efc7eee-3b20-4cdf-9062-d64472b2c888\") " pod="openshift-machine-config-operator/machine-config-server-8ngwr" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.618577 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6d52104b-91e7-4a3a-9138-163eb850485d-trusted-ca-bundle\") pod \"console-f9d7485db-6zspj\" (UID: \"6d52104b-91e7-4a3a-9138-163eb850485d\") " pod="openshift-console/console-f9d7485db-6zspj" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.618601 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c758606a-b3e4-494e-a2a6-7a7320277b37-webhook-cert\") pod \"packageserver-d55dfcdfc-x2x76\" (UID: \"c758606a-b3e4-494e-a2a6-7a7320277b37\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x2x76" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.618625 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f70daa4b-d685-406a-ba3a-7fa6d672acdd-cert\") pod \"ingress-canary-9kmt4\" (UID: \"f70daa4b-d685-406a-ba3a-7fa6d672acdd\") " pod="openshift-ingress-canary/ingress-canary-9kmt4" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.618757 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/296ae94a-36e6-480b-9395-8f6a96621fdf-stats-auth\") pod \"router-default-5444994796-fqzrl\" (UID: \"296ae94a-36e6-480b-9395-8f6a96621fdf\") " pod="openshift-ingress/router-default-5444994796-fqzrl" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.618799 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/0fc0c6ee-c7d3-4e99-a6c2-b8de8f7ffa00-signing-cabundle\") pod \"service-ca-9c57cc56f-72n7k\" (UID: \"0fc0c6ee-c7d3-4e99-a6c2-b8de8f7ffa00\") " pod="openshift-service-ca/service-ca-9c57cc56f-72n7k" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.618850 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9t82\" (UniqueName: \"kubernetes.io/projected/296ae94a-36e6-480b-9395-8f6a96621fdf-kube-api-access-p9t82\") pod \"router-default-5444994796-fqzrl\" (UID: \"296ae94a-36e6-480b-9395-8f6a96621fdf\") " pod="openshift-ingress/router-default-5444994796-fqzrl" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.618888 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6d52104b-91e7-4a3a-9138-163eb850485d-service-ca\") pod \"console-f9d7485db-6zspj\" (UID: \"6d52104b-91e7-4a3a-9138-163eb850485d\") " pod="openshift-console/console-f9d7485db-6zspj" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.619088 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/759ba4ec-c9e0-43a5-9f05-5a05bde7c3fb-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-wsdcq\" (UID: \"759ba4ec-c9e0-43a5-9f05-5a05bde7c3fb\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wsdcq" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.619175 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/91eb437c-beea-4f2d-b3f7-505b87fe6dee-secret-volume\") pod \"collect-profiles-29522280-c8fps\" (UID: \"91eb437c-beea-4f2d-b3f7-505b87fe6dee\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522280-c8fps" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.619228 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1a506e2e-c940-4f10-b89c-948d10ba8902-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-4bfcw\" (UID: \"1a506e2e-c940-4f10-b89c-948d10ba8902\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4bfcw" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.619267 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/296ae94a-36e6-480b-9395-8f6a96621fdf-metrics-certs\") pod \"router-default-5444994796-fqzrl\" (UID: \"296ae94a-36e6-480b-9395-8f6a96621fdf\") " pod="openshift-ingress/router-default-5444994796-fqzrl" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.619308 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/c2f4f6fb-d604-402f-83d0-6b25781c3aa8-socket-dir\") pod \"csi-hostpathplugin-6scjm\" (UID: \"c2f4f6fb-d604-402f-83d0-6b25781c3aa8\") " pod="hostpath-provisioner/csi-hostpathplugin-6scjm" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.619339 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/1ce6cce5-c0bb-4d10-8458-bb9e15832a9c-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-blr59\" (UID: \"1ce6cce5-c0bb-4d10-8458-bb9e15832a9c\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-blr59" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.620630 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4cd3f585-c95f-43ee-962c-ea33aff90415-trusted-ca\") pod \"image-registry-697d97f7c8-5vhz9\" (UID: \"4cd3f585-c95f-43ee-962c-ea33aff90415\") " pod="openshift-image-registry/image-registry-697d97f7c8-5vhz9" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.620665 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/296ae94a-36e6-480b-9395-8f6a96621fdf-service-ca-bundle\") pod \"router-default-5444994796-fqzrl\" (UID: \"296ae94a-36e6-480b-9395-8f6a96621fdf\") " pod="openshift-ingress/router-default-5444994796-fqzrl" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.620687 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/c2f4f6fb-d604-402f-83d0-6b25781c3aa8-csi-data-dir\") pod \"csi-hostpathplugin-6scjm\" (UID: \"c2f4f6fb-d604-402f-83d0-6b25781c3aa8\") " pod="hostpath-provisioner/csi-hostpathplugin-6scjm" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.620715 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhp9d\" (UniqueName: \"kubernetes.io/projected/4cd3f585-c95f-43ee-962c-ea33aff90415-kube-api-access-vhp9d\") pod \"image-registry-697d97f7c8-5vhz9\" (UID: \"4cd3f585-c95f-43ee-962c-ea33aff90415\") " pod="openshift-image-registry/image-registry-697d97f7c8-5vhz9" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.620783 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6d52104b-91e7-4a3a-9138-163eb850485d-console-serving-cert\") pod \"console-f9d7485db-6zspj\" (UID: \"6d52104b-91e7-4a3a-9138-163eb850485d\") " pod="openshift-console/console-f9d7485db-6zspj" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.620817 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8fcc519-bc1d-4a7e-8005-cb29f435f4e5-config\") pod \"service-ca-operator-777779d784-ld8ls\" (UID: \"b8fcc519-bc1d-4a7e-8005-cb29f435f4e5\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ld8ls" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.620947 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1679c4a6-a707-4150-825b-5cb8b90cb27c-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-2xtf8\" (UID: \"1679c4a6-a707-4150-825b-5cb8b90cb27c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2xtf8" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.620975 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lx2zw\" (UniqueName: \"kubernetes.io/projected/f70daa4b-d685-406a-ba3a-7fa6d672acdd-kube-api-access-lx2zw\") pod \"ingress-canary-9kmt4\" (UID: \"f70daa4b-d685-406a-ba3a-7fa6d672acdd\") " pod="openshift-ingress-canary/ingress-canary-9kmt4" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.621055 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grvpk\" (UniqueName: \"kubernetes.io/projected/6d52104b-91e7-4a3a-9138-163eb850485d-kube-api-access-grvpk\") pod \"console-f9d7485db-6zspj\" (UID: \"6d52104b-91e7-4a3a-9138-163eb850485d\") " pod="openshift-console/console-f9d7485db-6zspj" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.621189 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njj8f\" (UniqueName: \"kubernetes.io/projected/628fd7f0-d4b6-4866-b7d4-6966ed698611-kube-api-access-njj8f\") pod \"machine-config-operator-74547568cd-qc9j6\" (UID: \"628fd7f0-d4b6-4866-b7d4-6966ed698611\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qc9j6" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.621379 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b8fcc519-bc1d-4a7e-8005-cb29f435f4e5-serving-cert\") pod \"service-ca-operator-777779d784-ld8ls\" (UID: \"b8fcc519-bc1d-4a7e-8005-cb29f435f4e5\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ld8ls" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.621510 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1a506e2e-c940-4f10-b89c-948d10ba8902-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-4bfcw\" (UID: \"1a506e2e-c940-4f10-b89c-948d10ba8902\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4bfcw" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.621567 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4km46\" (UniqueName: \"kubernetes.io/projected/8efc7eee-3b20-4cdf-9062-d64472b2c888-kube-api-access-4km46\") pod \"machine-config-server-8ngwr\" (UID: \"8efc7eee-3b20-4cdf-9062-d64472b2c888\") " pod="openshift-machine-config-operator/machine-config-server-8ngwr" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.623100 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/0fc0c6ee-c7d3-4e99-a6c2-b8de8f7ffa00-signing-cabundle\") pod \"service-ca-9c57cc56f-72n7k\" (UID: \"0fc0c6ee-c7d3-4e99-a6c2-b8de8f7ffa00\") " pod="openshift-service-ca/service-ca-9c57cc56f-72n7k" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.624323 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/296ae94a-36e6-480b-9395-8f6a96621fdf-service-ca-bundle\") pod \"router-default-5444994796-fqzrl\" (UID: \"296ae94a-36e6-480b-9395-8f6a96621fdf\") " pod="openshift-ingress/router-default-5444994796-fqzrl" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.624416 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/56a1d7ef-ccae-4b8e-b94f-edee0ce6e902-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-l5kfm\" (UID: \"56a1d7ef-ccae-4b8e-b94f-edee0ce6e902\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-l5kfm" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.624833 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jf64r\" (UniqueName: \"kubernetes.io/projected/83427963-071f-40a0-8988-b39a3d41e59f-kube-api-access-jf64r\") pod \"migrator-59844c95c7-gtjx7\" (UID: \"83427963-071f-40a0-8988-b39a3d41e59f\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-gtjx7" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.624899 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/ad67d365-7ef5-406c-9ffe-6f66253704c9-profile-collector-cert\") pod \"catalog-operator-68c6474976-wgzvh\" (UID: \"ad67d365-7ef5-406c-9ffe-6f66253704c9\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wgzvh" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.625086 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/cea58b47-da5e-4dc7-be23-19d8408318d7-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-jhzxl\" (UID: \"cea58b47-da5e-4dc7-be23-19d8408318d7\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jhzxl" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.625142 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5vhz9\" (UID: \"4cd3f585-c95f-43ee-962c-ea33aff90415\") " pod="openshift-image-registry/image-registry-697d97f7c8-5vhz9" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.625255 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4cd3f585-c95f-43ee-962c-ea33aff90415-bound-sa-token\") pod \"image-registry-697d97f7c8-5vhz9\" (UID: \"4cd3f585-c95f-43ee-962c-ea33aff90415\") " pod="openshift-image-registry/image-registry-697d97f7c8-5vhz9" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.625347 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vc9lz\" (UniqueName: \"kubernetes.io/projected/1ce6cce5-c0bb-4d10-8458-bb9e15832a9c-kube-api-access-vc9lz\") pod \"package-server-manager-789f6589d5-blr59\" (UID: \"1ce6cce5-c0bb-4d10-8458-bb9e15832a9c\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-blr59" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.625375 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1a506e2e-c940-4f10-b89c-948d10ba8902-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-4bfcw\" (UID: \"1a506e2e-c940-4f10-b89c-948d10ba8902\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4bfcw" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.625448 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/628fd7f0-d4b6-4866-b7d4-6966ed698611-proxy-tls\") pod \"machine-config-operator-74547568cd-qc9j6\" (UID: \"628fd7f0-d4b6-4866-b7d4-6966ed698611\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qc9j6" Feb 17 14:08:37 crc kubenswrapper[4836]: E0217 14:08:37.625497 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:08:38.125481512 +0000 UTC m=+144.468409831 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5vhz9" (UID: "4cd3f585-c95f-43ee-962c-ea33aff90415") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.625543 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/8efc7eee-3b20-4cdf-9062-d64472b2c888-node-bootstrap-token\") pod \"machine-config-server-8ngwr\" (UID: \"8efc7eee-3b20-4cdf-9062-d64472b2c888\") " pod="openshift-machine-config-operator/machine-config-server-8ngwr" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.625592 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxf8g\" (UniqueName: \"kubernetes.io/projected/1a506e2e-c940-4f10-b89c-948d10ba8902-kube-api-access-qxf8g\") pod \"cluster-image-registry-operator-dc59b4c8b-4bfcw\" (UID: \"1a506e2e-c940-4f10-b89c-948d10ba8902\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4bfcw" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.625629 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1679c4a6-a707-4150-825b-5cb8b90cb27c-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-2xtf8\" (UID: \"1679c4a6-a707-4150-825b-5cb8b90cb27c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2xtf8" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.625734 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4cd3f585-c95f-43ee-962c-ea33aff90415-registry-tls\") pod \"image-registry-697d97f7c8-5vhz9\" (UID: \"4cd3f585-c95f-43ee-962c-ea33aff90415\") " pod="openshift-image-registry/image-registry-697d97f7c8-5vhz9" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.626396 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4cd3f585-c95f-43ee-962c-ea33aff90415-trusted-ca\") pod \"image-registry-697d97f7c8-5vhz9\" (UID: \"4cd3f585-c95f-43ee-962c-ea33aff90415\") " pod="openshift-image-registry/image-registry-697d97f7c8-5vhz9" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.626584 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4cd3f585-c95f-43ee-962c-ea33aff90415-installation-pull-secrets\") pod \"image-registry-697d97f7c8-5vhz9\" (UID: \"4cd3f585-c95f-43ee-962c-ea33aff90415\") " pod="openshift-image-registry/image-registry-697d97f7c8-5vhz9" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.626951 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6d52104b-91e7-4a3a-9138-163eb850485d-console-config\") pod \"console-f9d7485db-6zspj\" (UID: \"6d52104b-91e7-4a3a-9138-163eb850485d\") " pod="openshift-console/console-f9d7485db-6zspj" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.627223 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8fcc519-bc1d-4a7e-8005-cb29f435f4e5-config\") pod \"service-ca-operator-777779d784-ld8ls\" (UID: \"b8fcc519-bc1d-4a7e-8005-cb29f435f4e5\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ld8ls" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.627418 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rsrv\" (UniqueName: \"kubernetes.io/projected/56a1d7ef-ccae-4b8e-b94f-edee0ce6e902-kube-api-access-7rsrv\") pod \"machine-config-controller-84d6567774-l5kfm\" (UID: \"56a1d7ef-ccae-4b8e-b94f-edee0ce6e902\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-l5kfm" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.627865 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvptj\" (UniqueName: \"kubernetes.io/projected/f799cad1-5a28-4af5-8070-5c365cddbf78-kube-api-access-jvptj\") pod \"dns-default-284hg\" (UID: \"f799cad1-5a28-4af5-8070-5c365cddbf78\") " pod="openshift-dns/dns-default-284hg" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.627990 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/c2f4f6fb-d604-402f-83d0-6b25781c3aa8-mountpoint-dir\") pod \"csi-hostpathplugin-6scjm\" (UID: \"c2f4f6fb-d604-402f-83d0-6b25781c3aa8\") " pod="hostpath-provisioner/csi-hostpathplugin-6scjm" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.629142 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/ad67d365-7ef5-406c-9ffe-6f66253704c9-srv-cert\") pod \"catalog-operator-68c6474976-wgzvh\" (UID: \"ad67d365-7ef5-406c-9ffe-6f66253704c9\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wgzvh" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.629167 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/c758606a-b3e4-494e-a2a6-7a7320277b37-tmpfs\") pod \"packageserver-d55dfcdfc-x2x76\" (UID: \"c758606a-b3e4-494e-a2a6-7a7320277b37\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x2x76" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.629183 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/759ba4ec-c9e0-43a5-9f05-5a05bde7c3fb-config\") pod \"kube-apiserver-operator-766d6c64bb-wsdcq\" (UID: \"759ba4ec-c9e0-43a5-9f05-5a05bde7c3fb\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wsdcq" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.629214 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/91eb437c-beea-4f2d-b3f7-505b87fe6dee-config-volume\") pod \"collect-profiles-29522280-c8fps\" (UID: \"91eb437c-beea-4f2d-b3f7-505b87fe6dee\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522280-c8fps" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.629231 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6d52104b-91e7-4a3a-9138-163eb850485d-oauth-serving-cert\") pod \"console-f9d7485db-6zspj\" (UID: \"6d52104b-91e7-4a3a-9138-163eb850485d\") " pod="openshift-console/console-f9d7485db-6zspj" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.629369 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjkns\" (UniqueName: \"kubernetes.io/projected/b8fcc519-bc1d-4a7e-8005-cb29f435f4e5-kube-api-access-fjkns\") pod \"service-ca-operator-777779d784-ld8ls\" (UID: \"b8fcc519-bc1d-4a7e-8005-cb29f435f4e5\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ld8ls" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.629416 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/c2f4f6fb-d604-402f-83d0-6b25781c3aa8-registration-dir\") pod \"csi-hostpathplugin-6scjm\" (UID: \"c2f4f6fb-d604-402f-83d0-6b25781c3aa8\") " pod="hostpath-provisioner/csi-hostpathplugin-6scjm" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.629441 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/628fd7f0-d4b6-4866-b7d4-6966ed698611-images\") pod \"machine-config-operator-74547568cd-qc9j6\" (UID: \"628fd7f0-d4b6-4866-b7d4-6966ed698611\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qc9j6" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.629463 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c758606a-b3e4-494e-a2a6-7a7320277b37-apiservice-cert\") pod \"packageserver-d55dfcdfc-x2x76\" (UID: \"c758606a-b3e4-494e-a2a6-7a7320277b37\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x2x76" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.629517 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4cd3f585-c95f-43ee-962c-ea33aff90415-ca-trust-extracted\") pod \"image-registry-697d97f7c8-5vhz9\" (UID: \"4cd3f585-c95f-43ee-962c-ea33aff90415\") " pod="openshift-image-registry/image-registry-697d97f7c8-5vhz9" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.629534 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f799cad1-5a28-4af5-8070-5c365cddbf78-config-volume\") pod \"dns-default-284hg\" (UID: \"f799cad1-5a28-4af5-8070-5c365cddbf78\") " pod="openshift-dns/dns-default-284hg" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.629564 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/1a506e2e-c940-4f10-b89c-948d10ba8902-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-4bfcw\" (UID: \"1a506e2e-c940-4f10-b89c-948d10ba8902\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4bfcw" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.629589 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1679c4a6-a707-4150-825b-5cb8b90cb27c-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-2xtf8\" (UID: \"1679c4a6-a707-4150-825b-5cb8b90cb27c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2xtf8" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.629801 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r66m8\" (UniqueName: \"kubernetes.io/projected/0fc0c6ee-c7d3-4e99-a6c2-b8de8f7ffa00-kube-api-access-r66m8\") pod \"service-ca-9c57cc56f-72n7k\" (UID: \"0fc0c6ee-c7d3-4e99-a6c2-b8de8f7ffa00\") " pod="openshift-service-ca/service-ca-9c57cc56f-72n7k" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.629821 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87r8v\" (UniqueName: \"kubernetes.io/projected/c2f4f6fb-d604-402f-83d0-6b25781c3aa8-kube-api-access-87r8v\") pod \"csi-hostpathplugin-6scjm\" (UID: \"c2f4f6fb-d604-402f-83d0-6b25781c3aa8\") " pod="hostpath-provisioner/csi-hostpathplugin-6scjm" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.629843 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f799cad1-5a28-4af5-8070-5c365cddbf78-metrics-tls\") pod \"dns-default-284hg\" (UID: \"f799cad1-5a28-4af5-8070-5c365cddbf78\") " pod="openshift-dns/dns-default-284hg" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.629934 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/0fc0c6ee-c7d3-4e99-a6c2-b8de8f7ffa00-signing-key\") pod \"service-ca-9c57cc56f-72n7k\" (UID: \"0fc0c6ee-c7d3-4e99-a6c2-b8de8f7ffa00\") " pod="openshift-service-ca/service-ca-9c57cc56f-72n7k" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.629961 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6d52104b-91e7-4a3a-9138-163eb850485d-console-oauth-config\") pod \"console-f9d7485db-6zspj\" (UID: \"6d52104b-91e7-4a3a-9138-163eb850485d\") " pod="openshift-console/console-f9d7485db-6zspj" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.629980 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/296ae94a-36e6-480b-9395-8f6a96621fdf-default-certificate\") pod \"router-default-5444994796-fqzrl\" (UID: \"296ae94a-36e6-480b-9395-8f6a96621fdf\") " pod="openshift-ingress/router-default-5444994796-fqzrl" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.630011 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6grq\" (UniqueName: \"kubernetes.io/projected/cea58b47-da5e-4dc7-be23-19d8408318d7-kube-api-access-x6grq\") pod \"control-plane-machine-set-operator-78cbb6b69f-jhzxl\" (UID: \"cea58b47-da5e-4dc7-be23-19d8408318d7\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jhzxl" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.630040 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/628fd7f0-d4b6-4866-b7d4-6966ed698611-auth-proxy-config\") pod \"machine-config-operator-74547568cd-qc9j6\" (UID: \"628fd7f0-d4b6-4866-b7d4-6966ed698611\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qc9j6" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.630652 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6d52104b-91e7-4a3a-9138-163eb850485d-service-ca\") pod \"console-f9d7485db-6zspj\" (UID: \"6d52104b-91e7-4a3a-9138-163eb850485d\") " pod="openshift-console/console-f9d7485db-6zspj" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.631251 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6d52104b-91e7-4a3a-9138-163eb850485d-trusted-ca-bundle\") pod \"console-f9d7485db-6zspj\" (UID: \"6d52104b-91e7-4a3a-9138-163eb850485d\") " pod="openshift-console/console-f9d7485db-6zspj" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.631915 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1679c4a6-a707-4150-825b-5cb8b90cb27c-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-2xtf8\" (UID: \"1679c4a6-a707-4150-825b-5cb8b90cb27c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2xtf8" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.632202 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c758606a-b3e4-494e-a2a6-7a7320277b37-webhook-cert\") pod \"packageserver-d55dfcdfc-x2x76\" (UID: \"c758606a-b3e4-494e-a2a6-7a7320277b37\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x2x76" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.628366 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b8fcc519-bc1d-4a7e-8005-cb29f435f4e5-serving-cert\") pod \"service-ca-operator-777779d784-ld8ls\" (UID: \"b8fcc519-bc1d-4a7e-8005-cb29f435f4e5\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ld8ls" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.632637 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/759ba4ec-c9e0-43a5-9f05-5a05bde7c3fb-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-wsdcq\" (UID: \"759ba4ec-c9e0-43a5-9f05-5a05bde7c3fb\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wsdcq" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.633222 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/759ba4ec-c9e0-43a5-9f05-5a05bde7c3fb-config\") pod \"kube-apiserver-operator-766d6c64bb-wsdcq\" (UID: \"759ba4ec-c9e0-43a5-9f05-5a05bde7c3fb\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wsdcq" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.633406 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/296ae94a-36e6-480b-9395-8f6a96621fdf-stats-auth\") pod \"router-default-5444994796-fqzrl\" (UID: \"296ae94a-36e6-480b-9395-8f6a96621fdf\") " pod="openshift-ingress/router-default-5444994796-fqzrl" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.633844 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6d52104b-91e7-4a3a-9138-163eb850485d-oauth-serving-cert\") pod \"console-f9d7485db-6zspj\" (UID: \"6d52104b-91e7-4a3a-9138-163eb850485d\") " pod="openshift-console/console-f9d7485db-6zspj" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.634632 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/c758606a-b3e4-494e-a2a6-7a7320277b37-tmpfs\") pod \"packageserver-d55dfcdfc-x2x76\" (UID: \"c758606a-b3e4-494e-a2a6-7a7320277b37\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x2x76" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.634636 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/91eb437c-beea-4f2d-b3f7-505b87fe6dee-config-volume\") pod \"collect-profiles-29522280-c8fps\" (UID: \"91eb437c-beea-4f2d-b3f7-505b87fe6dee\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522280-c8fps" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.636270 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4cd3f585-c95f-43ee-962c-ea33aff90415-ca-trust-extracted\") pod \"image-registry-697d97f7c8-5vhz9\" (UID: \"4cd3f585-c95f-43ee-962c-ea33aff90415\") " pod="openshift-image-registry/image-registry-697d97f7c8-5vhz9" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.636824 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1679c4a6-a707-4150-825b-5cb8b90cb27c-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-2xtf8\" (UID: \"1679c4a6-a707-4150-825b-5cb8b90cb27c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2xtf8" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.637135 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/628fd7f0-d4b6-4866-b7d4-6966ed698611-images\") pod \"machine-config-operator-74547568cd-qc9j6\" (UID: \"628fd7f0-d4b6-4866-b7d4-6966ed698611\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qc9j6" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.639628 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6d52104b-91e7-4a3a-9138-163eb850485d-console-serving-cert\") pod \"console-f9d7485db-6zspj\" (UID: \"6d52104b-91e7-4a3a-9138-163eb850485d\") " pod="openshift-console/console-f9d7485db-6zspj" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.640306 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/ad67d365-7ef5-406c-9ffe-6f66253704c9-srv-cert\") pod \"catalog-operator-68c6474976-wgzvh\" (UID: \"ad67d365-7ef5-406c-9ffe-6f66253704c9\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wgzvh" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.640541 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/628fd7f0-d4b6-4866-b7d4-6966ed698611-auth-proxy-config\") pod \"machine-config-operator-74547568cd-qc9j6\" (UID: \"628fd7f0-d4b6-4866-b7d4-6966ed698611\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qc9j6" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.640661 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/296ae94a-36e6-480b-9395-8f6a96621fdf-default-certificate\") pod \"router-default-5444994796-fqzrl\" (UID: \"296ae94a-36e6-480b-9395-8f6a96621fdf\") " pod="openshift-ingress/router-default-5444994796-fqzrl" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.640681 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84mc8\" (UniqueName: \"kubernetes.io/projected/c758606a-b3e4-494e-a2a6-7a7320277b37-kube-api-access-84mc8\") pod \"packageserver-d55dfcdfc-x2x76\" (UID: \"c758606a-b3e4-494e-a2a6-7a7320277b37\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x2x76" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.641709 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/0fc0c6ee-c7d3-4e99-a6c2-b8de8f7ffa00-signing-key\") pod \"service-ca-9c57cc56f-72n7k\" (UID: \"0fc0c6ee-c7d3-4e99-a6c2-b8de8f7ffa00\") " pod="openshift-service-ca/service-ca-9c57cc56f-72n7k" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.641719 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4cd3f585-c95f-43ee-962c-ea33aff90415-registry-certificates\") pod \"image-registry-697d97f7c8-5vhz9\" (UID: \"4cd3f585-c95f-43ee-962c-ea33aff90415\") " pod="openshift-image-registry/image-registry-697d97f7c8-5vhz9" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.641981 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/56a1d7ef-ccae-4b8e-b94f-edee0ce6e902-proxy-tls\") pod \"machine-config-controller-84d6567774-l5kfm\" (UID: \"56a1d7ef-ccae-4b8e-b94f-edee0ce6e902\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-l5kfm" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.649958 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6d52104b-91e7-4a3a-9138-163eb850485d-console-oauth-config\") pod \"console-f9d7485db-6zspj\" (UID: \"6d52104b-91e7-4a3a-9138-163eb850485d\") " pod="openshift-console/console-f9d7485db-6zspj" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.650212 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/1a506e2e-c940-4f10-b89c-948d10ba8902-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-4bfcw\" (UID: \"1a506e2e-c940-4f10-b89c-948d10ba8902\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4bfcw" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.650655 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4cd3f585-c95f-43ee-962c-ea33aff90415-installation-pull-secrets\") pod \"image-registry-697d97f7c8-5vhz9\" (UID: \"4cd3f585-c95f-43ee-962c-ea33aff90415\") " pod="openshift-image-registry/image-registry-697d97f7c8-5vhz9" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.652491 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4cd3f585-c95f-43ee-962c-ea33aff90415-registry-tls\") pod \"image-registry-697d97f7c8-5vhz9\" (UID: \"4cd3f585-c95f-43ee-962c-ea33aff90415\") " pod="openshift-image-registry/image-registry-697d97f7c8-5vhz9" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.652691 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4cd3f585-c95f-43ee-962c-ea33aff90415-registry-certificates\") pod \"image-registry-697d97f7c8-5vhz9\" (UID: \"4cd3f585-c95f-43ee-962c-ea33aff90415\") " pod="openshift-image-registry/image-registry-697d97f7c8-5vhz9" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.653662 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/759ba4ec-c9e0-43a5-9f05-5a05bde7c3fb-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-wsdcq\" (UID: \"759ba4ec-c9e0-43a5-9f05-5a05bde7c3fb\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wsdcq" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.654142 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/ad67d365-7ef5-406c-9ffe-6f66253704c9-profile-collector-cert\") pod \"catalog-operator-68c6474976-wgzvh\" (UID: \"ad67d365-7ef5-406c-9ffe-6f66253704c9\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wgzvh" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.654200 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c758606a-b3e4-494e-a2a6-7a7320277b37-apiservice-cert\") pod \"packageserver-d55dfcdfc-x2x76\" (UID: \"c758606a-b3e4-494e-a2a6-7a7320277b37\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x2x76" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.654886 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/91eb437c-beea-4f2d-b3f7-505b87fe6dee-secret-volume\") pod \"collect-profiles-29522280-c8fps\" (UID: \"91eb437c-beea-4f2d-b3f7-505b87fe6dee\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522280-c8fps" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.655044 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/cea58b47-da5e-4dc7-be23-19d8408318d7-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-jhzxl\" (UID: \"cea58b47-da5e-4dc7-be23-19d8408318d7\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jhzxl" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.657126 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/56a1d7ef-ccae-4b8e-b94f-edee0ce6e902-proxy-tls\") pod \"machine-config-controller-84d6567774-l5kfm\" (UID: \"56a1d7ef-ccae-4b8e-b94f-edee0ce6e902\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-l5kfm" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.667481 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/296ae94a-36e6-480b-9395-8f6a96621fdf-metrics-certs\") pod \"router-default-5444994796-fqzrl\" (UID: \"296ae94a-36e6-480b-9395-8f6a96621fdf\") " pod="openshift-ingress/router-default-5444994796-fqzrl" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.668410 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/1ce6cce5-c0bb-4d10-8458-bb9e15832a9c-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-blr59\" (UID: \"1ce6cce5-c0bb-4d10-8458-bb9e15832a9c\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-blr59" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.673385 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kdnd\" (UniqueName: \"kubernetes.io/projected/91eb437c-beea-4f2d-b3f7-505b87fe6dee-kube-api-access-5kdnd\") pod \"collect-profiles-29522280-c8fps\" (UID: \"91eb437c-beea-4f2d-b3f7-505b87fe6dee\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522280-c8fps" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.681970 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/628fd7f0-d4b6-4866-b7d4-6966ed698611-proxy-tls\") pod \"machine-config-operator-74547568cd-qc9j6\" (UID: \"628fd7f0-d4b6-4866-b7d4-6966ed698611\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qc9j6" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.689468 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gd2r2\" (UniqueName: \"kubernetes.io/projected/ad67d365-7ef5-406c-9ffe-6f66253704c9-kube-api-access-gd2r2\") pod \"catalog-operator-68c6474976-wgzvh\" (UID: \"ad67d365-7ef5-406c-9ffe-6f66253704c9\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wgzvh" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.757220 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sknds"] Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.757362 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:08:37 crc kubenswrapper[4836]: E0217 14:08:37.757560 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:08:38.257531728 +0000 UTC m=+144.600459997 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.757657 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/c2f4f6fb-d604-402f-83d0-6b25781c3aa8-plugins-dir\") pod \"csi-hostpathplugin-6scjm\" (UID: \"c2f4f6fb-d604-402f-83d0-6b25781c3aa8\") " pod="hostpath-provisioner/csi-hostpathplugin-6scjm" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.757767 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/8efc7eee-3b20-4cdf-9062-d64472b2c888-certs\") pod \"machine-config-server-8ngwr\" (UID: \"8efc7eee-3b20-4cdf-9062-d64472b2c888\") " pod="openshift-machine-config-operator/machine-config-server-8ngwr" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.757883 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f70daa4b-d685-406a-ba3a-7fa6d672acdd-cert\") pod \"ingress-canary-9kmt4\" (UID: \"f70daa4b-d685-406a-ba3a-7fa6d672acdd\") " pod="openshift-ingress-canary/ingress-canary-9kmt4" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.757940 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/c2f4f6fb-d604-402f-83d0-6b25781c3aa8-socket-dir\") pod \"csi-hostpathplugin-6scjm\" (UID: \"c2f4f6fb-d604-402f-83d0-6b25781c3aa8\") " pod="hostpath-provisioner/csi-hostpathplugin-6scjm" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.757976 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/c2f4f6fb-d604-402f-83d0-6b25781c3aa8-csi-data-dir\") pod \"csi-hostpathplugin-6scjm\" (UID: \"c2f4f6fb-d604-402f-83d0-6b25781c3aa8\") " pod="hostpath-provisioner/csi-hostpathplugin-6scjm" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.758022 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lx2zw\" (UniqueName: \"kubernetes.io/projected/f70daa4b-d685-406a-ba3a-7fa6d672acdd-kube-api-access-lx2zw\") pod \"ingress-canary-9kmt4\" (UID: \"f70daa4b-d685-406a-ba3a-7fa6d672acdd\") " pod="openshift-ingress-canary/ingress-canary-9kmt4" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.758065 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4km46\" (UniqueName: \"kubernetes.io/projected/8efc7eee-3b20-4cdf-9062-d64472b2c888-kube-api-access-4km46\") pod \"machine-config-server-8ngwr\" (UID: \"8efc7eee-3b20-4cdf-9062-d64472b2c888\") " pod="openshift-machine-config-operator/machine-config-server-8ngwr" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.758121 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5vhz9\" (UID: \"4cd3f585-c95f-43ee-962c-ea33aff90415\") " pod="openshift-image-registry/image-registry-697d97f7c8-5vhz9" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.758153 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/8efc7eee-3b20-4cdf-9062-d64472b2c888-node-bootstrap-token\") pod \"machine-config-server-8ngwr\" (UID: \"8efc7eee-3b20-4cdf-9062-d64472b2c888\") " pod="openshift-machine-config-operator/machine-config-server-8ngwr" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.758242 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvptj\" (UniqueName: \"kubernetes.io/projected/f799cad1-5a28-4af5-8070-5c365cddbf78-kube-api-access-jvptj\") pod \"dns-default-284hg\" (UID: \"f799cad1-5a28-4af5-8070-5c365cddbf78\") " pod="openshift-dns/dns-default-284hg" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.758275 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/c2f4f6fb-d604-402f-83d0-6b25781c3aa8-mountpoint-dir\") pod \"csi-hostpathplugin-6scjm\" (UID: \"c2f4f6fb-d604-402f-83d0-6b25781c3aa8\") " pod="hostpath-provisioner/csi-hostpathplugin-6scjm" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.758314 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/c2f4f6fb-d604-402f-83d0-6b25781c3aa8-registration-dir\") pod \"csi-hostpathplugin-6scjm\" (UID: \"c2f4f6fb-d604-402f-83d0-6b25781c3aa8\") " pod="hostpath-provisioner/csi-hostpathplugin-6scjm" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.758335 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f799cad1-5a28-4af5-8070-5c365cddbf78-config-volume\") pod \"dns-default-284hg\" (UID: \"f799cad1-5a28-4af5-8070-5c365cddbf78\") " pod="openshift-dns/dns-default-284hg" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.758427 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87r8v\" (UniqueName: \"kubernetes.io/projected/c2f4f6fb-d604-402f-83d0-6b25781c3aa8-kube-api-access-87r8v\") pod \"csi-hostpathplugin-6scjm\" (UID: \"c2f4f6fb-d604-402f-83d0-6b25781c3aa8\") " pod="hostpath-provisioner/csi-hostpathplugin-6scjm" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.758450 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f799cad1-5a28-4af5-8070-5c365cddbf78-metrics-tls\") pod \"dns-default-284hg\" (UID: \"f799cad1-5a28-4af5-8070-5c365cddbf78\") " pod="openshift-dns/dns-default-284hg" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.758498 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rfgf7\" (UniqueName: \"kubernetes.io/projected/1ecc7c98-e9a3-4850-a741-7e0bcf670e27-kube-api-access-rfgf7\") pod \"machine-api-operator-5694c8668f-jjmwc\" (UID: \"1ecc7c98-e9a3-4850-a741-7e0bcf670e27\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jjmwc" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.758997 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/c2f4f6fb-d604-402f-83d0-6b25781c3aa8-plugins-dir\") pod \"csi-hostpathplugin-6scjm\" (UID: \"c2f4f6fb-d604-402f-83d0-6b25781c3aa8\") " pod="hostpath-provisioner/csi-hostpathplugin-6scjm" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.759007 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jf2hd"] Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.759140 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1a506e2e-c940-4f10-b89c-948d10ba8902-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-4bfcw\" (UID: \"1a506e2e-c940-4f10-b89c-948d10ba8902\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4bfcw" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.759217 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/c2f4f6fb-d604-402f-83d0-6b25781c3aa8-mountpoint-dir\") pod \"csi-hostpathplugin-6scjm\" (UID: \"c2f4f6fb-d604-402f-83d0-6b25781c3aa8\") " pod="hostpath-provisioner/csi-hostpathplugin-6scjm" Feb 17 14:08:37 crc kubenswrapper[4836]: E0217 14:08:37.759953 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:08:38.259932982 +0000 UTC m=+144.602861251 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5vhz9" (UID: "4cd3f585-c95f-43ee-962c-ea33aff90415") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.760542 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f799cad1-5a28-4af5-8070-5c365cddbf78-config-volume\") pod \"dns-default-284hg\" (UID: \"f799cad1-5a28-4af5-8070-5c365cddbf78\") " pod="openshift-dns/dns-default-284hg" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.760636 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/c2f4f6fb-d604-402f-83d0-6b25781c3aa8-registration-dir\") pod \"csi-hostpathplugin-6scjm\" (UID: \"c2f4f6fb-d604-402f-83d0-6b25781c3aa8\") " pod="hostpath-provisioner/csi-hostpathplugin-6scjm" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.760840 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/c2f4f6fb-d604-402f-83d0-6b25781c3aa8-socket-dir\") pod \"csi-hostpathplugin-6scjm\" (UID: \"c2f4f6fb-d604-402f-83d0-6b25781c3aa8\") " pod="hostpath-provisioner/csi-hostpathplugin-6scjm" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.764655 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/c2f4f6fb-d604-402f-83d0-6b25781c3aa8-csi-data-dir\") pod \"csi-hostpathplugin-6scjm\" (UID: \"c2f4f6fb-d604-402f-83d0-6b25781c3aa8\") " pod="hostpath-provisioner/csi-hostpathplugin-6scjm" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.764756 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfgf7\" (UniqueName: \"kubernetes.io/projected/1ecc7c98-e9a3-4850-a741-7e0bcf670e27-kube-api-access-rfgf7\") pod \"machine-api-operator-5694c8668f-jjmwc\" (UID: \"1ecc7c98-e9a3-4850-a741-7e0bcf670e27\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jjmwc" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.765707 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/8efc7eee-3b20-4cdf-9062-d64472b2c888-certs\") pod \"machine-config-server-8ngwr\" (UID: \"8efc7eee-3b20-4cdf-9062-d64472b2c888\") " pod="openshift-machine-config-operator/machine-config-server-8ngwr" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.767314 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/8efc7eee-3b20-4cdf-9062-d64472b2c888-node-bootstrap-token\") pod \"machine-config-server-8ngwr\" (UID: \"8efc7eee-3b20-4cdf-9062-d64472b2c888\") " pod="openshift-machine-config-operator/machine-config-server-8ngwr" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.790196 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f799cad1-5a28-4af5-8070-5c365cddbf78-metrics-tls\") pod \"dns-default-284hg\" (UID: \"f799cad1-5a28-4af5-8070-5c365cddbf78\") " pod="openshift-dns/dns-default-284hg" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.790739 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f70daa4b-d685-406a-ba3a-7fa6d672acdd-cert\") pod \"ingress-canary-9kmt4\" (UID: \"f70daa4b-d685-406a-ba3a-7fa6d672acdd\") " pod="openshift-ingress-canary/ingress-canary-9kmt4" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.795182 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-khbdr"] Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.797072 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9t82\" (UniqueName: \"kubernetes.io/projected/296ae94a-36e6-480b-9395-8f6a96621fdf-kube-api-access-p9t82\") pod \"router-default-5444994796-fqzrl\" (UID: \"296ae94a-36e6-480b-9395-8f6a96621fdf\") " pod="openshift-ingress/router-default-5444994796-fqzrl" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.798398 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhp9d\" (UniqueName: \"kubernetes.io/projected/4cd3f585-c95f-43ee-962c-ea33aff90415-kube-api-access-vhp9d\") pod \"image-registry-697d97f7c8-5vhz9\" (UID: \"4cd3f585-c95f-43ee-962c-ea33aff90415\") " pod="openshift-image-registry/image-registry-697d97f7c8-5vhz9" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.799995 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grvpk\" (UniqueName: \"kubernetes.io/projected/6d52104b-91e7-4a3a-9138-163eb850485d-kube-api-access-grvpk\") pod \"console-f9d7485db-6zspj\" (UID: \"6d52104b-91e7-4a3a-9138-163eb850485d\") " pod="openshift-console/console-f9d7485db-6zspj" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.811795 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-7klmp"] Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.813532 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4cd3f585-c95f-43ee-962c-ea33aff90415-bound-sa-token\") pod \"image-registry-697d97f7c8-5vhz9\" (UID: \"4cd3f585-c95f-43ee-962c-ea33aff90415\") " pod="openshift-image-registry/image-registry-697d97f7c8-5vhz9" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.822780 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dsrq8"] Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.823438 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-5l6x4"] Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.829892 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-z6h7n"] Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.839385 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vc9lz\" (UniqueName: \"kubernetes.io/projected/1ce6cce5-c0bb-4d10-8458-bb9e15832a9c-kube-api-access-vc9lz\") pod \"package-server-manager-789f6589d5-blr59\" (UID: \"1ce6cce5-c0bb-4d10-8458-bb9e15832a9c\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-blr59" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.852050 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-blr59" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.853478 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njj8f\" (UniqueName: \"kubernetes.io/projected/628fd7f0-d4b6-4866-b7d4-6966ed698611-kube-api-access-njj8f\") pod \"machine-config-operator-74547568cd-qc9j6\" (UID: \"628fd7f0-d4b6-4866-b7d4-6966ed698611\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qc9j6" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.861686 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:08:37 crc kubenswrapper[4836]: E0217 14:08:37.862275 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:08:38.362251788 +0000 UTC m=+144.705180057 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.864137 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wsdcq" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.864515 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jf64r\" (UniqueName: \"kubernetes.io/projected/83427963-071f-40a0-8988-b39a3d41e59f-kube-api-access-jf64r\") pod \"migrator-59844c95c7-gtjx7\" (UID: \"83427963-071f-40a0-8988-b39a3d41e59f\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-gtjx7" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.870128 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-8sd2q"] Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.870813 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wgzvh" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.889406 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-ch9j6"] Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.892558 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qc9j6" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.897440 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1679c4a6-a707-4150-825b-5cb8b90cb27c-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-2xtf8\" (UID: \"1679c4a6-a707-4150-825b-5cb8b90cb27c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2xtf8" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.898193 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nncbw"] Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.900280 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522280-c8fps" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.907094 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxf8g\" (UniqueName: \"kubernetes.io/projected/1a506e2e-c940-4f10-b89c-948d10ba8902-kube-api-access-qxf8g\") pod \"cluster-image-registry-operator-dc59b4c8b-4bfcw\" (UID: \"1a506e2e-c940-4f10-b89c-948d10ba8902\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4bfcw" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.916926 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-fqzrl" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.923885 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rsrv\" (UniqueName: \"kubernetes.io/projected/56a1d7ef-ccae-4b8e-b94f-edee0ce6e902-kube-api-access-7rsrv\") pod \"machine-config-controller-84d6567774-l5kfm\" (UID: \"56a1d7ef-ccae-4b8e-b94f-edee0ce6e902\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-l5kfm" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.926084 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4bfcw" Feb 17 14:08:37 crc kubenswrapper[4836]: W0217 14:08:37.939203 4836 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19216a1e_34af_4764_a621_e5097db4751b.slice/crio-c732fed9106b6ad88780358d79674f3866d5bc1bee0d51410d5390775fcf4cbc WatchSource:0}: Error finding container c732fed9106b6ad88780358d79674f3866d5bc1bee0d51410d5390775fcf4cbc: Status 404 returned error can't find the container with id c732fed9106b6ad88780358d79674f3866d5bc1bee0d51410d5390775fcf4cbc Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.959031 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjkns\" (UniqueName: \"kubernetes.io/projected/b8fcc519-bc1d-4a7e-8005-cb29f435f4e5-kube-api-access-fjkns\") pod \"service-ca-operator-777779d784-ld8ls\" (UID: \"b8fcc519-bc1d-4a7e-8005-cb29f435f4e5\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-ld8ls" Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.963127 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5vhz9\" (UID: \"4cd3f585-c95f-43ee-962c-ea33aff90415\") " pod="openshift-image-registry/image-registry-697d97f7c8-5vhz9" Feb 17 14:08:37 crc kubenswrapper[4836]: E0217 14:08:37.963588 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:08:38.463568428 +0000 UTC m=+144.806496757 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5vhz9" (UID: "4cd3f585-c95f-43ee-962c-ea33aff90415") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:37 crc kubenswrapper[4836]: I0217 14:08:37.975105 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r66m8\" (UniqueName: \"kubernetes.io/projected/0fc0c6ee-c7d3-4e99-a6c2-b8de8f7ffa00-kube-api-access-r66m8\") pod \"service-ca-9c57cc56f-72n7k\" (UID: \"0fc0c6ee-c7d3-4e99-a6c2-b8de8f7ffa00\") " pod="openshift-service-ca/service-ca-9c57cc56f-72n7k" Feb 17 14:08:38 crc kubenswrapper[4836]: I0217 14:08:38.011852 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84mc8\" (UniqueName: \"kubernetes.io/projected/c758606a-b3e4-494e-a2a6-7a7320277b37-kube-api-access-84mc8\") pod \"packageserver-d55dfcdfc-x2x76\" (UID: \"c758606a-b3e4-494e-a2a6-7a7320277b37\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x2x76" Feb 17 14:08:38 crc kubenswrapper[4836]: I0217 14:08:38.019962 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6grq\" (UniqueName: \"kubernetes.io/projected/cea58b47-da5e-4dc7-be23-19d8408318d7-kube-api-access-x6grq\") pod \"control-plane-machine-set-operator-78cbb6b69f-jhzxl\" (UID: \"cea58b47-da5e-4dc7-be23-19d8408318d7\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jhzxl" Feb 17 14:08:38 crc kubenswrapper[4836]: I0217 14:08:38.057966 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4km46\" (UniqueName: \"kubernetes.io/projected/8efc7eee-3b20-4cdf-9062-d64472b2c888-kube-api-access-4km46\") pod \"machine-config-server-8ngwr\" (UID: \"8efc7eee-3b20-4cdf-9062-d64472b2c888\") " pod="openshift-machine-config-operator/machine-config-server-8ngwr" Feb 17 14:08:38 crc kubenswrapper[4836]: I0217 14:08:38.064806 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:08:38 crc kubenswrapper[4836]: E0217 14:08:38.065241 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:08:38.565221657 +0000 UTC m=+144.908149926 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:38 crc kubenswrapper[4836]: I0217 14:08:38.066515 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-6zspj" Feb 17 14:08:38 crc kubenswrapper[4836]: I0217 14:08:38.070922 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvptj\" (UniqueName: \"kubernetes.io/projected/f799cad1-5a28-4af5-8070-5c365cddbf78-kube-api-access-jvptj\") pod \"dns-default-284hg\" (UID: \"f799cad1-5a28-4af5-8070-5c365cddbf78\") " pod="openshift-dns/dns-default-284hg" Feb 17 14:08:38 crc kubenswrapper[4836]: I0217 14:08:38.094501 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87r8v\" (UniqueName: \"kubernetes.io/projected/c2f4f6fb-d604-402f-83d0-6b25781c3aa8-kube-api-access-87r8v\") pod \"csi-hostpathplugin-6scjm\" (UID: \"c2f4f6fb-d604-402f-83d0-6b25781c3aa8\") " pod="hostpath-provisioner/csi-hostpathplugin-6scjm" Feb 17 14:08:38 crc kubenswrapper[4836]: I0217 14:08:38.105128 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lx2zw\" (UniqueName: \"kubernetes.io/projected/f70daa4b-d685-406a-ba3a-7fa6d672acdd-kube-api-access-lx2zw\") pod \"ingress-canary-9kmt4\" (UID: \"f70daa4b-d685-406a-ba3a-7fa6d672acdd\") " pod="openshift-ingress-canary/ingress-canary-9kmt4" Feb 17 14:08:38 crc kubenswrapper[4836]: I0217 14:08:38.119639 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-gtjx7" Feb 17 14:08:38 crc kubenswrapper[4836]: I0217 14:08:38.133235 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x2x76" Feb 17 14:08:38 crc kubenswrapper[4836]: I0217 14:08:38.134420 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-l5kfm" Feb 17 14:08:38 crc kubenswrapper[4836]: I0217 14:08:38.167557 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5vhz9\" (UID: \"4cd3f585-c95f-43ee-962c-ea33aff90415\") " pod="openshift-image-registry/image-registry-697d97f7c8-5vhz9" Feb 17 14:08:38 crc kubenswrapper[4836]: E0217 14:08:38.168205 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:08:38.66817272 +0000 UTC m=+145.011101149 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5vhz9" (UID: "4cd3f585-c95f-43ee-962c-ea33aff90415") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:38 crc kubenswrapper[4836]: I0217 14:08:38.177498 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2xtf8" Feb 17 14:08:38 crc kubenswrapper[4836]: I0217 14:08:38.211777 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-72n7k" Feb 17 14:08:38 crc kubenswrapper[4836]: I0217 14:08:38.250209 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jhzxl" Feb 17 14:08:38 crc kubenswrapper[4836]: I0217 14:08:38.258032 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-ld8ls" Feb 17 14:08:38 crc kubenswrapper[4836]: I0217 14:08:38.258914 4836 csr.go:261] certificate signing request csr-c4bdx is approved, waiting to be issued Feb 17 14:08:38 crc kubenswrapper[4836]: I0217 14:08:38.271774 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-284hg" Feb 17 14:08:38 crc kubenswrapper[4836]: I0217 14:08:38.272251 4836 csr.go:257] certificate signing request csr-c4bdx is issued Feb 17 14:08:38 crc kubenswrapper[4836]: I0217 14:08:38.272684 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:08:38 crc kubenswrapper[4836]: I0217 14:08:38.273005 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66402e53-3287-45c4-bceb-78fc99836c5b-config\") pod \"apiserver-76f77b778f-cnq25\" (UID: \"66402e53-3287-45c4-bceb-78fc99836c5b\") " pod="openshift-apiserver/apiserver-76f77b778f-cnq25" Feb 17 14:08:38 crc kubenswrapper[4836]: I0217 14:08:38.273068 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/66402e53-3287-45c4-bceb-78fc99836c5b-trusted-ca-bundle\") pod \"apiserver-76f77b778f-cnq25\" (UID: \"66402e53-3287-45c4-bceb-78fc99836c5b\") " pod="openshift-apiserver/apiserver-76f77b778f-cnq25" Feb 17 14:08:38 crc kubenswrapper[4836]: I0217 14:08:38.273102 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/1ecc7c98-e9a3-4850-a741-7e0bcf670e27-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-jjmwc\" (UID: \"1ecc7c98-e9a3-4850-a741-7e0bcf670e27\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jjmwc" Feb 17 14:08:38 crc kubenswrapper[4836]: I0217 14:08:38.273131 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/66402e53-3287-45c4-bceb-78fc99836c5b-image-import-ca\") pod \"apiserver-76f77b778f-cnq25\" (UID: \"66402e53-3287-45c4-bceb-78fc99836c5b\") " pod="openshift-apiserver/apiserver-76f77b778f-cnq25" Feb 17 14:08:38 crc kubenswrapper[4836]: I0217 14:08:38.273164 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/66402e53-3287-45c4-bceb-78fc99836c5b-encryption-config\") pod \"apiserver-76f77b778f-cnq25\" (UID: \"66402e53-3287-45c4-bceb-78fc99836c5b\") " pod="openshift-apiserver/apiserver-76f77b778f-cnq25" Feb 17 14:08:38 crc kubenswrapper[4836]: I0217 14:08:38.273211 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ecc7c98-e9a3-4850-a741-7e0bcf670e27-config\") pod \"machine-api-operator-5694c8668f-jjmwc\" (UID: \"1ecc7c98-e9a3-4850-a741-7e0bcf670e27\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jjmwc" Feb 17 14:08:38 crc kubenswrapper[4836]: I0217 14:08:38.273238 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/1ecc7c98-e9a3-4850-a741-7e0bcf670e27-images\") pod \"machine-api-operator-5694c8668f-jjmwc\" (UID: \"1ecc7c98-e9a3-4850-a741-7e0bcf670e27\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jjmwc" Feb 17 14:08:38 crc kubenswrapper[4836]: I0217 14:08:38.273264 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/66402e53-3287-45c4-bceb-78fc99836c5b-etcd-serving-ca\") pod \"apiserver-76f77b778f-cnq25\" (UID: \"66402e53-3287-45c4-bceb-78fc99836c5b\") " pod="openshift-apiserver/apiserver-76f77b778f-cnq25" Feb 17 14:08:38 crc kubenswrapper[4836]: I0217 14:08:38.274001 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/66402e53-3287-45c4-bceb-78fc99836c5b-etcd-serving-ca\") pod \"apiserver-76f77b778f-cnq25\" (UID: \"66402e53-3287-45c4-bceb-78fc99836c5b\") " pod="openshift-apiserver/apiserver-76f77b778f-cnq25" Feb 17 14:08:38 crc kubenswrapper[4836]: I0217 14:08:38.282108 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ecc7c98-e9a3-4850-a741-7e0bcf670e27-config\") pod \"machine-api-operator-5694c8668f-jjmwc\" (UID: \"1ecc7c98-e9a3-4850-a741-7e0bcf670e27\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jjmwc" Feb 17 14:08:38 crc kubenswrapper[4836]: I0217 14:08:38.283126 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66402e53-3287-45c4-bceb-78fc99836c5b-config\") pod \"apiserver-76f77b778f-cnq25\" (UID: \"66402e53-3287-45c4-bceb-78fc99836c5b\") " pod="openshift-apiserver/apiserver-76f77b778f-cnq25" Feb 17 14:08:38 crc kubenswrapper[4836]: I0217 14:08:38.283470 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/66402e53-3287-45c4-bceb-78fc99836c5b-trusted-ca-bundle\") pod \"apiserver-76f77b778f-cnq25\" (UID: \"66402e53-3287-45c4-bceb-78fc99836c5b\") " pod="openshift-apiserver/apiserver-76f77b778f-cnq25" Feb 17 14:08:38 crc kubenswrapper[4836]: E0217 14:08:38.283554 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:08:38.783532414 +0000 UTC m=+145.126460683 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:38 crc kubenswrapper[4836]: I0217 14:08:38.283939 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/66402e53-3287-45c4-bceb-78fc99836c5b-image-import-ca\") pod \"apiserver-76f77b778f-cnq25\" (UID: \"66402e53-3287-45c4-bceb-78fc99836c5b\") " pod="openshift-apiserver/apiserver-76f77b778f-cnq25" Feb 17 14:08:38 crc kubenswrapper[4836]: I0217 14:08:38.284252 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/1ecc7c98-e9a3-4850-a741-7e0bcf670e27-images\") pod \"machine-api-operator-5694c8668f-jjmwc\" (UID: \"1ecc7c98-e9a3-4850-a741-7e0bcf670e27\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jjmwc" Feb 17 14:08:38 crc kubenswrapper[4836]: I0217 14:08:38.293775 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/66402e53-3287-45c4-bceb-78fc99836c5b-encryption-config\") pod \"apiserver-76f77b778f-cnq25\" (UID: \"66402e53-3287-45c4-bceb-78fc99836c5b\") " pod="openshift-apiserver/apiserver-76f77b778f-cnq25" Feb 17 14:08:38 crc kubenswrapper[4836]: I0217 14:08:38.294325 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/1ecc7c98-e9a3-4850-a741-7e0bcf670e27-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-jjmwc\" (UID: \"1ecc7c98-e9a3-4850-a741-7e0bcf670e27\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jjmwc" Feb 17 14:08:38 crc kubenswrapper[4836]: I0217 14:08:38.297151 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-6scjm" Feb 17 14:08:38 crc kubenswrapper[4836]: I0217 14:08:38.317354 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-9kmt4" Feb 17 14:08:38 crc kubenswrapper[4836]: I0217 14:08:38.317372 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-8ngwr" Feb 17 14:08:38 crc kubenswrapper[4836]: I0217 14:08:38.382251 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5vhz9\" (UID: \"4cd3f585-c95f-43ee-962c-ea33aff90415\") " pod="openshift-image-registry/image-registry-697d97f7c8-5vhz9" Feb 17 14:08:38 crc kubenswrapper[4836]: E0217 14:08:38.382687 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:08:38.882671675 +0000 UTC m=+145.225599944 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5vhz9" (UID: "4cd3f585-c95f-43ee-962c-ea33aff90415") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:38 crc kubenswrapper[4836]: I0217 14:08:38.458152 4836 generic.go:334] "Generic (PLEG): container finished" podID="c26f912f-f640-4b4c-ab61-dd2a163f12ab" containerID="ff70bd1f19cb66489eb8021670929dc34b2836747cd3559cb1b40b76e9b0db37" exitCode=0 Feb 17 14:08:38 crc kubenswrapper[4836]: I0217 14:08:38.458235 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2rnsr" event={"ID":"c26f912f-f640-4b4c-ab61-dd2a163f12ab","Type":"ContainerDied","Data":"ff70bd1f19cb66489eb8021670929dc34b2836747cd3559cb1b40b76e9b0db37"} Feb 17 14:08:38 crc kubenswrapper[4836]: I0217 14:08:38.458264 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2rnsr" event={"ID":"c26f912f-f640-4b4c-ab61-dd2a163f12ab","Type":"ContainerStarted","Data":"3189bd3a0db29daf1f670c5fde8aa9c551657011478702a0b61c6a161156f464"} Feb 17 14:08:38 crc kubenswrapper[4836]: I0217 14:08:38.466427 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-kcm8s" event={"ID":"2840702b-d22f-4184-bada-4cd337d79407","Type":"ContainerStarted","Data":"4be7a02a68429fd3aca6fe66c5f908cf70143baa5ed0dc0dc81c5956e39f2350"} Feb 17 14:08:38 crc kubenswrapper[4836]: I0217 14:08:38.467667 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-kcm8s" Feb 17 14:08:38 crc kubenswrapper[4836]: I0217 14:08:38.471800 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sknds" event={"ID":"331c189b-0cb5-4733-a233-894429c709a9","Type":"ContainerStarted","Data":"6d996fae9b23ebe40130923ec7457c18e45a41798247aa59f7da602cb14a7891"} Feb 17 14:08:38 crc kubenswrapper[4836]: I0217 14:08:38.471830 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sknds" event={"ID":"331c189b-0cb5-4733-a233-894429c709a9","Type":"ContainerStarted","Data":"00ae5e23519b93a235c719050c0aaaa27453e0ec9d39273f3fd74363f96401c4"} Feb 17 14:08:38 crc kubenswrapper[4836]: I0217 14:08:38.472695 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sknds" Feb 17 14:08:38 crc kubenswrapper[4836]: I0217 14:08:38.475840 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-5l6x4" event={"ID":"8c77bcf1-4025-4c35-9580-41e9a61195e8","Type":"ContainerStarted","Data":"48a4ab3353f6c1184058baaeccdfdd55224e71bf388392ae1f7582ff6b6cbd0c"} Feb 17 14:08:38 crc kubenswrapper[4836]: I0217 14:08:38.475874 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-5l6x4" event={"ID":"8c77bcf1-4025-4c35-9580-41e9a61195e8","Type":"ContainerStarted","Data":"b99d73db17eb9c6b2aa85ca03f0903902f643a2f2fbc708d9b4c51f4e9d1ede7"} Feb 17 14:08:38 crc kubenswrapper[4836]: I0217 14:08:38.476602 4836 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-sknds container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.17:8443/healthz\": dial tcp 10.217.0.17:8443: connect: connection refused" start-of-body= Feb 17 14:08:38 crc kubenswrapper[4836]: I0217 14:08:38.476641 4836 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sknds" podUID="331c189b-0cb5-4733-a233-894429c709a9" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.17:8443/healthz\": dial tcp 10.217.0.17:8443: connect: connection refused" Feb 17 14:08:38 crc kubenswrapper[4836]: I0217 14:08:38.476681 4836 patch_prober.go:28] interesting pod/console-operator-58897d9998-kcm8s container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.22:8443/readyz\": dial tcp 10.217.0.22:8443: connect: connection refused" start-of-body= Feb 17 14:08:38 crc kubenswrapper[4836]: I0217 14:08:38.476776 4836 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-kcm8s" podUID="2840702b-d22f-4184-bada-4cd337d79407" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.22:8443/readyz\": dial tcp 10.217.0.22:8443: connect: connection refused" Feb 17 14:08:38 crc kubenswrapper[4836]: I0217 14:08:38.476855 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-5l6x4" Feb 17 14:08:38 crc kubenswrapper[4836]: I0217 14:08:38.478141 4836 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-5l6x4 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Feb 17 14:08:38 crc kubenswrapper[4836]: I0217 14:08:38.478237 4836 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-5l6x4" podUID="8c77bcf1-4025-4c35-9580-41e9a61195e8" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Feb 17 14:08:38 crc kubenswrapper[4836]: I0217 14:08:38.479580 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7htl2" event={"ID":"3bb5e0b8-9179-4570-a3d8-acaa80b2c884","Type":"ContainerStarted","Data":"f4657696ca94735e533f18cf33a4d84536d0fa2d79311a5bbd8096d04f6aa8de"} Feb 17 14:08:38 crc kubenswrapper[4836]: I0217 14:08:38.479624 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7htl2" event={"ID":"3bb5e0b8-9179-4570-a3d8-acaa80b2c884","Type":"ContainerStarted","Data":"fbc7ee12dfd6b918ba68fe183a564ff72718b972a1838e5e90efb3b9a4d21428"} Feb 17 14:08:38 crc kubenswrapper[4836]: I0217 14:08:38.493018 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wd65t" event={"ID":"9a3f2789-dd41-4e95-8174-db3a40098b0e","Type":"ContainerStarted","Data":"8b955158897397fb5f9170923b488df06517409802fadf21a2572eed8b5bec46"} Feb 17 14:08:38 crc kubenswrapper[4836]: I0217 14:08:38.494448 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:08:38 crc kubenswrapper[4836]: E0217 14:08:38.494840 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:08:38.994798322 +0000 UTC m=+145.337726591 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:38 crc kubenswrapper[4836]: I0217 14:08:38.495162 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5vhz9\" (UID: \"4cd3f585-c95f-43ee-962c-ea33aff90415\") " pod="openshift-image-registry/image-registry-697d97f7c8-5vhz9" Feb 17 14:08:38 crc kubenswrapper[4836]: I0217 14:08:38.496257 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2mmw4" event={"ID":"5ad14aa6-962d-4f8f-babe-745f65d63560","Type":"ContainerStarted","Data":"7bf8b1ad3766a39524e948476d227ffd5480f4d617fc83f3457ebccacb5d8b7e"} Feb 17 14:08:38 crc kubenswrapper[4836]: I0217 14:08:38.496975 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2mmw4" Feb 17 14:08:38 crc kubenswrapper[4836]: E0217 14:08:38.498474 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:08:38.99845604 +0000 UTC m=+145.341384309 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5vhz9" (UID: "4cd3f585-c95f-43ee-962c-ea33aff90415") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:38 crc kubenswrapper[4836]: I0217 14:08:38.512486 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-7klmp" event={"ID":"65684d1d-5242-464d-8caf-ad4866bf6a86","Type":"ContainerStarted","Data":"8b2041a0bc951b67664b08a48ffa757dc83671e656c0a06eb067f557cc4c5a47"} Feb 17 14:08:38 crc kubenswrapper[4836]: I0217 14:08:38.514263 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-khbdr" event={"ID":"985bc83c-52fa-45dc-ab4f-6e47ee47683e","Type":"ContainerStarted","Data":"4b1cfa0180186477ad01885b0380528a2ed9a9e38e3b90ab0219a2e26e3de881"} Feb 17 14:08:38 crc kubenswrapper[4836]: I0217 14:08:38.514346 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-khbdr" event={"ID":"985bc83c-52fa-45dc-ab4f-6e47ee47683e","Type":"ContainerStarted","Data":"7d0ca8f5e10670b96b45ab236df0ffcb5b0c0577a99d998beb3a30327978aa5e"} Feb 17 14:08:38 crc kubenswrapper[4836]: I0217 14:08:38.514703 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-khbdr" Feb 17 14:08:38 crc kubenswrapper[4836]: I0217 14:08:38.520568 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-8sd2q" event={"ID":"171e2af0-2993-4cd3-942f-043bccca2813","Type":"ContainerStarted","Data":"ce2afad92b1d3e7f3446573cfc68acc3678a98ba8dd05c52e22c8acd6f8e122e"} Feb 17 14:08:38 crc kubenswrapper[4836]: I0217 14:08:38.521182 4836 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-2mmw4 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Feb 17 14:08:38 crc kubenswrapper[4836]: I0217 14:08:38.521225 4836 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2mmw4" podUID="5ad14aa6-962d-4f8f-babe-745f65d63560" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" Feb 17 14:08:38 crc kubenswrapper[4836]: I0217 14:08:38.524439 4836 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-khbdr container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.29:8080/healthz\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Feb 17 14:08:38 crc kubenswrapper[4836]: I0217 14:08:38.524480 4836 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-khbdr" podUID="985bc83c-52fa-45dc-ab4f-6e47ee47683e" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.29:8080/healthz\": dial tcp 10.217.0.29:8080: connect: connection refused" Feb 17 14:08:38 crc kubenswrapper[4836]: I0217 14:08:38.568727 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-jjmwc" Feb 17 14:08:38 crc kubenswrapper[4836]: I0217 14:08:38.575119 4836 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-6rsds container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.23:6443/healthz\": dial tcp 10.217.0.23:6443: connect: connection refused" start-of-body= Feb 17 14:08:38 crc kubenswrapper[4836]: I0217 14:08:38.575176 4836 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-6rsds" podUID="c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.23:6443/healthz\": dial tcp 10.217.0.23:6443: connect: connection refused" Feb 17 14:08:38 crc kubenswrapper[4836]: I0217 14:08:38.596742 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-cnq25" Feb 17 14:08:38 crc kubenswrapper[4836]: I0217 14:08:38.596760 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:08:38 crc kubenswrapper[4836]: E0217 14:08:38.602772 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:08:39.102719948 +0000 UTC m=+145.445648217 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:38 crc kubenswrapper[4836]: I0217 14:08:38.605512 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5vhz9\" (UID: \"4cd3f585-c95f-43ee-962c-ea33aff90415\") " pod="openshift-image-registry/image-registry-697d97f7c8-5vhz9" Feb 17 14:08:38 crc kubenswrapper[4836]: E0217 14:08:38.632314 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:08:39.119814752 +0000 UTC m=+145.462743021 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5vhz9" (UID: "4cd3f585-c95f-43ee-962c-ea33aff90415") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:38 crc kubenswrapper[4836]: I0217 14:08:38.639863 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-6rsds" event={"ID":"c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7","Type":"ContainerStarted","Data":"374ae013639e0d9afa8e234c5feaec0812c5a1b8b7085c57cb72bf432395a8d0"} Feb 17 14:08:38 crc kubenswrapper[4836]: I0217 14:08:38.639894 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nncbw" event={"ID":"18ec8466-f311-4f81-ae38-48635b000ced","Type":"ContainerStarted","Data":"648849123f32076d0d03d9d2a494fbc3dfd2c9e02da687a37fcb474c75f57692"} Feb 17 14:08:38 crc kubenswrapper[4836]: I0217 14:08:38.639908 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-6rsds" Feb 17 14:08:38 crc kubenswrapper[4836]: I0217 14:08:38.639917 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-ch9j6" event={"ID":"19216a1e-34af-4764-a621-e5097db4751b","Type":"ContainerStarted","Data":"c732fed9106b6ad88780358d79674f3866d5bc1bee0d51410d5390775fcf4cbc"} Feb 17 14:08:38 crc kubenswrapper[4836]: I0217 14:08:38.639926 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dsrq8" event={"ID":"1d30a99f-a727-4eb4-9a32-0508707384bf","Type":"ContainerStarted","Data":"a735d305d37e372b5eab7f1f96601405978fd0c74ac2a2f77b58f60958a90ac5"} Feb 17 14:08:38 crc kubenswrapper[4836]: I0217 14:08:38.639935 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dsrq8" event={"ID":"1d30a99f-a727-4eb4-9a32-0508707384bf","Type":"ContainerStarted","Data":"7291ebb14322333ea30d034cdbbbe26e6099c047841086ef774fa19acbdc1145"} Feb 17 14:08:38 crc kubenswrapper[4836]: I0217 14:08:38.639944 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-98frx" event={"ID":"95872171-94c1-4b8a-935f-ae180a4e3d11","Type":"ContainerStarted","Data":"c7d64b67b1393749363c0e40b1d7a250a383bf75888f287bcd7649be5c519540"} Feb 17 14:08:38 crc kubenswrapper[4836]: I0217 14:08:38.639956 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-blr59"] Feb 17 14:08:38 crc kubenswrapper[4836]: I0217 14:08:38.648431 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jf2hd" event={"ID":"921ecdc3-b5f3-44e4-9300-d25342d944d8","Type":"ContainerStarted","Data":"1cee4451298f59a86bd26b4891652bce600348c354341a200fb5f066413bf22b"} Feb 17 14:08:38 crc kubenswrapper[4836]: I0217 14:08:38.679602 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rq6q9" event={"ID":"f8080d32-cbe7-4b02-8791-d9f1f9aca269","Type":"ContainerStarted","Data":"7157c1b2b508140b34c05778d3e34a29b1d928f04cfa7d5b8e263650474961e2"} Feb 17 14:08:38 crc kubenswrapper[4836]: I0217 14:08:38.717849 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-z6h7n" event={"ID":"e634dd4f-5615-4dd7-ac0f-8c22a3ee7d40","Type":"ContainerStarted","Data":"e8c4fbe0705bb8961a7928968738bd6e008c9707dd0c001e1e2eaef63fd0999c"} Feb 17 14:08:38 crc kubenswrapper[4836]: I0217 14:08:38.722609 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:08:38 crc kubenswrapper[4836]: E0217 14:08:38.727492 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:08:39.2274561 +0000 UTC m=+145.570384379 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:38 crc kubenswrapper[4836]: I0217 14:08:38.729557 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5vhz9\" (UID: \"4cd3f585-c95f-43ee-962c-ea33aff90415\") " pod="openshift-image-registry/image-registry-697d97f7c8-5vhz9" Feb 17 14:08:38 crc kubenswrapper[4836]: I0217 14:08:38.735779 4836 patch_prober.go:28] interesting pod/downloads-7954f5f757-5cbbv container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Feb 17 14:08:38 crc kubenswrapper[4836]: I0217 14:08:38.735851 4836 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-5cbbv" podUID="d9eb5c8b-f3c7-4068-82c7-28520f6905c6" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Feb 17 14:08:38 crc kubenswrapper[4836]: E0217 14:08:38.736349 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:08:39.236330125 +0000 UTC m=+145.579258394 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5vhz9" (UID: "4cd3f585-c95f-43ee-962c-ea33aff90415") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:38 crc kubenswrapper[4836]: I0217 14:08:38.749534 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wgzvh"] Feb 17 14:08:38 crc kubenswrapper[4836]: I0217 14:08:38.827353 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wsdcq"] Feb 17 14:08:38 crc kubenswrapper[4836]: I0217 14:08:38.831356 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:08:38 crc kubenswrapper[4836]: E0217 14:08:38.831976 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:08:39.331962614 +0000 UTC m=+145.674890883 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:38 crc kubenswrapper[4836]: I0217 14:08:38.943336 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5vhz9\" (UID: \"4cd3f585-c95f-43ee-962c-ea33aff90415\") " pod="openshift-image-registry/image-registry-697d97f7c8-5vhz9" Feb 17 14:08:38 crc kubenswrapper[4836]: E0217 14:08:38.943891 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:08:39.443876526 +0000 UTC m=+145.786804805 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5vhz9" (UID: "4cd3f585-c95f-43ee-962c-ea33aff90415") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:38 crc kubenswrapper[4836]: I0217 14:08:38.997985 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sknds" podStartSLOduration=122.997963832 podStartE2EDuration="2m2.997963832s" podCreationTimestamp="2026-02-17 14:06:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:08:38.976210584 +0000 UTC m=+145.319138853" watchObservedRunningTime="2026-02-17 14:08:38.997963832 +0000 UTC m=+145.340892111" Feb 17 14:08:39 crc kubenswrapper[4836]: I0217 14:08:39.061659 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:08:39 crc kubenswrapper[4836]: E0217 14:08:39.062104 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:08:39.562083724 +0000 UTC m=+145.905011993 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:39 crc kubenswrapper[4836]: I0217 14:08:39.067383 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-khbdr" podStartSLOduration=123.067347383 podStartE2EDuration="2m3.067347383s" podCreationTimestamp="2026-02-17 14:06:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:08:39.030883135 +0000 UTC m=+145.373811424" watchObservedRunningTime="2026-02-17 14:08:39.067347383 +0000 UTC m=+145.410275652" Feb 17 14:08:39 crc kubenswrapper[4836]: I0217 14:08:39.115775 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-rq6q9" podStartSLOduration=124.115754309 podStartE2EDuration="2m4.115754309s" podCreationTimestamp="2026-02-17 14:06:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:08:39.103791641 +0000 UTC m=+145.446719920" watchObservedRunningTime="2026-02-17 14:08:39.115754309 +0000 UTC m=+145.458682578" Feb 17 14:08:39 crc kubenswrapper[4836]: I0217 14:08:39.119846 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-qc9j6"] Feb 17 14:08:39 crc kubenswrapper[4836]: I0217 14:08:39.181364 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5vhz9\" (UID: \"4cd3f585-c95f-43ee-962c-ea33aff90415\") " pod="openshift-image-registry/image-registry-697d97f7c8-5vhz9" Feb 17 14:08:39 crc kubenswrapper[4836]: E0217 14:08:39.181953 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:08:39.681927996 +0000 UTC m=+146.024856275 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5vhz9" (UID: "4cd3f585-c95f-43ee-962c-ea33aff90415") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:39 crc kubenswrapper[4836]: I0217 14:08:39.207157 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-5l6x4" podStartSLOduration=124.207136695 podStartE2EDuration="2m4.207136695s" podCreationTimestamp="2026-02-17 14:06:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:08:39.189426886 +0000 UTC m=+145.532355165" watchObservedRunningTime="2026-02-17 14:08:39.207136695 +0000 UTC m=+145.550064964" Feb 17 14:08:39 crc kubenswrapper[4836]: I0217 14:08:39.209610 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-6zspj"] Feb 17 14:08:39 crc kubenswrapper[4836]: I0217 14:08:39.274397 4836 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-17 14:03:38 +0000 UTC, rotation deadline is 2026-11-17 03:14:03.55217016 +0000 UTC Feb 17 14:08:39 crc kubenswrapper[4836]: I0217 14:08:39.274476 4836 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6541h5m24.277697467s for next certificate rotation Feb 17 14:08:39 crc kubenswrapper[4836]: I0217 14:08:39.289149 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:08:39 crc kubenswrapper[4836]: E0217 14:08:39.289557 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:08:39.789538833 +0000 UTC m=+146.132467102 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:39 crc kubenswrapper[4836]: I0217 14:08:39.403540 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5vhz9\" (UID: \"4cd3f585-c95f-43ee-962c-ea33aff90415\") " pod="openshift-image-registry/image-registry-697d97f7c8-5vhz9" Feb 17 14:08:39 crc kubenswrapper[4836]: E0217 14:08:39.408256 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:08:39.908210994 +0000 UTC m=+146.251139273 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5vhz9" (UID: "4cd3f585-c95f-43ee-962c-ea33aff90415") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:39 crc kubenswrapper[4836]: I0217 14:08:39.413756 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2mmw4" podStartSLOduration=123.413727731 podStartE2EDuration="2m3.413727731s" podCreationTimestamp="2026-02-17 14:06:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:08:39.333962773 +0000 UTC m=+145.676891052" watchObservedRunningTime="2026-02-17 14:08:39.413727731 +0000 UTC m=+145.756656000" Feb 17 14:08:39 crc kubenswrapper[4836]: I0217 14:08:39.443971 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522280-c8fps"] Feb 17 14:08:39 crc kubenswrapper[4836]: I0217 14:08:39.459713 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wd65t" podStartSLOduration=124.459693841 podStartE2EDuration="2m4.459693841s" podCreationTimestamp="2026-02-17 14:06:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:08:39.458245023 +0000 UTC m=+145.801173312" watchObservedRunningTime="2026-02-17 14:08:39.459693841 +0000 UTC m=+145.802622110" Feb 17 14:08:39 crc kubenswrapper[4836]: I0217 14:08:39.506596 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:08:39 crc kubenswrapper[4836]: E0217 14:08:39.507093 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:08:40.007066888 +0000 UTC m=+146.349995157 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:39 crc kubenswrapper[4836]: I0217 14:08:39.513096 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4bfcw"] Feb 17 14:08:39 crc kubenswrapper[4836]: I0217 14:08:39.528371 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dsrq8" podStartSLOduration=124.528321974 podStartE2EDuration="2m4.528321974s" podCreationTimestamp="2026-02-17 14:06:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:08:39.51128756 +0000 UTC m=+145.854215839" watchObservedRunningTime="2026-02-17 14:08:39.528321974 +0000 UTC m=+145.871250243" Feb 17 14:08:39 crc kubenswrapper[4836]: I0217 14:08:39.546565 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cwlxz" podStartSLOduration=124.546537297 podStartE2EDuration="2m4.546537297s" podCreationTimestamp="2026-02-17 14:06:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:08:39.543576849 +0000 UTC m=+145.886505118" watchObservedRunningTime="2026-02-17 14:08:39.546537297 +0000 UTC m=+145.889465586" Feb 17 14:08:39 crc kubenswrapper[4836]: E0217 14:08:39.608506 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:08:40.108491791 +0000 UTC m=+146.451420060 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5vhz9" (UID: "4cd3f585-c95f-43ee-962c-ea33aff90415") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:39 crc kubenswrapper[4836]: I0217 14:08:39.608104 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5vhz9\" (UID: \"4cd3f585-c95f-43ee-962c-ea33aff90415\") " pod="openshift-image-registry/image-registry-697d97f7c8-5vhz9" Feb 17 14:08:39 crc kubenswrapper[4836]: W0217 14:08:39.616059 4836 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod91eb437c_beea_4f2d_b3f7_505b87fe6dee.slice/crio-d146dcdb5314cbe00f41e751cbb49254cf23ab6e8cdc06caed3ce29aac7230d2 WatchSource:0}: Error finding container d146dcdb5314cbe00f41e751cbb49254cf23ab6e8cdc06caed3ce29aac7230d2: Status 404 returned error can't find the container with id d146dcdb5314cbe00f41e751cbb49254cf23ab6e8cdc06caed3ce29aac7230d2 Feb 17 14:08:39 crc kubenswrapper[4836]: I0217 14:08:39.651823 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-6rsds" podStartSLOduration=124.651804832 podStartE2EDuration="2m4.651804832s" podCreationTimestamp="2026-02-17 14:06:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:08:39.610178636 +0000 UTC m=+145.953106905" watchObservedRunningTime="2026-02-17 14:08:39.651804832 +0000 UTC m=+145.994733101" Feb 17 14:08:39 crc kubenswrapper[4836]: I0217 14:08:39.715083 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:08:39 crc kubenswrapper[4836]: E0217 14:08:39.715271 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:08:40.215246346 +0000 UTC m=+146.558174615 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:39 crc kubenswrapper[4836]: I0217 14:08:39.715734 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5vhz9\" (UID: \"4cd3f585-c95f-43ee-962c-ea33aff90415\") " pod="openshift-image-registry/image-registry-697d97f7c8-5vhz9" Feb 17 14:08:39 crc kubenswrapper[4836]: E0217 14:08:39.716036 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:08:40.216023907 +0000 UTC m=+146.558952176 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5vhz9" (UID: "4cd3f585-c95f-43ee-962c-ea33aff90415") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:39 crc kubenswrapper[4836]: I0217 14:08:39.736985 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-kcm8s" podStartSLOduration=124.736962883 podStartE2EDuration="2m4.736962883s" podCreationTimestamp="2026-02-17 14:06:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:08:39.731162739 +0000 UTC m=+146.074091028" watchObservedRunningTime="2026-02-17 14:08:39.736962883 +0000 UTC m=+146.079891152" Feb 17 14:08:39 crc kubenswrapper[4836]: I0217 14:08:39.749252 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-5cbbv" podStartSLOduration=124.749230338 podStartE2EDuration="2m4.749230338s" podCreationTimestamp="2026-02-17 14:06:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:08:39.745487629 +0000 UTC m=+146.088415898" watchObservedRunningTime="2026-02-17 14:08:39.749230338 +0000 UTC m=+146.092158617" Feb 17 14:08:39 crc kubenswrapper[4836]: I0217 14:08:39.780783 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-98frx" podStartSLOduration=124.780759796 podStartE2EDuration="2m4.780759796s" podCreationTimestamp="2026-02-17 14:06:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:08:39.778950948 +0000 UTC m=+146.121879237" watchObservedRunningTime="2026-02-17 14:08:39.780759796 +0000 UTC m=+146.123688065" Feb 17 14:08:39 crc kubenswrapper[4836]: I0217 14:08:39.832062 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:08:39 crc kubenswrapper[4836]: E0217 14:08:39.832596 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:08:40.332567381 +0000 UTC m=+146.675495650 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:39 crc kubenswrapper[4836]: I0217 14:08:39.833332 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5vhz9\" (UID: \"4cd3f585-c95f-43ee-962c-ea33aff90415\") " pod="openshift-image-registry/image-registry-697d97f7c8-5vhz9" Feb 17 14:08:39 crc kubenswrapper[4836]: E0217 14:08:39.833990 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:08:40.333960258 +0000 UTC m=+146.676888707 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5vhz9" (UID: "4cd3f585-c95f-43ee-962c-ea33aff90415") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:39 crc kubenswrapper[4836]: I0217 14:08:39.860267 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-6zspj" event={"ID":"6d52104b-91e7-4a3a-9138-163eb850485d","Type":"ContainerStarted","Data":"291ff510753e6307affd77e72c2b113e622f07b799c9441e606ef5eb3889b1a8"} Feb 17 14:08:39 crc kubenswrapper[4836]: I0217 14:08:39.943631 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:08:39 crc kubenswrapper[4836]: E0217 14:08:39.944678 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:08:40.444660207 +0000 UTC m=+146.787588476 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:39 crc kubenswrapper[4836]: I0217 14:08:39.956866 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-ch9j6" event={"ID":"19216a1e-34af-4764-a621-e5097db4751b","Type":"ContainerStarted","Data":"6a0184fe57af3eace5e14316dfc3c005fa9a071572d7b984b02e9739363bc416"} Feb 17 14:08:39 crc kubenswrapper[4836]: I0217 14:08:39.983917 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-9kmt4"] Feb 17 14:08:40 crc kubenswrapper[4836]: I0217 14:08:40.001049 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-blr59" event={"ID":"1ce6cce5-c0bb-4d10-8458-bb9e15832a9c","Type":"ContainerStarted","Data":"8a43311cfd525de309c9c9c8044357beee78fd6571a5dfcbbc4d511abab80b36"} Feb 17 14:08:40 crc kubenswrapper[4836]: I0217 14:08:40.032738 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-fqzrl" event={"ID":"296ae94a-36e6-480b-9395-8f6a96621fdf","Type":"ContainerStarted","Data":"c80137cc1b82e9221143cce7ce317f4b02d9faac0b0555753344e61db88522d2"} Feb 17 14:08:40 crc kubenswrapper[4836]: I0217 14:08:40.035239 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4bfcw" event={"ID":"1a506e2e-c940-4f10-b89c-948d10ba8902","Type":"ContainerStarted","Data":"6bc5f32428c6d60602cc27b761cbfd980b04d3bb8f94c56d7b88d34c6d56a1d1"} Feb 17 14:08:40 crc kubenswrapper[4836]: I0217 14:08:40.038311 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-8ngwr" event={"ID":"8efc7eee-3b20-4cdf-9062-d64472b2c888","Type":"ContainerStarted","Data":"28116c03b63599f37e6ada53240987953f887af6aa25d8c6d798ad25c7708a0f"} Feb 17 14:08:40 crc kubenswrapper[4836]: I0217 14:08:40.045966 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5vhz9\" (UID: \"4cd3f585-c95f-43ee-962c-ea33aff90415\") " pod="openshift-image-registry/image-registry-697d97f7c8-5vhz9" Feb 17 14:08:40 crc kubenswrapper[4836]: E0217 14:08:40.047091 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:08:40.547075626 +0000 UTC m=+146.890003895 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5vhz9" (UID: "4cd3f585-c95f-43ee-962c-ea33aff90415") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:40 crc kubenswrapper[4836]: I0217 14:08:40.063869 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nncbw" event={"ID":"18ec8466-f311-4f81-ae38-48635b000ced","Type":"ContainerStarted","Data":"2a96effe68dcde9ae3d5061b948e7a8c5c380763dd8cf4c65d2d0978d91906d5"} Feb 17 14:08:40 crc kubenswrapper[4836]: I0217 14:08:40.101494 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jhzxl"] Feb 17 14:08:40 crc kubenswrapper[4836]: I0217 14:08:40.104194 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qc9j6" event={"ID":"628fd7f0-d4b6-4866-b7d4-6966ed698611","Type":"ContainerStarted","Data":"e0f2e827a3faadf00ce4c1bb29a3260e33bcd5ff37992adb5957692b8d9df39d"} Feb 17 14:08:40 crc kubenswrapper[4836]: I0217 14:08:40.114552 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wsdcq" event={"ID":"759ba4ec-c9e0-43a5-9f05-5a05bde7c3fb","Type":"ContainerStarted","Data":"062c2b9f95e0f8e02647d29a212805138a081d232a028b28cd3b646963fab553"} Feb 17 14:08:40 crc kubenswrapper[4836]: I0217 14:08:40.120172 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nncbw" podStartSLOduration=125.120156336 podStartE2EDuration="2m5.120156336s" podCreationTimestamp="2026-02-17 14:06:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:08:40.119406817 +0000 UTC m=+146.462335096" watchObservedRunningTime="2026-02-17 14:08:40.120156336 +0000 UTC m=+146.463084605" Feb 17 14:08:40 crc kubenswrapper[4836]: I0217 14:08:40.131911 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7htl2" event={"ID":"3bb5e0b8-9179-4570-a3d8-acaa80b2c884","Type":"ContainerStarted","Data":"6e429a13f72b86b358a3b35792870f01ef0a4205e81bb53c3f33656e1241d64c"} Feb 17 14:08:40 crc kubenswrapper[4836]: I0217 14:08:40.146520 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-fqzrl" podStartSLOduration=125.146503556 podStartE2EDuration="2m5.146503556s" podCreationTimestamp="2026-02-17 14:06:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:08:40.145464178 +0000 UTC m=+146.488392467" watchObservedRunningTime="2026-02-17 14:08:40.146503556 +0000 UTC m=+146.489431825" Feb 17 14:08:40 crc kubenswrapper[4836]: I0217 14:08:40.147415 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:08:40 crc kubenswrapper[4836]: E0217 14:08:40.149078 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:08:40.649062364 +0000 UTC m=+146.991990633 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:40 crc kubenswrapper[4836]: I0217 14:08:40.199350 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7htl2" podStartSLOduration=125.199328969 podStartE2EDuration="2m5.199328969s" podCreationTimestamp="2026-02-17 14:06:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:08:40.195153598 +0000 UTC m=+146.538081877" watchObservedRunningTime="2026-02-17 14:08:40.199328969 +0000 UTC m=+146.542257228" Feb 17 14:08:40 crc kubenswrapper[4836]: I0217 14:08:40.221143 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x2x76"] Feb 17 14:08:40 crc kubenswrapper[4836]: I0217 14:08:40.272691 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5vhz9\" (UID: \"4cd3f585-c95f-43ee-962c-ea33aff90415\") " pod="openshift-image-registry/image-registry-697d97f7c8-5vhz9" Feb 17 14:08:40 crc kubenswrapper[4836]: E0217 14:08:40.273327 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:08:40.773303783 +0000 UTC m=+147.116232062 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5vhz9" (UID: "4cd3f585-c95f-43ee-962c-ea33aff90415") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:40 crc kubenswrapper[4836]: I0217 14:08:40.280127 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-7klmp" event={"ID":"65684d1d-5242-464d-8caf-ad4866bf6a86","Type":"ContainerStarted","Data":"e133e0f2c1fff6df572a502162daf44645b3156b1cdfd9eeb6c1f6241f00aeea"} Feb 17 14:08:40 crc kubenswrapper[4836]: I0217 14:08:40.312527 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jf2hd" event={"ID":"921ecdc3-b5f3-44e4-9300-d25342d944d8","Type":"ContainerStarted","Data":"2b43faccef7aab4cdf7763f3ecfedc0f078099ff98d9886a922cb068c209a6fa"} Feb 17 14:08:40 crc kubenswrapper[4836]: I0217 14:08:40.319910 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-7klmp" podStartSLOduration=125.31988893 podStartE2EDuration="2m5.31988893s" podCreationTimestamp="2026-02-17 14:06:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:08:40.317155877 +0000 UTC m=+146.660084166" watchObservedRunningTime="2026-02-17 14:08:40.31988893 +0000 UTC m=+146.662817199" Feb 17 14:08:40 crc kubenswrapper[4836]: I0217 14:08:40.374556 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:08:40 crc kubenswrapper[4836]: E0217 14:08:40.374960 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:08:40.874941431 +0000 UTC m=+147.217869710 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:40 crc kubenswrapper[4836]: I0217 14:08:40.391556 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wgzvh" event={"ID":"ad67d365-7ef5-406c-9ffe-6f66253704c9","Type":"ContainerStarted","Data":"fa04f773a50c12114b3402cf501ef4d0760330312b6153b8ea03d1ab79f5a403"} Feb 17 14:08:40 crc kubenswrapper[4836]: I0217 14:08:40.392093 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wgzvh" Feb 17 14:08:40 crc kubenswrapper[4836]: I0217 14:08:40.406953 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522280-c8fps" event={"ID":"91eb437c-beea-4f2d-b3f7-505b87fe6dee","Type":"ContainerStarted","Data":"d146dcdb5314cbe00f41e751cbb49254cf23ab6e8cdc06caed3ce29aac7230d2"} Feb 17 14:08:40 crc kubenswrapper[4836]: I0217 14:08:40.421196 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-284hg"] Feb 17 14:08:40 crc kubenswrapper[4836]: I0217 14:08:40.437017 4836 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-wgzvh container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.28:8443/healthz\": dial tcp 10.217.0.28:8443: connect: connection refused" start-of-body= Feb 17 14:08:40 crc kubenswrapper[4836]: I0217 14:08:40.437086 4836 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wgzvh" podUID="ad67d365-7ef5-406c-9ffe-6f66253704c9" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.28:8443/healthz\": dial tcp 10.217.0.28:8443: connect: connection refused" Feb 17 14:08:40 crc kubenswrapper[4836]: I0217 14:08:40.449653 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wgzvh" podStartSLOduration=125.449637755 podStartE2EDuration="2m5.449637755s" podCreationTimestamp="2026-02-17 14:06:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:08:40.447845697 +0000 UTC m=+146.790773966" watchObservedRunningTime="2026-02-17 14:08:40.449637755 +0000 UTC m=+146.792566024" Feb 17 14:08:40 crc kubenswrapper[4836]: I0217 14:08:40.454911 4836 generic.go:334] "Generic (PLEG): container finished" podID="e634dd4f-5615-4dd7-ac0f-8c22a3ee7d40" containerID="bf8d1eedc34d2af6d2e6de50ac54dac0113b8fb73bd5dae605e4e5b4b2165e46" exitCode=0 Feb 17 14:08:40 crc kubenswrapper[4836]: I0217 14:08:40.455891 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-z6h7n" event={"ID":"e634dd4f-5615-4dd7-ac0f-8c22a3ee7d40","Type":"ContainerDied","Data":"bf8d1eedc34d2af6d2e6de50ac54dac0113b8fb73bd5dae605e4e5b4b2165e46"} Feb 17 14:08:40 crc kubenswrapper[4836]: I0217 14:08:40.477676 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5vhz9\" (UID: \"4cd3f585-c95f-43ee-962c-ea33aff90415\") " pod="openshift-image-registry/image-registry-697d97f7c8-5vhz9" Feb 17 14:08:40 crc kubenswrapper[4836]: E0217 14:08:40.481607 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:08:40.981459249 +0000 UTC m=+147.324387518 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5vhz9" (UID: "4cd3f585-c95f-43ee-962c-ea33aff90415") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:40 crc kubenswrapper[4836]: I0217 14:08:40.484897 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-jjmwc"] Feb 17 14:08:40 crc kubenswrapper[4836]: I0217 14:08:40.504012 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-l5kfm"] Feb 17 14:08:40 crc kubenswrapper[4836]: I0217 14:08:40.527133 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-8sd2q" event={"ID":"171e2af0-2993-4cd3-942f-043bccca2813","Type":"ContainerStarted","Data":"e7d432d1e079472b278684cf08df1245f9dc63f92ea016424fa5c0c99aa22306"} Feb 17 14:08:40 crc kubenswrapper[4836]: I0217 14:08:40.530978 4836 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-khbdr container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.29:8080/healthz\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Feb 17 14:08:40 crc kubenswrapper[4836]: I0217 14:08:40.531035 4836 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-khbdr" podUID="985bc83c-52fa-45dc-ab4f-6e47ee47683e" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.29:8080/healthz\": dial tcp 10.217.0.29:8080: connect: connection refused" Feb 17 14:08:40 crc kubenswrapper[4836]: I0217 14:08:40.532248 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-6scjm"] Feb 17 14:08:40 crc kubenswrapper[4836]: I0217 14:08:40.536553 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-gtjx7"] Feb 17 14:08:40 crc kubenswrapper[4836]: I0217 14:08:40.539051 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2mmw4" Feb 17 14:08:40 crc kubenswrapper[4836]: I0217 14:08:40.550636 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-5l6x4" Feb 17 14:08:40 crc kubenswrapper[4836]: I0217 14:08:40.557489 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-6rsds" Feb 17 14:08:40 crc kubenswrapper[4836]: I0217 14:08:40.580256 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:08:40 crc kubenswrapper[4836]: E0217 14:08:40.581177 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:08:41.081155307 +0000 UTC m=+147.424083586 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:40 crc kubenswrapper[4836]: I0217 14:08:40.587204 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sknds" Feb 17 14:08:40 crc kubenswrapper[4836]: I0217 14:08:40.587243 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-72n7k"] Feb 17 14:08:40 crc kubenswrapper[4836]: I0217 14:08:40.635364 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-kcm8s" Feb 17 14:08:40 crc kubenswrapper[4836]: I0217 14:08:40.688247 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5vhz9\" (UID: \"4cd3f585-c95f-43ee-962c-ea33aff90415\") " pod="openshift-image-registry/image-registry-697d97f7c8-5vhz9" Feb 17 14:08:40 crc kubenswrapper[4836]: E0217 14:08:40.719135 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:08:41.219117879 +0000 UTC m=+147.562046218 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5vhz9" (UID: "4cd3f585-c95f-43ee-962c-ea33aff90415") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:40 crc kubenswrapper[4836]: W0217 14:08:40.765080 4836 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0fc0c6ee_c7d3_4e99_a6c2_b8de8f7ffa00.slice/crio-7e91d1245e3a5f406f922429283277502660f1176e6a84a89b48e1271e943fc8 WatchSource:0}: Error finding container 7e91d1245e3a5f406f922429283277502660f1176e6a84a89b48e1271e943fc8: Status 404 returned error can't find the container with id 7e91d1245e3a5f406f922429283277502660f1176e6a84a89b48e1271e943fc8 Feb 17 14:08:40 crc kubenswrapper[4836]: I0217 14:08:40.792436 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:08:40 crc kubenswrapper[4836]: E0217 14:08:40.793471 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:08:41.293452453 +0000 UTC m=+147.636380732 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:40 crc kubenswrapper[4836]: I0217 14:08:40.906119 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5vhz9\" (UID: \"4cd3f585-c95f-43ee-962c-ea33aff90415\") " pod="openshift-image-registry/image-registry-697d97f7c8-5vhz9" Feb 17 14:08:40 crc kubenswrapper[4836]: E0217 14:08:40.906572 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:08:41.406526616 +0000 UTC m=+147.749454885 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5vhz9" (UID: "4cd3f585-c95f-43ee-962c-ea33aff90415") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:40 crc kubenswrapper[4836]: I0217 14:08:40.918263 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-fqzrl" Feb 17 14:08:40 crc kubenswrapper[4836]: I0217 14:08:40.972942 4836 patch_prober.go:28] interesting pod/router-default-5444994796-fqzrl container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Feb 17 14:08:40 crc kubenswrapper[4836]: I0217 14:08:40.973004 4836 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fqzrl" podUID="296ae94a-36e6-480b-9395-8f6a96621fdf" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Feb 17 14:08:41 crc kubenswrapper[4836]: I0217 14:08:41.009534 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:08:41 crc kubenswrapper[4836]: E0217 14:08:41.010192 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:08:41.510166887 +0000 UTC m=+147.853095156 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:41 crc kubenswrapper[4836]: I0217 14:08:41.032685 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-ld8ls"] Feb 17 14:08:41 crc kubenswrapper[4836]: I0217 14:08:41.062066 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2xtf8"] Feb 17 14:08:41 crc kubenswrapper[4836]: I0217 14:08:41.114986 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5vhz9\" (UID: \"4cd3f585-c95f-43ee-962c-ea33aff90415\") " pod="openshift-image-registry/image-registry-697d97f7c8-5vhz9" Feb 17 14:08:41 crc kubenswrapper[4836]: E0217 14:08:41.115315 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:08:41.615301369 +0000 UTC m=+147.958229638 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5vhz9" (UID: "4cd3f585-c95f-43ee-962c-ea33aff90415") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:41 crc kubenswrapper[4836]: I0217 14:08:41.138929 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-cnq25"] Feb 17 14:08:41 crc kubenswrapper[4836]: W0217 14:08:41.155816 4836 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1679c4a6_a707_4150_825b_5cb8b90cb27c.slice/crio-9f3428a83cd09a47ab4feb5e6bc8c24a7d13c5ec588cb5bb2c42e10da3ed95e7 WatchSource:0}: Error finding container 9f3428a83cd09a47ab4feb5e6bc8c24a7d13c5ec588cb5bb2c42e10da3ed95e7: Status 404 returned error can't find the container with id 9f3428a83cd09a47ab4feb5e6bc8c24a7d13c5ec588cb5bb2c42e10da3ed95e7 Feb 17 14:08:41 crc kubenswrapper[4836]: I0217 14:08:41.217224 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:08:41 crc kubenswrapper[4836]: E0217 14:08:41.217763 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:08:41.717719508 +0000 UTC m=+148.060647767 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:41 crc kubenswrapper[4836]: I0217 14:08:41.217945 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5vhz9\" (UID: \"4cd3f585-c95f-43ee-962c-ea33aff90415\") " pod="openshift-image-registry/image-registry-697d97f7c8-5vhz9" Feb 17 14:08:41 crc kubenswrapper[4836]: E0217 14:08:41.218359 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:08:41.718335284 +0000 UTC m=+148.061263573 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5vhz9" (UID: "4cd3f585-c95f-43ee-962c-ea33aff90415") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:41 crc kubenswrapper[4836]: I0217 14:08:41.319476 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:08:41 crc kubenswrapper[4836]: E0217 14:08:41.320165 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:08:41.820143457 +0000 UTC m=+148.163071726 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:41 crc kubenswrapper[4836]: I0217 14:08:41.421442 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5vhz9\" (UID: \"4cd3f585-c95f-43ee-962c-ea33aff90415\") " pod="openshift-image-registry/image-registry-697d97f7c8-5vhz9" Feb 17 14:08:41 crc kubenswrapper[4836]: E0217 14:08:41.421812 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:08:41.921801387 +0000 UTC m=+148.264729656 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5vhz9" (UID: "4cd3f585-c95f-43ee-962c-ea33aff90415") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:41 crc kubenswrapper[4836]: I0217 14:08:41.526033 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:08:41 crc kubenswrapper[4836]: E0217 14:08:41.526557 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:08:42.026539168 +0000 UTC m=+148.369467447 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:41 crc kubenswrapper[4836]: I0217 14:08:41.628092 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5vhz9\" (UID: \"4cd3f585-c95f-43ee-962c-ea33aff90415\") " pod="openshift-image-registry/image-registry-697d97f7c8-5vhz9" Feb 17 14:08:41 crc kubenswrapper[4836]: E0217 14:08:41.628565 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:08:42.128551906 +0000 UTC m=+148.471480175 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5vhz9" (UID: "4cd3f585-c95f-43ee-962c-ea33aff90415") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:41 crc kubenswrapper[4836]: I0217 14:08:41.636855 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-jjmwc" event={"ID":"1ecc7c98-e9a3-4850-a741-7e0bcf670e27","Type":"ContainerStarted","Data":"fecefbcabce7d581460da721089e3aaf00a80d8b8481ab7813bd28b6b56e33f4"} Feb 17 14:08:41 crc kubenswrapper[4836]: I0217 14:08:41.636942 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-jjmwc" event={"ID":"1ecc7c98-e9a3-4850-a741-7e0bcf670e27","Type":"ContainerStarted","Data":"abe8216f04612f75edf41ec18a2f6650cca5fd3f31ec4360b8c966e6edcb969b"} Feb 17 14:08:41 crc kubenswrapper[4836]: I0217 14:08:41.678075 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-6zspj" event={"ID":"6d52104b-91e7-4a3a-9138-163eb850485d","Type":"ContainerStarted","Data":"f775333908715051cce1f22baadc49c1832333ebb3ff40de54bd938fa8c37a98"} Feb 17 14:08:41 crc kubenswrapper[4836]: I0217 14:08:41.736229 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:08:41 crc kubenswrapper[4836]: E0217 14:08:41.737108 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:08:42.237086127 +0000 UTC m=+148.580014396 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:41 crc kubenswrapper[4836]: I0217 14:08:41.743098 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522280-c8fps" event={"ID":"91eb437c-beea-4f2d-b3f7-505b87fe6dee","Type":"ContainerStarted","Data":"4b8580f44aade0425b4de34e0f49d07bd6192e526f9c10aa11b53556a3546660"} Feb 17 14:08:41 crc kubenswrapper[4836]: I0217 14:08:41.757326 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-l5kfm" event={"ID":"56a1d7ef-ccae-4b8e-b94f-edee0ce6e902","Type":"ContainerStarted","Data":"a71f5b681c1cf8c3e0036b27104d45355d9eed9eb0c4dbd3b2ed191daafeaa1e"} Feb 17 14:08:41 crc kubenswrapper[4836]: I0217 14:08:41.757422 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-l5kfm" event={"ID":"56a1d7ef-ccae-4b8e-b94f-edee0ce6e902","Type":"ContainerStarted","Data":"12d23fb1c3176bf201275d87f8aadc303da6853f0b3cacde10cb3ec5cfffcc32"} Feb 17 14:08:41 crc kubenswrapper[4836]: I0217 14:08:41.787683 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-6zspj" podStartSLOduration=126.78766003 podStartE2EDuration="2m6.78766003s" podCreationTimestamp="2026-02-17 14:06:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:08:41.73721572 +0000 UTC m=+148.080143999" watchObservedRunningTime="2026-02-17 14:08:41.78766003 +0000 UTC m=+148.130588329" Feb 17 14:08:41 crc kubenswrapper[4836]: I0217 14:08:41.790244 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29522280-c8fps" podStartSLOduration=126.790233368 podStartE2EDuration="2m6.790233368s" podCreationTimestamp="2026-02-17 14:06:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:08:41.785908504 +0000 UTC m=+148.128836793" watchObservedRunningTime="2026-02-17 14:08:41.790233368 +0000 UTC m=+148.133161637" Feb 17 14:08:41 crc kubenswrapper[4836]: I0217 14:08:41.803131 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2rnsr" event={"ID":"c26f912f-f640-4b4c-ab61-dd2a163f12ab","Type":"ContainerStarted","Data":"339b233a0392cc1ae25e160cbae69bfbfda6b94a68d2c6fec86051674706c3bd"} Feb 17 14:08:41 crc kubenswrapper[4836]: I0217 14:08:41.804087 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2rnsr" Feb 17 14:08:41 crc kubenswrapper[4836]: I0217 14:08:41.818263 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-cnq25" event={"ID":"66402e53-3287-45c4-bceb-78fc99836c5b","Type":"ContainerStarted","Data":"ecaf3ef8d304449130aaa81d7ce6f0ebfc459031923106bdf43b8d4e6645a320"} Feb 17 14:08:41 crc kubenswrapper[4836]: I0217 14:08:41.828353 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-blr59" event={"ID":"1ce6cce5-c0bb-4d10-8458-bb9e15832a9c","Type":"ContainerStarted","Data":"2ba11af3b63fe390525fa68100541d0617a3c07adef0279400f0bc4e690218ff"} Feb 17 14:08:41 crc kubenswrapper[4836]: I0217 14:08:41.828401 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-blr59" event={"ID":"1ce6cce5-c0bb-4d10-8458-bb9e15832a9c","Type":"ContainerStarted","Data":"b4b8b7a1f292811ffe1d6ad1d5f62a28882d64527b4d531bfcdee88772ad6c9d"} Feb 17 14:08:41 crc kubenswrapper[4836]: I0217 14:08:41.829087 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-blr59" Feb 17 14:08:41 crc kubenswrapper[4836]: I0217 14:08:41.841666 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5vhz9\" (UID: \"4cd3f585-c95f-43ee-962c-ea33aff90415\") " pod="openshift-image-registry/image-registry-697d97f7c8-5vhz9" Feb 17 14:08:41 crc kubenswrapper[4836]: E0217 14:08:41.841932 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:08:42.34191996 +0000 UTC m=+148.684848229 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5vhz9" (UID: "4cd3f585-c95f-43ee-962c-ea33aff90415") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:41 crc kubenswrapper[4836]: I0217 14:08:41.845361 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4bfcw" event={"ID":"1a506e2e-c940-4f10-b89c-948d10ba8902","Type":"ContainerStarted","Data":"3fecb315f3247083dd673ec8f96e4d094038e9776b4dc89179ed6b9b7343abe9"} Feb 17 14:08:41 crc kubenswrapper[4836]: I0217 14:08:41.862403 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2rnsr" podStartSLOduration=126.862362234 podStartE2EDuration="2m6.862362234s" podCreationTimestamp="2026-02-17 14:06:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:08:41.860573136 +0000 UTC m=+148.203501415" watchObservedRunningTime="2026-02-17 14:08:41.862362234 +0000 UTC m=+148.205290513" Feb 17 14:08:41 crc kubenswrapper[4836]: I0217 14:08:41.868732 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-72n7k" event={"ID":"0fc0c6ee-c7d3-4e99-a6c2-b8de8f7ffa00","Type":"ContainerStarted","Data":"7e91d1245e3a5f406f922429283277502660f1176e6a84a89b48e1271e943fc8"} Feb 17 14:08:41 crc kubenswrapper[4836]: I0217 14:08:41.875982 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" Feb 17 14:08:41 crc kubenswrapper[4836]: I0217 14:08:41.878746 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-ch9j6" event={"ID":"19216a1e-34af-4764-a621-e5097db4751b","Type":"ContainerStarted","Data":"541952d69cab6c11b3fc7e559178910c58e24409f9e10fa515210c2d7678eafe"} Feb 17 14:08:41 crc kubenswrapper[4836]: I0217 14:08:41.898430 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-8sd2q" event={"ID":"171e2af0-2993-4cd3-942f-043bccca2813","Type":"ContainerStarted","Data":"e18399186af073660f549c96051d9c998265a5ce1b626eb8518999225e1d2674"} Feb 17 14:08:41 crc kubenswrapper[4836]: I0217 14:08:41.934048 4836 patch_prober.go:28] interesting pod/router-default-5444994796-fqzrl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 14:08:41 crc kubenswrapper[4836]: [-]has-synced failed: reason withheld Feb 17 14:08:41 crc kubenswrapper[4836]: [+]process-running ok Feb 17 14:08:41 crc kubenswrapper[4836]: healthz check failed Feb 17 14:08:41 crc kubenswrapper[4836]: I0217 14:08:41.934100 4836 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fqzrl" podUID="296ae94a-36e6-480b-9395-8f6a96621fdf" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 14:08:41 crc kubenswrapper[4836]: I0217 14:08:41.938147 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wsdcq" event={"ID":"759ba4ec-c9e0-43a5-9f05-5a05bde7c3fb","Type":"ContainerStarted","Data":"1896c6ebfdf62fb56a8dfec0d91294e09cab7f8d55a164ff613d06b7c1ca705d"} Feb 17 14:08:41 crc kubenswrapper[4836]: I0217 14:08:41.942218 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:08:41 crc kubenswrapper[4836]: I0217 14:08:41.945122 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-6scjm" event={"ID":"c2f4f6fb-d604-402f-83d0-6b25781c3aa8","Type":"ContainerStarted","Data":"e1b370fccdd3c0cbe1bd48b096eb15a417e6cde60a9f9fca5ca899cf49f77f6f"} Feb 17 14:08:41 crc kubenswrapper[4836]: E0217 14:08:41.945539 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:08:42.445518481 +0000 UTC m=+148.788446920 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:41 crc kubenswrapper[4836]: I0217 14:08:41.951585 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-blr59" podStartSLOduration=125.951560751 podStartE2EDuration="2m5.951560751s" podCreationTimestamp="2026-02-17 14:06:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:08:41.944091394 +0000 UTC m=+148.287019673" watchObservedRunningTime="2026-02-17 14:08:41.951560751 +0000 UTC m=+148.294489020" Feb 17 14:08:41 crc kubenswrapper[4836]: I0217 14:08:41.966080 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qc9j6" event={"ID":"628fd7f0-d4b6-4866-b7d4-6966ed698611","Type":"ContainerStarted","Data":"6931602701dbddf7d3aef736b234ff4b522e3851fded6f8218a926b7a1174ab2"} Feb 17 14:08:41 crc kubenswrapper[4836]: I0217 14:08:41.966161 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qc9j6" event={"ID":"628fd7f0-d4b6-4866-b7d4-6966ed698611","Type":"ContainerStarted","Data":"e1eeb465451bc50cb039cba588fd31be184af00e413db8d626ef59e16043bdfa"} Feb 17 14:08:42 crc kubenswrapper[4836]: I0217 14:08:42.000321 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4bfcw" podStartSLOduration=127.000274135 podStartE2EDuration="2m7.000274135s" podCreationTimestamp="2026-02-17 14:06:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:08:41.998270522 +0000 UTC m=+148.341198811" watchObservedRunningTime="2026-02-17 14:08:42.000274135 +0000 UTC m=+148.343202404" Feb 17 14:08:42 crc kubenswrapper[4836]: I0217 14:08:42.022395 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wgzvh" event={"ID":"ad67d365-7ef5-406c-9ffe-6f66253704c9","Type":"ContainerStarted","Data":"388642265f4db9d0735cce02485d442992568c7fd46001c7bc78254a567dbaca"} Feb 17 14:08:42 crc kubenswrapper[4836]: I0217 14:08:42.027437 4836 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-wgzvh container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.28:8443/healthz\": dial tcp 10.217.0.28:8443: connect: connection refused" start-of-body= Feb 17 14:08:42 crc kubenswrapper[4836]: I0217 14:08:42.027494 4836 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wgzvh" podUID="ad67d365-7ef5-406c-9ffe-6f66253704c9" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.28:8443/healthz\": dial tcp 10.217.0.28:8443: connect: connection refused" Feb 17 14:08:42 crc kubenswrapper[4836]: I0217 14:08:42.049428 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5vhz9\" (UID: \"4cd3f585-c95f-43ee-962c-ea33aff90415\") " pod="openshift-image-registry/image-registry-697d97f7c8-5vhz9" Feb 17 14:08:42 crc kubenswrapper[4836]: E0217 14:08:42.054143 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:08:42.554124645 +0000 UTC m=+148.897053104 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5vhz9" (UID: "4cd3f585-c95f-43ee-962c-ea33aff90415") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:42 crc kubenswrapper[4836]: I0217 14:08:42.115088 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x2x76" event={"ID":"c758606a-b3e4-494e-a2a6-7a7320277b37","Type":"ContainerStarted","Data":"409c23bf268949259e227a8a90fb2343eddc04f3a9768ef7989e55923aeb9dfc"} Feb 17 14:08:42 crc kubenswrapper[4836]: I0217 14:08:42.115143 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x2x76" event={"ID":"c758606a-b3e4-494e-a2a6-7a7320277b37","Type":"ContainerStarted","Data":"228fee33c87b0b716de72ec5eddd835617b8a8b71e05b8b117fbe54c01548e3a"} Feb 17 14:08:42 crc kubenswrapper[4836]: I0217 14:08:42.116237 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x2x76" Feb 17 14:08:42 crc kubenswrapper[4836]: I0217 14:08:42.117928 4836 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-x2x76 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.32:5443/healthz\": dial tcp 10.217.0.32:5443: connect: connection refused" start-of-body= Feb 17 14:08:42 crc kubenswrapper[4836]: I0217 14:08:42.117967 4836 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x2x76" podUID="c758606a-b3e4-494e-a2a6-7a7320277b37" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.32:5443/healthz\": dial tcp 10.217.0.32:5443: connect: connection refused" Feb 17 14:08:42 crc kubenswrapper[4836]: I0217 14:08:42.120507 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jf2hd" event={"ID":"921ecdc3-b5f3-44e4-9300-d25342d944d8","Type":"ContainerStarted","Data":"3c505489717a23a07a29188af5920d31395e6fa5597950a0715ef893694d8c29"} Feb 17 14:08:42 crc kubenswrapper[4836]: I0217 14:08:42.143810 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jhzxl" event={"ID":"cea58b47-da5e-4dc7-be23-19d8408318d7","Type":"ContainerStarted","Data":"93c66f80e4f7a852f9530feeb27cdd8fe450ee2f132d821836ade2d0abda189b"} Feb 17 14:08:42 crc kubenswrapper[4836]: I0217 14:08:42.143883 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jhzxl" event={"ID":"cea58b47-da5e-4dc7-be23-19d8408318d7","Type":"ContainerStarted","Data":"d0b0ff9c3637aff4d5ccd9c5d218aa1b169a1c8970c688131c8c515a8d4e2cf7"} Feb 17 14:08:42 crc kubenswrapper[4836]: I0217 14:08:42.152045 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:08:42 crc kubenswrapper[4836]: E0217 14:08:42.154002 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:08:42.653960536 +0000 UTC m=+148.996888985 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:42 crc kubenswrapper[4836]: I0217 14:08:42.178608 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2xtf8" event={"ID":"1679c4a6-a707-4150-825b-5cb8b90cb27c","Type":"ContainerStarted","Data":"9f3428a83cd09a47ab4feb5e6bc8c24a7d13c5ec588cb5bb2c42e10da3ed95e7"} Feb 17 14:08:42 crc kubenswrapper[4836]: I0217 14:08:42.265880 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5vhz9\" (UID: \"4cd3f585-c95f-43ee-962c-ea33aff90415\") " pod="openshift-image-registry/image-registry-697d97f7c8-5vhz9" Feb 17 14:08:42 crc kubenswrapper[4836]: E0217 14:08:42.271126 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:08:42.771103316 +0000 UTC m=+149.114031585 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5vhz9" (UID: "4cd3f585-c95f-43ee-962c-ea33aff90415") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:42 crc kubenswrapper[4836]: I0217 14:08:42.277802 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-ch9j6" podStartSLOduration=127.277784443 podStartE2EDuration="2m7.277784443s" podCreationTimestamp="2026-02-17 14:06:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:08:42.22870399 +0000 UTC m=+148.571632279" watchObservedRunningTime="2026-02-17 14:08:42.277784443 +0000 UTC m=+148.620712712" Feb 17 14:08:42 crc kubenswrapper[4836]: I0217 14:08:42.360872 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-fqzrl" event={"ID":"296ae94a-36e6-480b-9395-8f6a96621fdf","Type":"ContainerStarted","Data":"d0f3be10b979ed9ddc992a293ce660fd9546585be5a409a195c0c6066553e222"} Feb 17 14:08:42 crc kubenswrapper[4836]: I0217 14:08:42.368771 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:08:42 crc kubenswrapper[4836]: E0217 14:08:42.369690 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:08:42.869670343 +0000 UTC m=+149.212598612 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:42 crc kubenswrapper[4836]: I0217 14:08:42.382670 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-8ngwr" event={"ID":"8efc7eee-3b20-4cdf-9062-d64472b2c888","Type":"ContainerStarted","Data":"bb5cf0343b6bd13a71f5c7f74c1e2686d90220fe8354c4422de092015efed3e1"} Feb 17 14:08:42 crc kubenswrapper[4836]: I0217 14:08:42.391777 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-ld8ls" event={"ID":"b8fcc519-bc1d-4a7e-8005-cb29f435f4e5","Type":"ContainerStarted","Data":"ae1d3dd43412d1028433bdfbcf6d347aa29f071463e21340063aaf24f68c1104"} Feb 17 14:08:42 crc kubenswrapper[4836]: I0217 14:08:42.404287 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-gtjx7" event={"ID":"83427963-071f-40a0-8988-b39a3d41e59f","Type":"ContainerStarted","Data":"66c23c64f3d021901675ec2843a8277fb0ae34c926962ac2f3cd7783af1be3d7"} Feb 17 14:08:42 crc kubenswrapper[4836]: I0217 14:08:42.413511 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-284hg" event={"ID":"f799cad1-5a28-4af5-8070-5c365cddbf78","Type":"ContainerStarted","Data":"5c9c12c8a9ba3079222f116257011486923d4631bdb296cc7bd06e6055fa3f43"} Feb 17 14:08:42 crc kubenswrapper[4836]: I0217 14:08:42.423189 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qc9j6" podStartSLOduration=127.423147952 podStartE2EDuration="2m7.423147952s" podCreationTimestamp="2026-02-17 14:06:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:08:42.297678872 +0000 UTC m=+148.640607141" watchObservedRunningTime="2026-02-17 14:08:42.423147952 +0000 UTC m=+148.766076222" Feb 17 14:08:42 crc kubenswrapper[4836]: I0217 14:08:42.424786 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-9kmt4" event={"ID":"f70daa4b-d685-406a-ba3a-7fa6d672acdd","Type":"ContainerStarted","Data":"f55d51822910cf61da64c47ac1417ba9aebe9d1f79039fa69723576dadcf0637"} Feb 17 14:08:42 crc kubenswrapper[4836]: I0217 14:08:42.424840 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-9kmt4" event={"ID":"f70daa4b-d685-406a-ba3a-7fa6d672acdd","Type":"ContainerStarted","Data":"e32b19868a282e2ff1b39a9f475fd5f67a0902937d2759466c7ef84e64469cec"} Feb 17 14:08:42 crc kubenswrapper[4836]: I0217 14:08:42.429225 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-khbdr" Feb 17 14:08:42 crc kubenswrapper[4836]: I0217 14:08:42.473342 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5vhz9\" (UID: \"4cd3f585-c95f-43ee-962c-ea33aff90415\") " pod="openshift-image-registry/image-registry-697d97f7c8-5vhz9" Feb 17 14:08:42 crc kubenswrapper[4836]: E0217 14:08:42.474094 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:08:42.974071735 +0000 UTC m=+149.316999994 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5vhz9" (UID: "4cd3f585-c95f-43ee-962c-ea33aff90415") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:42 crc kubenswrapper[4836]: I0217 14:08:42.498740 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-8sd2q" podStartSLOduration=127.498705819 podStartE2EDuration="2m7.498705819s" podCreationTimestamp="2026-02-17 14:06:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:08:42.426613095 +0000 UTC m=+148.769541364" watchObservedRunningTime="2026-02-17 14:08:42.498705819 +0000 UTC m=+148.841634098" Feb 17 14:08:42 crc kubenswrapper[4836]: I0217 14:08:42.574847 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:08:42 crc kubenswrapper[4836]: E0217 14:08:42.576194 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:08:43.076176855 +0000 UTC m=+149.419105124 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:42 crc kubenswrapper[4836]: I0217 14:08:42.651522 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wsdcq" podStartSLOduration=127.651497005 podStartE2EDuration="2m7.651497005s" podCreationTimestamp="2026-02-17 14:06:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:08:42.520825506 +0000 UTC m=+148.863753785" watchObservedRunningTime="2026-02-17 14:08:42.651497005 +0000 UTC m=+148.994425274" Feb 17 14:08:42 crc kubenswrapper[4836]: I0217 14:08:42.677046 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5vhz9\" (UID: \"4cd3f585-c95f-43ee-962c-ea33aff90415\") " pod="openshift-image-registry/image-registry-697d97f7c8-5vhz9" Feb 17 14:08:42 crc kubenswrapper[4836]: E0217 14:08:42.677458 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:08:43.177441955 +0000 UTC m=+149.520370224 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5vhz9" (UID: "4cd3f585-c95f-43ee-962c-ea33aff90415") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:42 crc kubenswrapper[4836]: I0217 14:08:42.780639 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:08:42 crc kubenswrapper[4836]: E0217 14:08:42.781053 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:08:43.281030475 +0000 UTC m=+149.623958734 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:42 crc kubenswrapper[4836]: I0217 14:08:42.808983 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x2x76" podStartSLOduration=126.808960966 podStartE2EDuration="2m6.808960966s" podCreationTimestamp="2026-02-17 14:06:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:08:42.651882526 +0000 UTC m=+148.994810795" watchObservedRunningTime="2026-02-17 14:08:42.808960966 +0000 UTC m=+149.151889235" Feb 17 14:08:42 crc kubenswrapper[4836]: I0217 14:08:42.899508 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5vhz9\" (UID: \"4cd3f585-c95f-43ee-962c-ea33aff90415\") " pod="openshift-image-registry/image-registry-697d97f7c8-5vhz9" Feb 17 14:08:42 crc kubenswrapper[4836]: E0217 14:08:42.899936 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:08:43.399921822 +0000 UTC m=+149.742850091 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5vhz9" (UID: "4cd3f585-c95f-43ee-962c-ea33aff90415") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:42 crc kubenswrapper[4836]: I0217 14:08:42.923649 4836 patch_prober.go:28] interesting pod/router-default-5444994796-fqzrl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 14:08:42 crc kubenswrapper[4836]: [-]has-synced failed: reason withheld Feb 17 14:08:42 crc kubenswrapper[4836]: [+]process-running ok Feb 17 14:08:42 crc kubenswrapper[4836]: healthz check failed Feb 17 14:08:42 crc kubenswrapper[4836]: I0217 14:08:42.923720 4836 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fqzrl" podUID="296ae94a-36e6-480b-9395-8f6a96621fdf" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 14:08:43 crc kubenswrapper[4836]: I0217 14:08:43.008395 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:08:43 crc kubenswrapper[4836]: E0217 14:08:43.008745 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:08:43.5087247 +0000 UTC m=+149.851652969 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:43 crc kubenswrapper[4836]: I0217 14:08:43.110375 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5vhz9\" (UID: \"4cd3f585-c95f-43ee-962c-ea33aff90415\") " pod="openshift-image-registry/image-registry-697d97f7c8-5vhz9" Feb 17 14:08:43 crc kubenswrapper[4836]: E0217 14:08:43.110880 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:08:43.610854411 +0000 UTC m=+149.953782860 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5vhz9" (UID: "4cd3f585-c95f-43ee-962c-ea33aff90415") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:43 crc kubenswrapper[4836]: I0217 14:08:43.143698 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jf2hd" podStartSLOduration=128.143661363 podStartE2EDuration="2m8.143661363s" podCreationTimestamp="2026-02-17 14:06:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:08:42.811934825 +0000 UTC m=+149.154863094" watchObservedRunningTime="2026-02-17 14:08:43.143661363 +0000 UTC m=+149.486589622" Feb 17 14:08:43 crc kubenswrapper[4836]: I0217 14:08:43.145492 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 17 14:08:43 crc kubenswrapper[4836]: I0217 14:08:43.146412 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 17 14:08:43 crc kubenswrapper[4836]: W0217 14:08:43.160539 4836 reflector.go:561] object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n": failed to list *v1.Secret: secrets "installer-sa-dockercfg-kjl2n" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-kube-controller-manager": no relationship found between node 'crc' and this object Feb 17 14:08:43 crc kubenswrapper[4836]: E0217 14:08:43.160593 4836 reflector.go:158] "Unhandled Error" err="object-\"openshift-kube-controller-manager\"/\"installer-sa-dockercfg-kjl2n\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"installer-sa-dockercfg-kjl2n\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-kube-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 17 14:08:43 crc kubenswrapper[4836]: W0217 14:08:43.182449 4836 reflector.go:561] object-"openshift-kube-controller-manager"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-kube-controller-manager": no relationship found between node 'crc' and this object Feb 17 14:08:43 crc kubenswrapper[4836]: E0217 14:08:43.182507 4836 reflector.go:158] "Unhandled Error" err="object-\"openshift-kube-controller-manager\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-kube-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 17 14:08:43 crc kubenswrapper[4836]: I0217 14:08:43.230477 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:08:43 crc kubenswrapper[4836]: I0217 14:08:43.230705 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/caed4fb3-6dd4-4427-880f-fee413854d48-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"caed4fb3-6dd4-4427-880f-fee413854d48\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 17 14:08:43 crc kubenswrapper[4836]: I0217 14:08:43.230781 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/caed4fb3-6dd4-4427-880f-fee413854d48-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"caed4fb3-6dd4-4427-880f-fee413854d48\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 17 14:08:43 crc kubenswrapper[4836]: E0217 14:08:43.230882 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:08:43.730865648 +0000 UTC m=+150.073793917 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:43 crc kubenswrapper[4836]: I0217 14:08:43.250538 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 17 14:08:43 crc kubenswrapper[4836]: I0217 14:08:43.332145 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5vhz9\" (UID: \"4cd3f585-c95f-43ee-962c-ea33aff90415\") " pod="openshift-image-registry/image-registry-697d97f7c8-5vhz9" Feb 17 14:08:43 crc kubenswrapper[4836]: I0217 14:08:43.332214 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/caed4fb3-6dd4-4427-880f-fee413854d48-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"caed4fb3-6dd4-4427-880f-fee413854d48\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 17 14:08:43 crc kubenswrapper[4836]: I0217 14:08:43.332352 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/caed4fb3-6dd4-4427-880f-fee413854d48-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"caed4fb3-6dd4-4427-880f-fee413854d48\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 17 14:08:43 crc kubenswrapper[4836]: I0217 14:08:43.332417 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:08:43 crc kubenswrapper[4836]: I0217 14:08:43.332453 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:08:43 crc kubenswrapper[4836]: I0217 14:08:43.335496 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/caed4fb3-6dd4-4427-880f-fee413854d48-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"caed4fb3-6dd4-4427-880f-fee413854d48\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 17 14:08:43 crc kubenswrapper[4836]: E0217 14:08:43.336000 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:08:43.835977269 +0000 UTC m=+150.178905538 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5vhz9" (UID: "4cd3f585-c95f-43ee-962c-ea33aff90415") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:43 crc kubenswrapper[4836]: I0217 14:08:43.345501 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-z6h7n" podStartSLOduration=127.345472801 podStartE2EDuration="2m7.345472801s" podCreationTimestamp="2026-02-17 14:06:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:08:43.342689627 +0000 UTC m=+149.685617916" watchObservedRunningTime="2026-02-17 14:08:43.345472801 +0000 UTC m=+149.688401080" Feb 17 14:08:43 crc kubenswrapper[4836]: I0217 14:08:43.350660 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:08:43 crc kubenswrapper[4836]: I0217 14:08:43.378462 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:08:43 crc kubenswrapper[4836]: I0217 14:08:43.435909 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:08:43 crc kubenswrapper[4836]: I0217 14:08:43.436086 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:08:43 crc kubenswrapper[4836]: I0217 14:08:43.436111 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:08:43 crc kubenswrapper[4836]: E0217 14:08:43.438489 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:08:43.938456019 +0000 UTC m=+150.281384288 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:43 crc kubenswrapper[4836]: I0217 14:08:43.440526 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-9kmt4" podStartSLOduration=9.440510124 podStartE2EDuration="9.440510124s" podCreationTimestamp="2026-02-17 14:08:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:08:43.39518612 +0000 UTC m=+149.738114409" watchObservedRunningTime="2026-02-17 14:08:43.440510124 +0000 UTC m=+149.783438393" Feb 17 14:08:43 crc kubenswrapper[4836]: I0217 14:08:43.451182 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:08:43 crc kubenswrapper[4836]: I0217 14:08:43.460717 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:08:43 crc kubenswrapper[4836]: I0217 14:08:43.539444 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5vhz9\" (UID: \"4cd3f585-c95f-43ee-962c-ea33aff90415\") " pod="openshift-image-registry/image-registry-697d97f7c8-5vhz9" Feb 17 14:08:43 crc kubenswrapper[4836]: E0217 14:08:43.539831 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:08:44.039814891 +0000 UTC m=+150.382743160 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5vhz9" (UID: "4cd3f585-c95f-43ee-962c-ea33aff90415") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:43 crc kubenswrapper[4836]: I0217 14:08:43.544825 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-jjmwc" event={"ID":"1ecc7c98-e9a3-4850-a741-7e0bcf670e27","Type":"ContainerStarted","Data":"b4aefdd0721537ab913492aef942f5b49036dd754f7aca8276dc6e3aee6f0498"} Feb 17 14:08:43 crc kubenswrapper[4836]: I0217 14:08:43.567730 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-gtjx7" event={"ID":"83427963-071f-40a0-8988-b39a3d41e59f","Type":"ContainerStarted","Data":"972e5982538d255cd8980703966e8868df496eeb602cbe64b2430afbfe05045f"} Feb 17 14:08:43 crc kubenswrapper[4836]: I0217 14:08:43.567784 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-gtjx7" event={"ID":"83427963-071f-40a0-8988-b39a3d41e59f","Type":"ContainerStarted","Data":"0b77a265395504163a0cad4be47c7aef193020081ba1eef748429bf9343c0461"} Feb 17 14:08:43 crc kubenswrapper[4836]: I0217 14:08:43.569691 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-ld8ls" podStartSLOduration=127.569666163 podStartE2EDuration="2m7.569666163s" podCreationTimestamp="2026-02-17 14:06:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:08:43.557280304 +0000 UTC m=+149.900208603" watchObservedRunningTime="2026-02-17 14:08:43.569666163 +0000 UTC m=+149.912594432" Feb 17 14:08:43 crc kubenswrapper[4836]: I0217 14:08:43.570688 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jhzxl" podStartSLOduration=127.570682711 podStartE2EDuration="2m7.570682711s" podCreationTimestamp="2026-02-17 14:06:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:08:43.48142104 +0000 UTC m=+149.824349309" watchObservedRunningTime="2026-02-17 14:08:43.570682711 +0000 UTC m=+149.913611000" Feb 17 14:08:43 crc kubenswrapper[4836]: I0217 14:08:43.582098 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 14:08:43 crc kubenswrapper[4836]: I0217 14:08:43.597935 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 14:08:43 crc kubenswrapper[4836]: I0217 14:08:43.598467 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:08:43 crc kubenswrapper[4836]: I0217 14:08:43.598852 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-z6h7n" event={"ID":"e634dd4f-5615-4dd7-ac0f-8c22a3ee7d40","Type":"ContainerStarted","Data":"6d0e2dedcae09446f7b3bfaf744b1c829817ff9ce38f5a925b95962072d0f167"} Feb 17 14:08:43 crc kubenswrapper[4836]: I0217 14:08:43.639454 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-8ngwr" podStartSLOduration=9.639417015 podStartE2EDuration="9.639417015s" podCreationTimestamp="2026-02-17 14:08:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:08:43.618024338 +0000 UTC m=+149.960952627" watchObservedRunningTime="2026-02-17 14:08:43.639417015 +0000 UTC m=+149.982345294" Feb 17 14:08:43 crc kubenswrapper[4836]: I0217 14:08:43.640187 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:08:43 crc kubenswrapper[4836]: E0217 14:08:43.641108 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:08:44.141083449 +0000 UTC m=+150.484011718 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:43 crc kubenswrapper[4836]: I0217 14:08:43.743315 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5vhz9\" (UID: \"4cd3f585-c95f-43ee-962c-ea33aff90415\") " pod="openshift-image-registry/image-registry-697d97f7c8-5vhz9" Feb 17 14:08:43 crc kubenswrapper[4836]: E0217 14:08:43.744164 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:08:44.244149766 +0000 UTC m=+150.587078035 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5vhz9" (UID: "4cd3f585-c95f-43ee-962c-ea33aff90415") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:43 crc kubenswrapper[4836]: I0217 14:08:43.752671 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-gtjx7" podStartSLOduration=128.752653012 podStartE2EDuration="2m8.752653012s" podCreationTimestamp="2026-02-17 14:06:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:08:43.66293236 +0000 UTC m=+150.005860649" watchObservedRunningTime="2026-02-17 14:08:43.752653012 +0000 UTC m=+150.095581291" Feb 17 14:08:43 crc kubenswrapper[4836]: I0217 14:08:43.770634 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-jjmwc" podStartSLOduration=127.770601959 podStartE2EDuration="2m7.770601959s" podCreationTimestamp="2026-02-17 14:06:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:08:43.76838463 +0000 UTC m=+150.111312909" watchObservedRunningTime="2026-02-17 14:08:43.770601959 +0000 UTC m=+150.113530228" Feb 17 14:08:43 crc kubenswrapper[4836]: I0217 14:08:43.790264 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-l5kfm" event={"ID":"56a1d7ef-ccae-4b8e-b94f-edee0ce6e902","Type":"ContainerStarted","Data":"36f97f4b465b627a7bbdc9ff6a08105f7f3504b8424041c3c5d0af5de400a7e9"} Feb 17 14:08:43 crc kubenswrapper[4836]: I0217 14:08:43.792089 4836 generic.go:334] "Generic (PLEG): container finished" podID="66402e53-3287-45c4-bceb-78fc99836c5b" containerID="e1410cad0bdf253c1f33bcce8f3592648cbfa9dbf3714face250be327e64211e" exitCode=0 Feb 17 14:08:43 crc kubenswrapper[4836]: I0217 14:08:43.792143 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-cnq25" event={"ID":"66402e53-3287-45c4-bceb-78fc99836c5b","Type":"ContainerDied","Data":"e1410cad0bdf253c1f33bcce8f3592648cbfa9dbf3714face250be327e64211e"} Feb 17 14:08:43 crc kubenswrapper[4836]: I0217 14:08:43.815232 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-ld8ls" event={"ID":"b8fcc519-bc1d-4a7e-8005-cb29f435f4e5","Type":"ContainerStarted","Data":"f4b1b6ddbd3387e065362939698ff59a43d98af047d58fe3730f24961cf9d3f3"} Feb 17 14:08:43 crc kubenswrapper[4836]: I0217 14:08:43.926481 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:08:43 crc kubenswrapper[4836]: E0217 14:08:43.928098 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:08:44.42807941 +0000 UTC m=+150.771007689 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:43 crc kubenswrapper[4836]: I0217 14:08:43.975764 4836 patch_prober.go:28] interesting pod/router-default-5444994796-fqzrl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 14:08:43 crc kubenswrapper[4836]: [-]has-synced failed: reason withheld Feb 17 14:08:43 crc kubenswrapper[4836]: [+]process-running ok Feb 17 14:08:43 crc kubenswrapper[4836]: healthz check failed Feb 17 14:08:43 crc kubenswrapper[4836]: I0217 14:08:43.975821 4836 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fqzrl" podUID="296ae94a-36e6-480b-9395-8f6a96621fdf" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 14:08:43 crc kubenswrapper[4836]: I0217 14:08:43.976456 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-284hg" event={"ID":"f799cad1-5a28-4af5-8070-5c365cddbf78","Type":"ContainerStarted","Data":"a342a3e318d1021d98dd316efc76e3c2170c5c0d31cd5bd52c4f284086a9a33f"} Feb 17 14:08:43 crc kubenswrapper[4836]: I0217 14:08:43.976507 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-284hg" event={"ID":"f799cad1-5a28-4af5-8070-5c365cddbf78","Type":"ContainerStarted","Data":"6822c77d848fcdab528abca24fc98ef79abf427180e8f5d03d60f8f9a6387c3d"} Feb 17 14:08:43 crc kubenswrapper[4836]: I0217 14:08:43.976816 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-284hg" Feb 17 14:08:43 crc kubenswrapper[4836]: I0217 14:08:43.985267 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-6scjm" event={"ID":"c2f4f6fb-d604-402f-83d0-6b25781c3aa8","Type":"ContainerStarted","Data":"4e7219136a28b611d66687e06cd6aadc0123bae2d2908a16f756658233014eb9"} Feb 17 14:08:43 crc kubenswrapper[4836]: I0217 14:08:43.986856 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-72n7k" event={"ID":"0fc0c6ee-c7d3-4e99-a6c2-b8de8f7ffa00","Type":"ContainerStarted","Data":"4212dc04069a7b1e36bc72e9b817419946105d0e67d1d9d67b83b9eeff6bc959"} Feb 17 14:08:44 crc kubenswrapper[4836]: I0217 14:08:44.023781 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2xtf8" event={"ID":"1679c4a6-a707-4150-825b-5cb8b90cb27c","Type":"ContainerStarted","Data":"82d604aef88d37b6109f35e29ebfcdc7a938ff194ec971f9fa09d7f3c1413d68"} Feb 17 14:08:44 crc kubenswrapper[4836]: I0217 14:08:44.027694 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5vhz9\" (UID: \"4cd3f585-c95f-43ee-962c-ea33aff90415\") " pod="openshift-image-registry/image-registry-697d97f7c8-5vhz9" Feb 17 14:08:44 crc kubenswrapper[4836]: E0217 14:08:44.027988 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:08:44.527976061 +0000 UTC m=+150.870904330 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5vhz9" (UID: "4cd3f585-c95f-43ee-962c-ea33aff90415") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:44 crc kubenswrapper[4836]: I0217 14:08:44.143349 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:08:44 crc kubenswrapper[4836]: I0217 14:08:44.143597 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Feb 17 14:08:44 crc kubenswrapper[4836]: E0217 14:08:44.144070 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:08:44.644049253 +0000 UTC m=+150.986977592 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:44 crc kubenswrapper[4836]: I0217 14:08:44.144148 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5vhz9\" (UID: \"4cd3f585-c95f-43ee-962c-ea33aff90415\") " pod="openshift-image-registry/image-registry-697d97f7c8-5vhz9" Feb 17 14:08:44 crc kubenswrapper[4836]: E0217 14:08:44.144444 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:08:44.644437614 +0000 UTC m=+150.987365883 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5vhz9" (UID: "4cd3f585-c95f-43ee-962c-ea33aff90415") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:44 crc kubenswrapper[4836]: I0217 14:08:44.244738 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:08:44 crc kubenswrapper[4836]: E0217 14:08:44.246423 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:08:44.746402801 +0000 UTC m=+151.089331090 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:44 crc kubenswrapper[4836]: I0217 14:08:44.347625 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5vhz9\" (UID: \"4cd3f585-c95f-43ee-962c-ea33aff90415\") " pod="openshift-image-registry/image-registry-697d97f7c8-5vhz9" Feb 17 14:08:44 crc kubenswrapper[4836]: E0217 14:08:44.348034 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:08:44.848019939 +0000 UTC m=+151.190948208 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5vhz9" (UID: "4cd3f585-c95f-43ee-962c-ea33aff90415") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:44 crc kubenswrapper[4836]: I0217 14:08:44.391600 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wgzvh" Feb 17 14:08:44 crc kubenswrapper[4836]: I0217 14:08:44.465696 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Feb 17 14:08:44 crc kubenswrapper[4836]: I0217 14:08:44.477918 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:08:44 crc kubenswrapper[4836]: E0217 14:08:44.478248 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:08:44.978229776 +0000 UTC m=+151.321158055 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:44 crc kubenswrapper[4836]: I0217 14:08:44.502834 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/caed4fb3-6dd4-4427-880f-fee413854d48-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"caed4fb3-6dd4-4427-880f-fee413854d48\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 17 14:08:44 crc kubenswrapper[4836]: I0217 14:08:44.514608 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-l5kfm" podStartSLOduration=129.514588672 podStartE2EDuration="2m9.514588672s" podCreationTimestamp="2026-02-17 14:06:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:08:44.477413674 +0000 UTC m=+150.820341963" watchObservedRunningTime="2026-02-17 14:08:44.514588672 +0000 UTC m=+150.857516941" Feb 17 14:08:44 crc kubenswrapper[4836]: I0217 14:08:44.580503 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5vhz9\" (UID: \"4cd3f585-c95f-43ee-962c-ea33aff90415\") " pod="openshift-image-registry/image-registry-697d97f7c8-5vhz9" Feb 17 14:08:44 crc kubenswrapper[4836]: E0217 14:08:44.581134 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:08:45.081107147 +0000 UTC m=+151.424035416 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5vhz9" (UID: "4cd3f585-c95f-43ee-962c-ea33aff90415") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:44 crc kubenswrapper[4836]: I0217 14:08:44.707092 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:08:44 crc kubenswrapper[4836]: E0217 14:08:44.707945 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:08:45.207922085 +0000 UTC m=+151.550850354 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:44 crc kubenswrapper[4836]: I0217 14:08:44.711621 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 17 14:08:44 crc kubenswrapper[4836]: I0217 14:08:44.713638 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-72n7k" podStartSLOduration=128.713598575 podStartE2EDuration="2m8.713598575s" podCreationTimestamp="2026-02-17 14:06:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:08:44.516359279 +0000 UTC m=+150.859287568" watchObservedRunningTime="2026-02-17 14:08:44.713598575 +0000 UTC m=+151.056526864" Feb 17 14:08:44 crc kubenswrapper[4836]: I0217 14:08:44.787073 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-2xtf8" podStartSLOduration=129.787048085 podStartE2EDuration="2m9.787048085s" podCreationTimestamp="2026-02-17 14:06:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:08:44.786689576 +0000 UTC m=+151.129617845" watchObservedRunningTime="2026-02-17 14:08:44.787048085 +0000 UTC m=+151.129976344" Feb 17 14:08:44 crc kubenswrapper[4836]: I0217 14:08:44.810978 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5vhz9\" (UID: \"4cd3f585-c95f-43ee-962c-ea33aff90415\") " pod="openshift-image-registry/image-registry-697d97f7c8-5vhz9" Feb 17 14:08:44 crc kubenswrapper[4836]: E0217 14:08:44.811346 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:08:45.3113292 +0000 UTC m=+151.654257469 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5vhz9" (UID: "4cd3f585-c95f-43ee-962c-ea33aff90415") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:44 crc kubenswrapper[4836]: I0217 14:08:44.921837 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:08:44 crc kubenswrapper[4836]: E0217 14:08:44.922163 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:08:45.422136532 +0000 UTC m=+151.765064801 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:44 crc kubenswrapper[4836]: I0217 14:08:44.953998 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-284hg" podStartSLOduration=10.953969828 podStartE2EDuration="10.953969828s" podCreationTimestamp="2026-02-17 14:08:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:08:44.874000985 +0000 UTC m=+151.216929264" watchObservedRunningTime="2026-02-17 14:08:44.953969828 +0000 UTC m=+151.296898117" Feb 17 14:08:44 crc kubenswrapper[4836]: I0217 14:08:44.965232 4836 patch_prober.go:28] interesting pod/router-default-5444994796-fqzrl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 14:08:44 crc kubenswrapper[4836]: [-]has-synced failed: reason withheld Feb 17 14:08:44 crc kubenswrapper[4836]: [+]process-running ok Feb 17 14:08:44 crc kubenswrapper[4836]: healthz check failed Feb 17 14:08:44 crc kubenswrapper[4836]: I0217 14:08:44.965403 4836 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fqzrl" podUID="296ae94a-36e6-480b-9395-8f6a96621fdf" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 14:08:44 crc kubenswrapper[4836]: I0217 14:08:44.982344 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9w8zr"] Feb 17 14:08:44 crc kubenswrapper[4836]: I0217 14:08:44.983609 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9w8zr" Feb 17 14:08:44 crc kubenswrapper[4836]: I0217 14:08:44.990652 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 17 14:08:45 crc kubenswrapper[4836]: I0217 14:08:45.025004 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/089d1289-afe9-4ffe-9d96-ac10058335ed-utilities\") pod \"community-operators-9w8zr\" (UID: \"089d1289-afe9-4ffe-9d96-ac10058335ed\") " pod="openshift-marketplace/community-operators-9w8zr" Feb 17 14:08:45 crc kubenswrapper[4836]: I0217 14:08:45.025117 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5vhz9\" (UID: \"4cd3f585-c95f-43ee-962c-ea33aff90415\") " pod="openshift-image-registry/image-registry-697d97f7c8-5vhz9" Feb 17 14:08:45 crc kubenswrapper[4836]: I0217 14:08:45.025167 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/089d1289-afe9-4ffe-9d96-ac10058335ed-catalog-content\") pod \"community-operators-9w8zr\" (UID: \"089d1289-afe9-4ffe-9d96-ac10058335ed\") " pod="openshift-marketplace/community-operators-9w8zr" Feb 17 14:08:45 crc kubenswrapper[4836]: I0217 14:08:45.025203 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tknx6\" (UniqueName: \"kubernetes.io/projected/089d1289-afe9-4ffe-9d96-ac10058335ed-kube-api-access-tknx6\") pod \"community-operators-9w8zr\" (UID: \"089d1289-afe9-4ffe-9d96-ac10058335ed\") " pod="openshift-marketplace/community-operators-9w8zr" Feb 17 14:08:45 crc kubenswrapper[4836]: E0217 14:08:45.025658 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:08:45.52563642 +0000 UTC m=+151.868564689 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5vhz9" (UID: "4cd3f585-c95f-43ee-962c-ea33aff90415") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:45 crc kubenswrapper[4836]: I0217 14:08:45.027277 4836 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-x2x76 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.32:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 17 14:08:45 crc kubenswrapper[4836]: I0217 14:08:45.027413 4836 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x2x76" podUID="c758606a-b3e4-494e-a2a6-7a7320277b37" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.32:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 17 14:08:45 crc kubenswrapper[4836]: I0217 14:08:45.032656 4836 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-2rnsr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.13:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 17 14:08:45 crc kubenswrapper[4836]: I0217 14:08:45.032769 4836 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2rnsr" podUID="c26f912f-f640-4b4c-ab61-dd2a163f12ab" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.13:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 17 14:08:45 crc kubenswrapper[4836]: I0217 14:08:45.129981 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:08:45 crc kubenswrapper[4836]: I0217 14:08:45.130309 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/089d1289-afe9-4ffe-9d96-ac10058335ed-utilities\") pod \"community-operators-9w8zr\" (UID: \"089d1289-afe9-4ffe-9d96-ac10058335ed\") " pod="openshift-marketplace/community-operators-9w8zr" Feb 17 14:08:45 crc kubenswrapper[4836]: I0217 14:08:45.130413 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/089d1289-afe9-4ffe-9d96-ac10058335ed-catalog-content\") pod \"community-operators-9w8zr\" (UID: \"089d1289-afe9-4ffe-9d96-ac10058335ed\") " pod="openshift-marketplace/community-operators-9w8zr" Feb 17 14:08:45 crc kubenswrapper[4836]: I0217 14:08:45.130450 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tknx6\" (UniqueName: \"kubernetes.io/projected/089d1289-afe9-4ffe-9d96-ac10058335ed-kube-api-access-tknx6\") pod \"community-operators-9w8zr\" (UID: \"089d1289-afe9-4ffe-9d96-ac10058335ed\") " pod="openshift-marketplace/community-operators-9w8zr" Feb 17 14:08:45 crc kubenswrapper[4836]: E0217 14:08:45.130985 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:08:45.630962067 +0000 UTC m=+151.973890336 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:45 crc kubenswrapper[4836]: I0217 14:08:45.131900 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/089d1289-afe9-4ffe-9d96-ac10058335ed-utilities\") pod \"community-operators-9w8zr\" (UID: \"089d1289-afe9-4ffe-9d96-ac10058335ed\") " pod="openshift-marketplace/community-operators-9w8zr" Feb 17 14:08:45 crc kubenswrapper[4836]: I0217 14:08:45.132049 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/089d1289-afe9-4ffe-9d96-ac10058335ed-catalog-content\") pod \"community-operators-9w8zr\" (UID: \"089d1289-afe9-4ffe-9d96-ac10058335ed\") " pod="openshift-marketplace/community-operators-9w8zr" Feb 17 14:08:45 crc kubenswrapper[4836]: I0217 14:08:45.268935 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5vhz9\" (UID: \"4cd3f585-c95f-43ee-962c-ea33aff90415\") " pod="openshift-image-registry/image-registry-697d97f7c8-5vhz9" Feb 17 14:08:45 crc kubenswrapper[4836]: E0217 14:08:45.270565 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:08:45.770551753 +0000 UTC m=+152.113480022 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5vhz9" (UID: "4cd3f585-c95f-43ee-962c-ea33aff90415") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:45 crc kubenswrapper[4836]: I0217 14:08:45.348306 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tknx6\" (UniqueName: \"kubernetes.io/projected/089d1289-afe9-4ffe-9d96-ac10058335ed-kube-api-access-tknx6\") pod \"community-operators-9w8zr\" (UID: \"089d1289-afe9-4ffe-9d96-ac10058335ed\") " pod="openshift-marketplace/community-operators-9w8zr" Feb 17 14:08:45 crc kubenswrapper[4836]: I0217 14:08:45.374939 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:08:45 crc kubenswrapper[4836]: E0217 14:08:45.375430 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:08:45.875409857 +0000 UTC m=+152.218338126 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:45 crc kubenswrapper[4836]: I0217 14:08:45.594370 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5vhz9\" (UID: \"4cd3f585-c95f-43ee-962c-ea33aff90415\") " pod="openshift-image-registry/image-registry-697d97f7c8-5vhz9" Feb 17 14:08:45 crc kubenswrapper[4836]: E0217 14:08:45.594865 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:08:46.094852643 +0000 UTC m=+152.437780912 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5vhz9" (UID: "4cd3f585-c95f-43ee-962c-ea33aff90415") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:45 crc kubenswrapper[4836]: I0217 14:08:45.622189 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9w8zr" Feb 17 14:08:45 crc kubenswrapper[4836]: I0217 14:08:45.632445 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vfmw4"] Feb 17 14:08:45 crc kubenswrapper[4836]: I0217 14:08:45.634087 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vfmw4" Feb 17 14:08:45 crc kubenswrapper[4836]: W0217 14:08:45.639144 4836 reflector.go:561] object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g": failed to list *v1.Secret: secrets "certified-operators-dockercfg-4rs5g" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-marketplace": no relationship found between node 'crc' and this object Feb 17 14:08:45 crc kubenswrapper[4836]: E0217 14:08:45.639207 4836 reflector.go:158] "Unhandled Error" err="object-\"openshift-marketplace\"/\"certified-operators-dockercfg-4rs5g\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"certified-operators-dockercfg-4rs5g\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-marketplace\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 17 14:08:45 crc kubenswrapper[4836]: I0217 14:08:45.663677 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9w8zr"] Feb 17 14:08:45 crc kubenswrapper[4836]: I0217 14:08:45.672974 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-tmpvx"] Feb 17 14:08:45 crc kubenswrapper[4836]: I0217 14:08:45.674588 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tmpvx" Feb 17 14:08:45 crc kubenswrapper[4836]: I0217 14:08:45.696316 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:08:45 crc kubenswrapper[4836]: I0217 14:08:45.696505 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xd5dt\" (UniqueName: \"kubernetes.io/projected/8762f2f2-8375-4fdd-8a29-ea2ab598afa1-kube-api-access-xd5dt\") pod \"certified-operators-vfmw4\" (UID: \"8762f2f2-8375-4fdd-8a29-ea2ab598afa1\") " pod="openshift-marketplace/certified-operators-vfmw4" Feb 17 14:08:45 crc kubenswrapper[4836]: I0217 14:08:45.696577 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fk4b\" (UniqueName: \"kubernetes.io/projected/c6c873c6-ddde-4b9b-9141-e6de9be567d4-kube-api-access-7fk4b\") pod \"community-operators-tmpvx\" (UID: \"c6c873c6-ddde-4b9b-9141-e6de9be567d4\") " pod="openshift-marketplace/community-operators-tmpvx" Feb 17 14:08:45 crc kubenswrapper[4836]: I0217 14:08:45.696599 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6c873c6-ddde-4b9b-9141-e6de9be567d4-catalog-content\") pod \"community-operators-tmpvx\" (UID: \"c6c873c6-ddde-4b9b-9141-e6de9be567d4\") " pod="openshift-marketplace/community-operators-tmpvx" Feb 17 14:08:45 crc kubenswrapper[4836]: I0217 14:08:45.696640 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8762f2f2-8375-4fdd-8a29-ea2ab598afa1-catalog-content\") pod \"certified-operators-vfmw4\" (UID: \"8762f2f2-8375-4fdd-8a29-ea2ab598afa1\") " pod="openshift-marketplace/certified-operators-vfmw4" Feb 17 14:08:45 crc kubenswrapper[4836]: I0217 14:08:45.696668 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8762f2f2-8375-4fdd-8a29-ea2ab598afa1-utilities\") pod \"certified-operators-vfmw4\" (UID: \"8762f2f2-8375-4fdd-8a29-ea2ab598afa1\") " pod="openshift-marketplace/certified-operators-vfmw4" Feb 17 14:08:45 crc kubenswrapper[4836]: I0217 14:08:45.696710 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6c873c6-ddde-4b9b-9141-e6de9be567d4-utilities\") pod \"community-operators-tmpvx\" (UID: \"c6c873c6-ddde-4b9b-9141-e6de9be567d4\") " pod="openshift-marketplace/community-operators-tmpvx" Feb 17 14:08:45 crc kubenswrapper[4836]: E0217 14:08:45.696826 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:08:46.19681092 +0000 UTC m=+152.539739189 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:45 crc kubenswrapper[4836]: I0217 14:08:45.799901 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xd5dt\" (UniqueName: \"kubernetes.io/projected/8762f2f2-8375-4fdd-8a29-ea2ab598afa1-kube-api-access-xd5dt\") pod \"certified-operators-vfmw4\" (UID: \"8762f2f2-8375-4fdd-8a29-ea2ab598afa1\") " pod="openshift-marketplace/certified-operators-vfmw4" Feb 17 14:08:45 crc kubenswrapper[4836]: I0217 14:08:45.800001 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5vhz9\" (UID: \"4cd3f585-c95f-43ee-962c-ea33aff90415\") " pod="openshift-image-registry/image-registry-697d97f7c8-5vhz9" Feb 17 14:08:45 crc kubenswrapper[4836]: I0217 14:08:45.800050 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7fk4b\" (UniqueName: \"kubernetes.io/projected/c6c873c6-ddde-4b9b-9141-e6de9be567d4-kube-api-access-7fk4b\") pod \"community-operators-tmpvx\" (UID: \"c6c873c6-ddde-4b9b-9141-e6de9be567d4\") " pod="openshift-marketplace/community-operators-tmpvx" Feb 17 14:08:45 crc kubenswrapper[4836]: I0217 14:08:45.800100 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6c873c6-ddde-4b9b-9141-e6de9be567d4-catalog-content\") pod \"community-operators-tmpvx\" (UID: \"c6c873c6-ddde-4b9b-9141-e6de9be567d4\") " pod="openshift-marketplace/community-operators-tmpvx" Feb 17 14:08:45 crc kubenswrapper[4836]: I0217 14:08:45.800138 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8762f2f2-8375-4fdd-8a29-ea2ab598afa1-catalog-content\") pod \"certified-operators-vfmw4\" (UID: \"8762f2f2-8375-4fdd-8a29-ea2ab598afa1\") " pod="openshift-marketplace/certified-operators-vfmw4" Feb 17 14:08:45 crc kubenswrapper[4836]: I0217 14:08:45.800182 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8762f2f2-8375-4fdd-8a29-ea2ab598afa1-utilities\") pod \"certified-operators-vfmw4\" (UID: \"8762f2f2-8375-4fdd-8a29-ea2ab598afa1\") " pod="openshift-marketplace/certified-operators-vfmw4" Feb 17 14:08:45 crc kubenswrapper[4836]: I0217 14:08:45.800205 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6c873c6-ddde-4b9b-9141-e6de9be567d4-utilities\") pod \"community-operators-tmpvx\" (UID: \"c6c873c6-ddde-4b9b-9141-e6de9be567d4\") " pod="openshift-marketplace/community-operators-tmpvx" Feb 17 14:08:45 crc kubenswrapper[4836]: I0217 14:08:45.800828 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6c873c6-ddde-4b9b-9141-e6de9be567d4-utilities\") pod \"community-operators-tmpvx\" (UID: \"c6c873c6-ddde-4b9b-9141-e6de9be567d4\") " pod="openshift-marketplace/community-operators-tmpvx" Feb 17 14:08:45 crc kubenswrapper[4836]: E0217 14:08:45.801762 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:08:46.301747606 +0000 UTC m=+152.644675875 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5vhz9" (UID: "4cd3f585-c95f-43ee-962c-ea33aff90415") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:45 crc kubenswrapper[4836]: I0217 14:08:45.802210 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8762f2f2-8375-4fdd-8a29-ea2ab598afa1-catalog-content\") pod \"certified-operators-vfmw4\" (UID: \"8762f2f2-8375-4fdd-8a29-ea2ab598afa1\") " pod="openshift-marketplace/certified-operators-vfmw4" Feb 17 14:08:45 crc kubenswrapper[4836]: I0217 14:08:45.802578 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6c873c6-ddde-4b9b-9141-e6de9be567d4-catalog-content\") pod \"community-operators-tmpvx\" (UID: \"c6c873c6-ddde-4b9b-9141-e6de9be567d4\") " pod="openshift-marketplace/community-operators-tmpvx" Feb 17 14:08:45 crc kubenswrapper[4836]: I0217 14:08:45.802891 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8762f2f2-8375-4fdd-8a29-ea2ab598afa1-utilities\") pod \"certified-operators-vfmw4\" (UID: \"8762f2f2-8375-4fdd-8a29-ea2ab598afa1\") " pod="openshift-marketplace/certified-operators-vfmw4" Feb 17 14:08:45 crc kubenswrapper[4836]: I0217 14:08:45.941394 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:08:45 crc kubenswrapper[4836]: E0217 14:08:45.941925 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:08:46.441900127 +0000 UTC m=+152.784828396 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:45 crc kubenswrapper[4836]: I0217 14:08:45.950954 4836 patch_prober.go:28] interesting pod/router-default-5444994796-fqzrl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 14:08:45 crc kubenswrapper[4836]: [-]has-synced failed: reason withheld Feb 17 14:08:45 crc kubenswrapper[4836]: [+]process-running ok Feb 17 14:08:45 crc kubenswrapper[4836]: healthz check failed Feb 17 14:08:45 crc kubenswrapper[4836]: I0217 14:08:45.951027 4836 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fqzrl" podUID="296ae94a-36e6-480b-9395-8f6a96621fdf" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 14:08:46 crc kubenswrapper[4836]: I0217 14:08:46.019021 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-cmk55"] Feb 17 14:08:46 crc kubenswrapper[4836]: I0217 14:08:46.020235 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cmk55" Feb 17 14:08:46 crc kubenswrapper[4836]: I0217 14:08:46.043267 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5vhz9\" (UID: \"4cd3f585-c95f-43ee-962c-ea33aff90415\") " pod="openshift-image-registry/image-registry-697d97f7c8-5vhz9" Feb 17 14:08:46 crc kubenswrapper[4836]: E0217 14:08:46.043805 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:08:46.543782993 +0000 UTC m=+152.886711322 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5vhz9" (UID: "4cd3f585-c95f-43ee-962c-ea33aff90415") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:46 crc kubenswrapper[4836]: I0217 14:08:46.097753 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tmpvx"] Feb 17 14:08:46 crc kubenswrapper[4836]: I0217 14:08:46.144901 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:08:46 crc kubenswrapper[4836]: I0217 14:08:46.145451 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1bd4ed0-3b99-4446-9218-71bb589da4a4-catalog-content\") pod \"certified-operators-cmk55\" (UID: \"f1bd4ed0-3b99-4446-9218-71bb589da4a4\") " pod="openshift-marketplace/certified-operators-cmk55" Feb 17 14:08:46 crc kubenswrapper[4836]: I0217 14:08:46.145491 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1bd4ed0-3b99-4446-9218-71bb589da4a4-utilities\") pod \"certified-operators-cmk55\" (UID: \"f1bd4ed0-3b99-4446-9218-71bb589da4a4\") " pod="openshift-marketplace/certified-operators-cmk55" Feb 17 14:08:46 crc kubenswrapper[4836]: I0217 14:08:46.145534 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmwlv\" (UniqueName: \"kubernetes.io/projected/f1bd4ed0-3b99-4446-9218-71bb589da4a4-kube-api-access-fmwlv\") pod \"certified-operators-cmk55\" (UID: \"f1bd4ed0-3b99-4446-9218-71bb589da4a4\") " pod="openshift-marketplace/certified-operators-cmk55" Feb 17 14:08:46 crc kubenswrapper[4836]: E0217 14:08:46.145682 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:08:46.645662538 +0000 UTC m=+152.988590807 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:46 crc kubenswrapper[4836]: I0217 14:08:46.175537 4836 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-2rnsr container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.13:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 17 14:08:46 crc kubenswrapper[4836]: I0217 14:08:46.175607 4836 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2rnsr" podUID="c26f912f-f640-4b4c-ab61-dd2a163f12ab" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.13:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 17 14:08:46 crc kubenswrapper[4836]: I0217 14:08:46.194782 4836 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-x2x76 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.32:5443/healthz\": context deadline exceeded" start-of-body= Feb 17 14:08:46 crc kubenswrapper[4836]: I0217 14:08:46.194828 4836 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x2x76" podUID="c758606a-b3e4-494e-a2a6-7a7320277b37" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.32:5443/healthz\": context deadline exceeded" Feb 17 14:08:46 crc kubenswrapper[4836]: I0217 14:08:46.200208 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cmk55"] Feb 17 14:08:46 crc kubenswrapper[4836]: I0217 14:08:46.203733 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fk4b\" (UniqueName: \"kubernetes.io/projected/c6c873c6-ddde-4b9b-9141-e6de9be567d4-kube-api-access-7fk4b\") pod \"community-operators-tmpvx\" (UID: \"c6c873c6-ddde-4b9b-9141-e6de9be567d4\") " pod="openshift-marketplace/community-operators-tmpvx" Feb 17 14:08:46 crc kubenswrapper[4836]: I0217 14:08:46.280238 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1bd4ed0-3b99-4446-9218-71bb589da4a4-catalog-content\") pod \"certified-operators-cmk55\" (UID: \"f1bd4ed0-3b99-4446-9218-71bb589da4a4\") " pod="openshift-marketplace/certified-operators-cmk55" Feb 17 14:08:46 crc kubenswrapper[4836]: I0217 14:08:46.280532 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1bd4ed0-3b99-4446-9218-71bb589da4a4-utilities\") pod \"certified-operators-cmk55\" (UID: \"f1bd4ed0-3b99-4446-9218-71bb589da4a4\") " pod="openshift-marketplace/certified-operators-cmk55" Feb 17 14:08:46 crc kubenswrapper[4836]: I0217 14:08:46.280566 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmwlv\" (UniqueName: \"kubernetes.io/projected/f1bd4ed0-3b99-4446-9218-71bb589da4a4-kube-api-access-fmwlv\") pod \"certified-operators-cmk55\" (UID: \"f1bd4ed0-3b99-4446-9218-71bb589da4a4\") " pod="openshift-marketplace/certified-operators-cmk55" Feb 17 14:08:46 crc kubenswrapper[4836]: I0217 14:08:46.280590 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5vhz9\" (UID: \"4cd3f585-c95f-43ee-962c-ea33aff90415\") " pod="openshift-image-registry/image-registry-697d97f7c8-5vhz9" Feb 17 14:08:46 crc kubenswrapper[4836]: E0217 14:08:46.280887 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:08:46.780875477 +0000 UTC m=+153.123803746 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5vhz9" (UID: "4cd3f585-c95f-43ee-962c-ea33aff90415") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:46 crc kubenswrapper[4836]: I0217 14:08:46.281246 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1bd4ed0-3b99-4446-9218-71bb589da4a4-catalog-content\") pod \"certified-operators-cmk55\" (UID: \"f1bd4ed0-3b99-4446-9218-71bb589da4a4\") " pod="openshift-marketplace/certified-operators-cmk55" Feb 17 14:08:46 crc kubenswrapper[4836]: I0217 14:08:46.281477 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1bd4ed0-3b99-4446-9218-71bb589da4a4-utilities\") pod \"certified-operators-cmk55\" (UID: \"f1bd4ed0-3b99-4446-9218-71bb589da4a4\") " pod="openshift-marketplace/certified-operators-cmk55" Feb 17 14:08:46 crc kubenswrapper[4836]: I0217 14:08:46.288277 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vfmw4"] Feb 17 14:08:46 crc kubenswrapper[4836]: I0217 14:08:46.315014 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xd5dt\" (UniqueName: \"kubernetes.io/projected/8762f2f2-8375-4fdd-8a29-ea2ab598afa1-kube-api-access-xd5dt\") pod \"certified-operators-vfmw4\" (UID: \"8762f2f2-8375-4fdd-8a29-ea2ab598afa1\") " pod="openshift-marketplace/certified-operators-vfmw4" Feb 17 14:08:46 crc kubenswrapper[4836]: I0217 14:08:46.352209 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tmpvx" Feb 17 14:08:46 crc kubenswrapper[4836]: I0217 14:08:46.381522 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:08:46 crc kubenswrapper[4836]: E0217 14:08:46.382713 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:08:46.88269161 +0000 UTC m=+153.225619879 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:46 crc kubenswrapper[4836]: I0217 14:08:46.386367 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmwlv\" (UniqueName: \"kubernetes.io/projected/f1bd4ed0-3b99-4446-9218-71bb589da4a4-kube-api-access-fmwlv\") pod \"certified-operators-cmk55\" (UID: \"f1bd4ed0-3b99-4446-9218-71bb589da4a4\") " pod="openshift-marketplace/certified-operators-cmk55" Feb 17 14:08:46 crc kubenswrapper[4836]: I0217 14:08:46.490187 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5vhz9\" (UID: \"4cd3f585-c95f-43ee-962c-ea33aff90415\") " pod="openshift-image-registry/image-registry-697d97f7c8-5vhz9" Feb 17 14:08:46 crc kubenswrapper[4836]: E0217 14:08:46.490627 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:08:46.990615256 +0000 UTC m=+153.333543525 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5vhz9" (UID: "4cd3f585-c95f-43ee-962c-ea33aff90415") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:46 crc kubenswrapper[4836]: I0217 14:08:46.530966 4836 patch_prober.go:28] interesting pod/downloads-7954f5f757-5cbbv container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Feb 17 14:08:46 crc kubenswrapper[4836]: I0217 14:08:46.531050 4836 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-5cbbv" podUID="d9eb5c8b-f3c7-4068-82c7-28520f6905c6" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Feb 17 14:08:46 crc kubenswrapper[4836]: I0217 14:08:46.531130 4836 patch_prober.go:28] interesting pod/downloads-7954f5f757-5cbbv container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Feb 17 14:08:46 crc kubenswrapper[4836]: I0217 14:08:46.531149 4836 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-5cbbv" podUID="d9eb5c8b-f3c7-4068-82c7-28520f6905c6" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Feb 17 14:08:46 crc kubenswrapper[4836]: I0217 14:08:46.531417 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-cnq25" event={"ID":"66402e53-3287-45c4-bceb-78fc99836c5b","Type":"ContainerStarted","Data":"aa94ece838f84b94397713060dcb950b0f6669b05f4564f61741a418f75afa9f"} Feb 17 14:08:46 crc kubenswrapper[4836]: E0217 14:08:46.595874 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:08:47.09584118 +0000 UTC m=+153.438769449 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:46 crc kubenswrapper[4836]: I0217 14:08:46.626348 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:08:46 crc kubenswrapper[4836]: I0217 14:08:46.626821 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5vhz9\" (UID: \"4cd3f585-c95f-43ee-962c-ea33aff90415\") " pod="openshift-image-registry/image-registry-697d97f7c8-5vhz9" Feb 17 14:08:46 crc kubenswrapper[4836]: E0217 14:08:46.627393 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:08:47.127330636 +0000 UTC m=+153.470258905 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5vhz9" (UID: "4cd3f585-c95f-43ee-962c-ea33aff90415") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:46 crc kubenswrapper[4836]: I0217 14:08:46.730894 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:08:46 crc kubenswrapper[4836]: E0217 14:08:46.732452 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:08:47.232425757 +0000 UTC m=+153.575354026 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:46 crc kubenswrapper[4836]: I0217 14:08:46.837145 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5vhz9\" (UID: \"4cd3f585-c95f-43ee-962c-ea33aff90415\") " pod="openshift-image-registry/image-registry-697d97f7c8-5vhz9" Feb 17 14:08:46 crc kubenswrapper[4836]: E0217 14:08:46.837522 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:08:47.337507607 +0000 UTC m=+153.680435876 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5vhz9" (UID: "4cd3f585-c95f-43ee-962c-ea33aff90415") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:46 crc kubenswrapper[4836]: I0217 14:08:46.904559 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2rnsr" Feb 17 14:08:46 crc kubenswrapper[4836]: I0217 14:08:46.972481 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:08:46 crc kubenswrapper[4836]: E0217 14:08:46.973856 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:08:47.473836916 +0000 UTC m=+153.816765185 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:47 crc kubenswrapper[4836]: I0217 14:08:47.039908 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 17 14:08:47 crc kubenswrapper[4836]: I0217 14:08:47.041314 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vfmw4" Feb 17 14:08:47 crc kubenswrapper[4836]: I0217 14:08:47.052177 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cmk55" Feb 17 14:08:47 crc kubenswrapper[4836]: I0217 14:08:47.077773 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5vhz9\" (UID: \"4cd3f585-c95f-43ee-962c-ea33aff90415\") " pod="openshift-image-registry/image-registry-697d97f7c8-5vhz9" Feb 17 14:08:47 crc kubenswrapper[4836]: E0217 14:08:47.078162 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:08:47.578150266 +0000 UTC m=+153.921078525 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5vhz9" (UID: "4cd3f585-c95f-43ee-962c-ea33aff90415") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:47 crc kubenswrapper[4836]: I0217 14:08:47.087572 4836 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-2rnsr container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.217.0.13:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 17 14:08:47 crc kubenswrapper[4836]: I0217 14:08:47.087666 4836 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2rnsr" podUID="c26f912f-f640-4b4c-ab61-dd2a163f12ab" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.13:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 17 14:08:47 crc kubenswrapper[4836]: I0217 14:08:47.087796 4836 patch_prober.go:28] interesting pod/router-default-5444994796-fqzrl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 14:08:47 crc kubenswrapper[4836]: [-]has-synced failed: reason withheld Feb 17 14:08:47 crc kubenswrapper[4836]: [+]process-running ok Feb 17 14:08:47 crc kubenswrapper[4836]: healthz check failed Feb 17 14:08:47 crc kubenswrapper[4836]: I0217 14:08:47.087869 4836 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fqzrl" podUID="296ae94a-36e6-480b-9395-8f6a96621fdf" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 14:08:47 crc kubenswrapper[4836]: I0217 14:08:47.179899 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:08:47 crc kubenswrapper[4836]: E0217 14:08:47.180005 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:08:47.67998565 +0000 UTC m=+154.022913919 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:47 crc kubenswrapper[4836]: I0217 14:08:47.180207 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5vhz9\" (UID: \"4cd3f585-c95f-43ee-962c-ea33aff90415\") " pod="openshift-image-registry/image-registry-697d97f7c8-5vhz9" Feb 17 14:08:47 crc kubenswrapper[4836]: E0217 14:08:47.180490 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:08:47.680483933 +0000 UTC m=+154.023412202 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5vhz9" (UID: "4cd3f585-c95f-43ee-962c-ea33aff90415") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:47 crc kubenswrapper[4836]: I0217 14:08:47.252145 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-z6h7n" Feb 17 14:08:47 crc kubenswrapper[4836]: I0217 14:08:47.265044 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-z6h7n" Feb 17 14:08:47 crc kubenswrapper[4836]: I0217 14:08:47.272779 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-pxwhr"] Feb 17 14:08:47 crc kubenswrapper[4836]: I0217 14:08:47.274221 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pxwhr" Feb 17 14:08:47 crc kubenswrapper[4836]: I0217 14:08:47.281281 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:08:47 crc kubenswrapper[4836]: I0217 14:08:47.281617 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 17 14:08:47 crc kubenswrapper[4836]: E0217 14:08:47.281742 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:08:47.781721171 +0000 UTC m=+154.124649440 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:47 crc kubenswrapper[4836]: I0217 14:08:47.329495 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pxwhr"] Feb 17 14:08:47 crc kubenswrapper[4836]: I0217 14:08:47.383354 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9f23804-837d-4d3c-94b7-7cdefe6e94df-catalog-content\") pod \"redhat-marketplace-pxwhr\" (UID: \"e9f23804-837d-4d3c-94b7-7cdefe6e94df\") " pod="openshift-marketplace/redhat-marketplace-pxwhr" Feb 17 14:08:47 crc kubenswrapper[4836]: I0217 14:08:47.383427 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqqlz\" (UniqueName: \"kubernetes.io/projected/e9f23804-837d-4d3c-94b7-7cdefe6e94df-kube-api-access-hqqlz\") pod \"redhat-marketplace-pxwhr\" (UID: \"e9f23804-837d-4d3c-94b7-7cdefe6e94df\") " pod="openshift-marketplace/redhat-marketplace-pxwhr" Feb 17 14:08:47 crc kubenswrapper[4836]: I0217 14:08:47.383458 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5vhz9\" (UID: \"4cd3f585-c95f-43ee-962c-ea33aff90415\") " pod="openshift-image-registry/image-registry-697d97f7c8-5vhz9" Feb 17 14:08:47 crc kubenswrapper[4836]: I0217 14:08:47.383478 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9f23804-837d-4d3c-94b7-7cdefe6e94df-utilities\") pod \"redhat-marketplace-pxwhr\" (UID: \"e9f23804-837d-4d3c-94b7-7cdefe6e94df\") " pod="openshift-marketplace/redhat-marketplace-pxwhr" Feb 17 14:08:47 crc kubenswrapper[4836]: E0217 14:08:47.384810 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:08:47.884794868 +0000 UTC m=+154.227723137 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5vhz9" (UID: "4cd3f585-c95f-43ee-962c-ea33aff90415") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:47 crc kubenswrapper[4836]: I0217 14:08:47.484801 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:08:47 crc kubenswrapper[4836]: I0217 14:08:47.485005 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9f23804-837d-4d3c-94b7-7cdefe6e94df-catalog-content\") pod \"redhat-marketplace-pxwhr\" (UID: \"e9f23804-837d-4d3c-94b7-7cdefe6e94df\") " pod="openshift-marketplace/redhat-marketplace-pxwhr" Feb 17 14:08:47 crc kubenswrapper[4836]: I0217 14:08:47.485059 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqqlz\" (UniqueName: \"kubernetes.io/projected/e9f23804-837d-4d3c-94b7-7cdefe6e94df-kube-api-access-hqqlz\") pod \"redhat-marketplace-pxwhr\" (UID: \"e9f23804-837d-4d3c-94b7-7cdefe6e94df\") " pod="openshift-marketplace/redhat-marketplace-pxwhr" Feb 17 14:08:47 crc kubenswrapper[4836]: I0217 14:08:47.485093 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9f23804-837d-4d3c-94b7-7cdefe6e94df-utilities\") pod \"redhat-marketplace-pxwhr\" (UID: \"e9f23804-837d-4d3c-94b7-7cdefe6e94df\") " pod="openshift-marketplace/redhat-marketplace-pxwhr" Feb 17 14:08:47 crc kubenswrapper[4836]: I0217 14:08:47.485770 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9f23804-837d-4d3c-94b7-7cdefe6e94df-utilities\") pod \"redhat-marketplace-pxwhr\" (UID: \"e9f23804-837d-4d3c-94b7-7cdefe6e94df\") " pod="openshift-marketplace/redhat-marketplace-pxwhr" Feb 17 14:08:47 crc kubenswrapper[4836]: I0217 14:08:47.486007 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9f23804-837d-4d3c-94b7-7cdefe6e94df-catalog-content\") pod \"redhat-marketplace-pxwhr\" (UID: \"e9f23804-837d-4d3c-94b7-7cdefe6e94df\") " pod="openshift-marketplace/redhat-marketplace-pxwhr" Feb 17 14:08:47 crc kubenswrapper[4836]: E0217 14:08:47.486070 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:08:47.986055876 +0000 UTC m=+154.328984145 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:47 crc kubenswrapper[4836]: I0217 14:08:47.584751 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-5kqqh"] Feb 17 14:08:47 crc kubenswrapper[4836]: I0217 14:08:47.585923 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5kqqh" Feb 17 14:08:47 crc kubenswrapper[4836]: I0217 14:08:47.586310 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5vhz9\" (UID: \"4cd3f585-c95f-43ee-962c-ea33aff90415\") " pod="openshift-image-registry/image-registry-697d97f7c8-5vhz9" Feb 17 14:08:47 crc kubenswrapper[4836]: E0217 14:08:47.586771 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:08:48.0867566 +0000 UTC m=+154.429684869 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5vhz9" (UID: "4cd3f585-c95f-43ee-962c-ea33aff90415") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:47 crc kubenswrapper[4836]: I0217 14:08:47.609555 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5kqqh"] Feb 17 14:08:47 crc kubenswrapper[4836]: I0217 14:08:47.616260 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqqlz\" (UniqueName: \"kubernetes.io/projected/e9f23804-837d-4d3c-94b7-7cdefe6e94df-kube-api-access-hqqlz\") pod \"redhat-marketplace-pxwhr\" (UID: \"e9f23804-837d-4d3c-94b7-7cdefe6e94df\") " pod="openshift-marketplace/redhat-marketplace-pxwhr" Feb 17 14:08:47 crc kubenswrapper[4836]: I0217 14:08:47.688825 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:08:47 crc kubenswrapper[4836]: I0217 14:08:47.689060 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c85860a6-c3bb-448b-b812-cbf38230de01-utilities\") pod \"redhat-marketplace-5kqqh\" (UID: \"c85860a6-c3bb-448b-b812-cbf38230de01\") " pod="openshift-marketplace/redhat-marketplace-5kqqh" Feb 17 14:08:47 crc kubenswrapper[4836]: I0217 14:08:47.689088 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8k6d\" (UniqueName: \"kubernetes.io/projected/c85860a6-c3bb-448b-b812-cbf38230de01-kube-api-access-t8k6d\") pod \"redhat-marketplace-5kqqh\" (UID: \"c85860a6-c3bb-448b-b812-cbf38230de01\") " pod="openshift-marketplace/redhat-marketplace-5kqqh" Feb 17 14:08:47 crc kubenswrapper[4836]: I0217 14:08:47.689164 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c85860a6-c3bb-448b-b812-cbf38230de01-catalog-content\") pod \"redhat-marketplace-5kqqh\" (UID: \"c85860a6-c3bb-448b-b812-cbf38230de01\") " pod="openshift-marketplace/redhat-marketplace-5kqqh" Feb 17 14:08:47 crc kubenswrapper[4836]: E0217 14:08:47.689893 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:08:48.189871118 +0000 UTC m=+154.532799387 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:47 crc kubenswrapper[4836]: I0217 14:08:47.814171 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c85860a6-c3bb-448b-b812-cbf38230de01-utilities\") pod \"redhat-marketplace-5kqqh\" (UID: \"c85860a6-c3bb-448b-b812-cbf38230de01\") " pod="openshift-marketplace/redhat-marketplace-5kqqh" Feb 17 14:08:47 crc kubenswrapper[4836]: I0217 14:08:47.814220 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8k6d\" (UniqueName: \"kubernetes.io/projected/c85860a6-c3bb-448b-b812-cbf38230de01-kube-api-access-t8k6d\") pod \"redhat-marketplace-5kqqh\" (UID: \"c85860a6-c3bb-448b-b812-cbf38230de01\") " pod="openshift-marketplace/redhat-marketplace-5kqqh" Feb 17 14:08:47 crc kubenswrapper[4836]: I0217 14:08:47.814271 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c85860a6-c3bb-448b-b812-cbf38230de01-catalog-content\") pod \"redhat-marketplace-5kqqh\" (UID: \"c85860a6-c3bb-448b-b812-cbf38230de01\") " pod="openshift-marketplace/redhat-marketplace-5kqqh" Feb 17 14:08:47 crc kubenswrapper[4836]: I0217 14:08:47.814324 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5vhz9\" (UID: \"4cd3f585-c95f-43ee-962c-ea33aff90415\") " pod="openshift-image-registry/image-registry-697d97f7c8-5vhz9" Feb 17 14:08:47 crc kubenswrapper[4836]: E0217 14:08:47.814695 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:08:48.314678221 +0000 UTC m=+154.657606490 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5vhz9" (UID: "4cd3f585-c95f-43ee-962c-ea33aff90415") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:47 crc kubenswrapper[4836]: I0217 14:08:47.817908 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c85860a6-c3bb-448b-b812-cbf38230de01-catalog-content\") pod \"redhat-marketplace-5kqqh\" (UID: \"c85860a6-c3bb-448b-b812-cbf38230de01\") " pod="openshift-marketplace/redhat-marketplace-5kqqh" Feb 17 14:08:47 crc kubenswrapper[4836]: I0217 14:08:47.823795 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c85860a6-c3bb-448b-b812-cbf38230de01-utilities\") pod \"redhat-marketplace-5kqqh\" (UID: \"c85860a6-c3bb-448b-b812-cbf38230de01\") " pod="openshift-marketplace/redhat-marketplace-5kqqh" Feb 17 14:08:47 crc kubenswrapper[4836]: I0217 14:08:47.868580 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pxwhr" Feb 17 14:08:47 crc kubenswrapper[4836]: I0217 14:08:47.916456 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8k6d\" (UniqueName: \"kubernetes.io/projected/c85860a6-c3bb-448b-b812-cbf38230de01-kube-api-access-t8k6d\") pod \"redhat-marketplace-5kqqh\" (UID: \"c85860a6-c3bb-448b-b812-cbf38230de01\") " pod="openshift-marketplace/redhat-marketplace-5kqqh" Feb 17 14:08:47 crc kubenswrapper[4836]: I0217 14:08:47.917283 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-fqzrl" Feb 17 14:08:47 crc kubenswrapper[4836]: I0217 14:08:47.918990 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:08:47 crc kubenswrapper[4836]: E0217 14:08:47.920132 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:08:48.4201135 +0000 UTC m=+154.763041769 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:47 crc kubenswrapper[4836]: I0217 14:08:47.947803 4836 patch_prober.go:28] interesting pod/router-default-5444994796-fqzrl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 14:08:47 crc kubenswrapper[4836]: [-]has-synced failed: reason withheld Feb 17 14:08:47 crc kubenswrapper[4836]: [+]process-running ok Feb 17 14:08:47 crc kubenswrapper[4836]: healthz check failed Feb 17 14:08:47 crc kubenswrapper[4836]: I0217 14:08:47.948045 4836 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fqzrl" podUID="296ae94a-36e6-480b-9395-8f6a96621fdf" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 14:08:48 crc kubenswrapper[4836]: I0217 14:08:48.020064 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5vhz9\" (UID: \"4cd3f585-c95f-43ee-962c-ea33aff90415\") " pod="openshift-image-registry/image-registry-697d97f7c8-5vhz9" Feb 17 14:08:48 crc kubenswrapper[4836]: E0217 14:08:48.020417 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:08:48.520405073 +0000 UTC m=+154.863333342 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5vhz9" (UID: "4cd3f585-c95f-43ee-962c-ea33aff90415") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:48 crc kubenswrapper[4836]: I0217 14:08:48.072701 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-6zspj" Feb 17 14:08:48 crc kubenswrapper[4836]: I0217 14:08:48.072742 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-6zspj" Feb 17 14:08:48 crc kubenswrapper[4836]: I0217 14:08:48.101424 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-252vj"] Feb 17 14:08:48 crc kubenswrapper[4836]: I0217 14:08:48.102684 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-252vj" Feb 17 14:08:48 crc kubenswrapper[4836]: I0217 14:08:48.108642 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 17 14:08:48 crc kubenswrapper[4836]: I0217 14:08:48.124995 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:08:48 crc kubenswrapper[4836]: I0217 14:08:48.125166 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a172042c-7dc6-4cea-906e-3d9135523f15-catalog-content\") pod \"redhat-operators-252vj\" (UID: \"a172042c-7dc6-4cea-906e-3d9135523f15\") " pod="openshift-marketplace/redhat-operators-252vj" Feb 17 14:08:48 crc kubenswrapper[4836]: I0217 14:08:48.125366 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thgpf\" (UniqueName: \"kubernetes.io/projected/a172042c-7dc6-4cea-906e-3d9135523f15-kube-api-access-thgpf\") pod \"redhat-operators-252vj\" (UID: \"a172042c-7dc6-4cea-906e-3d9135523f15\") " pod="openshift-marketplace/redhat-operators-252vj" Feb 17 14:08:48 crc kubenswrapper[4836]: I0217 14:08:48.125392 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a172042c-7dc6-4cea-906e-3d9135523f15-utilities\") pod \"redhat-operators-252vj\" (UID: \"a172042c-7dc6-4cea-906e-3d9135523f15\") " pod="openshift-marketplace/redhat-operators-252vj" Feb 17 14:08:48 crc kubenswrapper[4836]: E0217 14:08:48.126361 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:08:48.626279734 +0000 UTC m=+154.969208013 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:48 crc kubenswrapper[4836]: I0217 14:08:48.133464 4836 patch_prober.go:28] interesting pod/console-f9d7485db-6zspj container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.16:8443/health\": dial tcp 10.217.0.16:8443: connect: connection refused" start-of-body= Feb 17 14:08:48 crc kubenswrapper[4836]: I0217 14:08:48.133535 4836 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-6zspj" podUID="6d52104b-91e7-4a3a-9138-163eb850485d" containerName="console" probeResult="failure" output="Get \"https://10.217.0.16:8443/health\": dial tcp 10.217.0.16:8443: connect: connection refused" Feb 17 14:08:48 crc kubenswrapper[4836]: I0217 14:08:48.134961 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-252vj"] Feb 17 14:08:48 crc kubenswrapper[4836]: I0217 14:08:48.143842 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5kqqh" Feb 17 14:08:48 crc kubenswrapper[4836]: I0217 14:08:48.171605 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x2x76" Feb 17 14:08:48 crc kubenswrapper[4836]: I0217 14:08:48.229664 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a172042c-7dc6-4cea-906e-3d9135523f15-utilities\") pod \"redhat-operators-252vj\" (UID: \"a172042c-7dc6-4cea-906e-3d9135523f15\") " pod="openshift-marketplace/redhat-operators-252vj" Feb 17 14:08:48 crc kubenswrapper[4836]: I0217 14:08:48.229705 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thgpf\" (UniqueName: \"kubernetes.io/projected/a172042c-7dc6-4cea-906e-3d9135523f15-kube-api-access-thgpf\") pod \"redhat-operators-252vj\" (UID: \"a172042c-7dc6-4cea-906e-3d9135523f15\") " pod="openshift-marketplace/redhat-operators-252vj" Feb 17 14:08:48 crc kubenswrapper[4836]: I0217 14:08:48.229762 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a172042c-7dc6-4cea-906e-3d9135523f15-catalog-content\") pod \"redhat-operators-252vj\" (UID: \"a172042c-7dc6-4cea-906e-3d9135523f15\") " pod="openshift-marketplace/redhat-operators-252vj" Feb 17 14:08:48 crc kubenswrapper[4836]: I0217 14:08:48.229808 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5vhz9\" (UID: \"4cd3f585-c95f-43ee-962c-ea33aff90415\") " pod="openshift-image-registry/image-registry-697d97f7c8-5vhz9" Feb 17 14:08:48 crc kubenswrapper[4836]: E0217 14:08:48.230185 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:08:48.730171103 +0000 UTC m=+155.073099372 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5vhz9" (UID: "4cd3f585-c95f-43ee-962c-ea33aff90415") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:48 crc kubenswrapper[4836]: I0217 14:08:48.230350 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a172042c-7dc6-4cea-906e-3d9135523f15-utilities\") pod \"redhat-operators-252vj\" (UID: \"a172042c-7dc6-4cea-906e-3d9135523f15\") " pod="openshift-marketplace/redhat-operators-252vj" Feb 17 14:08:48 crc kubenswrapper[4836]: I0217 14:08:48.230837 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a172042c-7dc6-4cea-906e-3d9135523f15-catalog-content\") pod \"redhat-operators-252vj\" (UID: \"a172042c-7dc6-4cea-906e-3d9135523f15\") " pod="openshift-marketplace/redhat-operators-252vj" Feb 17 14:08:48 crc kubenswrapper[4836]: I0217 14:08:48.295989 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-z6h7n" Feb 17 14:08:48 crc kubenswrapper[4836]: I0217 14:08:48.308392 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thgpf\" (UniqueName: \"kubernetes.io/projected/a172042c-7dc6-4cea-906e-3d9135523f15-kube-api-access-thgpf\") pod \"redhat-operators-252vj\" (UID: \"a172042c-7dc6-4cea-906e-3d9135523f15\") " pod="openshift-marketplace/redhat-operators-252vj" Feb 17 14:08:48 crc kubenswrapper[4836]: I0217 14:08:48.336956 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:08:48 crc kubenswrapper[4836]: E0217 14:08:48.337955 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:08:48.837924534 +0000 UTC m=+155.180852803 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:48 crc kubenswrapper[4836]: I0217 14:08:48.433982 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 17 14:08:48 crc kubenswrapper[4836]: I0217 14:08:48.484250 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-252vj" Feb 17 14:08:48 crc kubenswrapper[4836]: I0217 14:08:48.632862 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5vhz9\" (UID: \"4cd3f585-c95f-43ee-962c-ea33aff90415\") " pod="openshift-image-registry/image-registry-697d97f7c8-5vhz9" Feb 17 14:08:48 crc kubenswrapper[4836]: E0217 14:08:48.633654 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:08:49.133637865 +0000 UTC m=+155.476566144 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5vhz9" (UID: "4cd3f585-c95f-43ee-962c-ea33aff90415") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:48 crc kubenswrapper[4836]: I0217 14:08:48.798794 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:08:48 crc kubenswrapper[4836]: E0217 14:08:48.810015 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:08:49.309978607 +0000 UTC m=+155.652906876 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:48 crc kubenswrapper[4836]: I0217 14:08:48.823988 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"01b4a0bc4233e480d2a5e4ad0e393206ecbab9a812054b85c0b53ea0a3483348"} Feb 17 14:08:48 crc kubenswrapper[4836]: I0217 14:08:48.824041 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9w8zr"] Feb 17 14:08:48 crc kubenswrapper[4836]: I0217 14:08:48.824718 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5vhz9\" (UID: \"4cd3f585-c95f-43ee-962c-ea33aff90415\") " pod="openshift-image-registry/image-registry-697d97f7c8-5vhz9" Feb 17 14:08:48 crc kubenswrapper[4836]: E0217 14:08:48.825126 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:08:49.325108999 +0000 UTC m=+155.668037268 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5vhz9" (UID: "4cd3f585-c95f-43ee-962c-ea33aff90415") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:48 crc kubenswrapper[4836]: I0217 14:08:48.826406 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"caed4fb3-6dd4-4427-880f-fee413854d48","Type":"ContainerStarted","Data":"b0ae66c7e07c61466cd3f90b98740cfd0b7ef75ac524fdbc34cb7a0d3e897bbf"} Feb 17 14:08:48 crc kubenswrapper[4836]: I0217 14:08:48.906978 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-cnq25" event={"ID":"66402e53-3287-45c4-bceb-78fc99836c5b","Type":"ContainerStarted","Data":"170dd031ad3e600b8c3e288bbcdf262adb7b37fc3c878cc3458e081c8193a929"} Feb 17 14:08:48 crc kubenswrapper[4836]: I0217 14:08:48.940127 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:08:48 crc kubenswrapper[4836]: E0217 14:08:48.941312 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:08:49.441260792 +0000 UTC m=+155.784189121 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:48 crc kubenswrapper[4836]: I0217 14:08:48.952946 4836 patch_prober.go:28] interesting pod/router-default-5444994796-fqzrl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 14:08:48 crc kubenswrapper[4836]: [-]has-synced failed: reason withheld Feb 17 14:08:48 crc kubenswrapper[4836]: [+]process-running ok Feb 17 14:08:48 crc kubenswrapper[4836]: healthz check failed Feb 17 14:08:48 crc kubenswrapper[4836]: I0217 14:08:48.953005 4836 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fqzrl" podUID="296ae94a-36e6-480b-9395-8f6a96621fdf" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 14:08:49 crc kubenswrapper[4836]: I0217 14:08:49.005848 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"988fe3fef96881ecf0d201e42800c566c722a81d012c8d57dd9b9480fe9595c1"} Feb 17 14:08:49 crc kubenswrapper[4836]: I0217 14:08:49.012328 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"6d6e2e0928ae5ceb546a64e10f83c1dcdbf16201e96a338b090459780c5233b8"} Feb 17 14:08:49 crc kubenswrapper[4836]: I0217 14:08:49.032851 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-5rfnm"] Feb 17 14:08:49 crc kubenswrapper[4836]: I0217 14:08:49.034506 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5rfnm" Feb 17 14:08:49 crc kubenswrapper[4836]: I0217 14:08:49.061868 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-z6h7n" Feb 17 14:08:49 crc kubenswrapper[4836]: I0217 14:08:49.062808 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88cf2bb1-d70f-4b82-9b9a-9d7c7a4244ff-catalog-content\") pod \"redhat-operators-5rfnm\" (UID: \"88cf2bb1-d70f-4b82-9b9a-9d7c7a4244ff\") " pod="openshift-marketplace/redhat-operators-5rfnm" Feb 17 14:08:49 crc kubenswrapper[4836]: I0217 14:08:49.062828 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88cf2bb1-d70f-4b82-9b9a-9d7c7a4244ff-utilities\") pod \"redhat-operators-5rfnm\" (UID: \"88cf2bb1-d70f-4b82-9b9a-9d7c7a4244ff\") " pod="openshift-marketplace/redhat-operators-5rfnm" Feb 17 14:08:49 crc kubenswrapper[4836]: I0217 14:08:49.062943 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5vhz9\" (UID: \"4cd3f585-c95f-43ee-962c-ea33aff90415\") " pod="openshift-image-registry/image-registry-697d97f7c8-5vhz9" Feb 17 14:08:49 crc kubenswrapper[4836]: I0217 14:08:49.062975 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnjxf\" (UniqueName: \"kubernetes.io/projected/88cf2bb1-d70f-4b82-9b9a-9d7c7a4244ff-kube-api-access-jnjxf\") pod \"redhat-operators-5rfnm\" (UID: \"88cf2bb1-d70f-4b82-9b9a-9d7c7a4244ff\") " pod="openshift-marketplace/redhat-operators-5rfnm" Feb 17 14:08:49 crc kubenswrapper[4836]: E0217 14:08:49.066098 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:08:49.566085697 +0000 UTC m=+155.909013966 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5vhz9" (UID: "4cd3f585-c95f-43ee-962c-ea33aff90415") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:49 crc kubenswrapper[4836]: I0217 14:08:49.163831 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:08:49 crc kubenswrapper[4836]: I0217 14:08:49.164118 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88cf2bb1-d70f-4b82-9b9a-9d7c7a4244ff-catalog-content\") pod \"redhat-operators-5rfnm\" (UID: \"88cf2bb1-d70f-4b82-9b9a-9d7c7a4244ff\") " pod="openshift-marketplace/redhat-operators-5rfnm" Feb 17 14:08:49 crc kubenswrapper[4836]: I0217 14:08:49.164149 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88cf2bb1-d70f-4b82-9b9a-9d7c7a4244ff-utilities\") pod \"redhat-operators-5rfnm\" (UID: \"88cf2bb1-d70f-4b82-9b9a-9d7c7a4244ff\") " pod="openshift-marketplace/redhat-operators-5rfnm" Feb 17 14:08:49 crc kubenswrapper[4836]: I0217 14:08:49.164270 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnjxf\" (UniqueName: \"kubernetes.io/projected/88cf2bb1-d70f-4b82-9b9a-9d7c7a4244ff-kube-api-access-jnjxf\") pod \"redhat-operators-5rfnm\" (UID: \"88cf2bb1-d70f-4b82-9b9a-9d7c7a4244ff\") " pod="openshift-marketplace/redhat-operators-5rfnm" Feb 17 14:08:49 crc kubenswrapper[4836]: I0217 14:08:49.164835 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88cf2bb1-d70f-4b82-9b9a-9d7c7a4244ff-catalog-content\") pod \"redhat-operators-5rfnm\" (UID: \"88cf2bb1-d70f-4b82-9b9a-9d7c7a4244ff\") " pod="openshift-marketplace/redhat-operators-5rfnm" Feb 17 14:08:49 crc kubenswrapper[4836]: E0217 14:08:49.164922 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:08:49.6649024 +0000 UTC m=+156.007830669 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:49 crc kubenswrapper[4836]: I0217 14:08:49.165163 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88cf2bb1-d70f-4b82-9b9a-9d7c7a4244ff-utilities\") pod \"redhat-operators-5rfnm\" (UID: \"88cf2bb1-d70f-4b82-9b9a-9d7c7a4244ff\") " pod="openshift-marketplace/redhat-operators-5rfnm" Feb 17 14:08:49 crc kubenswrapper[4836]: I0217 14:08:49.218065 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5rfnm"] Feb 17 14:08:49 crc kubenswrapper[4836]: I0217 14:08:49.228481 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 17 14:08:49 crc kubenswrapper[4836]: I0217 14:08:49.229284 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 17 14:08:49 crc kubenswrapper[4836]: I0217 14:08:49.300906 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/895e5f35-c3c6-46b6-878c-6d9a47b6221f-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"895e5f35-c3c6-46b6-878c-6d9a47b6221f\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 17 14:08:49 crc kubenswrapper[4836]: I0217 14:08:49.301332 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5vhz9\" (UID: \"4cd3f585-c95f-43ee-962c-ea33aff90415\") " pod="openshift-image-registry/image-registry-697d97f7c8-5vhz9" Feb 17 14:08:49 crc kubenswrapper[4836]: I0217 14:08:49.301407 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/895e5f35-c3c6-46b6-878c-6d9a47b6221f-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"895e5f35-c3c6-46b6-878c-6d9a47b6221f\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 17 14:08:49 crc kubenswrapper[4836]: E0217 14:08:49.301811 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:08:49.801797965 +0000 UTC m=+156.144726234 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5vhz9" (UID: "4cd3f585-c95f-43ee-962c-ea33aff90415") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:49 crc kubenswrapper[4836]: I0217 14:08:49.340490 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnjxf\" (UniqueName: \"kubernetes.io/projected/88cf2bb1-d70f-4b82-9b9a-9d7c7a4244ff-kube-api-access-jnjxf\") pod \"redhat-operators-5rfnm\" (UID: \"88cf2bb1-d70f-4b82-9b9a-9d7c7a4244ff\") " pod="openshift-marketplace/redhat-operators-5rfnm" Feb 17 14:08:49 crc kubenswrapper[4836]: I0217 14:08:49.356257 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 17 14:08:49 crc kubenswrapper[4836]: I0217 14:08:49.398926 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 17 14:08:49 crc kubenswrapper[4836]: I0217 14:08:49.403102 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:08:49 crc kubenswrapper[4836]: I0217 14:08:49.403397 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/895e5f35-c3c6-46b6-878c-6d9a47b6221f-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"895e5f35-c3c6-46b6-878c-6d9a47b6221f\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 17 14:08:49 crc kubenswrapper[4836]: I0217 14:08:49.403535 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/895e5f35-c3c6-46b6-878c-6d9a47b6221f-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"895e5f35-c3c6-46b6-878c-6d9a47b6221f\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 17 14:08:49 crc kubenswrapper[4836]: I0217 14:08:49.403642 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/895e5f35-c3c6-46b6-878c-6d9a47b6221f-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"895e5f35-c3c6-46b6-878c-6d9a47b6221f\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 17 14:08:49 crc kubenswrapper[4836]: E0217 14:08:49.403715 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:08:49.90368377 +0000 UTC m=+156.246612049 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:49 crc kubenswrapper[4836]: I0217 14:08:49.439897 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tmpvx"] Feb 17 14:08:49 crc kubenswrapper[4836]: I0217 14:08:49.448093 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/895e5f35-c3c6-46b6-878c-6d9a47b6221f-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"895e5f35-c3c6-46b6-878c-6d9a47b6221f\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 17 14:08:49 crc kubenswrapper[4836]: I0217 14:08:49.456823 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 17 14:08:49 crc kubenswrapper[4836]: I0217 14:08:49.478656 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-cnq25" podStartSLOduration=134.47862619 podStartE2EDuration="2m14.47862619s" podCreationTimestamp="2026-02-17 14:06:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:08:49.443491137 +0000 UTC m=+155.786419416" watchObservedRunningTime="2026-02-17 14:08:49.47862619 +0000 UTC m=+155.821554489" Feb 17 14:08:49 crc kubenswrapper[4836]: I0217 14:08:49.481867 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5rfnm" Feb 17 14:08:49 crc kubenswrapper[4836]: I0217 14:08:49.507533 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5vhz9\" (UID: \"4cd3f585-c95f-43ee-962c-ea33aff90415\") " pod="openshift-image-registry/image-registry-697d97f7c8-5vhz9" Feb 17 14:08:49 crc kubenswrapper[4836]: E0217 14:08:49.507874 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:08:50.007860486 +0000 UTC m=+156.350788765 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5vhz9" (UID: "4cd3f585-c95f-43ee-962c-ea33aff90415") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:49 crc kubenswrapper[4836]: I0217 14:08:49.508277 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 17 14:08:49 crc kubenswrapper[4836]: I0217 14:08:49.601371 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cmk55"] Feb 17 14:08:49 crc kubenswrapper[4836]: I0217 14:08:49.608431 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:08:49 crc kubenswrapper[4836]: E0217 14:08:49.608768 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:08:50.108752605 +0000 UTC m=+156.451680884 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:49 crc kubenswrapper[4836]: I0217 14:08:49.611858 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vfmw4"] Feb 17 14:08:49 crc kubenswrapper[4836]: I0217 14:08:49.714446 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5vhz9\" (UID: \"4cd3f585-c95f-43ee-962c-ea33aff90415\") " pod="openshift-image-registry/image-registry-697d97f7c8-5vhz9" Feb 17 14:08:49 crc kubenswrapper[4836]: E0217 14:08:49.715251 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:08:50.215234572 +0000 UTC m=+156.558162851 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5vhz9" (UID: "4cd3f585-c95f-43ee-962c-ea33aff90415") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:49 crc kubenswrapper[4836]: I0217 14:08:49.830526 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:08:49 crc kubenswrapper[4836]: E0217 14:08:49.831000 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:08:50.330982895 +0000 UTC m=+156.673911164 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:49 crc kubenswrapper[4836]: I0217 14:08:49.946150 4836 patch_prober.go:28] interesting pod/router-default-5444994796-fqzrl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 14:08:49 crc kubenswrapper[4836]: [-]has-synced failed: reason withheld Feb 17 14:08:49 crc kubenswrapper[4836]: [+]process-running ok Feb 17 14:08:49 crc kubenswrapper[4836]: healthz check failed Feb 17 14:08:49 crc kubenswrapper[4836]: I0217 14:08:49.946204 4836 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fqzrl" podUID="296ae94a-36e6-480b-9395-8f6a96621fdf" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 14:08:49 crc kubenswrapper[4836]: I0217 14:08:49.947109 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5vhz9\" (UID: \"4cd3f585-c95f-43ee-962c-ea33aff90415\") " pod="openshift-image-registry/image-registry-697d97f7c8-5vhz9" Feb 17 14:08:49 crc kubenswrapper[4836]: E0217 14:08:49.947452 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:08:50.447437697 +0000 UTC m=+156.790365966 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5vhz9" (UID: "4cd3f585-c95f-43ee-962c-ea33aff90415") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:49 crc kubenswrapper[4836]: I0217 14:08:49.975827 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5kqqh"] Feb 17 14:08:50 crc kubenswrapper[4836]: W0217 14:08:50.035655 4836 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8762f2f2_8375_4fdd_8a29_ea2ab598afa1.slice/crio-a5a3c8cb6babc233f1c2ac1e8dd3635788628a1cb1bb705f8a779b47d0562e2b WatchSource:0}: Error finding container a5a3c8cb6babc233f1c2ac1e8dd3635788628a1cb1bb705f8a779b47d0562e2b: Status 404 returned error can't find the container with id a5a3c8cb6babc233f1c2ac1e8dd3635788628a1cb1bb705f8a779b47d0562e2b Feb 17 14:08:50 crc kubenswrapper[4836]: I0217 14:08:50.039390 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pxwhr"] Feb 17 14:08:50 crc kubenswrapper[4836]: I0217 14:08:50.052284 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:08:50 crc kubenswrapper[4836]: E0217 14:08:50.052593 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:08:50.552575578 +0000 UTC m=+156.895503847 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:50 crc kubenswrapper[4836]: I0217 14:08:50.135731 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tmpvx" event={"ID":"c6c873c6-ddde-4b9b-9141-e6de9be567d4","Type":"ContainerStarted","Data":"79e0157c4fae70c4a163e7552bd45039fe6e084cf3fa63db4fbd428401695df6"} Feb 17 14:08:50 crc kubenswrapper[4836]: I0217 14:08:50.145996 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9w8zr" event={"ID":"089d1289-afe9-4ffe-9d96-ac10058335ed","Type":"ContainerStarted","Data":"aec4a035ba778cf216a49780b8ffa622c813a3d3daa4a826e68b03c1acc34c4d"} Feb 17 14:08:50 crc kubenswrapper[4836]: I0217 14:08:50.154068 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5vhz9\" (UID: \"4cd3f585-c95f-43ee-962c-ea33aff90415\") " pod="openshift-image-registry/image-registry-697d97f7c8-5vhz9" Feb 17 14:08:50 crc kubenswrapper[4836]: W0217 14:08:50.154538 4836 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9f23804_837d_4d3c_94b7_7cdefe6e94df.slice/crio-aef23167292c1dafb12389117081e87d3bd5bee8abc67ecc65bf3cd0a4bf9f1c WatchSource:0}: Error finding container aef23167292c1dafb12389117081e87d3bd5bee8abc67ecc65bf3cd0a4bf9f1c: Status 404 returned error can't find the container with id aef23167292c1dafb12389117081e87d3bd5bee8abc67ecc65bf3cd0a4bf9f1c Feb 17 14:08:50 crc kubenswrapper[4836]: E0217 14:08:50.154777 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:08:50.654762412 +0000 UTC m=+156.997690691 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5vhz9" (UID: "4cd3f585-c95f-43ee-962c-ea33aff90415") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:50 crc kubenswrapper[4836]: I0217 14:08:50.161014 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5kqqh" event={"ID":"c85860a6-c3bb-448b-b812-cbf38230de01","Type":"ContainerStarted","Data":"f9efa614ea777c6c1f7f2234c739bb0e406ce4096c5477be16d8aba1cfb4c85e"} Feb 17 14:08:50 crc kubenswrapper[4836]: I0217 14:08:50.249312 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vfmw4" event={"ID":"8762f2f2-8375-4fdd-8a29-ea2ab598afa1","Type":"ContainerStarted","Data":"a5a3c8cb6babc233f1c2ac1e8dd3635788628a1cb1bb705f8a779b47d0562e2b"} Feb 17 14:08:50 crc kubenswrapper[4836]: I0217 14:08:50.257553 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:08:50 crc kubenswrapper[4836]: E0217 14:08:50.257651 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:08:50.757614383 +0000 UTC m=+157.100542652 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:50 crc kubenswrapper[4836]: I0217 14:08:50.257794 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5vhz9\" (UID: \"4cd3f585-c95f-43ee-962c-ea33aff90415\") " pod="openshift-image-registry/image-registry-697d97f7c8-5vhz9" Feb 17 14:08:50 crc kubenswrapper[4836]: E0217 14:08:50.258171 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:08:50.758156697 +0000 UTC m=+157.101084966 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5vhz9" (UID: "4cd3f585-c95f-43ee-962c-ea33aff90415") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:50 crc kubenswrapper[4836]: I0217 14:08:50.299857 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"5279ed1ec23ad195ee96d2ed7467f460d22af9b931e9b620c8ade5d19327e788"} Feb 17 14:08:50 crc kubenswrapper[4836]: I0217 14:08:50.301128 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:08:50 crc kubenswrapper[4836]: I0217 14:08:50.303566 4836 generic.go:334] "Generic (PLEG): container finished" podID="91eb437c-beea-4f2d-b3f7-505b87fe6dee" containerID="4b8580f44aade0425b4de34e0f49d07bd6192e526f9c10aa11b53556a3546660" exitCode=0 Feb 17 14:08:50 crc kubenswrapper[4836]: I0217 14:08:50.303694 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522280-c8fps" event={"ID":"91eb437c-beea-4f2d-b3f7-505b87fe6dee","Type":"ContainerDied","Data":"4b8580f44aade0425b4de34e0f49d07bd6192e526f9c10aa11b53556a3546660"} Feb 17 14:08:50 crc kubenswrapper[4836]: I0217 14:08:50.305464 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"caed4fb3-6dd4-4427-880f-fee413854d48","Type":"ContainerStarted","Data":"894f04d8bb69cf58bea3ab4206057ab2e51ebe330575f3a10c3c1c616fcfa44c"} Feb 17 14:08:50 crc kubenswrapper[4836]: I0217 14:08:50.307614 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cmk55" event={"ID":"f1bd4ed0-3b99-4446-9218-71bb589da4a4","Type":"ContainerStarted","Data":"6e75d917f9b18c07b2feade7d6ceab556bb6226e0a78e8a3d47b72928e406bad"} Feb 17 14:08:50 crc kubenswrapper[4836]: I0217 14:08:50.415671 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:08:50 crc kubenswrapper[4836]: E0217 14:08:50.416139 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:08:50.916118871 +0000 UTC m=+157.259047140 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:50 crc kubenswrapper[4836]: I0217 14:08:50.517639 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5vhz9\" (UID: \"4cd3f585-c95f-43ee-962c-ea33aff90415\") " pod="openshift-image-registry/image-registry-697d97f7c8-5vhz9" Feb 17 14:08:50 crc kubenswrapper[4836]: E0217 14:08:50.520305 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:08:51.020211364 +0000 UTC m=+157.363139633 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5vhz9" (UID: "4cd3f585-c95f-43ee-962c-ea33aff90415") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:50 crc kubenswrapper[4836]: I0217 14:08:50.653233 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:08:50 crc kubenswrapper[4836]: I0217 14:08:50.769806 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=7.76976773 podStartE2EDuration="7.76976773s" podCreationTimestamp="2026-02-17 14:08:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:08:50.759142848 +0000 UTC m=+157.102071127" watchObservedRunningTime="2026-02-17 14:08:50.76976773 +0000 UTC m=+157.112695999" Feb 17 14:08:50 crc kubenswrapper[4836]: E0217 14:08:50.815050 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 14:08:51.314985521 +0000 UTC m=+157.657913790 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:50 crc kubenswrapper[4836]: I0217 14:08:50.815981 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5vhz9\" (UID: \"4cd3f585-c95f-43ee-962c-ea33aff90415\") " pod="openshift-image-registry/image-registry-697d97f7c8-5vhz9" Feb 17 14:08:50 crc kubenswrapper[4836]: I0217 14:08:50.817636 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-6scjm" event={"ID":"c2f4f6fb-d604-402f-83d0-6b25781c3aa8","Type":"ContainerStarted","Data":"d7db103eb0f05f6770e34d461544f70605661f51228b5fac894fae8ee978b438"} Feb 17 14:08:50 crc kubenswrapper[4836]: I0217 14:08:50.817671 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"53956ee709ceed9511b467ea1bb33e2e4cb24195855b9addf992d2f901ce9683"} Feb 17 14:08:50 crc kubenswrapper[4836]: I0217 14:08:50.817686 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-252vj"] Feb 17 14:08:50 crc kubenswrapper[4836]: E0217 14:08:50.817977 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 14:08:51.31796831 +0000 UTC m=+157.660896579 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5vhz9" (UID: "4cd3f585-c95f-43ee-962c-ea33aff90415") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 14:08:50 crc kubenswrapper[4836]: I0217 14:08:50.838460 4836 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Feb 17 14:08:50 crc kubenswrapper[4836]: I0217 14:08:50.852383 4836 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-02-17T14:08:50.838493506Z","Handler":null,"Name":""} Feb 17 14:08:50 crc kubenswrapper[4836]: I0217 14:08:50.855543 4836 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Feb 17 14:08:50 crc kubenswrapper[4836]: I0217 14:08:50.855570 4836 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Feb 17 14:08:50 crc kubenswrapper[4836]: I0217 14:08:50.856569 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"3c737909227f0810cdfcb9e38b0fa4e5dbbb9bdbeb1deb19a77ed4f06c928a68"} Feb 17 14:08:50 crc kubenswrapper[4836]: I0217 14:08:50.925317 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 14:08:50 crc kubenswrapper[4836]: I0217 14:08:50.935382 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 17 14:08:51 crc kubenswrapper[4836]: I0217 14:08:51.026960 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5vhz9\" (UID: \"4cd3f585-c95f-43ee-962c-ea33aff90415\") " pod="openshift-image-registry/image-registry-697d97f7c8-5vhz9" Feb 17 14:08:51 crc kubenswrapper[4836]: I0217 14:08:51.053283 4836 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 17 14:08:51 crc kubenswrapper[4836]: I0217 14:08:51.053334 4836 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5vhz9\" (UID: \"4cd3f585-c95f-43ee-962c-ea33aff90415\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-5vhz9" Feb 17 14:08:51 crc kubenswrapper[4836]: I0217 14:08:51.089597 4836 patch_prober.go:28] interesting pod/router-default-5444994796-fqzrl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 14:08:51 crc kubenswrapper[4836]: [-]has-synced failed: reason withheld Feb 17 14:08:51 crc kubenswrapper[4836]: [+]process-running ok Feb 17 14:08:51 crc kubenswrapper[4836]: healthz check failed Feb 17 14:08:51 crc kubenswrapper[4836]: I0217 14:08:51.089650 4836 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fqzrl" podUID="296ae94a-36e6-480b-9395-8f6a96621fdf" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 14:08:51 crc kubenswrapper[4836]: I0217 14:08:51.287746 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5vhz9\" (UID: \"4cd3f585-c95f-43ee-962c-ea33aff90415\") " pod="openshift-image-registry/image-registry-697d97f7c8-5vhz9" Feb 17 14:08:51 crc kubenswrapper[4836]: I0217 14:08:51.342835 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 17 14:08:51 crc kubenswrapper[4836]: I0217 14:08:51.387041 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5rfnm"] Feb 17 14:08:51 crc kubenswrapper[4836]: W0217 14:08:51.404397 4836 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88cf2bb1_d70f_4b82_9b9a_9d7c7a4244ff.slice/crio-f4532e92cda0f4cb49095bb6a57a64c5099b9a9601e8744de1408686bb81a1cb WatchSource:0}: Error finding container f4532e92cda0f4cb49095bb6a57a64c5099b9a9601e8744de1408686bb81a1cb: Status 404 returned error can't find the container with id f4532e92cda0f4cb49095bb6a57a64c5099b9a9601e8744de1408686bb81a1cb Feb 17 14:08:51 crc kubenswrapper[4836]: I0217 14:08:51.441850 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-5vhz9" Feb 17 14:08:51 crc kubenswrapper[4836]: I0217 14:08:51.915699 4836 generic.go:334] "Generic (PLEG): container finished" podID="c85860a6-c3bb-448b-b812-cbf38230de01" containerID="d36870560f8d1243c818dca57cf74dea7a07e8c43795bb396db32ccfc2a302b6" exitCode=0 Feb 17 14:08:51 crc kubenswrapper[4836]: I0217 14:08:51.916428 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5kqqh" event={"ID":"c85860a6-c3bb-448b-b812-cbf38230de01","Type":"ContainerDied","Data":"d36870560f8d1243c818dca57cf74dea7a07e8c43795bb396db32ccfc2a302b6"} Feb 17 14:08:51 crc kubenswrapper[4836]: I0217 14:08:51.921256 4836 patch_prober.go:28] interesting pod/router-default-5444994796-fqzrl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 14:08:51 crc kubenswrapper[4836]: [-]has-synced failed: reason withheld Feb 17 14:08:51 crc kubenswrapper[4836]: [+]process-running ok Feb 17 14:08:51 crc kubenswrapper[4836]: healthz check failed Feb 17 14:08:51 crc kubenswrapper[4836]: I0217 14:08:51.921593 4836 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fqzrl" podUID="296ae94a-36e6-480b-9395-8f6a96621fdf" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 14:08:51 crc kubenswrapper[4836]: I0217 14:08:51.925900 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"895e5f35-c3c6-46b6-878c-6d9a47b6221f","Type":"ContainerStarted","Data":"4866eba95fc74299a5d4d267763f0b47fa1876ffea3c2307e4ea9572f0fa5ed5"} Feb 17 14:08:51 crc kubenswrapper[4836]: I0217 14:08:51.925941 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"895e5f35-c3c6-46b6-878c-6d9a47b6221f","Type":"ContainerStarted","Data":"9a71104ba91fe474c5ec1895a885d742689d99f71786a709bb26ffc4e5fce4b7"} Feb 17 14:08:51 crc kubenswrapper[4836]: I0217 14:08:51.926773 4836 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 17 14:08:51 crc kubenswrapper[4836]: I0217 14:08:51.928539 4836 generic.go:334] "Generic (PLEG): container finished" podID="089d1289-afe9-4ffe-9d96-ac10058335ed" containerID="f5f1510b84a48fd765ca27386941284d20f6da0225cb6c655223588a86aa6f8f" exitCode=0 Feb 17 14:08:51 crc kubenswrapper[4836]: I0217 14:08:51.928606 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9w8zr" event={"ID":"089d1289-afe9-4ffe-9d96-ac10058335ed","Type":"ContainerDied","Data":"f5f1510b84a48fd765ca27386941284d20f6da0225cb6c655223588a86aa6f8f"} Feb 17 14:08:51 crc kubenswrapper[4836]: I0217 14:08:51.932577 4836 generic.go:334] "Generic (PLEG): container finished" podID="c6c873c6-ddde-4b9b-9141-e6de9be567d4" containerID="8aece0956593a85800757e782bdc3eb1d3d87f1ac99e3fc8ce9f7012a48be219" exitCode=0 Feb 17 14:08:51 crc kubenswrapper[4836]: I0217 14:08:51.932660 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tmpvx" event={"ID":"c6c873c6-ddde-4b9b-9141-e6de9be567d4","Type":"ContainerDied","Data":"8aece0956593a85800757e782bdc3eb1d3d87f1ac99e3fc8ce9f7012a48be219"} Feb 17 14:08:51 crc kubenswrapper[4836]: I0217 14:08:51.942366 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-5vhz9"] Feb 17 14:08:51 crc kubenswrapper[4836]: I0217 14:08:51.961250 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-6scjm" event={"ID":"c2f4f6fb-d604-402f-83d0-6b25781c3aa8","Type":"ContainerStarted","Data":"c23147920100542e35ac81853accaec49db8552b462d160596b4b74afffcf2a6"} Feb 17 14:08:51 crc kubenswrapper[4836]: I0217 14:08:51.974062 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5rfnm" event={"ID":"88cf2bb1-d70f-4b82-9b9a-9d7c7a4244ff","Type":"ContainerStarted","Data":"6bb0c69e3bd51a902da87668ec1687c119230884a9fa4dff71d17ec28d9300e8"} Feb 17 14:08:51 crc kubenswrapper[4836]: I0217 14:08:51.974111 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5rfnm" event={"ID":"88cf2bb1-d70f-4b82-9b9a-9d7c7a4244ff","Type":"ContainerStarted","Data":"f4532e92cda0f4cb49095bb6a57a64c5099b9a9601e8744de1408686bb81a1cb"} Feb 17 14:08:51 crc kubenswrapper[4836]: I0217 14:08:51.977283 4836 generic.go:334] "Generic (PLEG): container finished" podID="e9f23804-837d-4d3c-94b7-7cdefe6e94df" containerID="73060b123dbcdb54cacfb96235e77305156ac3a055b89a97013a4725f13fbc92" exitCode=0 Feb 17 14:08:51 crc kubenswrapper[4836]: I0217 14:08:51.977784 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pxwhr" event={"ID":"e9f23804-837d-4d3c-94b7-7cdefe6e94df","Type":"ContainerDied","Data":"73060b123dbcdb54cacfb96235e77305156ac3a055b89a97013a4725f13fbc92"} Feb 17 14:08:51 crc kubenswrapper[4836]: I0217 14:08:51.977829 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pxwhr" event={"ID":"e9f23804-837d-4d3c-94b7-7cdefe6e94df","Type":"ContainerStarted","Data":"aef23167292c1dafb12389117081e87d3bd5bee8abc67ecc65bf3cd0a4bf9f1c"} Feb 17 14:08:51 crc kubenswrapper[4836]: I0217 14:08:51.978782 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=2.9787661 podStartE2EDuration="2.9787661s" podCreationTimestamp="2026-02-17 14:08:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:08:51.95279714 +0000 UTC m=+158.295725419" watchObservedRunningTime="2026-02-17 14:08:51.9787661 +0000 UTC m=+158.321694369" Feb 17 14:08:51 crc kubenswrapper[4836]: I0217 14:08:51.982373 4836 generic.go:334] "Generic (PLEG): container finished" podID="8762f2f2-8375-4fdd-8a29-ea2ab598afa1" containerID="fdc430f3f9d22a422de0b99423af704e6cc0b0c2a36fc9623c6db36600886e79" exitCode=0 Feb 17 14:08:51 crc kubenswrapper[4836]: I0217 14:08:51.982440 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vfmw4" event={"ID":"8762f2f2-8375-4fdd-8a29-ea2ab598afa1","Type":"ContainerDied","Data":"fdc430f3f9d22a422de0b99423af704e6cc0b0c2a36fc9623c6db36600886e79"} Feb 17 14:08:51 crc kubenswrapper[4836]: I0217 14:08:51.988135 4836 generic.go:334] "Generic (PLEG): container finished" podID="a172042c-7dc6-4cea-906e-3d9135523f15" containerID="fbdef3e9d702e26b2d9eab100a7cb39741759b5bc646072d63aa2cde6951ee43" exitCode=0 Feb 17 14:08:51 crc kubenswrapper[4836]: I0217 14:08:51.988218 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-252vj" event={"ID":"a172042c-7dc6-4cea-906e-3d9135523f15","Type":"ContainerDied","Data":"fbdef3e9d702e26b2d9eab100a7cb39741759b5bc646072d63aa2cde6951ee43"} Feb 17 14:08:51 crc kubenswrapper[4836]: I0217 14:08:51.988246 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-252vj" event={"ID":"a172042c-7dc6-4cea-906e-3d9135523f15","Type":"ContainerStarted","Data":"8ef482fc8eb2712be43ba1d606607d7a887e18d38349afed73ed063a65b62543"} Feb 17 14:08:51 crc kubenswrapper[4836]: I0217 14:08:51.996656 4836 generic.go:334] "Generic (PLEG): container finished" podID="f1bd4ed0-3b99-4446-9218-71bb589da4a4" containerID="9679e4b4c4f0f644eb56ce9a9ac7ad7178d79f35bc0d94642d6b2ded1809a114" exitCode=0 Feb 17 14:08:51 crc kubenswrapper[4836]: I0217 14:08:51.996719 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cmk55" event={"ID":"f1bd4ed0-3b99-4446-9218-71bb589da4a4","Type":"ContainerDied","Data":"9679e4b4c4f0f644eb56ce9a9ac7ad7178d79f35bc0d94642d6b2ded1809a114"} Feb 17 14:08:52 crc kubenswrapper[4836]: I0217 14:08:52.594447 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Feb 17 14:08:52 crc kubenswrapper[4836]: I0217 14:08:52.853183 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522280-c8fps" Feb 17 14:08:52 crc kubenswrapper[4836]: I0217 14:08:52.923036 4836 patch_prober.go:28] interesting pod/router-default-5444994796-fqzrl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 14:08:52 crc kubenswrapper[4836]: [-]has-synced failed: reason withheld Feb 17 14:08:52 crc kubenswrapper[4836]: [+]process-running ok Feb 17 14:08:52 crc kubenswrapper[4836]: healthz check failed Feb 17 14:08:52 crc kubenswrapper[4836]: I0217 14:08:52.923092 4836 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fqzrl" podUID="296ae94a-36e6-480b-9395-8f6a96621fdf" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 14:08:52 crc kubenswrapper[4836]: I0217 14:08:52.935524 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/91eb437c-beea-4f2d-b3f7-505b87fe6dee-secret-volume\") pod \"91eb437c-beea-4f2d-b3f7-505b87fe6dee\" (UID: \"91eb437c-beea-4f2d-b3f7-505b87fe6dee\") " Feb 17 14:08:52 crc kubenswrapper[4836]: I0217 14:08:52.935587 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5kdnd\" (UniqueName: \"kubernetes.io/projected/91eb437c-beea-4f2d-b3f7-505b87fe6dee-kube-api-access-5kdnd\") pod \"91eb437c-beea-4f2d-b3f7-505b87fe6dee\" (UID: \"91eb437c-beea-4f2d-b3f7-505b87fe6dee\") " Feb 17 14:08:52 crc kubenswrapper[4836]: I0217 14:08:52.935727 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/91eb437c-beea-4f2d-b3f7-505b87fe6dee-config-volume\") pod \"91eb437c-beea-4f2d-b3f7-505b87fe6dee\" (UID: \"91eb437c-beea-4f2d-b3f7-505b87fe6dee\") " Feb 17 14:08:52 crc kubenswrapper[4836]: I0217 14:08:52.936732 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91eb437c-beea-4f2d-b3f7-505b87fe6dee-config-volume" (OuterVolumeSpecName: "config-volume") pod "91eb437c-beea-4f2d-b3f7-505b87fe6dee" (UID: "91eb437c-beea-4f2d-b3f7-505b87fe6dee"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:08:52 crc kubenswrapper[4836]: I0217 14:08:52.965367 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91eb437c-beea-4f2d-b3f7-505b87fe6dee-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "91eb437c-beea-4f2d-b3f7-505b87fe6dee" (UID: "91eb437c-beea-4f2d-b3f7-505b87fe6dee"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:08:52 crc kubenswrapper[4836]: I0217 14:08:52.973060 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91eb437c-beea-4f2d-b3f7-505b87fe6dee-kube-api-access-5kdnd" (OuterVolumeSpecName: "kube-api-access-5kdnd") pod "91eb437c-beea-4f2d-b3f7-505b87fe6dee" (UID: "91eb437c-beea-4f2d-b3f7-505b87fe6dee"). InnerVolumeSpecName "kube-api-access-5kdnd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:08:53 crc kubenswrapper[4836]: I0217 14:08:53.036994 4836 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/91eb437c-beea-4f2d-b3f7-505b87fe6dee-config-volume\") on node \"crc\" DevicePath \"\"" Feb 17 14:08:53 crc kubenswrapper[4836]: I0217 14:08:53.037040 4836 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/91eb437c-beea-4f2d-b3f7-505b87fe6dee-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 17 14:08:53 crc kubenswrapper[4836]: I0217 14:08:53.037056 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5kdnd\" (UniqueName: \"kubernetes.io/projected/91eb437c-beea-4f2d-b3f7-505b87fe6dee-kube-api-access-5kdnd\") on node \"crc\" DevicePath \"\"" Feb 17 14:08:53 crc kubenswrapper[4836]: I0217 14:08:53.064813 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-6scjm" event={"ID":"c2f4f6fb-d604-402f-83d0-6b25781c3aa8","Type":"ContainerStarted","Data":"5545da0ab4aa419e6b59b3baf5274c6959da6ac1e41ca507d90e592cd6ad25c6"} Feb 17 14:08:53 crc kubenswrapper[4836]: I0217 14:08:53.068590 4836 generic.go:334] "Generic (PLEG): container finished" podID="88cf2bb1-d70f-4b82-9b9a-9d7c7a4244ff" containerID="6bb0c69e3bd51a902da87668ec1687c119230884a9fa4dff71d17ec28d9300e8" exitCode=0 Feb 17 14:08:53 crc kubenswrapper[4836]: I0217 14:08:53.068660 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5rfnm" event={"ID":"88cf2bb1-d70f-4b82-9b9a-9d7c7a4244ff","Type":"ContainerDied","Data":"6bb0c69e3bd51a902da87668ec1687c119230884a9fa4dff71d17ec28d9300e8"} Feb 17 14:08:53 crc kubenswrapper[4836]: I0217 14:08:53.070373 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522280-c8fps" event={"ID":"91eb437c-beea-4f2d-b3f7-505b87fe6dee","Type":"ContainerDied","Data":"d146dcdb5314cbe00f41e751cbb49254cf23ab6e8cdc06caed3ce29aac7230d2"} Feb 17 14:08:53 crc kubenswrapper[4836]: I0217 14:08:53.070394 4836 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d146dcdb5314cbe00f41e751cbb49254cf23ab6e8cdc06caed3ce29aac7230d2" Feb 17 14:08:53 crc kubenswrapper[4836]: I0217 14:08:53.070463 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522280-c8fps" Feb 17 14:08:53 crc kubenswrapper[4836]: I0217 14:08:53.085108 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-5vhz9" event={"ID":"4cd3f585-c95f-43ee-962c-ea33aff90415","Type":"ContainerStarted","Data":"bdf9c0182351aaefe8f64ad91ea1c51a4b7acfffd267691001139a7e914dc3b6"} Feb 17 14:08:53 crc kubenswrapper[4836]: I0217 14:08:53.085161 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-5vhz9" event={"ID":"4cd3f585-c95f-43ee-962c-ea33aff90415","Type":"ContainerStarted","Data":"b92bf709add22f9c57e92a26debc7c9604b5ddd76791fbcef0b8821c381eba8e"} Feb 17 14:08:53 crc kubenswrapper[4836]: I0217 14:08:53.085206 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-5vhz9" Feb 17 14:08:53 crc kubenswrapper[4836]: I0217 14:08:53.088539 4836 generic.go:334] "Generic (PLEG): container finished" podID="caed4fb3-6dd4-4427-880f-fee413854d48" containerID="894f04d8bb69cf58bea3ab4206057ab2e51ebe330575f3a10c3c1c616fcfa44c" exitCode=0 Feb 17 14:08:53 crc kubenswrapper[4836]: I0217 14:08:53.088615 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"caed4fb3-6dd4-4427-880f-fee413854d48","Type":"ContainerDied","Data":"894f04d8bb69cf58bea3ab4206057ab2e51ebe330575f3a10c3c1c616fcfa44c"} Feb 17 14:08:53 crc kubenswrapper[4836]: I0217 14:08:53.105360 4836 generic.go:334] "Generic (PLEG): container finished" podID="895e5f35-c3c6-46b6-878c-6d9a47b6221f" containerID="4866eba95fc74299a5d4d267763f0b47fa1876ffea3c2307e4ea9572f0fa5ed5" exitCode=0 Feb 17 14:08:53 crc kubenswrapper[4836]: I0217 14:08:53.106055 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"895e5f35-c3c6-46b6-878c-6d9a47b6221f","Type":"ContainerDied","Data":"4866eba95fc74299a5d4d267763f0b47fa1876ffea3c2307e4ea9572f0fa5ed5"} Feb 17 14:08:53 crc kubenswrapper[4836]: I0217 14:08:53.161621 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-6scjm" podStartSLOduration=19.161590515 podStartE2EDuration="19.161590515s" podCreationTimestamp="2026-02-17 14:08:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:08:53.16104461 +0000 UTC m=+159.503972899" watchObservedRunningTime="2026-02-17 14:08:53.161590515 +0000 UTC m=+159.504518784" Feb 17 14:08:53 crc kubenswrapper[4836]: I0217 14:08:53.264158 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-5vhz9" podStartSLOduration=138.264112867 podStartE2EDuration="2m18.264112867s" podCreationTimestamp="2026-02-17 14:06:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:08:53.263826959 +0000 UTC m=+159.606755238" watchObservedRunningTime="2026-02-17 14:08:53.264112867 +0000 UTC m=+159.607041156" Feb 17 14:08:53 crc kubenswrapper[4836]: I0217 14:08:53.306032 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-284hg" Feb 17 14:08:53 crc kubenswrapper[4836]: I0217 14:08:53.617719 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-cnq25" Feb 17 14:08:53 crc kubenswrapper[4836]: I0217 14:08:53.618285 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-cnq25" Feb 17 14:08:53 crc kubenswrapper[4836]: I0217 14:08:53.941522 4836 patch_prober.go:28] interesting pod/router-default-5444994796-fqzrl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 14:08:53 crc kubenswrapper[4836]: [-]has-synced failed: reason withheld Feb 17 14:08:53 crc kubenswrapper[4836]: [+]process-running ok Feb 17 14:08:53 crc kubenswrapper[4836]: healthz check failed Feb 17 14:08:53 crc kubenswrapper[4836]: I0217 14:08:53.941596 4836 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fqzrl" podUID="296ae94a-36e6-480b-9395-8f6a96621fdf" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 14:08:54 crc kubenswrapper[4836]: I0217 14:08:54.072099 4836 patch_prober.go:28] interesting pod/apiserver-76f77b778f-cnq25 container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Feb 17 14:08:54 crc kubenswrapper[4836]: [+]log ok Feb 17 14:08:54 crc kubenswrapper[4836]: [+]etcd ok Feb 17 14:08:54 crc kubenswrapper[4836]: [+]poststarthook/start-apiserver-admission-initializer ok Feb 17 14:08:54 crc kubenswrapper[4836]: [+]poststarthook/generic-apiserver-start-informers ok Feb 17 14:08:54 crc kubenswrapper[4836]: [+]poststarthook/max-in-flight-filter ok Feb 17 14:08:54 crc kubenswrapper[4836]: [+]poststarthook/storage-object-count-tracker-hook ok Feb 17 14:08:54 crc kubenswrapper[4836]: [+]poststarthook/image.openshift.io-apiserver-caches ok Feb 17 14:08:54 crc kubenswrapper[4836]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Feb 17 14:08:54 crc kubenswrapper[4836]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Feb 17 14:08:54 crc kubenswrapper[4836]: [+]poststarthook/project.openshift.io-projectcache ok Feb 17 14:08:54 crc kubenswrapper[4836]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Feb 17 14:08:54 crc kubenswrapper[4836]: [-]poststarthook/openshift.io-startinformers failed: reason withheld Feb 17 14:08:54 crc kubenswrapper[4836]: [-]poststarthook/openshift.io-restmapperupdater failed: reason withheld Feb 17 14:08:54 crc kubenswrapper[4836]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Feb 17 14:08:54 crc kubenswrapper[4836]: livez check failed Feb 17 14:08:54 crc kubenswrapper[4836]: I0217 14:08:54.072222 4836 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-cnq25" podUID="66402e53-3287-45c4-bceb-78fc99836c5b" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 14:08:54 crc kubenswrapper[4836]: I0217 14:08:54.929020 4836 patch_prober.go:28] interesting pod/router-default-5444994796-fqzrl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 14:08:54 crc kubenswrapper[4836]: [-]has-synced failed: reason withheld Feb 17 14:08:54 crc kubenswrapper[4836]: [+]process-running ok Feb 17 14:08:54 crc kubenswrapper[4836]: healthz check failed Feb 17 14:08:54 crc kubenswrapper[4836]: I0217 14:08:54.929121 4836 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fqzrl" podUID="296ae94a-36e6-480b-9395-8f6a96621fdf" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 14:08:55 crc kubenswrapper[4836]: I0217 14:08:55.634393 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 17 14:08:55 crc kubenswrapper[4836]: I0217 14:08:55.692534 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/895e5f35-c3c6-46b6-878c-6d9a47b6221f-kube-api-access\") pod \"895e5f35-c3c6-46b6-878c-6d9a47b6221f\" (UID: \"895e5f35-c3c6-46b6-878c-6d9a47b6221f\") " Feb 17 14:08:55 crc kubenswrapper[4836]: I0217 14:08:55.692910 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/895e5f35-c3c6-46b6-878c-6d9a47b6221f-kubelet-dir\") pod \"895e5f35-c3c6-46b6-878c-6d9a47b6221f\" (UID: \"895e5f35-c3c6-46b6-878c-6d9a47b6221f\") " Feb 17 14:08:55 crc kubenswrapper[4836]: I0217 14:08:55.693449 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/895e5f35-c3c6-46b6-878c-6d9a47b6221f-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "895e5f35-c3c6-46b6-878c-6d9a47b6221f" (UID: "895e5f35-c3c6-46b6-878c-6d9a47b6221f"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 14:08:55 crc kubenswrapper[4836]: I0217 14:08:55.807121 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/895e5f35-c3c6-46b6-878c-6d9a47b6221f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "895e5f35-c3c6-46b6-878c-6d9a47b6221f" (UID: "895e5f35-c3c6-46b6-878c-6d9a47b6221f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:08:55 crc kubenswrapper[4836]: I0217 14:08:55.807618 4836 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/895e5f35-c3c6-46b6-878c-6d9a47b6221f-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 17 14:08:55 crc kubenswrapper[4836]: I0217 14:08:55.807644 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/895e5f35-c3c6-46b6-878c-6d9a47b6221f-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 17 14:08:56 crc kubenswrapper[4836]: I0217 14:08:56.020360 4836 patch_prober.go:28] interesting pod/router-default-5444994796-fqzrl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 14:08:56 crc kubenswrapper[4836]: [-]has-synced failed: reason withheld Feb 17 14:08:56 crc kubenswrapper[4836]: [+]process-running ok Feb 17 14:08:56 crc kubenswrapper[4836]: healthz check failed Feb 17 14:08:56 crc kubenswrapper[4836]: I0217 14:08:56.020495 4836 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fqzrl" podUID="296ae94a-36e6-480b-9395-8f6a96621fdf" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 14:08:56 crc kubenswrapper[4836]: I0217 14:08:56.084237 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 17 14:08:56 crc kubenswrapper[4836]: I0217 14:08:56.199999 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/caed4fb3-6dd4-4427-880f-fee413854d48-kube-api-access\") pod \"caed4fb3-6dd4-4427-880f-fee413854d48\" (UID: \"caed4fb3-6dd4-4427-880f-fee413854d48\") " Feb 17 14:08:56 crc kubenswrapper[4836]: I0217 14:08:56.200064 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/caed4fb3-6dd4-4427-880f-fee413854d48-kubelet-dir\") pod \"caed4fb3-6dd4-4427-880f-fee413854d48\" (UID: \"caed4fb3-6dd4-4427-880f-fee413854d48\") " Feb 17 14:08:56 crc kubenswrapper[4836]: I0217 14:08:56.200449 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/caed4fb3-6dd4-4427-880f-fee413854d48-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "caed4fb3-6dd4-4427-880f-fee413854d48" (UID: "caed4fb3-6dd4-4427-880f-fee413854d48"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 14:08:56 crc kubenswrapper[4836]: I0217 14:08:56.265912 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/caed4fb3-6dd4-4427-880f-fee413854d48-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "caed4fb3-6dd4-4427-880f-fee413854d48" (UID: "caed4fb3-6dd4-4427-880f-fee413854d48"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:08:56 crc kubenswrapper[4836]: I0217 14:08:56.337731 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/caed4fb3-6dd4-4427-880f-fee413854d48-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 17 14:08:56 crc kubenswrapper[4836]: I0217 14:08:56.337856 4836 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/caed4fb3-6dd4-4427-880f-fee413854d48-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 17 14:08:56 crc kubenswrapper[4836]: I0217 14:08:56.415427 4836 patch_prober.go:28] interesting pod/downloads-7954f5f757-5cbbv container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Feb 17 14:08:56 crc kubenswrapper[4836]: I0217 14:08:56.415458 4836 patch_prober.go:28] interesting pod/downloads-7954f5f757-5cbbv container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Feb 17 14:08:56 crc kubenswrapper[4836]: I0217 14:08:56.415525 4836 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-5cbbv" podUID="d9eb5c8b-f3c7-4068-82c7-28520f6905c6" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Feb 17 14:08:56 crc kubenswrapper[4836]: I0217 14:08:56.415600 4836 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-5cbbv" podUID="d9eb5c8b-f3c7-4068-82c7-28520f6905c6" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Feb 17 14:08:56 crc kubenswrapper[4836]: I0217 14:08:56.449132 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"caed4fb3-6dd4-4427-880f-fee413854d48","Type":"ContainerDied","Data":"b0ae66c7e07c61466cd3f90b98740cfd0b7ef75ac524fdbc34cb7a0d3e897bbf"} Feb 17 14:08:56 crc kubenswrapper[4836]: I0217 14:08:56.449185 4836 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b0ae66c7e07c61466cd3f90b98740cfd0b7ef75ac524fdbc34cb7a0d3e897bbf" Feb 17 14:08:56 crc kubenswrapper[4836]: I0217 14:08:56.449215 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 17 14:08:56 crc kubenswrapper[4836]: I0217 14:08:56.523986 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"895e5f35-c3c6-46b6-878c-6d9a47b6221f","Type":"ContainerDied","Data":"9a71104ba91fe474c5ec1895a885d742689d99f71786a709bb26ffc4e5fce4b7"} Feb 17 14:08:56 crc kubenswrapper[4836]: I0217 14:08:56.524028 4836 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9a71104ba91fe474c5ec1895a885d742689d99f71786a709bb26ffc4e5fce4b7" Feb 17 14:08:56 crc kubenswrapper[4836]: I0217 14:08:56.524143 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 17 14:08:57 crc kubenswrapper[4836]: I0217 14:08:57.049888 4836 patch_prober.go:28] interesting pod/router-default-5444994796-fqzrl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 14:08:57 crc kubenswrapper[4836]: [-]has-synced failed: reason withheld Feb 17 14:08:57 crc kubenswrapper[4836]: [+]process-running ok Feb 17 14:08:57 crc kubenswrapper[4836]: healthz check failed Feb 17 14:08:57 crc kubenswrapper[4836]: I0217 14:08:57.050525 4836 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fqzrl" podUID="296ae94a-36e6-480b-9395-8f6a96621fdf" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 14:08:57 crc kubenswrapper[4836]: I0217 14:08:57.927573 4836 patch_prober.go:28] interesting pod/router-default-5444994796-fqzrl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 14:08:57 crc kubenswrapper[4836]: [-]has-synced failed: reason withheld Feb 17 14:08:57 crc kubenswrapper[4836]: [+]process-running ok Feb 17 14:08:57 crc kubenswrapper[4836]: healthz check failed Feb 17 14:08:57 crc kubenswrapper[4836]: I0217 14:08:57.927636 4836 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fqzrl" podUID="296ae94a-36e6-480b-9395-8f6a96621fdf" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 14:08:58 crc kubenswrapper[4836]: I0217 14:08:58.030590 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c-metrics-certs\") pod \"network-metrics-daemon-c4txt\" (UID: \"8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c\") " pod="openshift-multus/network-metrics-daemon-c4txt" Feb 17 14:08:58 crc kubenswrapper[4836]: I0217 14:08:58.052569 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c-metrics-certs\") pod \"network-metrics-daemon-c4txt\" (UID: \"8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c\") " pod="openshift-multus/network-metrics-daemon-c4txt" Feb 17 14:08:58 crc kubenswrapper[4836]: I0217 14:08:58.068379 4836 patch_prober.go:28] interesting pod/console-f9d7485db-6zspj container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.16:8443/health\": dial tcp 10.217.0.16:8443: connect: connection refused" start-of-body= Feb 17 14:08:58 crc kubenswrapper[4836]: I0217 14:08:58.068487 4836 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-6zspj" podUID="6d52104b-91e7-4a3a-9138-163eb850485d" containerName="console" probeResult="failure" output="Get \"https://10.217.0.16:8443/health\": dial tcp 10.217.0.16:8443: connect: connection refused" Feb 17 14:08:58 crc kubenswrapper[4836]: I0217 14:08:58.311079 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c4txt" Feb 17 14:08:58 crc kubenswrapper[4836]: I0217 14:08:58.708760 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-cnq25" Feb 17 14:08:58 crc kubenswrapper[4836]: I0217 14:08:58.785894 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-cnq25" Feb 17 14:09:00 crc kubenswrapper[4836]: I0217 14:09:00.791146 4836 patch_prober.go:28] interesting pod/openshift-kube-scheduler-crc container/kube-scheduler namespace/openshift-kube-scheduler: Readiness probe status=failure output="Get \"https://192.168.126.11:10259/healthz\": context deadline exceeded" start-of-body= Feb 17 14:09:00 crc kubenswrapper[4836]: I0217 14:09:00.791609 4836 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podUID="3dcd261975c3d6b9a6ad6367fd4facd3" containerName="kube-scheduler" probeResult="failure" output="Get \"https://192.168.126.11:10259/healthz\": context deadline exceeded" Feb 17 14:09:01 crc kubenswrapper[4836]: I0217 14:09:01.003164 4836 patch_prober.go:28] interesting pod/router-default-5444994796-fqzrl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 14:09:01 crc kubenswrapper[4836]: [-]has-synced failed: reason withheld Feb 17 14:09:01 crc kubenswrapper[4836]: [+]process-running ok Feb 17 14:09:01 crc kubenswrapper[4836]: healthz check failed Feb 17 14:09:01 crc kubenswrapper[4836]: I0217 14:09:01.003448 4836 patch_prober.go:28] interesting pod/machine-config-daemon-bkk9g container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 17 14:09:01 crc kubenswrapper[4836]: I0217 14:09:01.003530 4836 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" podUID="895a19c9-a3f0-4a15-aa19-19347121388c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused (Client.Timeout exceeded while awaiting headers)" Feb 17 14:09:01 crc kubenswrapper[4836]: I0217 14:09:01.005767 4836 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fqzrl" podUID="296ae94a-36e6-480b-9395-8f6a96621fdf" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 14:09:01 crc kubenswrapper[4836]: I0217 14:09:01.010507 4836 patch_prober.go:28] interesting pod/router-default-5444994796-fqzrl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 14:09:01 crc kubenswrapper[4836]: [-]has-synced failed: reason withheld Feb 17 14:09:01 crc kubenswrapper[4836]: [+]process-running ok Feb 17 14:09:01 crc kubenswrapper[4836]: healthz check failed Feb 17 14:09:01 crc kubenswrapper[4836]: I0217 14:09:01.010578 4836 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fqzrl" podUID="296ae94a-36e6-480b-9395-8f6a96621fdf" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 14:09:01 crc kubenswrapper[4836]: I0217 14:09:01.947008 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-c4txt"] Feb 17 14:09:01 crc kubenswrapper[4836]: I0217 14:09:01.952509 4836 patch_prober.go:28] interesting pod/router-default-5444994796-fqzrl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 14:09:01 crc kubenswrapper[4836]: [-]has-synced failed: reason withheld Feb 17 14:09:01 crc kubenswrapper[4836]: [+]process-running ok Feb 17 14:09:01 crc kubenswrapper[4836]: healthz check failed Feb 17 14:09:01 crc kubenswrapper[4836]: I0217 14:09:01.952575 4836 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fqzrl" podUID="296ae94a-36e6-480b-9395-8f6a96621fdf" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 14:09:02 crc kubenswrapper[4836]: I0217 14:09:02.921520 4836 patch_prober.go:28] interesting pod/router-default-5444994796-fqzrl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 14:09:02 crc kubenswrapper[4836]: [-]has-synced failed: reason withheld Feb 17 14:09:02 crc kubenswrapper[4836]: [+]process-running ok Feb 17 14:09:02 crc kubenswrapper[4836]: healthz check failed Feb 17 14:09:02 crc kubenswrapper[4836]: I0217 14:09:02.921682 4836 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fqzrl" podUID="296ae94a-36e6-480b-9395-8f6a96621fdf" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 14:09:03 crc kubenswrapper[4836]: I0217 14:09:03.279174 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-c4txt" event={"ID":"8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c","Type":"ContainerStarted","Data":"016aa4d7249177ed49714a2acb840cce0bfb12481beb7a8fa1a30cc84f4bbaa2"} Feb 17 14:09:03 crc kubenswrapper[4836]: I0217 14:09:03.920327 4836 patch_prober.go:28] interesting pod/router-default-5444994796-fqzrl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 14:09:03 crc kubenswrapper[4836]: [-]has-synced failed: reason withheld Feb 17 14:09:03 crc kubenswrapper[4836]: [+]process-running ok Feb 17 14:09:03 crc kubenswrapper[4836]: healthz check failed Feb 17 14:09:03 crc kubenswrapper[4836]: I0217 14:09:03.920387 4836 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fqzrl" podUID="296ae94a-36e6-480b-9395-8f6a96621fdf" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 14:09:04 crc kubenswrapper[4836]: I0217 14:09:04.339826 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-c4txt" event={"ID":"8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c","Type":"ContainerStarted","Data":"4b8910ff1472227e3b9e3d130a0d5ea1b05c5c942dff038408499b0c5bd79471"} Feb 17 14:09:04 crc kubenswrapper[4836]: I0217 14:09:04.923244 4836 patch_prober.go:28] interesting pod/router-default-5444994796-fqzrl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 14:09:04 crc kubenswrapper[4836]: [-]has-synced failed: reason withheld Feb 17 14:09:04 crc kubenswrapper[4836]: [+]process-running ok Feb 17 14:09:04 crc kubenswrapper[4836]: healthz check failed Feb 17 14:09:04 crc kubenswrapper[4836]: I0217 14:09:04.923585 4836 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fqzrl" podUID="296ae94a-36e6-480b-9395-8f6a96621fdf" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 14:09:05 crc kubenswrapper[4836]: I0217 14:09:05.381445 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-c4txt" event={"ID":"8fcd2b6e-6e9f-4bbc-8625-2f21ab4b2c7c","Type":"ContainerStarted","Data":"370bf7639c10502d35b797eb6839b9eb3917522b465c0a4dd7664837b6787193"} Feb 17 14:09:05 crc kubenswrapper[4836]: I0217 14:09:05.421810 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-c4txt" podStartSLOduration=150.421758751 podStartE2EDuration="2m30.421758751s" podCreationTimestamp="2026-02-17 14:06:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:09:05.406736112 +0000 UTC m=+171.749664401" watchObservedRunningTime="2026-02-17 14:09:05.421758751 +0000 UTC m=+171.764687040" Feb 17 14:09:05 crc kubenswrapper[4836]: I0217 14:09:05.954910 4836 patch_prober.go:28] interesting pod/router-default-5444994796-fqzrl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 14:09:05 crc kubenswrapper[4836]: [-]has-synced failed: reason withheld Feb 17 14:09:05 crc kubenswrapper[4836]: [+]process-running ok Feb 17 14:09:05 crc kubenswrapper[4836]: healthz check failed Feb 17 14:09:05 crc kubenswrapper[4836]: I0217 14:09:05.954978 4836 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fqzrl" podUID="296ae94a-36e6-480b-9395-8f6a96621fdf" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 14:09:06 crc kubenswrapper[4836]: I0217 14:09:06.445185 4836 patch_prober.go:28] interesting pod/downloads-7954f5f757-5cbbv container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Feb 17 14:09:06 crc kubenswrapper[4836]: I0217 14:09:06.445208 4836 patch_prober.go:28] interesting pod/downloads-7954f5f757-5cbbv container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Feb 17 14:09:06 crc kubenswrapper[4836]: I0217 14:09:06.445257 4836 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-5cbbv" podUID="d9eb5c8b-f3c7-4068-82c7-28520f6905c6" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Feb 17 14:09:06 crc kubenswrapper[4836]: I0217 14:09:06.445325 4836 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-5cbbv" podUID="d9eb5c8b-f3c7-4068-82c7-28520f6905c6" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Feb 17 14:09:06 crc kubenswrapper[4836]: I0217 14:09:06.450274 4836 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console/downloads-7954f5f757-5cbbv" Feb 17 14:09:06 crc kubenswrapper[4836]: I0217 14:09:06.450852 4836 patch_prober.go:28] interesting pod/downloads-7954f5f757-5cbbv container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Feb 17 14:09:06 crc kubenswrapper[4836]: I0217 14:09:06.450895 4836 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-5cbbv" podUID="d9eb5c8b-f3c7-4068-82c7-28520f6905c6" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Feb 17 14:09:06 crc kubenswrapper[4836]: I0217 14:09:06.450924 4836 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="download-server" containerStatusID={"Type":"cri-o","ID":"92b59bab9fd909d359405ecf217a49ab1de8122281a49768577c5a706060d118"} pod="openshift-console/downloads-7954f5f757-5cbbv" containerMessage="Container download-server failed liveness probe, will be restarted" Feb 17 14:09:06 crc kubenswrapper[4836]: I0217 14:09:06.451036 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/downloads-7954f5f757-5cbbv" podUID="d9eb5c8b-f3c7-4068-82c7-28520f6905c6" containerName="download-server" containerID="cri-o://92b59bab9fd909d359405ecf217a49ab1de8122281a49768577c5a706060d118" gracePeriod=2 Feb 17 14:09:07 crc kubenswrapper[4836]: I0217 14:09:07.001162 4836 patch_prober.go:28] interesting pod/router-default-5444994796-fqzrl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 14:09:07 crc kubenswrapper[4836]: [-]has-synced failed: reason withheld Feb 17 14:09:07 crc kubenswrapper[4836]: [+]process-running ok Feb 17 14:09:07 crc kubenswrapper[4836]: healthz check failed Feb 17 14:09:07 crc kubenswrapper[4836]: I0217 14:09:07.001780 4836 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fqzrl" podUID="296ae94a-36e6-480b-9395-8f6a96621fdf" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 14:09:07 crc kubenswrapper[4836]: I0217 14:09:07.497423 4836 generic.go:334] "Generic (PLEG): container finished" podID="d9eb5c8b-f3c7-4068-82c7-28520f6905c6" containerID="92b59bab9fd909d359405ecf217a49ab1de8122281a49768577c5a706060d118" exitCode=0 Feb 17 14:09:07 crc kubenswrapper[4836]: I0217 14:09:07.497515 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-5cbbv" event={"ID":"d9eb5c8b-f3c7-4068-82c7-28520f6905c6","Type":"ContainerDied","Data":"92b59bab9fd909d359405ecf217a49ab1de8122281a49768577c5a706060d118"} Feb 17 14:09:07 crc kubenswrapper[4836]: I0217 14:09:07.497559 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-5cbbv" event={"ID":"d9eb5c8b-f3c7-4068-82c7-28520f6905c6","Type":"ContainerStarted","Data":"9f7cb281e045dc9dce4b8664374b7c5b4f753c5186831200e7b466bbba132db3"} Feb 17 14:09:07 crc kubenswrapper[4836]: I0217 14:09:07.499315 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-5cbbv" Feb 17 14:09:07 crc kubenswrapper[4836]: I0217 14:09:07.499413 4836 patch_prober.go:28] interesting pod/downloads-7954f5f757-5cbbv container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Feb 17 14:09:07 crc kubenswrapper[4836]: I0217 14:09:07.499448 4836 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-5cbbv" podUID="d9eb5c8b-f3c7-4068-82c7-28520f6905c6" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Feb 17 14:09:07 crc kubenswrapper[4836]: I0217 14:09:07.957882 4836 patch_prober.go:28] interesting pod/router-default-5444994796-fqzrl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 14:09:07 crc kubenswrapper[4836]: [-]has-synced failed: reason withheld Feb 17 14:09:07 crc kubenswrapper[4836]: [+]process-running ok Feb 17 14:09:07 crc kubenswrapper[4836]: healthz check failed Feb 17 14:09:07 crc kubenswrapper[4836]: I0217 14:09:07.958613 4836 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fqzrl" podUID="296ae94a-36e6-480b-9395-8f6a96621fdf" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 14:09:08 crc kubenswrapper[4836]: I0217 14:09:08.146726 4836 patch_prober.go:28] interesting pod/console-f9d7485db-6zspj container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.16:8443/health\": dial tcp 10.217.0.16:8443: connect: connection refused" start-of-body= Feb 17 14:09:08 crc kubenswrapper[4836]: I0217 14:09:08.146941 4836 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-6zspj" podUID="6d52104b-91e7-4a3a-9138-163eb850485d" containerName="console" probeResult="failure" output="Get \"https://10.217.0.16:8443/health\": dial tcp 10.217.0.16:8443: connect: connection refused" Feb 17 14:09:08 crc kubenswrapper[4836]: I0217 14:09:08.720906 4836 patch_prober.go:28] interesting pod/downloads-7954f5f757-5cbbv container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Feb 17 14:09:08 crc kubenswrapper[4836]: I0217 14:09:08.721679 4836 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-5cbbv" podUID="d9eb5c8b-f3c7-4068-82c7-28520f6905c6" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Feb 17 14:09:09 crc kubenswrapper[4836]: I0217 14:09:08.958259 4836 patch_prober.go:28] interesting pod/router-default-5444994796-fqzrl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 14:09:09 crc kubenswrapper[4836]: [-]has-synced failed: reason withheld Feb 17 14:09:09 crc kubenswrapper[4836]: [+]process-running ok Feb 17 14:09:09 crc kubenswrapper[4836]: healthz check failed Feb 17 14:09:09 crc kubenswrapper[4836]: I0217 14:09:08.958529 4836 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fqzrl" podUID="296ae94a-36e6-480b-9395-8f6a96621fdf" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 14:09:09 crc kubenswrapper[4836]: I0217 14:09:09.921061 4836 patch_prober.go:28] interesting pod/router-default-5444994796-fqzrl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 14:09:09 crc kubenswrapper[4836]: [-]has-synced failed: reason withheld Feb 17 14:09:09 crc kubenswrapper[4836]: [+]process-running ok Feb 17 14:09:09 crc kubenswrapper[4836]: healthz check failed Feb 17 14:09:09 crc kubenswrapper[4836]: I0217 14:09:09.921200 4836 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fqzrl" podUID="296ae94a-36e6-480b-9395-8f6a96621fdf" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 14:09:10 crc kubenswrapper[4836]: I0217 14:09:10.924045 4836 patch_prober.go:28] interesting pod/router-default-5444994796-fqzrl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 14:09:10 crc kubenswrapper[4836]: [-]has-synced failed: reason withheld Feb 17 14:09:10 crc kubenswrapper[4836]: [+]process-running ok Feb 17 14:09:10 crc kubenswrapper[4836]: healthz check failed Feb 17 14:09:10 crc kubenswrapper[4836]: I0217 14:09:10.924168 4836 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fqzrl" podUID="296ae94a-36e6-480b-9395-8f6a96621fdf" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 14:09:11 crc kubenswrapper[4836]: I0217 14:09:11.525208 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-5vhz9" Feb 17 14:09:11 crc kubenswrapper[4836]: I0217 14:09:11.991769 4836 patch_prober.go:28] interesting pod/router-default-5444994796-fqzrl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 14:09:11 crc kubenswrapper[4836]: [-]has-synced failed: reason withheld Feb 17 14:09:11 crc kubenswrapper[4836]: [+]process-running ok Feb 17 14:09:11 crc kubenswrapper[4836]: healthz check failed Feb 17 14:09:11 crc kubenswrapper[4836]: I0217 14:09:11.991840 4836 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fqzrl" podUID="296ae94a-36e6-480b-9395-8f6a96621fdf" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 14:09:12 crc kubenswrapper[4836]: I0217 14:09:12.998022 4836 patch_prober.go:28] interesting pod/router-default-5444994796-fqzrl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 14:09:12 crc kubenswrapper[4836]: [-]has-synced failed: reason withheld Feb 17 14:09:12 crc kubenswrapper[4836]: [+]process-running ok Feb 17 14:09:12 crc kubenswrapper[4836]: healthz check failed Feb 17 14:09:12 crc kubenswrapper[4836]: I0217 14:09:12.998513 4836 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fqzrl" podUID="296ae94a-36e6-480b-9395-8f6a96621fdf" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 14:09:13 crc kubenswrapper[4836]: I0217 14:09:13.953407 4836 patch_prober.go:28] interesting pod/router-default-5444994796-fqzrl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 14:09:13 crc kubenswrapper[4836]: [-]has-synced failed: reason withheld Feb 17 14:09:13 crc kubenswrapper[4836]: [+]process-running ok Feb 17 14:09:13 crc kubenswrapper[4836]: healthz check failed Feb 17 14:09:13 crc kubenswrapper[4836]: I0217 14:09:13.953534 4836 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fqzrl" podUID="296ae94a-36e6-480b-9395-8f6a96621fdf" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 14:09:15 crc kubenswrapper[4836]: I0217 14:09:15.004919 4836 patch_prober.go:28] interesting pod/router-default-5444994796-fqzrl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 14:09:15 crc kubenswrapper[4836]: [-]has-synced failed: reason withheld Feb 17 14:09:15 crc kubenswrapper[4836]: [+]process-running ok Feb 17 14:09:15 crc kubenswrapper[4836]: healthz check failed Feb 17 14:09:15 crc kubenswrapper[4836]: I0217 14:09:15.005718 4836 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fqzrl" podUID="296ae94a-36e6-480b-9395-8f6a96621fdf" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 14:09:15 crc kubenswrapper[4836]: I0217 14:09:15.921078 4836 patch_prober.go:28] interesting pod/router-default-5444994796-fqzrl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 14:09:15 crc kubenswrapper[4836]: [-]has-synced failed: reason withheld Feb 17 14:09:15 crc kubenswrapper[4836]: [+]process-running ok Feb 17 14:09:15 crc kubenswrapper[4836]: healthz check failed Feb 17 14:09:15 crc kubenswrapper[4836]: I0217 14:09:15.921157 4836 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fqzrl" podUID="296ae94a-36e6-480b-9395-8f6a96621fdf" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 14:09:16 crc kubenswrapper[4836]: I0217 14:09:16.425907 4836 patch_prober.go:28] interesting pod/downloads-7954f5f757-5cbbv container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Feb 17 14:09:16 crc kubenswrapper[4836]: I0217 14:09:16.426036 4836 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-5cbbv" podUID="d9eb5c8b-f3c7-4068-82c7-28520f6905c6" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Feb 17 14:09:16 crc kubenswrapper[4836]: I0217 14:09:16.427269 4836 patch_prober.go:28] interesting pod/downloads-7954f5f757-5cbbv container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Feb 17 14:09:16 crc kubenswrapper[4836]: I0217 14:09:16.427443 4836 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-5cbbv" podUID="d9eb5c8b-f3c7-4068-82c7-28520f6905c6" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Feb 17 14:09:16 crc kubenswrapper[4836]: I0217 14:09:16.924613 4836 patch_prober.go:28] interesting pod/router-default-5444994796-fqzrl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 14:09:16 crc kubenswrapper[4836]: [+]has-synced ok Feb 17 14:09:16 crc kubenswrapper[4836]: [+]process-running ok Feb 17 14:09:16 crc kubenswrapper[4836]: healthz check failed Feb 17 14:09:16 crc kubenswrapper[4836]: I0217 14:09:16.925160 4836 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-fqzrl" podUID="296ae94a-36e6-480b-9395-8f6a96621fdf" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 14:09:17 crc kubenswrapper[4836]: I0217 14:09:17.940938 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-fqzrl" Feb 17 14:09:17 crc kubenswrapper[4836]: I0217 14:09:17.955824 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-blr59" Feb 17 14:09:17 crc kubenswrapper[4836]: I0217 14:09:17.973070 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-fqzrl" Feb 17 14:09:18 crc kubenswrapper[4836]: I0217 14:09:18.074470 4836 patch_prober.go:28] interesting pod/console-f9d7485db-6zspj container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.16:8443/health\": dial tcp 10.217.0.16:8443: connect: connection refused" start-of-body= Feb 17 14:09:18 crc kubenswrapper[4836]: I0217 14:09:18.074536 4836 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-6zspj" podUID="6d52104b-91e7-4a3a-9138-163eb850485d" containerName="console" probeResult="failure" output="Get \"https://10.217.0.16:8443/health\": dial tcp 10.217.0.16:8443: connect: connection refused" Feb 17 14:09:23 crc kubenswrapper[4836]: I0217 14:09:23.771679 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 14:09:26 crc kubenswrapper[4836]: I0217 14:09:26.322282 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 17 14:09:26 crc kubenswrapper[4836]: E0217 14:09:26.322980 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91eb437c-beea-4f2d-b3f7-505b87fe6dee" containerName="collect-profiles" Feb 17 14:09:26 crc kubenswrapper[4836]: I0217 14:09:26.322999 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="91eb437c-beea-4f2d-b3f7-505b87fe6dee" containerName="collect-profiles" Feb 17 14:09:26 crc kubenswrapper[4836]: E0217 14:09:26.323026 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="895e5f35-c3c6-46b6-878c-6d9a47b6221f" containerName="pruner" Feb 17 14:09:26 crc kubenswrapper[4836]: I0217 14:09:26.323037 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="895e5f35-c3c6-46b6-878c-6d9a47b6221f" containerName="pruner" Feb 17 14:09:26 crc kubenswrapper[4836]: E0217 14:09:26.323057 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="caed4fb3-6dd4-4427-880f-fee413854d48" containerName="pruner" Feb 17 14:09:26 crc kubenswrapper[4836]: I0217 14:09:26.323068 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="caed4fb3-6dd4-4427-880f-fee413854d48" containerName="pruner" Feb 17 14:09:26 crc kubenswrapper[4836]: I0217 14:09:26.323247 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="91eb437c-beea-4f2d-b3f7-505b87fe6dee" containerName="collect-profiles" Feb 17 14:09:26 crc kubenswrapper[4836]: I0217 14:09:26.323378 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="caed4fb3-6dd4-4427-880f-fee413854d48" containerName="pruner" Feb 17 14:09:26 crc kubenswrapper[4836]: I0217 14:09:26.323394 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="895e5f35-c3c6-46b6-878c-6d9a47b6221f" containerName="pruner" Feb 17 14:09:26 crc kubenswrapper[4836]: I0217 14:09:26.324006 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 17 14:09:26 crc kubenswrapper[4836]: I0217 14:09:26.330005 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 17 14:09:26 crc kubenswrapper[4836]: I0217 14:09:26.330391 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 17 14:09:26 crc kubenswrapper[4836]: I0217 14:09:26.339522 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 17 14:09:26 crc kubenswrapper[4836]: I0217 14:09:26.419366 4836 patch_prober.go:28] interesting pod/downloads-7954f5f757-5cbbv container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Feb 17 14:09:26 crc kubenswrapper[4836]: I0217 14:09:26.419440 4836 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-5cbbv" podUID="d9eb5c8b-f3c7-4068-82c7-28520f6905c6" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Feb 17 14:09:26 crc kubenswrapper[4836]: I0217 14:09:26.436512 4836 patch_prober.go:28] interesting pod/downloads-7954f5f757-5cbbv container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Feb 17 14:09:26 crc kubenswrapper[4836]: I0217 14:09:26.436593 4836 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-5cbbv" podUID="d9eb5c8b-f3c7-4068-82c7-28520f6905c6" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Feb 17 14:09:26 crc kubenswrapper[4836]: I0217 14:09:26.566467 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c99af120-e5bb-45a6-baec-1157b240bda6-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"c99af120-e5bb-45a6-baec-1157b240bda6\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 17 14:09:26 crc kubenswrapper[4836]: I0217 14:09:26.566545 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c99af120-e5bb-45a6-baec-1157b240bda6-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"c99af120-e5bb-45a6-baec-1157b240bda6\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 17 14:09:26 crc kubenswrapper[4836]: I0217 14:09:26.667423 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c99af120-e5bb-45a6-baec-1157b240bda6-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"c99af120-e5bb-45a6-baec-1157b240bda6\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 17 14:09:26 crc kubenswrapper[4836]: I0217 14:09:26.667502 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c99af120-e5bb-45a6-baec-1157b240bda6-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"c99af120-e5bb-45a6-baec-1157b240bda6\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 17 14:09:26 crc kubenswrapper[4836]: I0217 14:09:26.667553 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c99af120-e5bb-45a6-baec-1157b240bda6-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"c99af120-e5bb-45a6-baec-1157b240bda6\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 17 14:09:26 crc kubenswrapper[4836]: I0217 14:09:26.702208 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c99af120-e5bb-45a6-baec-1157b240bda6-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"c99af120-e5bb-45a6-baec-1157b240bda6\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 17 14:09:26 crc kubenswrapper[4836]: I0217 14:09:26.964657 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 17 14:09:28 crc kubenswrapper[4836]: I0217 14:09:28.120537 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-6zspj" Feb 17 14:09:28 crc kubenswrapper[4836]: I0217 14:09:28.127940 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-6zspj" Feb 17 14:09:29 crc kubenswrapper[4836]: I0217 14:09:29.783902 4836 patch_prober.go:28] interesting pod/machine-config-daemon-bkk9g container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 14:09:29 crc kubenswrapper[4836]: I0217 14:09:29.784271 4836 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" podUID="895a19c9-a3f0-4a15-aa19-19347121388c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 14:09:31 crc kubenswrapper[4836]: I0217 14:09:31.859221 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 17 14:09:31 crc kubenswrapper[4836]: I0217 14:09:31.860451 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 17 14:09:31 crc kubenswrapper[4836]: I0217 14:09:31.899995 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 17 14:09:32 crc kubenswrapper[4836]: I0217 14:09:32.041521 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ffb43a5d-f735-4891-912a-3ba9e47a4055-kube-api-access\") pod \"installer-9-crc\" (UID: \"ffb43a5d-f735-4891-912a-3ba9e47a4055\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 17 14:09:32 crc kubenswrapper[4836]: I0217 14:09:32.042786 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ffb43a5d-f735-4891-912a-3ba9e47a4055-kubelet-dir\") pod \"installer-9-crc\" (UID: \"ffb43a5d-f735-4891-912a-3ba9e47a4055\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 17 14:09:32 crc kubenswrapper[4836]: I0217 14:09:32.042837 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ffb43a5d-f735-4891-912a-3ba9e47a4055-var-lock\") pod \"installer-9-crc\" (UID: \"ffb43a5d-f735-4891-912a-3ba9e47a4055\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 17 14:09:32 crc kubenswrapper[4836]: I0217 14:09:32.220191 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ffb43a5d-f735-4891-912a-3ba9e47a4055-kube-api-access\") pod \"installer-9-crc\" (UID: \"ffb43a5d-f735-4891-912a-3ba9e47a4055\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 17 14:09:32 crc kubenswrapper[4836]: I0217 14:09:32.220312 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ffb43a5d-f735-4891-912a-3ba9e47a4055-kubelet-dir\") pod \"installer-9-crc\" (UID: \"ffb43a5d-f735-4891-912a-3ba9e47a4055\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 17 14:09:32 crc kubenswrapper[4836]: I0217 14:09:32.220335 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ffb43a5d-f735-4891-912a-3ba9e47a4055-var-lock\") pod \"installer-9-crc\" (UID: \"ffb43a5d-f735-4891-912a-3ba9e47a4055\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 17 14:09:32 crc kubenswrapper[4836]: I0217 14:09:32.220632 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ffb43a5d-f735-4891-912a-3ba9e47a4055-var-lock\") pod \"installer-9-crc\" (UID: \"ffb43a5d-f735-4891-912a-3ba9e47a4055\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 17 14:09:32 crc kubenswrapper[4836]: I0217 14:09:32.221079 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ffb43a5d-f735-4891-912a-3ba9e47a4055-kubelet-dir\") pod \"installer-9-crc\" (UID: \"ffb43a5d-f735-4891-912a-3ba9e47a4055\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 17 14:09:32 crc kubenswrapper[4836]: I0217 14:09:32.261983 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ffb43a5d-f735-4891-912a-3ba9e47a4055-kube-api-access\") pod \"installer-9-crc\" (UID: \"ffb43a5d-f735-4891-912a-3ba9e47a4055\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 17 14:09:32 crc kubenswrapper[4836]: I0217 14:09:32.580385 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 17 14:09:36 crc kubenswrapper[4836]: I0217 14:09:36.417687 4836 patch_prober.go:28] interesting pod/downloads-7954f5f757-5cbbv container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Feb 17 14:09:36 crc kubenswrapper[4836]: I0217 14:09:36.418912 4836 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-5cbbv" podUID="d9eb5c8b-f3c7-4068-82c7-28520f6905c6" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Feb 17 14:09:36 crc kubenswrapper[4836]: I0217 14:09:36.420736 4836 patch_prober.go:28] interesting pod/downloads-7954f5f757-5cbbv container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Feb 17 14:09:36 crc kubenswrapper[4836]: I0217 14:09:36.420879 4836 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-5cbbv" podUID="d9eb5c8b-f3c7-4068-82c7-28520f6905c6" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Feb 17 14:09:36 crc kubenswrapper[4836]: I0217 14:09:36.420971 4836 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console/downloads-7954f5f757-5cbbv" Feb 17 14:09:36 crc kubenswrapper[4836]: I0217 14:09:36.422114 4836 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="download-server" containerStatusID={"Type":"cri-o","ID":"9f7cb281e045dc9dce4b8664374b7c5b4f753c5186831200e7b466bbba132db3"} pod="openshift-console/downloads-7954f5f757-5cbbv" containerMessage="Container download-server failed liveness probe, will be restarted" Feb 17 14:09:36 crc kubenswrapper[4836]: I0217 14:09:36.422190 4836 patch_prober.go:28] interesting pod/downloads-7954f5f757-5cbbv container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Feb 17 14:09:36 crc kubenswrapper[4836]: I0217 14:09:36.422243 4836 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-5cbbv" podUID="d9eb5c8b-f3c7-4068-82c7-28520f6905c6" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Feb 17 14:09:36 crc kubenswrapper[4836]: I0217 14:09:36.422188 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/downloads-7954f5f757-5cbbv" podUID="d9eb5c8b-f3c7-4068-82c7-28520f6905c6" containerName="download-server" containerID="cri-o://9f7cb281e045dc9dce4b8664374b7c5b4f753c5186831200e7b466bbba132db3" gracePeriod=2 Feb 17 14:09:37 crc kubenswrapper[4836]: I0217 14:09:37.148984 4836 generic.go:334] "Generic (PLEG): container finished" podID="d9eb5c8b-f3c7-4068-82c7-28520f6905c6" containerID="9f7cb281e045dc9dce4b8664374b7c5b4f753c5186831200e7b466bbba132db3" exitCode=0 Feb 17 14:09:37 crc kubenswrapper[4836]: I0217 14:09:37.149064 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-5cbbv" event={"ID":"d9eb5c8b-f3c7-4068-82c7-28520f6905c6","Type":"ContainerDied","Data":"9f7cb281e045dc9dce4b8664374b7c5b4f753c5186831200e7b466bbba132db3"} Feb 17 14:09:37 crc kubenswrapper[4836]: I0217 14:09:37.149121 4836 scope.go:117] "RemoveContainer" containerID="92b59bab9fd909d359405ecf217a49ab1de8122281a49768577c5a706060d118" Feb 17 14:09:39 crc kubenswrapper[4836]: I0217 14:09:39.342765 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 17 14:09:39 crc kubenswrapper[4836]: I0217 14:09:39.420016 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 17 14:09:42 crc kubenswrapper[4836]: W0217 14:09:42.658947 4836 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podffb43a5d_f735_4891_912a_3ba9e47a4055.slice/crio-29d09f7152c586bbd27086739e98c573e0ee0e7fe283e6e465f8d46989dcb7a3 WatchSource:0}: Error finding container 29d09f7152c586bbd27086739e98c573e0ee0e7fe283e6e465f8d46989dcb7a3: Status 404 returned error can't find the container with id 29d09f7152c586bbd27086739e98c573e0ee0e7fe283e6e465f8d46989dcb7a3 Feb 17 14:09:43 crc kubenswrapper[4836]: I0217 14:09:43.210370 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"ffb43a5d-f735-4891-912a-3ba9e47a4055","Type":"ContainerStarted","Data":"29d09f7152c586bbd27086739e98c573e0ee0e7fe283e6e465f8d46989dcb7a3"} Feb 17 14:09:46 crc kubenswrapper[4836]: I0217 14:09:46.038118 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-6rsds"] Feb 17 14:09:46 crc kubenswrapper[4836]: I0217 14:09:46.238142 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"c99af120-e5bb-45a6-baec-1157b240bda6","Type":"ContainerStarted","Data":"405d20c9f1da8228078b48fbb3f4c6d23ca94a32adbe89d4e90ada232dcd9609"} Feb 17 14:09:46 crc kubenswrapper[4836]: I0217 14:09:46.415282 4836 patch_prober.go:28] interesting pod/downloads-7954f5f757-5cbbv container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Feb 17 14:09:46 crc kubenswrapper[4836]: I0217 14:09:46.415719 4836 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-5cbbv" podUID="d9eb5c8b-f3c7-4068-82c7-28520f6905c6" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Feb 17 14:09:56 crc kubenswrapper[4836]: I0217 14:09:56.415204 4836 patch_prober.go:28] interesting pod/downloads-7954f5f757-5cbbv container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Feb 17 14:09:56 crc kubenswrapper[4836]: I0217 14:09:56.416207 4836 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-5cbbv" podUID="d9eb5c8b-f3c7-4068-82c7-28520f6905c6" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Feb 17 14:09:59 crc kubenswrapper[4836]: I0217 14:09:59.764825 4836 patch_prober.go:28] interesting pod/machine-config-daemon-bkk9g container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 14:09:59 crc kubenswrapper[4836]: I0217 14:09:59.765515 4836 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" podUID="895a19c9-a3f0-4a15-aa19-19347121388c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 14:09:59 crc kubenswrapper[4836]: I0217 14:09:59.765602 4836 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" Feb 17 14:09:59 crc kubenswrapper[4836]: I0217 14:09:59.766903 4836 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c10f7b69881ea75bfa81905a42379fed8daf48e788b5f2787c522a52d28f58cb"} pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 14:09:59 crc kubenswrapper[4836]: I0217 14:09:59.767431 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" podUID="895a19c9-a3f0-4a15-aa19-19347121388c" containerName="machine-config-daemon" containerID="cri-o://c10f7b69881ea75bfa81905a42379fed8daf48e788b5f2787c522a52d28f58cb" gracePeriod=600 Feb 17 14:10:00 crc kubenswrapper[4836]: I0217 14:10:00.562800 4836 generic.go:334] "Generic (PLEG): container finished" podID="895a19c9-a3f0-4a15-aa19-19347121388c" containerID="c10f7b69881ea75bfa81905a42379fed8daf48e788b5f2787c522a52d28f58cb" exitCode=0 Feb 17 14:10:00 crc kubenswrapper[4836]: I0217 14:10:00.562848 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" event={"ID":"895a19c9-a3f0-4a15-aa19-19347121388c","Type":"ContainerDied","Data":"c10f7b69881ea75bfa81905a42379fed8daf48e788b5f2787c522a52d28f58cb"} Feb 17 14:10:02 crc kubenswrapper[4836]: E0217 14:10:02.510326 4836 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Feb 17 14:10:02 crc kubenswrapper[4836]: E0217 14:10:02.510574 4836 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-thgpf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-252vj_openshift-marketplace(a172042c-7dc6-4cea-906e-3d9135523f15): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 17 14:10:02 crc kubenswrapper[4836]: E0217 14:10:02.512583 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-252vj" podUID="a172042c-7dc6-4cea-906e-3d9135523f15" Feb 17 14:10:03 crc kubenswrapper[4836]: E0217 14:10:03.464416 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-252vj" podUID="a172042c-7dc6-4cea-906e-3d9135523f15" Feb 17 14:10:03 crc kubenswrapper[4836]: E0217 14:10:03.558871 4836 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Feb 17 14:10:03 crc kubenswrapper[4836]: E0217 14:10:03.559282 4836 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jnjxf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-5rfnm_openshift-marketplace(88cf2bb1-d70f-4b82-9b9a-9d7c7a4244ff): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 17 14:10:03 crc kubenswrapper[4836]: E0217 14:10:03.564947 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-5rfnm" podUID="88cf2bb1-d70f-4b82-9b9a-9d7c7a4244ff" Feb 17 14:10:03 crc kubenswrapper[4836]: E0217 14:10:03.567823 4836 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Feb 17 14:10:03 crc kubenswrapper[4836]: E0217 14:10:03.568050 4836 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tknx6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-9w8zr_openshift-marketplace(089d1289-afe9-4ffe-9d96-ac10058335ed): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 17 14:10:03 crc kubenswrapper[4836]: E0217 14:10:03.569633 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-9w8zr" podUID="089d1289-afe9-4ffe-9d96-ac10058335ed" Feb 17 14:10:03 crc kubenswrapper[4836]: E0217 14:10:03.608958 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-5rfnm" podUID="88cf2bb1-d70f-4b82-9b9a-9d7c7a4244ff" Feb 17 14:10:03 crc kubenswrapper[4836]: E0217 14:10:03.623879 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-9w8zr" podUID="089d1289-afe9-4ffe-9d96-ac10058335ed" Feb 17 14:10:04 crc kubenswrapper[4836]: I0217 14:10:04.603604 4836 generic.go:334] "Generic (PLEG): container finished" podID="c85860a6-c3bb-448b-b812-cbf38230de01" containerID="8d61046d718ebf03dc13da6194072e3009ef971818f44de3733bfb8ab11c1f92" exitCode=0 Feb 17 14:10:04 crc kubenswrapper[4836]: I0217 14:10:04.603786 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5kqqh" event={"ID":"c85860a6-c3bb-448b-b812-cbf38230de01","Type":"ContainerDied","Data":"8d61046d718ebf03dc13da6194072e3009ef971818f44de3733bfb8ab11c1f92"} Feb 17 14:10:04 crc kubenswrapper[4836]: I0217 14:10:04.606527 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"ffb43a5d-f735-4891-912a-3ba9e47a4055","Type":"ContainerStarted","Data":"ce3e52bd320d663e1a8fc906cd936e572a8197efd430dd75fd6c81b3471e6dd4"} Feb 17 14:10:04 crc kubenswrapper[4836]: I0217 14:10:04.610142 4836 generic.go:334] "Generic (PLEG): container finished" podID="e9f23804-837d-4d3c-94b7-7cdefe6e94df" containerID="03ed8f2f65fff33093a6776fd604dcab5d3520ae863a96ba61bdb418d4e8293c" exitCode=0 Feb 17 14:10:04 crc kubenswrapper[4836]: I0217 14:10:04.610207 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pxwhr" event={"ID":"e9f23804-837d-4d3c-94b7-7cdefe6e94df","Type":"ContainerDied","Data":"03ed8f2f65fff33093a6776fd604dcab5d3520ae863a96ba61bdb418d4e8293c"} Feb 17 14:10:04 crc kubenswrapper[4836]: I0217 14:10:04.619578 4836 generic.go:334] "Generic (PLEG): container finished" podID="c99af120-e5bb-45a6-baec-1157b240bda6" containerID="775f2de68fea42d390b39e7697da438c32c1bcf52a235f512d84d601f8a51746" exitCode=0 Feb 17 14:10:04 crc kubenswrapper[4836]: I0217 14:10:04.619913 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"c99af120-e5bb-45a6-baec-1157b240bda6","Type":"ContainerDied","Data":"775f2de68fea42d390b39e7697da438c32c1bcf52a235f512d84d601f8a51746"} Feb 17 14:10:04 crc kubenswrapper[4836]: I0217 14:10:04.626764 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-5cbbv" event={"ID":"d9eb5c8b-f3c7-4068-82c7-28520f6905c6","Type":"ContainerStarted","Data":"783e6f3ec1ddf5eee98e5c4ee5983e27d9e7b8f8f8789635783ab63380e75bcf"} Feb 17 14:10:04 crc kubenswrapper[4836]: I0217 14:10:04.627068 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-5cbbv" Feb 17 14:10:04 crc kubenswrapper[4836]: I0217 14:10:04.627387 4836 patch_prober.go:28] interesting pod/downloads-7954f5f757-5cbbv container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Feb 17 14:10:04 crc kubenswrapper[4836]: I0217 14:10:04.627434 4836 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-5cbbv" podUID="d9eb5c8b-f3c7-4068-82c7-28520f6905c6" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Feb 17 14:10:04 crc kubenswrapper[4836]: I0217 14:10:04.629456 4836 generic.go:334] "Generic (PLEG): container finished" podID="f1bd4ed0-3b99-4446-9218-71bb589da4a4" containerID="73b212b6f45054b199f4f919939777dc7461c5c17f33a2a285ccf07966ece193" exitCode=0 Feb 17 14:10:04 crc kubenswrapper[4836]: I0217 14:10:04.629528 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cmk55" event={"ID":"f1bd4ed0-3b99-4446-9218-71bb589da4a4","Type":"ContainerDied","Data":"73b212b6f45054b199f4f919939777dc7461c5c17f33a2a285ccf07966ece193"} Feb 17 14:10:04 crc kubenswrapper[4836]: I0217 14:10:04.636967 4836 generic.go:334] "Generic (PLEG): container finished" podID="c6c873c6-ddde-4b9b-9141-e6de9be567d4" containerID="eac3b1d23d40a9e3d574bb39b162de6c6b11b16dff12abcfba61b9ba01c21760" exitCode=0 Feb 17 14:10:04 crc kubenswrapper[4836]: I0217 14:10:04.637082 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tmpvx" event={"ID":"c6c873c6-ddde-4b9b-9141-e6de9be567d4","Type":"ContainerDied","Data":"eac3b1d23d40a9e3d574bb39b162de6c6b11b16dff12abcfba61b9ba01c21760"} Feb 17 14:10:04 crc kubenswrapper[4836]: I0217 14:10:04.648193 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" event={"ID":"895a19c9-a3f0-4a15-aa19-19347121388c","Type":"ContainerStarted","Data":"6ca471c2a83c51e21c02e6df84d64c6720d133c689bc0501ece1848cccb37b3b"} Feb 17 14:10:04 crc kubenswrapper[4836]: I0217 14:10:04.654396 4836 generic.go:334] "Generic (PLEG): container finished" podID="8762f2f2-8375-4fdd-8a29-ea2ab598afa1" containerID="c45d87fc95c2bb97baee74cdf9eb8890199ccbcb1361ab9d40701a3bf1b0aef6" exitCode=0 Feb 17 14:10:04 crc kubenswrapper[4836]: I0217 14:10:04.654449 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vfmw4" event={"ID":"8762f2f2-8375-4fdd-8a29-ea2ab598afa1","Type":"ContainerDied","Data":"c45d87fc95c2bb97baee74cdf9eb8890199ccbcb1361ab9d40701a3bf1b0aef6"} Feb 17 14:10:04 crc kubenswrapper[4836]: I0217 14:10:04.685674 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=33.685633334 podStartE2EDuration="33.685633334s" podCreationTimestamp="2026-02-17 14:09:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:10:04.680245211 +0000 UTC m=+231.023173490" watchObservedRunningTime="2026-02-17 14:10:04.685633334 +0000 UTC m=+231.028561603" Feb 17 14:10:05 crc kubenswrapper[4836]: I0217 14:10:05.664597 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pxwhr" event={"ID":"e9f23804-837d-4d3c-94b7-7cdefe6e94df","Type":"ContainerStarted","Data":"c2eaa809f67d2bf376430950b6f31e802fbb9ae20ab0242708d65041bbaf3f07"} Feb 17 14:10:05 crc kubenswrapper[4836]: I0217 14:10:05.667667 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vfmw4" event={"ID":"8762f2f2-8375-4fdd-8a29-ea2ab598afa1","Type":"ContainerStarted","Data":"63859747d78e0b196aa1ae4f9aecdf579a3667fc25d5a072dbcb78b4447b6dc2"} Feb 17 14:10:05 crc kubenswrapper[4836]: I0217 14:10:05.670141 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5kqqh" event={"ID":"c85860a6-c3bb-448b-b812-cbf38230de01","Type":"ContainerStarted","Data":"050ce9294d63c1cb12cc162f10c12c570ecb02593b7c8408dbd3bd6ac92c0e81"} Feb 17 14:10:05 crc kubenswrapper[4836]: I0217 14:10:05.672806 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cmk55" event={"ID":"f1bd4ed0-3b99-4446-9218-71bb589da4a4","Type":"ContainerStarted","Data":"6edde19e8fa08b860a5493cc87da492992d83a470155d9fcd528dfc9281f65eb"} Feb 17 14:10:05 crc kubenswrapper[4836]: I0217 14:10:05.675979 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tmpvx" event={"ID":"c6c873c6-ddde-4b9b-9141-e6de9be567d4","Type":"ContainerStarted","Data":"a23a8c919a9eea21ab628dc33e93962e83f3bdb7249542d3b4905dd07bca224b"} Feb 17 14:10:05 crc kubenswrapper[4836]: I0217 14:10:05.676939 4836 patch_prober.go:28] interesting pod/downloads-7954f5f757-5cbbv container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Feb 17 14:10:05 crc kubenswrapper[4836]: I0217 14:10:05.677004 4836 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-5cbbv" podUID="d9eb5c8b-f3c7-4068-82c7-28520f6905c6" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Feb 17 14:10:05 crc kubenswrapper[4836]: I0217 14:10:05.703700 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-pxwhr" podStartSLOduration=5.62098076 podStartE2EDuration="1m18.703663033s" podCreationTimestamp="2026-02-17 14:08:47 +0000 UTC" firstStartedPulling="2026-02-17 14:08:51.98066947 +0000 UTC m=+158.323597739" lastFinishedPulling="2026-02-17 14:10:05.063351743 +0000 UTC m=+231.406280012" observedRunningTime="2026-02-17 14:10:05.697615693 +0000 UTC m=+232.040543962" watchObservedRunningTime="2026-02-17 14:10:05.703663033 +0000 UTC m=+232.046591302" Feb 17 14:10:05 crc kubenswrapper[4836]: I0217 14:10:05.726250 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-cmk55" podStartSLOduration=7.557454774 podStartE2EDuration="1m20.726225102s" podCreationTimestamp="2026-02-17 14:08:45 +0000 UTC" firstStartedPulling="2026-02-17 14:08:52.004891994 +0000 UTC m=+158.347820253" lastFinishedPulling="2026-02-17 14:10:05.173662302 +0000 UTC m=+231.516590581" observedRunningTime="2026-02-17 14:10:05.724744163 +0000 UTC m=+232.067672432" watchObservedRunningTime="2026-02-17 14:10:05.726225102 +0000 UTC m=+232.069153371" Feb 17 14:10:05 crc kubenswrapper[4836]: I0217 14:10:05.764373 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-5kqqh" podStartSLOduration=5.571011954 podStartE2EDuration="1m18.764340464s" podCreationTimestamp="2026-02-17 14:08:47 +0000 UTC" firstStartedPulling="2026-02-17 14:08:51.926353048 +0000 UTC m=+158.269281327" lastFinishedPulling="2026-02-17 14:10:05.119681578 +0000 UTC m=+231.462609837" observedRunningTime="2026-02-17 14:10:05.759184628 +0000 UTC m=+232.102112917" watchObservedRunningTime="2026-02-17 14:10:05.764340464 +0000 UTC m=+232.107268743" Feb 17 14:10:05 crc kubenswrapper[4836]: I0217 14:10:05.780224 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-tmpvx" podStartSLOduration=7.564898172 podStartE2EDuration="1m20.780202396s" podCreationTimestamp="2026-02-17 14:08:45 +0000 UTC" firstStartedPulling="2026-02-17 14:08:51.936706083 +0000 UTC m=+158.279634352" lastFinishedPulling="2026-02-17 14:10:05.152010307 +0000 UTC m=+231.494938576" observedRunningTime="2026-02-17 14:10:05.778527422 +0000 UTC m=+232.121455681" watchObservedRunningTime="2026-02-17 14:10:05.780202396 +0000 UTC m=+232.123130685" Feb 17 14:10:05 crc kubenswrapper[4836]: I0217 14:10:05.806062 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vfmw4" podStartSLOduration=7.577291771 podStartE2EDuration="1m20.806044132s" podCreationTimestamp="2026-02-17 14:08:45 +0000 UTC" firstStartedPulling="2026-02-17 14:08:51.986083604 +0000 UTC m=+158.329011873" lastFinishedPulling="2026-02-17 14:10:05.214835965 +0000 UTC m=+231.557764234" observedRunningTime="2026-02-17 14:10:05.805521828 +0000 UTC m=+232.148450117" watchObservedRunningTime="2026-02-17 14:10:05.806044132 +0000 UTC m=+232.148972401" Feb 17 14:10:06 crc kubenswrapper[4836]: I0217 14:10:06.103384 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 17 14:10:06 crc kubenswrapper[4836]: I0217 14:10:06.266840 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c99af120-e5bb-45a6-baec-1157b240bda6-kube-api-access\") pod \"c99af120-e5bb-45a6-baec-1157b240bda6\" (UID: \"c99af120-e5bb-45a6-baec-1157b240bda6\") " Feb 17 14:10:06 crc kubenswrapper[4836]: I0217 14:10:06.266909 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c99af120-e5bb-45a6-baec-1157b240bda6-kubelet-dir\") pod \"c99af120-e5bb-45a6-baec-1157b240bda6\" (UID: \"c99af120-e5bb-45a6-baec-1157b240bda6\") " Feb 17 14:10:06 crc kubenswrapper[4836]: I0217 14:10:06.267101 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c99af120-e5bb-45a6-baec-1157b240bda6-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "c99af120-e5bb-45a6-baec-1157b240bda6" (UID: "c99af120-e5bb-45a6-baec-1157b240bda6"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 14:10:06 crc kubenswrapper[4836]: I0217 14:10:06.267397 4836 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c99af120-e5bb-45a6-baec-1157b240bda6-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 17 14:10:06 crc kubenswrapper[4836]: I0217 14:10:06.276490 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c99af120-e5bb-45a6-baec-1157b240bda6-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "c99af120-e5bb-45a6-baec-1157b240bda6" (UID: "c99af120-e5bb-45a6-baec-1157b240bda6"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:10:06 crc kubenswrapper[4836]: I0217 14:10:06.368884 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-tmpvx" Feb 17 14:10:06 crc kubenswrapper[4836]: I0217 14:10:06.369034 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-tmpvx" Feb 17 14:10:06 crc kubenswrapper[4836]: I0217 14:10:06.369449 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c99af120-e5bb-45a6-baec-1157b240bda6-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 17 14:10:06 crc kubenswrapper[4836]: I0217 14:10:06.415630 4836 patch_prober.go:28] interesting pod/downloads-7954f5f757-5cbbv container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Feb 17 14:10:06 crc kubenswrapper[4836]: I0217 14:10:06.415687 4836 patch_prober.go:28] interesting pod/downloads-7954f5f757-5cbbv container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Feb 17 14:10:06 crc kubenswrapper[4836]: I0217 14:10:06.415718 4836 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-5cbbv" podUID="d9eb5c8b-f3c7-4068-82c7-28520f6905c6" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Feb 17 14:10:06 crc kubenswrapper[4836]: I0217 14:10:06.415759 4836 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-5cbbv" podUID="d9eb5c8b-f3c7-4068-82c7-28520f6905c6" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Feb 17 14:10:06 crc kubenswrapper[4836]: I0217 14:10:06.681928 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 17 14:10:06 crc kubenswrapper[4836]: I0217 14:10:06.681952 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"c99af120-e5bb-45a6-baec-1157b240bda6","Type":"ContainerDied","Data":"405d20c9f1da8228078b48fbb3f4c6d23ca94a32adbe89d4e90ada232dcd9609"} Feb 17 14:10:06 crc kubenswrapper[4836]: I0217 14:10:06.683706 4836 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="405d20c9f1da8228078b48fbb3f4c6d23ca94a32adbe89d4e90ada232dcd9609" Feb 17 14:10:07 crc kubenswrapper[4836]: I0217 14:10:07.042289 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vfmw4" Feb 17 14:10:07 crc kubenswrapper[4836]: I0217 14:10:07.042376 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vfmw4" Feb 17 14:10:07 crc kubenswrapper[4836]: I0217 14:10:07.190453 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-cmk55" Feb 17 14:10:07 crc kubenswrapper[4836]: I0217 14:10:07.190702 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-cmk55" Feb 17 14:10:07 crc kubenswrapper[4836]: I0217 14:10:07.678654 4836 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-tmpvx" podUID="c6c873c6-ddde-4b9b-9141-e6de9be567d4" containerName="registry-server" probeResult="failure" output=< Feb 17 14:10:07 crc kubenswrapper[4836]: timeout: failed to connect service ":50051" within 1s Feb 17 14:10:07 crc kubenswrapper[4836]: > Feb 17 14:10:07 crc kubenswrapper[4836]: I0217 14:10:07.869906 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-pxwhr" Feb 17 14:10:07 crc kubenswrapper[4836]: I0217 14:10:07.869990 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-pxwhr" Feb 17 14:10:08 crc kubenswrapper[4836]: I0217 14:10:08.091080 4836 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-vfmw4" podUID="8762f2f2-8375-4fdd-8a29-ea2ab598afa1" containerName="registry-server" probeResult="failure" output=< Feb 17 14:10:08 crc kubenswrapper[4836]: timeout: failed to connect service ":50051" within 1s Feb 17 14:10:08 crc kubenswrapper[4836]: > Feb 17 14:10:08 crc kubenswrapper[4836]: I0217 14:10:08.144764 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-5kqqh" Feb 17 14:10:08 crc kubenswrapper[4836]: I0217 14:10:08.144841 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-5kqqh" Feb 17 14:10:08 crc kubenswrapper[4836]: I0217 14:10:08.429035 4836 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-cmk55" podUID="f1bd4ed0-3b99-4446-9218-71bb589da4a4" containerName="registry-server" probeResult="failure" output=< Feb 17 14:10:08 crc kubenswrapper[4836]: timeout: failed to connect service ":50051" within 1s Feb 17 14:10:08 crc kubenswrapper[4836]: > Feb 17 14:10:08 crc kubenswrapper[4836]: I0217 14:10:08.924766 4836 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-pxwhr" podUID="e9f23804-837d-4d3c-94b7-7cdefe6e94df" containerName="registry-server" probeResult="failure" output=< Feb 17 14:10:08 crc kubenswrapper[4836]: timeout: failed to connect service ":50051" within 1s Feb 17 14:10:08 crc kubenswrapper[4836]: > Feb 17 14:10:09 crc kubenswrapper[4836]: I0217 14:10:09.195485 4836 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-5kqqh" podUID="c85860a6-c3bb-448b-b812-cbf38230de01" containerName="registry-server" probeResult="failure" output=< Feb 17 14:10:09 crc kubenswrapper[4836]: timeout: failed to connect service ":50051" within 1s Feb 17 14:10:09 crc kubenswrapper[4836]: > Feb 17 14:10:11 crc kubenswrapper[4836]: I0217 14:10:11.066690 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-6rsds" podUID="c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7" containerName="oauth-openshift" containerID="cri-o://374ae013639e0d9afa8e234c5feaec0812c5a1b8b7085c57cb72bf432395a8d0" gracePeriod=15 Feb 17 14:10:11 crc kubenswrapper[4836]: I0217 14:10:11.726913 4836 generic.go:334] "Generic (PLEG): container finished" podID="c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7" containerID="374ae013639e0d9afa8e234c5feaec0812c5a1b8b7085c57cb72bf432395a8d0" exitCode=0 Feb 17 14:10:11 crc kubenswrapper[4836]: I0217 14:10:11.726977 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-6rsds" event={"ID":"c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7","Type":"ContainerDied","Data":"374ae013639e0d9afa8e234c5feaec0812c5a1b8b7085c57cb72bf432395a8d0"} Feb 17 14:10:12 crc kubenswrapper[4836]: I0217 14:10:12.545113 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-6rsds" Feb 17 14:10:12 crc kubenswrapper[4836]: I0217 14:10:12.598438 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7-v4-0-config-user-template-provider-selection\") pod \"c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7\" (UID: \"c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7\") " Feb 17 14:10:12 crc kubenswrapper[4836]: I0217 14:10:12.598512 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7-v4-0-config-system-ocp-branding-template\") pod \"c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7\" (UID: \"c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7\") " Feb 17 14:10:12 crc kubenswrapper[4836]: I0217 14:10:12.598566 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7-v4-0-config-system-router-certs\") pod \"c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7\" (UID: \"c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7\") " Feb 17 14:10:12 crc kubenswrapper[4836]: I0217 14:10:12.598605 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6htjx\" (UniqueName: \"kubernetes.io/projected/c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7-kube-api-access-6htjx\") pod \"c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7\" (UID: \"c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7\") " Feb 17 14:10:12 crc kubenswrapper[4836]: I0217 14:10:12.598684 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7-v4-0-config-user-template-login\") pod \"c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7\" (UID: \"c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7\") " Feb 17 14:10:12 crc kubenswrapper[4836]: I0217 14:10:12.598739 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7-v4-0-config-user-template-error\") pod \"c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7\" (UID: \"c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7\") " Feb 17 14:10:12 crc kubenswrapper[4836]: I0217 14:10:12.598780 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7-v4-0-config-system-cliconfig\") pod \"c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7\" (UID: \"c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7\") " Feb 17 14:10:12 crc kubenswrapper[4836]: I0217 14:10:12.598823 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7-v4-0-config-system-trusted-ca-bundle\") pod \"c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7\" (UID: \"c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7\") " Feb 17 14:10:12 crc kubenswrapper[4836]: I0217 14:10:12.598897 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7-v4-0-config-user-idp-0-file-data\") pod \"c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7\" (UID: \"c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7\") " Feb 17 14:10:12 crc kubenswrapper[4836]: I0217 14:10:12.598977 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7-v4-0-config-system-session\") pod \"c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7\" (UID: \"c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7\") " Feb 17 14:10:12 crc kubenswrapper[4836]: I0217 14:10:12.599035 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7-v4-0-config-system-service-ca\") pod \"c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7\" (UID: \"c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7\") " Feb 17 14:10:12 crc kubenswrapper[4836]: I0217 14:10:12.599068 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7-audit-policies\") pod \"c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7\" (UID: \"c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7\") " Feb 17 14:10:12 crc kubenswrapper[4836]: I0217 14:10:12.599111 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7-v4-0-config-system-serving-cert\") pod \"c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7\" (UID: \"c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7\") " Feb 17 14:10:12 crc kubenswrapper[4836]: I0217 14:10:12.599151 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7-audit-dir\") pod \"c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7\" (UID: \"c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7\") " Feb 17 14:10:12 crc kubenswrapper[4836]: I0217 14:10:12.600545 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7" (UID: "c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:10:12 crc kubenswrapper[4836]: I0217 14:10:12.600575 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7" (UID: "c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:10:12 crc kubenswrapper[4836]: I0217 14:10:12.601115 4836 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 17 14:10:12 crc kubenswrapper[4836]: I0217 14:10:12.601111 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-574dcf5686-dttcs"] Feb 17 14:10:12 crc kubenswrapper[4836]: E0217 14:10:12.602152 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c99af120-e5bb-45a6-baec-1157b240bda6" containerName="pruner" Feb 17 14:10:12 crc kubenswrapper[4836]: I0217 14:10:12.602182 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="c99af120-e5bb-45a6-baec-1157b240bda6" containerName="pruner" Feb 17 14:10:12 crc kubenswrapper[4836]: E0217 14:10:12.602256 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7" containerName="oauth-openshift" Feb 17 14:10:12 crc kubenswrapper[4836]: I0217 14:10:12.602267 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7" containerName="oauth-openshift" Feb 17 14:10:12 crc kubenswrapper[4836]: I0217 14:10:12.602624 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="c99af120-e5bb-45a6-baec-1157b240bda6" containerName="pruner" Feb 17 14:10:12 crc kubenswrapper[4836]: I0217 14:10:12.602662 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7" containerName="oauth-openshift" Feb 17 14:10:12 crc kubenswrapper[4836]: I0217 14:10:12.605766 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-574dcf5686-dttcs" Feb 17 14:10:12 crc kubenswrapper[4836]: I0217 14:10:12.601173 4836 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:10:12 crc kubenswrapper[4836]: I0217 14:10:12.601162 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7" (UID: "c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:10:12 crc kubenswrapper[4836]: I0217 14:10:12.601227 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7" (UID: "c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 14:10:12 crc kubenswrapper[4836]: I0217 14:10:12.602151 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7" (UID: "c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:10:12 crc kubenswrapper[4836]: I0217 14:10:12.615738 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-574dcf5686-dttcs"] Feb 17 14:10:12 crc kubenswrapper[4836]: I0217 14:10:12.621724 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7" (UID: "c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:10:12 crc kubenswrapper[4836]: I0217 14:10:12.622305 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7" (UID: "c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:10:12 crc kubenswrapper[4836]: I0217 14:10:12.625668 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7" (UID: "c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:10:12 crc kubenswrapper[4836]: I0217 14:10:12.627164 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7" (UID: "c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:10:12 crc kubenswrapper[4836]: I0217 14:10:12.627700 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7" (UID: "c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:10:12 crc kubenswrapper[4836]: I0217 14:10:12.628835 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7-kube-api-access-6htjx" (OuterVolumeSpecName: "kube-api-access-6htjx") pod "c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7" (UID: "c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7"). InnerVolumeSpecName "kube-api-access-6htjx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:10:12 crc kubenswrapper[4836]: I0217 14:10:12.637149 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7" (UID: "c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:10:12 crc kubenswrapper[4836]: I0217 14:10:12.637598 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7" (UID: "c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:10:12 crc kubenswrapper[4836]: I0217 14:10:12.640262 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7" (UID: "c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:10:12 crc kubenswrapper[4836]: I0217 14:10:12.708359 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/45204263-0159-4c86-b81a-a900db07b14f-v4-0-config-user-template-error\") pod \"oauth-openshift-574dcf5686-dttcs\" (UID: \"45204263-0159-4c86-b81a-a900db07b14f\") " pod="openshift-authentication/oauth-openshift-574dcf5686-dttcs" Feb 17 14:10:12 crc kubenswrapper[4836]: I0217 14:10:12.708423 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/45204263-0159-4c86-b81a-a900db07b14f-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-574dcf5686-dttcs\" (UID: \"45204263-0159-4c86-b81a-a900db07b14f\") " pod="openshift-authentication/oauth-openshift-574dcf5686-dttcs" Feb 17 14:10:12 crc kubenswrapper[4836]: I0217 14:10:12.708446 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/45204263-0159-4c86-b81a-a900db07b14f-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-574dcf5686-dttcs\" (UID: \"45204263-0159-4c86-b81a-a900db07b14f\") " pod="openshift-authentication/oauth-openshift-574dcf5686-dttcs" Feb 17 14:10:12 crc kubenswrapper[4836]: I0217 14:10:12.708470 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/45204263-0159-4c86-b81a-a900db07b14f-v4-0-config-system-session\") pod \"oauth-openshift-574dcf5686-dttcs\" (UID: \"45204263-0159-4c86-b81a-a900db07b14f\") " pod="openshift-authentication/oauth-openshift-574dcf5686-dttcs" Feb 17 14:10:12 crc kubenswrapper[4836]: I0217 14:10:12.708497 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/45204263-0159-4c86-b81a-a900db07b14f-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-574dcf5686-dttcs\" (UID: \"45204263-0159-4c86-b81a-a900db07b14f\") " pod="openshift-authentication/oauth-openshift-574dcf5686-dttcs" Feb 17 14:10:12 crc kubenswrapper[4836]: I0217 14:10:12.708544 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/45204263-0159-4c86-b81a-a900db07b14f-audit-dir\") pod \"oauth-openshift-574dcf5686-dttcs\" (UID: \"45204263-0159-4c86-b81a-a900db07b14f\") " pod="openshift-authentication/oauth-openshift-574dcf5686-dttcs" Feb 17 14:10:12 crc kubenswrapper[4836]: I0217 14:10:12.708565 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/45204263-0159-4c86-b81a-a900db07b14f-audit-policies\") pod \"oauth-openshift-574dcf5686-dttcs\" (UID: \"45204263-0159-4c86-b81a-a900db07b14f\") " pod="openshift-authentication/oauth-openshift-574dcf5686-dttcs" Feb 17 14:10:12 crc kubenswrapper[4836]: I0217 14:10:12.708631 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/45204263-0159-4c86-b81a-a900db07b14f-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-574dcf5686-dttcs\" (UID: \"45204263-0159-4c86-b81a-a900db07b14f\") " pod="openshift-authentication/oauth-openshift-574dcf5686-dttcs" Feb 17 14:10:12 crc kubenswrapper[4836]: I0217 14:10:12.708723 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdct7\" (UniqueName: \"kubernetes.io/projected/45204263-0159-4c86-b81a-a900db07b14f-kube-api-access-wdct7\") pod \"oauth-openshift-574dcf5686-dttcs\" (UID: \"45204263-0159-4c86-b81a-a900db07b14f\") " pod="openshift-authentication/oauth-openshift-574dcf5686-dttcs" Feb 17 14:10:12 crc kubenswrapper[4836]: I0217 14:10:12.708788 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/45204263-0159-4c86-b81a-a900db07b14f-v4-0-config-system-cliconfig\") pod \"oauth-openshift-574dcf5686-dttcs\" (UID: \"45204263-0159-4c86-b81a-a900db07b14f\") " pod="openshift-authentication/oauth-openshift-574dcf5686-dttcs" Feb 17 14:10:12 crc kubenswrapper[4836]: I0217 14:10:12.708806 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/45204263-0159-4c86-b81a-a900db07b14f-v4-0-config-system-service-ca\") pod \"oauth-openshift-574dcf5686-dttcs\" (UID: \"45204263-0159-4c86-b81a-a900db07b14f\") " pod="openshift-authentication/oauth-openshift-574dcf5686-dttcs" Feb 17 14:10:12 crc kubenswrapper[4836]: I0217 14:10:12.708831 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/45204263-0159-4c86-b81a-a900db07b14f-v4-0-config-system-serving-cert\") pod \"oauth-openshift-574dcf5686-dttcs\" (UID: \"45204263-0159-4c86-b81a-a900db07b14f\") " pod="openshift-authentication/oauth-openshift-574dcf5686-dttcs" Feb 17 14:10:12 crc kubenswrapper[4836]: I0217 14:10:12.708848 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/45204263-0159-4c86-b81a-a900db07b14f-v4-0-config-user-template-login\") pod \"oauth-openshift-574dcf5686-dttcs\" (UID: \"45204263-0159-4c86-b81a-a900db07b14f\") " pod="openshift-authentication/oauth-openshift-574dcf5686-dttcs" Feb 17 14:10:12 crc kubenswrapper[4836]: I0217 14:10:12.708864 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/45204263-0159-4c86-b81a-a900db07b14f-v4-0-config-system-router-certs\") pod \"oauth-openshift-574dcf5686-dttcs\" (UID: \"45204263-0159-4c86-b81a-a900db07b14f\") " pod="openshift-authentication/oauth-openshift-574dcf5686-dttcs" Feb 17 14:10:12 crc kubenswrapper[4836]: I0217 14:10:12.708971 4836 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 17 14:10:12 crc kubenswrapper[4836]: I0217 14:10:12.708988 4836 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 17 14:10:12 crc kubenswrapper[4836]: I0217 14:10:12.709002 4836 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 14:10:12 crc kubenswrapper[4836]: I0217 14:10:12.709014 4836 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 17 14:10:12 crc kubenswrapper[4836]: I0217 14:10:12.709026 4836 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 17 14:10:12 crc kubenswrapper[4836]: I0217 14:10:12.709036 4836 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 17 14:10:12 crc kubenswrapper[4836]: I0217 14:10:12.709045 4836 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 17 14:10:12 crc kubenswrapper[4836]: I0217 14:10:12.709055 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6htjx\" (UniqueName: \"kubernetes.io/projected/c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7-kube-api-access-6htjx\") on node \"crc\" DevicePath \"\"" Feb 17 14:10:12 crc kubenswrapper[4836]: I0217 14:10:12.709063 4836 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 17 14:10:12 crc kubenswrapper[4836]: I0217 14:10:12.709073 4836 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 17 14:10:12 crc kubenswrapper[4836]: I0217 14:10:12.709082 4836 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 17 14:10:12 crc kubenswrapper[4836]: I0217 14:10:12.709093 4836 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 17 14:10:12 crc kubenswrapper[4836]: I0217 14:10:12.733141 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-6rsds" event={"ID":"c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7","Type":"ContainerDied","Data":"f9b98c0ff2091be32d114061b6cc2daa5338c6afb2d30dae1e18fe2afc9b3ea3"} Feb 17 14:10:12 crc kubenswrapper[4836]: I0217 14:10:12.733202 4836 scope.go:117] "RemoveContainer" containerID="374ae013639e0d9afa8e234c5feaec0812c5a1b8b7085c57cb72bf432395a8d0" Feb 17 14:10:12 crc kubenswrapper[4836]: I0217 14:10:12.733235 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-6rsds" Feb 17 14:10:12 crc kubenswrapper[4836]: I0217 14:10:12.768192 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-6rsds"] Feb 17 14:10:12 crc kubenswrapper[4836]: I0217 14:10:12.771677 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-6rsds"] Feb 17 14:10:12 crc kubenswrapper[4836]: I0217 14:10:12.811363 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/45204263-0159-4c86-b81a-a900db07b14f-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-574dcf5686-dttcs\" (UID: \"45204263-0159-4c86-b81a-a900db07b14f\") " pod="openshift-authentication/oauth-openshift-574dcf5686-dttcs" Feb 17 14:10:12 crc kubenswrapper[4836]: I0217 14:10:12.811478 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdct7\" (UniqueName: \"kubernetes.io/projected/45204263-0159-4c86-b81a-a900db07b14f-kube-api-access-wdct7\") pod \"oauth-openshift-574dcf5686-dttcs\" (UID: \"45204263-0159-4c86-b81a-a900db07b14f\") " pod="openshift-authentication/oauth-openshift-574dcf5686-dttcs" Feb 17 14:10:12 crc kubenswrapper[4836]: I0217 14:10:12.811540 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/45204263-0159-4c86-b81a-a900db07b14f-v4-0-config-system-cliconfig\") pod \"oauth-openshift-574dcf5686-dttcs\" (UID: \"45204263-0159-4c86-b81a-a900db07b14f\") " pod="openshift-authentication/oauth-openshift-574dcf5686-dttcs" Feb 17 14:10:12 crc kubenswrapper[4836]: I0217 14:10:12.811570 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/45204263-0159-4c86-b81a-a900db07b14f-v4-0-config-system-service-ca\") pod \"oauth-openshift-574dcf5686-dttcs\" (UID: \"45204263-0159-4c86-b81a-a900db07b14f\") " pod="openshift-authentication/oauth-openshift-574dcf5686-dttcs" Feb 17 14:10:12 crc kubenswrapper[4836]: I0217 14:10:12.811624 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/45204263-0159-4c86-b81a-a900db07b14f-v4-0-config-system-serving-cert\") pod \"oauth-openshift-574dcf5686-dttcs\" (UID: \"45204263-0159-4c86-b81a-a900db07b14f\") " pod="openshift-authentication/oauth-openshift-574dcf5686-dttcs" Feb 17 14:10:12 crc kubenswrapper[4836]: I0217 14:10:12.811693 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/45204263-0159-4c86-b81a-a900db07b14f-v4-0-config-user-template-login\") pod \"oauth-openshift-574dcf5686-dttcs\" (UID: \"45204263-0159-4c86-b81a-a900db07b14f\") " pod="openshift-authentication/oauth-openshift-574dcf5686-dttcs" Feb 17 14:10:12 crc kubenswrapper[4836]: I0217 14:10:12.811722 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/45204263-0159-4c86-b81a-a900db07b14f-v4-0-config-system-router-certs\") pod \"oauth-openshift-574dcf5686-dttcs\" (UID: \"45204263-0159-4c86-b81a-a900db07b14f\") " pod="openshift-authentication/oauth-openshift-574dcf5686-dttcs" Feb 17 14:10:12 crc kubenswrapper[4836]: I0217 14:10:12.811796 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/45204263-0159-4c86-b81a-a900db07b14f-v4-0-config-user-template-error\") pod \"oauth-openshift-574dcf5686-dttcs\" (UID: \"45204263-0159-4c86-b81a-a900db07b14f\") " pod="openshift-authentication/oauth-openshift-574dcf5686-dttcs" Feb 17 14:10:12 crc kubenswrapper[4836]: I0217 14:10:12.811824 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/45204263-0159-4c86-b81a-a900db07b14f-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-574dcf5686-dttcs\" (UID: \"45204263-0159-4c86-b81a-a900db07b14f\") " pod="openshift-authentication/oauth-openshift-574dcf5686-dttcs" Feb 17 14:10:12 crc kubenswrapper[4836]: I0217 14:10:12.811878 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/45204263-0159-4c86-b81a-a900db07b14f-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-574dcf5686-dttcs\" (UID: \"45204263-0159-4c86-b81a-a900db07b14f\") " pod="openshift-authentication/oauth-openshift-574dcf5686-dttcs" Feb 17 14:10:12 crc kubenswrapper[4836]: I0217 14:10:12.811904 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/45204263-0159-4c86-b81a-a900db07b14f-v4-0-config-system-session\") pod \"oauth-openshift-574dcf5686-dttcs\" (UID: \"45204263-0159-4c86-b81a-a900db07b14f\") " pod="openshift-authentication/oauth-openshift-574dcf5686-dttcs" Feb 17 14:10:12 crc kubenswrapper[4836]: I0217 14:10:12.811971 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/45204263-0159-4c86-b81a-a900db07b14f-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-574dcf5686-dttcs\" (UID: \"45204263-0159-4c86-b81a-a900db07b14f\") " pod="openshift-authentication/oauth-openshift-574dcf5686-dttcs" Feb 17 14:10:12 crc kubenswrapper[4836]: I0217 14:10:12.812025 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/45204263-0159-4c86-b81a-a900db07b14f-audit-dir\") pod \"oauth-openshift-574dcf5686-dttcs\" (UID: \"45204263-0159-4c86-b81a-a900db07b14f\") " pod="openshift-authentication/oauth-openshift-574dcf5686-dttcs" Feb 17 14:10:12 crc kubenswrapper[4836]: I0217 14:10:12.812061 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/45204263-0159-4c86-b81a-a900db07b14f-audit-policies\") pod \"oauth-openshift-574dcf5686-dttcs\" (UID: \"45204263-0159-4c86-b81a-a900db07b14f\") " pod="openshift-authentication/oauth-openshift-574dcf5686-dttcs" Feb 17 14:10:12 crc kubenswrapper[4836]: I0217 14:10:12.813160 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/45204263-0159-4c86-b81a-a900db07b14f-audit-policies\") pod \"oauth-openshift-574dcf5686-dttcs\" (UID: \"45204263-0159-4c86-b81a-a900db07b14f\") " pod="openshift-authentication/oauth-openshift-574dcf5686-dttcs" Feb 17 14:10:12 crc kubenswrapper[4836]: I0217 14:10:12.813776 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/45204263-0159-4c86-b81a-a900db07b14f-v4-0-config-system-cliconfig\") pod \"oauth-openshift-574dcf5686-dttcs\" (UID: \"45204263-0159-4c86-b81a-a900db07b14f\") " pod="openshift-authentication/oauth-openshift-574dcf5686-dttcs" Feb 17 14:10:12 crc kubenswrapper[4836]: I0217 14:10:12.813955 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/45204263-0159-4c86-b81a-a900db07b14f-audit-dir\") pod \"oauth-openshift-574dcf5686-dttcs\" (UID: \"45204263-0159-4c86-b81a-a900db07b14f\") " pod="openshift-authentication/oauth-openshift-574dcf5686-dttcs" Feb 17 14:10:12 crc kubenswrapper[4836]: I0217 14:10:12.815243 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/45204263-0159-4c86-b81a-a900db07b14f-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-574dcf5686-dttcs\" (UID: \"45204263-0159-4c86-b81a-a900db07b14f\") " pod="openshift-authentication/oauth-openshift-574dcf5686-dttcs" Feb 17 14:10:12 crc kubenswrapper[4836]: I0217 14:10:12.815404 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/45204263-0159-4c86-b81a-a900db07b14f-v4-0-config-user-template-error\") pod \"oauth-openshift-574dcf5686-dttcs\" (UID: \"45204263-0159-4c86-b81a-a900db07b14f\") " pod="openshift-authentication/oauth-openshift-574dcf5686-dttcs" Feb 17 14:10:12 crc kubenswrapper[4836]: I0217 14:10:12.815873 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/45204263-0159-4c86-b81a-a900db07b14f-v4-0-config-system-service-ca\") pod \"oauth-openshift-574dcf5686-dttcs\" (UID: \"45204263-0159-4c86-b81a-a900db07b14f\") " pod="openshift-authentication/oauth-openshift-574dcf5686-dttcs" Feb 17 14:10:12 crc kubenswrapper[4836]: I0217 14:10:12.818258 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/45204263-0159-4c86-b81a-a900db07b14f-v4-0-config-system-router-certs\") pod \"oauth-openshift-574dcf5686-dttcs\" (UID: \"45204263-0159-4c86-b81a-a900db07b14f\") " pod="openshift-authentication/oauth-openshift-574dcf5686-dttcs" Feb 17 14:10:12 crc kubenswrapper[4836]: I0217 14:10:12.818402 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/45204263-0159-4c86-b81a-a900db07b14f-v4-0-config-system-serving-cert\") pod \"oauth-openshift-574dcf5686-dttcs\" (UID: \"45204263-0159-4c86-b81a-a900db07b14f\") " pod="openshift-authentication/oauth-openshift-574dcf5686-dttcs" Feb 17 14:10:12 crc kubenswrapper[4836]: I0217 14:10:12.818545 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/45204263-0159-4c86-b81a-a900db07b14f-v4-0-config-user-template-login\") pod \"oauth-openshift-574dcf5686-dttcs\" (UID: \"45204263-0159-4c86-b81a-a900db07b14f\") " pod="openshift-authentication/oauth-openshift-574dcf5686-dttcs" Feb 17 14:10:12 crc kubenswrapper[4836]: I0217 14:10:12.819496 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/45204263-0159-4c86-b81a-a900db07b14f-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-574dcf5686-dttcs\" (UID: \"45204263-0159-4c86-b81a-a900db07b14f\") " pod="openshift-authentication/oauth-openshift-574dcf5686-dttcs" Feb 17 14:10:12 crc kubenswrapper[4836]: I0217 14:10:12.819891 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/45204263-0159-4c86-b81a-a900db07b14f-v4-0-config-system-session\") pod \"oauth-openshift-574dcf5686-dttcs\" (UID: \"45204263-0159-4c86-b81a-a900db07b14f\") " pod="openshift-authentication/oauth-openshift-574dcf5686-dttcs" Feb 17 14:10:12 crc kubenswrapper[4836]: I0217 14:10:12.822046 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/45204263-0159-4c86-b81a-a900db07b14f-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-574dcf5686-dttcs\" (UID: \"45204263-0159-4c86-b81a-a900db07b14f\") " pod="openshift-authentication/oauth-openshift-574dcf5686-dttcs" Feb 17 14:10:12 crc kubenswrapper[4836]: I0217 14:10:12.823788 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/45204263-0159-4c86-b81a-a900db07b14f-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-574dcf5686-dttcs\" (UID: \"45204263-0159-4c86-b81a-a900db07b14f\") " pod="openshift-authentication/oauth-openshift-574dcf5686-dttcs" Feb 17 14:10:12 crc kubenswrapper[4836]: I0217 14:10:12.827358 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdct7\" (UniqueName: \"kubernetes.io/projected/45204263-0159-4c86-b81a-a900db07b14f-kube-api-access-wdct7\") pod \"oauth-openshift-574dcf5686-dttcs\" (UID: \"45204263-0159-4c86-b81a-a900db07b14f\") " pod="openshift-authentication/oauth-openshift-574dcf5686-dttcs" Feb 17 14:10:12 crc kubenswrapper[4836]: I0217 14:10:12.972344 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-574dcf5686-dttcs" Feb 17 14:10:14 crc kubenswrapper[4836]: I0217 14:10:14.075123 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-574dcf5686-dttcs"] Feb 17 14:10:14 crc kubenswrapper[4836]: I0217 14:10:14.580128 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7" path="/var/lib/kubelet/pods/c2d8fb42-9c68-4eb3-a8c9-4e4a98772ae7/volumes" Feb 17 14:10:14 crc kubenswrapper[4836]: I0217 14:10:14.801729 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-574dcf5686-dttcs" event={"ID":"45204263-0159-4c86-b81a-a900db07b14f","Type":"ContainerStarted","Data":"45d78833c75355ecb9992b443e268d8aac688a767fe2b22a8848b9f4142aa91f"} Feb 17 14:10:14 crc kubenswrapper[4836]: I0217 14:10:14.801838 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-574dcf5686-dttcs" event={"ID":"45204263-0159-4c86-b81a-a900db07b14f","Type":"ContainerStarted","Data":"f076f81bf29321a523aa5373a441006ff695ed5c2c0f577c0822b4a5c8173c20"} Feb 17 14:10:14 crc kubenswrapper[4836]: I0217 14:10:14.805824 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-574dcf5686-dttcs" Feb 17 14:10:14 crc kubenswrapper[4836]: I0217 14:10:14.809729 4836 patch_prober.go:28] interesting pod/oauth-openshift-574dcf5686-dttcs container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.57:6443/healthz\": dial tcp 10.217.0.57:6443: connect: connection refused" start-of-body= Feb 17 14:10:14 crc kubenswrapper[4836]: I0217 14:10:14.809800 4836 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-574dcf5686-dttcs" podUID="45204263-0159-4c86-b81a-a900db07b14f" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.57:6443/healthz\": dial tcp 10.217.0.57:6443: connect: connection refused" Feb 17 14:10:14 crc kubenswrapper[4836]: I0217 14:10:14.830956 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-574dcf5686-dttcs" podStartSLOduration=28.830922267 podStartE2EDuration="28.830922267s" podCreationTimestamp="2026-02-17 14:09:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:10:14.828043121 +0000 UTC m=+241.170971400" watchObservedRunningTime="2026-02-17 14:10:14.830922267 +0000 UTC m=+241.173850536" Feb 17 14:10:15 crc kubenswrapper[4836]: I0217 14:10:15.818288 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-574dcf5686-dttcs" Feb 17 14:10:16 crc kubenswrapper[4836]: I0217 14:10:16.425317 4836 patch_prober.go:28] interesting pod/downloads-7954f5f757-5cbbv container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Feb 17 14:10:16 crc kubenswrapper[4836]: I0217 14:10:16.425701 4836 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-5cbbv" podUID="d9eb5c8b-f3c7-4068-82c7-28520f6905c6" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Feb 17 14:10:16 crc kubenswrapper[4836]: I0217 14:10:16.426062 4836 patch_prober.go:28] interesting pod/downloads-7954f5f757-5cbbv container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Feb 17 14:10:16 crc kubenswrapper[4836]: I0217 14:10:16.426082 4836 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-5cbbv" podUID="d9eb5c8b-f3c7-4068-82c7-28520f6905c6" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Feb 17 14:10:16 crc kubenswrapper[4836]: I0217 14:10:16.478631 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-tmpvx" Feb 17 14:10:16 crc kubenswrapper[4836]: I0217 14:10:16.527318 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-tmpvx" Feb 17 14:10:16 crc kubenswrapper[4836]: I0217 14:10:16.821620 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-252vj" event={"ID":"a172042c-7dc6-4cea-906e-3d9135523f15","Type":"ContainerStarted","Data":"5fe13927481d2948ed6f845b9678013bbf8fcbf061f7116c0ec82c5abd9ee696"} Feb 17 14:10:17 crc kubenswrapper[4836]: I0217 14:10:17.218878 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vfmw4" Feb 17 14:10:17 crc kubenswrapper[4836]: I0217 14:10:17.265844 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-cmk55" Feb 17 14:10:17 crc kubenswrapper[4836]: I0217 14:10:17.391790 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-cmk55" Feb 17 14:10:17 crc kubenswrapper[4836]: I0217 14:10:17.391885 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vfmw4" Feb 17 14:10:18 crc kubenswrapper[4836]: I0217 14:10:18.070646 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-cmk55"] Feb 17 14:10:18 crc kubenswrapper[4836]: I0217 14:10:18.264491 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-pxwhr" Feb 17 14:10:18 crc kubenswrapper[4836]: I0217 14:10:18.581273 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-5kqqh" Feb 17 14:10:18 crc kubenswrapper[4836]: I0217 14:10:18.638120 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-pxwhr" Feb 17 14:10:18 crc kubenswrapper[4836]: I0217 14:10:18.643935 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-5kqqh" Feb 17 14:10:18 crc kubenswrapper[4836]: I0217 14:10:18.878962 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5rfnm" event={"ID":"88cf2bb1-d70f-4b82-9b9a-9d7c7a4244ff","Type":"ContainerStarted","Data":"c3a9e8665fc73a09585fc2a34fa81b5e1b737549477832ab2c07840098e0bf3c"} Feb 17 14:10:18 crc kubenswrapper[4836]: I0217 14:10:18.885307 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9w8zr" event={"ID":"089d1289-afe9-4ffe-9d96-ac10058335ed","Type":"ContainerStarted","Data":"12b9c51f4d9306ca0c2b4adb55d1695962298f8f615d1a514d7884045bb5aea1"} Feb 17 14:10:18 crc kubenswrapper[4836]: I0217 14:10:18.885987 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-cmk55" podUID="f1bd4ed0-3b99-4446-9218-71bb589da4a4" containerName="registry-server" containerID="cri-o://6edde19e8fa08b860a5493cc87da492992d83a470155d9fcd528dfc9281f65eb" gracePeriod=2 Feb 17 14:10:19 crc kubenswrapper[4836]: I0217 14:10:19.880026 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tmpvx"] Feb 17 14:10:19 crc kubenswrapper[4836]: I0217 14:10:19.880692 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-tmpvx" podUID="c6c873c6-ddde-4b9b-9141-e6de9be567d4" containerName="registry-server" containerID="cri-o://a23a8c919a9eea21ab628dc33e93962e83f3bdb7249542d3b4905dd07bca224b" gracePeriod=2 Feb 17 14:10:19 crc kubenswrapper[4836]: I0217 14:10:19.906143 4836 generic.go:334] "Generic (PLEG): container finished" podID="f1bd4ed0-3b99-4446-9218-71bb589da4a4" containerID="6edde19e8fa08b860a5493cc87da492992d83a470155d9fcd528dfc9281f65eb" exitCode=0 Feb 17 14:10:19 crc kubenswrapper[4836]: I0217 14:10:19.906199 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cmk55" event={"ID":"f1bd4ed0-3b99-4446-9218-71bb589da4a4","Type":"ContainerDied","Data":"6edde19e8fa08b860a5493cc87da492992d83a470155d9fcd528dfc9281f65eb"} Feb 17 14:10:20 crc kubenswrapper[4836]: I0217 14:10:20.930141 4836 generic.go:334] "Generic (PLEG): container finished" podID="c6c873c6-ddde-4b9b-9141-e6de9be567d4" containerID="a23a8c919a9eea21ab628dc33e93962e83f3bdb7249542d3b4905dd07bca224b" exitCode=0 Feb 17 14:10:20 crc kubenswrapper[4836]: I0217 14:10:20.930274 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tmpvx" event={"ID":"c6c873c6-ddde-4b9b-9141-e6de9be567d4","Type":"ContainerDied","Data":"a23a8c919a9eea21ab628dc33e93962e83f3bdb7249542d3b4905dd07bca224b"} Feb 17 14:10:21 crc kubenswrapper[4836]: I0217 14:10:21.230144 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cmk55" Feb 17 14:10:21 crc kubenswrapper[4836]: I0217 14:10:21.421251 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1bd4ed0-3b99-4446-9218-71bb589da4a4-utilities\") pod \"f1bd4ed0-3b99-4446-9218-71bb589da4a4\" (UID: \"f1bd4ed0-3b99-4446-9218-71bb589da4a4\") " Feb 17 14:10:21 crc kubenswrapper[4836]: I0217 14:10:21.421417 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fmwlv\" (UniqueName: \"kubernetes.io/projected/f1bd4ed0-3b99-4446-9218-71bb589da4a4-kube-api-access-fmwlv\") pod \"f1bd4ed0-3b99-4446-9218-71bb589da4a4\" (UID: \"f1bd4ed0-3b99-4446-9218-71bb589da4a4\") " Feb 17 14:10:21 crc kubenswrapper[4836]: I0217 14:10:21.421470 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1bd4ed0-3b99-4446-9218-71bb589da4a4-catalog-content\") pod \"f1bd4ed0-3b99-4446-9218-71bb589da4a4\" (UID: \"f1bd4ed0-3b99-4446-9218-71bb589da4a4\") " Feb 17 14:10:21 crc kubenswrapper[4836]: I0217 14:10:21.540390 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1bd4ed0-3b99-4446-9218-71bb589da4a4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f1bd4ed0-3b99-4446-9218-71bb589da4a4" (UID: "f1bd4ed0-3b99-4446-9218-71bb589da4a4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:10:21 crc kubenswrapper[4836]: I0217 14:10:21.541707 4836 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1bd4ed0-3b99-4446-9218-71bb589da4a4-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 14:10:21 crc kubenswrapper[4836]: I0217 14:10:21.543175 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1bd4ed0-3b99-4446-9218-71bb589da4a4-utilities" (OuterVolumeSpecName: "utilities") pod "f1bd4ed0-3b99-4446-9218-71bb589da4a4" (UID: "f1bd4ed0-3b99-4446-9218-71bb589da4a4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:10:21 crc kubenswrapper[4836]: I0217 14:10:21.637913 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1bd4ed0-3b99-4446-9218-71bb589da4a4-kube-api-access-fmwlv" (OuterVolumeSpecName: "kube-api-access-fmwlv") pod "f1bd4ed0-3b99-4446-9218-71bb589da4a4" (UID: "f1bd4ed0-3b99-4446-9218-71bb589da4a4"). InnerVolumeSpecName "kube-api-access-fmwlv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:10:21 crc kubenswrapper[4836]: I0217 14:10:21.642515 4836 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1bd4ed0-3b99-4446-9218-71bb589da4a4-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 14:10:21 crc kubenswrapper[4836]: I0217 14:10:21.642544 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fmwlv\" (UniqueName: \"kubernetes.io/projected/f1bd4ed0-3b99-4446-9218-71bb589da4a4-kube-api-access-fmwlv\") on node \"crc\" DevicePath \"\"" Feb 17 14:10:21 crc kubenswrapper[4836]: I0217 14:10:21.980983 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cmk55" event={"ID":"f1bd4ed0-3b99-4446-9218-71bb589da4a4","Type":"ContainerDied","Data":"6e75d917f9b18c07b2feade7d6ceab556bb6226e0a78e8a3d47b72928e406bad"} Feb 17 14:10:21 crc kubenswrapper[4836]: I0217 14:10:21.981585 4836 scope.go:117] "RemoveContainer" containerID="6edde19e8fa08b860a5493cc87da492992d83a470155d9fcd528dfc9281f65eb" Feb 17 14:10:21 crc kubenswrapper[4836]: I0217 14:10:21.981785 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cmk55" Feb 17 14:10:22 crc kubenswrapper[4836]: I0217 14:10:22.016458 4836 generic.go:334] "Generic (PLEG): container finished" podID="089d1289-afe9-4ffe-9d96-ac10058335ed" containerID="12b9c51f4d9306ca0c2b4adb55d1695962298f8f615d1a514d7884045bb5aea1" exitCode=0 Feb 17 14:10:22 crc kubenswrapper[4836]: I0217 14:10:22.016529 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9w8zr" event={"ID":"089d1289-afe9-4ffe-9d96-ac10058335ed","Type":"ContainerDied","Data":"12b9c51f4d9306ca0c2b4adb55d1695962298f8f615d1a514d7884045bb5aea1"} Feb 17 14:10:22 crc kubenswrapper[4836]: I0217 14:10:22.260429 4836 scope.go:117] "RemoveContainer" containerID="73b212b6f45054b199f4f919939777dc7461c5c17f33a2a285ccf07966ece193" Feb 17 14:10:22 crc kubenswrapper[4836]: I0217 14:10:22.262316 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5kqqh"] Feb 17 14:10:22 crc kubenswrapper[4836]: I0217 14:10:22.262649 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-5kqqh" podUID="c85860a6-c3bb-448b-b812-cbf38230de01" containerName="registry-server" containerID="cri-o://050ce9294d63c1cb12cc162f10c12c570ecb02593b7c8408dbd3bd6ac92c0e81" gracePeriod=2 Feb 17 14:10:22 crc kubenswrapper[4836]: I0217 14:10:22.270579 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-cmk55"] Feb 17 14:10:22 crc kubenswrapper[4836]: I0217 14:10:22.273339 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-cmk55"] Feb 17 14:10:22 crc kubenswrapper[4836]: I0217 14:10:22.642790 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1bd4ed0-3b99-4446-9218-71bb589da4a4" path="/var/lib/kubelet/pods/f1bd4ed0-3b99-4446-9218-71bb589da4a4/volumes" Feb 17 14:10:22 crc kubenswrapper[4836]: I0217 14:10:22.987529 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tmpvx" Feb 17 14:10:22 crc kubenswrapper[4836]: I0217 14:10:22.995675 4836 scope.go:117] "RemoveContainer" containerID="9679e4b4c4f0f644eb56ce9a9ac7ad7178d79f35bc0d94642d6b2ded1809a114" Feb 17 14:10:23 crc kubenswrapper[4836]: I0217 14:10:23.032016 4836 generic.go:334] "Generic (PLEG): container finished" podID="c85860a6-c3bb-448b-b812-cbf38230de01" containerID="050ce9294d63c1cb12cc162f10c12c570ecb02593b7c8408dbd3bd6ac92c0e81" exitCode=0 Feb 17 14:10:23 crc kubenswrapper[4836]: I0217 14:10:23.032094 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5kqqh" event={"ID":"c85860a6-c3bb-448b-b812-cbf38230de01","Type":"ContainerDied","Data":"050ce9294d63c1cb12cc162f10c12c570ecb02593b7c8408dbd3bd6ac92c0e81"} Feb 17 14:10:23 crc kubenswrapper[4836]: I0217 14:10:23.045917 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tmpvx" event={"ID":"c6c873c6-ddde-4b9b-9141-e6de9be567d4","Type":"ContainerDied","Data":"79e0157c4fae70c4a163e7552bd45039fe6e084cf3fa63db4fbd428401695df6"} Feb 17 14:10:23 crc kubenswrapper[4836]: I0217 14:10:23.045993 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tmpvx" Feb 17 14:10:23 crc kubenswrapper[4836]: I0217 14:10:23.045991 4836 scope.go:117] "RemoveContainer" containerID="a23a8c919a9eea21ab628dc33e93962e83f3bdb7249542d3b4905dd07bca224b" Feb 17 14:10:23 crc kubenswrapper[4836]: I0217 14:10:23.065023 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7fk4b\" (UniqueName: \"kubernetes.io/projected/c6c873c6-ddde-4b9b-9141-e6de9be567d4-kube-api-access-7fk4b\") pod \"c6c873c6-ddde-4b9b-9141-e6de9be567d4\" (UID: \"c6c873c6-ddde-4b9b-9141-e6de9be567d4\") " Feb 17 14:10:23 crc kubenswrapper[4836]: I0217 14:10:23.065136 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6c873c6-ddde-4b9b-9141-e6de9be567d4-catalog-content\") pod \"c6c873c6-ddde-4b9b-9141-e6de9be567d4\" (UID: \"c6c873c6-ddde-4b9b-9141-e6de9be567d4\") " Feb 17 14:10:23 crc kubenswrapper[4836]: I0217 14:10:23.065272 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6c873c6-ddde-4b9b-9141-e6de9be567d4-utilities\") pod \"c6c873c6-ddde-4b9b-9141-e6de9be567d4\" (UID: \"c6c873c6-ddde-4b9b-9141-e6de9be567d4\") " Feb 17 14:10:23 crc kubenswrapper[4836]: I0217 14:10:23.067010 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6c873c6-ddde-4b9b-9141-e6de9be567d4-utilities" (OuterVolumeSpecName: "utilities") pod "c6c873c6-ddde-4b9b-9141-e6de9be567d4" (UID: "c6c873c6-ddde-4b9b-9141-e6de9be567d4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:10:23 crc kubenswrapper[4836]: I0217 14:10:23.104788 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6c873c6-ddde-4b9b-9141-e6de9be567d4-kube-api-access-7fk4b" (OuterVolumeSpecName: "kube-api-access-7fk4b") pod "c6c873c6-ddde-4b9b-9141-e6de9be567d4" (UID: "c6c873c6-ddde-4b9b-9141-e6de9be567d4"). InnerVolumeSpecName "kube-api-access-7fk4b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:10:23 crc kubenswrapper[4836]: I0217 14:10:23.125654 4836 scope.go:117] "RemoveContainer" containerID="eac3b1d23d40a9e3d574bb39b162de6c6b11b16dff12abcfba61b9ba01c21760" Feb 17 14:10:23 crc kubenswrapper[4836]: I0217 14:10:23.151731 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6c873c6-ddde-4b9b-9141-e6de9be567d4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c6c873c6-ddde-4b9b-9141-e6de9be567d4" (UID: "c6c873c6-ddde-4b9b-9141-e6de9be567d4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:10:23 crc kubenswrapper[4836]: I0217 14:10:23.167064 4836 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6c873c6-ddde-4b9b-9141-e6de9be567d4-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 14:10:23 crc kubenswrapper[4836]: I0217 14:10:23.167108 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7fk4b\" (UniqueName: \"kubernetes.io/projected/c6c873c6-ddde-4b9b-9141-e6de9be567d4-kube-api-access-7fk4b\") on node \"crc\" DevicePath \"\"" Feb 17 14:10:23 crc kubenswrapper[4836]: I0217 14:10:23.167123 4836 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6c873c6-ddde-4b9b-9141-e6de9be567d4-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 14:10:23 crc kubenswrapper[4836]: I0217 14:10:23.274638 4836 scope.go:117] "RemoveContainer" containerID="8aece0956593a85800757e782bdc3eb1d3d87f1ac99e3fc8ce9f7012a48be219" Feb 17 14:10:23 crc kubenswrapper[4836]: I0217 14:10:23.365653 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5kqqh" Feb 17 14:10:23 crc kubenswrapper[4836]: I0217 14:10:23.400489 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tmpvx"] Feb 17 14:10:23 crc kubenswrapper[4836]: I0217 14:10:23.401693 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-tmpvx"] Feb 17 14:10:23 crc kubenswrapper[4836]: I0217 14:10:23.470695 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t8k6d\" (UniqueName: \"kubernetes.io/projected/c85860a6-c3bb-448b-b812-cbf38230de01-kube-api-access-t8k6d\") pod \"c85860a6-c3bb-448b-b812-cbf38230de01\" (UID: \"c85860a6-c3bb-448b-b812-cbf38230de01\") " Feb 17 14:10:23 crc kubenswrapper[4836]: I0217 14:10:23.470816 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c85860a6-c3bb-448b-b812-cbf38230de01-utilities\") pod \"c85860a6-c3bb-448b-b812-cbf38230de01\" (UID: \"c85860a6-c3bb-448b-b812-cbf38230de01\") " Feb 17 14:10:23 crc kubenswrapper[4836]: I0217 14:10:23.470960 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c85860a6-c3bb-448b-b812-cbf38230de01-catalog-content\") pod \"c85860a6-c3bb-448b-b812-cbf38230de01\" (UID: \"c85860a6-c3bb-448b-b812-cbf38230de01\") " Feb 17 14:10:23 crc kubenswrapper[4836]: I0217 14:10:23.472381 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c85860a6-c3bb-448b-b812-cbf38230de01-utilities" (OuterVolumeSpecName: "utilities") pod "c85860a6-c3bb-448b-b812-cbf38230de01" (UID: "c85860a6-c3bb-448b-b812-cbf38230de01"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:10:23 crc kubenswrapper[4836]: I0217 14:10:23.475252 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c85860a6-c3bb-448b-b812-cbf38230de01-kube-api-access-t8k6d" (OuterVolumeSpecName: "kube-api-access-t8k6d") pod "c85860a6-c3bb-448b-b812-cbf38230de01" (UID: "c85860a6-c3bb-448b-b812-cbf38230de01"). InnerVolumeSpecName "kube-api-access-t8k6d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:10:23 crc kubenswrapper[4836]: I0217 14:10:23.498605 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c85860a6-c3bb-448b-b812-cbf38230de01-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c85860a6-c3bb-448b-b812-cbf38230de01" (UID: "c85860a6-c3bb-448b-b812-cbf38230de01"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:10:23 crc kubenswrapper[4836]: I0217 14:10:23.572011 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t8k6d\" (UniqueName: \"kubernetes.io/projected/c85860a6-c3bb-448b-b812-cbf38230de01-kube-api-access-t8k6d\") on node \"crc\" DevicePath \"\"" Feb 17 14:10:23 crc kubenswrapper[4836]: I0217 14:10:23.572068 4836 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c85860a6-c3bb-448b-b812-cbf38230de01-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 14:10:23 crc kubenswrapper[4836]: I0217 14:10:23.572079 4836 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c85860a6-c3bb-448b-b812-cbf38230de01-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 14:10:24 crc kubenswrapper[4836]: I0217 14:10:24.056587 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9w8zr" event={"ID":"089d1289-afe9-4ffe-9d96-ac10058335ed","Type":"ContainerStarted","Data":"56a4ac051fd52f2fd8e193686dffb745df251c7f892fec72d600a2fa80ecbd34"} Feb 17 14:10:24 crc kubenswrapper[4836]: I0217 14:10:24.058326 4836 generic.go:334] "Generic (PLEG): container finished" podID="88cf2bb1-d70f-4b82-9b9a-9d7c7a4244ff" containerID="c3a9e8665fc73a09585fc2a34fa81b5e1b737549477832ab2c07840098e0bf3c" exitCode=0 Feb 17 14:10:24 crc kubenswrapper[4836]: I0217 14:10:24.058402 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5rfnm" event={"ID":"88cf2bb1-d70f-4b82-9b9a-9d7c7a4244ff","Type":"ContainerDied","Data":"c3a9e8665fc73a09585fc2a34fa81b5e1b737549477832ab2c07840098e0bf3c"} Feb 17 14:10:24 crc kubenswrapper[4836]: I0217 14:10:24.067583 4836 generic.go:334] "Generic (PLEG): container finished" podID="a172042c-7dc6-4cea-906e-3d9135523f15" containerID="5fe13927481d2948ed6f845b9678013bbf8fcbf061f7116c0ec82c5abd9ee696" exitCode=0 Feb 17 14:10:24 crc kubenswrapper[4836]: I0217 14:10:24.067680 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-252vj" event={"ID":"a172042c-7dc6-4cea-906e-3d9135523f15","Type":"ContainerDied","Data":"5fe13927481d2948ed6f845b9678013bbf8fcbf061f7116c0ec82c5abd9ee696"} Feb 17 14:10:24 crc kubenswrapper[4836]: I0217 14:10:24.072093 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5kqqh" event={"ID":"c85860a6-c3bb-448b-b812-cbf38230de01","Type":"ContainerDied","Data":"f9efa614ea777c6c1f7f2234c739bb0e406ce4096c5477be16d8aba1cfb4c85e"} Feb 17 14:10:24 crc kubenswrapper[4836]: I0217 14:10:24.072152 4836 scope.go:117] "RemoveContainer" containerID="050ce9294d63c1cb12cc162f10c12c570ecb02593b7c8408dbd3bd6ac92c0e81" Feb 17 14:10:24 crc kubenswrapper[4836]: I0217 14:10:24.072191 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5kqqh" Feb 17 14:10:24 crc kubenswrapper[4836]: I0217 14:10:24.079430 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9w8zr" podStartSLOduration=8.918052428 podStartE2EDuration="1m40.079411311s" podCreationTimestamp="2026-02-17 14:08:44 +0000 UTC" firstStartedPulling="2026-02-17 14:08:51.932978604 +0000 UTC m=+158.275906873" lastFinishedPulling="2026-02-17 14:10:23.094337487 +0000 UTC m=+249.437265756" observedRunningTime="2026-02-17 14:10:24.078075756 +0000 UTC m=+250.421004115" watchObservedRunningTime="2026-02-17 14:10:24.079411311 +0000 UTC m=+250.422339600" Feb 17 14:10:24 crc kubenswrapper[4836]: I0217 14:10:24.094348 4836 scope.go:117] "RemoveContainer" containerID="8d61046d718ebf03dc13da6194072e3009ef971818f44de3733bfb8ab11c1f92" Feb 17 14:10:24 crc kubenswrapper[4836]: I0217 14:10:24.112874 4836 scope.go:117] "RemoveContainer" containerID="d36870560f8d1243c818dca57cf74dea7a07e8c43795bb396db32ccfc2a302b6" Feb 17 14:10:24 crc kubenswrapper[4836]: I0217 14:10:24.165159 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5kqqh"] Feb 17 14:10:24 crc kubenswrapper[4836]: I0217 14:10:24.167902 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-5kqqh"] Feb 17 14:10:24 crc kubenswrapper[4836]: I0217 14:10:24.575258 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6c873c6-ddde-4b9b-9141-e6de9be567d4" path="/var/lib/kubelet/pods/c6c873c6-ddde-4b9b-9141-e6de9be567d4/volumes" Feb 17 14:10:24 crc kubenswrapper[4836]: I0217 14:10:24.575989 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c85860a6-c3bb-448b-b812-cbf38230de01" path="/var/lib/kubelet/pods/c85860a6-c3bb-448b-b812-cbf38230de01/volumes" Feb 17 14:10:25 crc kubenswrapper[4836]: I0217 14:10:25.082136 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5rfnm" event={"ID":"88cf2bb1-d70f-4b82-9b9a-9d7c7a4244ff","Type":"ContainerStarted","Data":"cde4d60e0777293bf4b491417a101ff54d8c34c827d15b3c2d4b38007512d375"} Feb 17 14:10:25 crc kubenswrapper[4836]: I0217 14:10:25.083793 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-252vj" event={"ID":"a172042c-7dc6-4cea-906e-3d9135523f15","Type":"ContainerStarted","Data":"6b92cf31cabf6eec3e2e948b5b114dee8dd77bf37e267af921b53ecc81d156aa"} Feb 17 14:10:25 crc kubenswrapper[4836]: I0217 14:10:25.118878 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-5rfnm" podStartSLOduration=4.357341982 podStartE2EDuration="1m37.118848259s" podCreationTimestamp="2026-02-17 14:08:48 +0000 UTC" firstStartedPulling="2026-02-17 14:08:51.976065278 +0000 UTC m=+158.318993547" lastFinishedPulling="2026-02-17 14:10:24.737571555 +0000 UTC m=+251.080499824" observedRunningTime="2026-02-17 14:10:25.115795308 +0000 UTC m=+251.458723597" watchObservedRunningTime="2026-02-17 14:10:25.118848259 +0000 UTC m=+251.461776528" Feb 17 14:10:25 crc kubenswrapper[4836]: I0217 14:10:25.138895 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-252vj" podStartSLOduration=4.503022009 podStartE2EDuration="1m37.138843849s" podCreationTimestamp="2026-02-17 14:08:48 +0000 UTC" firstStartedPulling="2026-02-17 14:08:51.99005929 +0000 UTC m=+158.332987559" lastFinishedPulling="2026-02-17 14:10:24.62588112 +0000 UTC m=+250.968809399" observedRunningTime="2026-02-17 14:10:25.134018451 +0000 UTC m=+251.476946740" watchObservedRunningTime="2026-02-17 14:10:25.138843849 +0000 UTC m=+251.481772118" Feb 17 14:10:25 crc kubenswrapper[4836]: I0217 14:10:25.665887 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9w8zr" Feb 17 14:10:25 crc kubenswrapper[4836]: I0217 14:10:25.665939 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9w8zr" Feb 17 14:10:26 crc kubenswrapper[4836]: I0217 14:10:26.431247 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-5cbbv" Feb 17 14:10:26 crc kubenswrapper[4836]: I0217 14:10:26.721629 4836 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-9w8zr" podUID="089d1289-afe9-4ffe-9d96-ac10058335ed" containerName="registry-server" probeResult="failure" output=< Feb 17 14:10:26 crc kubenswrapper[4836]: timeout: failed to connect service ":50051" within 1s Feb 17 14:10:26 crc kubenswrapper[4836]: > Feb 17 14:10:28 crc kubenswrapper[4836]: I0217 14:10:28.574166 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-252vj" Feb 17 14:10:28 crc kubenswrapper[4836]: I0217 14:10:28.606832 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-252vj" Feb 17 14:10:29 crc kubenswrapper[4836]: I0217 14:10:29.482993 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-5rfnm" Feb 17 14:10:29 crc kubenswrapper[4836]: I0217 14:10:29.483466 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-5rfnm" Feb 17 14:10:29 crc kubenswrapper[4836]: I0217 14:10:29.628174 4836 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-252vj" podUID="a172042c-7dc6-4cea-906e-3d9135523f15" containerName="registry-server" probeResult="failure" output=< Feb 17 14:10:29 crc kubenswrapper[4836]: timeout: failed to connect service ":50051" within 1s Feb 17 14:10:29 crc kubenswrapper[4836]: > Feb 17 14:10:30 crc kubenswrapper[4836]: I0217 14:10:30.541090 4836 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-5rfnm" podUID="88cf2bb1-d70f-4b82-9b9a-9d7c7a4244ff" containerName="registry-server" probeResult="failure" output=< Feb 17 14:10:30 crc kubenswrapper[4836]: timeout: failed to connect service ":50051" within 1s Feb 17 14:10:30 crc kubenswrapper[4836]: > Feb 17 14:10:35 crc kubenswrapper[4836]: I0217 14:10:35.666595 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9w8zr" Feb 17 14:10:35 crc kubenswrapper[4836]: I0217 14:10:35.710071 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9w8zr" Feb 17 14:10:38 crc kubenswrapper[4836]: I0217 14:10:38.623039 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-252vj" Feb 17 14:10:38 crc kubenswrapper[4836]: I0217 14:10:38.673568 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-252vj" Feb 17 14:10:39 crc kubenswrapper[4836]: I0217 14:10:39.530950 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-5rfnm" Feb 17 14:10:39 crc kubenswrapper[4836]: I0217 14:10:39.569911 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-5rfnm" Feb 17 14:10:40 crc kubenswrapper[4836]: I0217 14:10:40.210925 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5rfnm"] Feb 17 14:10:41 crc kubenswrapper[4836]: I0217 14:10:41.229654 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-5rfnm" podUID="88cf2bb1-d70f-4b82-9b9a-9d7c7a4244ff" containerName="registry-server" containerID="cri-o://cde4d60e0777293bf4b491417a101ff54d8c34c827d15b3c2d4b38007512d375" gracePeriod=2 Feb 17 14:10:41 crc kubenswrapper[4836]: I0217 14:10:41.597879 4836 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 17 14:10:41 crc kubenswrapper[4836]: E0217 14:10:41.598950 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6c873c6-ddde-4b9b-9141-e6de9be567d4" containerName="registry-server" Feb 17 14:10:41 crc kubenswrapper[4836]: I0217 14:10:41.598972 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6c873c6-ddde-4b9b-9141-e6de9be567d4" containerName="registry-server" Feb 17 14:10:41 crc kubenswrapper[4836]: E0217 14:10:41.599011 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c85860a6-c3bb-448b-b812-cbf38230de01" containerName="extract-content" Feb 17 14:10:41 crc kubenswrapper[4836]: I0217 14:10:41.599020 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="c85860a6-c3bb-448b-b812-cbf38230de01" containerName="extract-content" Feb 17 14:10:41 crc kubenswrapper[4836]: E0217 14:10:41.599029 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6c873c6-ddde-4b9b-9141-e6de9be567d4" containerName="extract-utilities" Feb 17 14:10:41 crc kubenswrapper[4836]: I0217 14:10:41.599038 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6c873c6-ddde-4b9b-9141-e6de9be567d4" containerName="extract-utilities" Feb 17 14:10:41 crc kubenswrapper[4836]: E0217 14:10:41.599057 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c85860a6-c3bb-448b-b812-cbf38230de01" containerName="extract-utilities" Feb 17 14:10:41 crc kubenswrapper[4836]: I0217 14:10:41.599082 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="c85860a6-c3bb-448b-b812-cbf38230de01" containerName="extract-utilities" Feb 17 14:10:41 crc kubenswrapper[4836]: E0217 14:10:41.599098 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1bd4ed0-3b99-4446-9218-71bb589da4a4" containerName="registry-server" Feb 17 14:10:41 crc kubenswrapper[4836]: I0217 14:10:41.599105 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1bd4ed0-3b99-4446-9218-71bb589da4a4" containerName="registry-server" Feb 17 14:10:41 crc kubenswrapper[4836]: E0217 14:10:41.599122 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1bd4ed0-3b99-4446-9218-71bb589da4a4" containerName="extract-content" Feb 17 14:10:41 crc kubenswrapper[4836]: I0217 14:10:41.599129 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1bd4ed0-3b99-4446-9218-71bb589da4a4" containerName="extract-content" Feb 17 14:10:41 crc kubenswrapper[4836]: E0217 14:10:41.599163 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6c873c6-ddde-4b9b-9141-e6de9be567d4" containerName="extract-content" Feb 17 14:10:41 crc kubenswrapper[4836]: I0217 14:10:41.599170 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6c873c6-ddde-4b9b-9141-e6de9be567d4" containerName="extract-content" Feb 17 14:10:41 crc kubenswrapper[4836]: E0217 14:10:41.599181 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1bd4ed0-3b99-4446-9218-71bb589da4a4" containerName="extract-utilities" Feb 17 14:10:41 crc kubenswrapper[4836]: I0217 14:10:41.599189 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1bd4ed0-3b99-4446-9218-71bb589da4a4" containerName="extract-utilities" Feb 17 14:10:41 crc kubenswrapper[4836]: E0217 14:10:41.599199 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c85860a6-c3bb-448b-b812-cbf38230de01" containerName="registry-server" Feb 17 14:10:41 crc kubenswrapper[4836]: I0217 14:10:41.599205 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="c85860a6-c3bb-448b-b812-cbf38230de01" containerName="registry-server" Feb 17 14:10:41 crc kubenswrapper[4836]: I0217 14:10:41.599364 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="c85860a6-c3bb-448b-b812-cbf38230de01" containerName="registry-server" Feb 17 14:10:41 crc kubenswrapper[4836]: I0217 14:10:41.599455 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6c873c6-ddde-4b9b-9141-e6de9be567d4" containerName="registry-server" Feb 17 14:10:41 crc kubenswrapper[4836]: I0217 14:10:41.599469 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1bd4ed0-3b99-4446-9218-71bb589da4a4" containerName="registry-server" Feb 17 14:10:41 crc kubenswrapper[4836]: I0217 14:10:41.600381 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 14:10:41 crc kubenswrapper[4836]: I0217 14:10:41.638287 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5rfnm" Feb 17 14:10:41 crc kubenswrapper[4836]: I0217 14:10:41.663820 4836 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 17 14:10:41 crc kubenswrapper[4836]: I0217 14:10:41.664230 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://281c6f51b95a96d557a29b449759275131f2fde30c4814e7692edfff96442e71" gracePeriod=15 Feb 17 14:10:41 crc kubenswrapper[4836]: I0217 14:10:41.664403 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://edaaa277a27d7e7b9e8b42acaa9ee04fc0933b440441ad3f8abee458e96945c0" gracePeriod=15 Feb 17 14:10:41 crc kubenswrapper[4836]: I0217 14:10:41.664393 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://9bc90fa5f9363034d51c35721493655d10f8f8d5fad4dda86ca4c882963fb7a3" gracePeriod=15 Feb 17 14:10:41 crc kubenswrapper[4836]: I0217 14:10:41.664566 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://2b36202875b19f7dafcfde168e006d9d47571de30240d5886e5e0920b74b609b" gracePeriod=15 Feb 17 14:10:41 crc kubenswrapper[4836]: I0217 14:10:41.664538 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://14e1fa543737ef273b75b2cffc78dc2edf4d1673c2c4095edda7c40db8238f2c" gracePeriod=15 Feb 17 14:10:41 crc kubenswrapper[4836]: I0217 14:10:41.667596 4836 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 17 14:10:41 crc kubenswrapper[4836]: E0217 14:10:41.667877 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 17 14:10:41 crc kubenswrapper[4836]: I0217 14:10:41.667903 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 17 14:10:41 crc kubenswrapper[4836]: E0217 14:10:41.667917 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88cf2bb1-d70f-4b82-9b9a-9d7c7a4244ff" containerName="extract-utilities" Feb 17 14:10:41 crc kubenswrapper[4836]: I0217 14:10:41.667925 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="88cf2bb1-d70f-4b82-9b9a-9d7c7a4244ff" containerName="extract-utilities" Feb 17 14:10:41 crc kubenswrapper[4836]: E0217 14:10:41.667943 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 17 14:10:41 crc kubenswrapper[4836]: I0217 14:10:41.668331 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 17 14:10:41 crc kubenswrapper[4836]: E0217 14:10:41.668352 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 17 14:10:41 crc kubenswrapper[4836]: I0217 14:10:41.668361 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 17 14:10:41 crc kubenswrapper[4836]: E0217 14:10:41.668371 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 17 14:10:41 crc kubenswrapper[4836]: I0217 14:10:41.668377 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 17 14:10:41 crc kubenswrapper[4836]: E0217 14:10:41.668386 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 17 14:10:41 crc kubenswrapper[4836]: I0217 14:10:41.668392 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 17 14:10:41 crc kubenswrapper[4836]: E0217 14:10:41.668399 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 17 14:10:41 crc kubenswrapper[4836]: I0217 14:10:41.668408 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 17 14:10:41 crc kubenswrapper[4836]: E0217 14:10:41.668419 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88cf2bb1-d70f-4b82-9b9a-9d7c7a4244ff" containerName="extract-content" Feb 17 14:10:41 crc kubenswrapper[4836]: I0217 14:10:41.668425 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="88cf2bb1-d70f-4b82-9b9a-9d7c7a4244ff" containerName="extract-content" Feb 17 14:10:41 crc kubenswrapper[4836]: E0217 14:10:41.668433 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88cf2bb1-d70f-4b82-9b9a-9d7c7a4244ff" containerName="registry-server" Feb 17 14:10:41 crc kubenswrapper[4836]: I0217 14:10:41.668439 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="88cf2bb1-d70f-4b82-9b9a-9d7c7a4244ff" containerName="registry-server" Feb 17 14:10:41 crc kubenswrapper[4836]: E0217 14:10:41.668447 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 17 14:10:41 crc kubenswrapper[4836]: I0217 14:10:41.668453 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 17 14:10:41 crc kubenswrapper[4836]: I0217 14:10:41.668554 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 17 14:10:41 crc kubenswrapper[4836]: I0217 14:10:41.668577 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 17 14:10:41 crc kubenswrapper[4836]: I0217 14:10:41.668585 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 17 14:10:41 crc kubenswrapper[4836]: I0217 14:10:41.668592 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 17 14:10:41 crc kubenswrapper[4836]: I0217 14:10:41.668601 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 17 14:10:41 crc kubenswrapper[4836]: I0217 14:10:41.668611 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 17 14:10:41 crc kubenswrapper[4836]: I0217 14:10:41.668620 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="88cf2bb1-d70f-4b82-9b9a-9d7c7a4244ff" containerName="registry-server" Feb 17 14:10:41 crc kubenswrapper[4836]: E0217 14:10:41.668745 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 17 14:10:41 crc kubenswrapper[4836]: I0217 14:10:41.668756 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 17 14:10:41 crc kubenswrapper[4836]: I0217 14:10:41.668888 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 17 14:10:41 crc kubenswrapper[4836]: I0217 14:10:41.844899 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 17 14:10:41 crc kubenswrapper[4836]: I0217 14:10:41.852775 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88cf2bb1-d70f-4b82-9b9a-9d7c7a4244ff-utilities\") pod \"88cf2bb1-d70f-4b82-9b9a-9d7c7a4244ff\" (UID: \"88cf2bb1-d70f-4b82-9b9a-9d7c7a4244ff\") " Feb 17 14:10:41 crc kubenswrapper[4836]: I0217 14:10:41.852905 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jnjxf\" (UniqueName: \"kubernetes.io/projected/88cf2bb1-d70f-4b82-9b9a-9d7c7a4244ff-kube-api-access-jnjxf\") pod \"88cf2bb1-d70f-4b82-9b9a-9d7c7a4244ff\" (UID: \"88cf2bb1-d70f-4b82-9b9a-9d7c7a4244ff\") " Feb 17 14:10:41 crc kubenswrapper[4836]: I0217 14:10:41.853020 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88cf2bb1-d70f-4b82-9b9a-9d7c7a4244ff-catalog-content\") pod \"88cf2bb1-d70f-4b82-9b9a-9d7c7a4244ff\" (UID: \"88cf2bb1-d70f-4b82-9b9a-9d7c7a4244ff\") " Feb 17 14:10:41 crc kubenswrapper[4836]: I0217 14:10:41.853306 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:10:41 crc kubenswrapper[4836]: I0217 14:10:41.853340 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:10:41 crc kubenswrapper[4836]: I0217 14:10:41.853431 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 14:10:41 crc kubenswrapper[4836]: I0217 14:10:41.853499 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 14:10:41 crc kubenswrapper[4836]: I0217 14:10:41.853617 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 14:10:41 crc kubenswrapper[4836]: I0217 14:10:41.853647 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 14:10:41 crc kubenswrapper[4836]: I0217 14:10:41.853680 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 14:10:41 crc kubenswrapper[4836]: I0217 14:10:41.853695 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:10:41 crc kubenswrapper[4836]: I0217 14:10:41.856786 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88cf2bb1-d70f-4b82-9b9a-9d7c7a4244ff-utilities" (OuterVolumeSpecName: "utilities") pod "88cf2bb1-d70f-4b82-9b9a-9d7c7a4244ff" (UID: "88cf2bb1-d70f-4b82-9b9a-9d7c7a4244ff"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:10:41 crc kubenswrapper[4836]: I0217 14:10:41.905956 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88cf2bb1-d70f-4b82-9b9a-9d7c7a4244ff-kube-api-access-jnjxf" (OuterVolumeSpecName: "kube-api-access-jnjxf") pod "88cf2bb1-d70f-4b82-9b9a-9d7c7a4244ff" (UID: "88cf2bb1-d70f-4b82-9b9a-9d7c7a4244ff"). InnerVolumeSpecName "kube-api-access-jnjxf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:10:41 crc kubenswrapper[4836]: I0217 14:10:41.955749 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:10:41 crc kubenswrapper[4836]: I0217 14:10:41.955815 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:10:41 crc kubenswrapper[4836]: I0217 14:10:41.955856 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 14:10:41 crc kubenswrapper[4836]: I0217 14:10:41.955892 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 14:10:41 crc kubenswrapper[4836]: I0217 14:10:41.955936 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 14:10:41 crc kubenswrapper[4836]: I0217 14:10:41.955957 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 14:10:41 crc kubenswrapper[4836]: I0217 14:10:41.955982 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 14:10:41 crc kubenswrapper[4836]: I0217 14:10:41.955998 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:10:41 crc kubenswrapper[4836]: I0217 14:10:41.956067 4836 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88cf2bb1-d70f-4b82-9b9a-9d7c7a4244ff-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 14:10:41 crc kubenswrapper[4836]: I0217 14:10:41.956084 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jnjxf\" (UniqueName: \"kubernetes.io/projected/88cf2bb1-d70f-4b82-9b9a-9d7c7a4244ff-kube-api-access-jnjxf\") on node \"crc\" DevicePath \"\"" Feb 17 14:10:41 crc kubenswrapper[4836]: I0217 14:10:41.956143 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:10:41 crc kubenswrapper[4836]: I0217 14:10:41.956192 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:10:41 crc kubenswrapper[4836]: I0217 14:10:41.956216 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:10:41 crc kubenswrapper[4836]: I0217 14:10:41.956240 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 14:10:41 crc kubenswrapper[4836]: I0217 14:10:41.956268 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 14:10:41 crc kubenswrapper[4836]: I0217 14:10:41.956314 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 14:10:41 crc kubenswrapper[4836]: I0217 14:10:41.956336 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 14:10:41 crc kubenswrapper[4836]: I0217 14:10:41.956360 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 14:10:41 crc kubenswrapper[4836]: I0217 14:10:41.994472 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88cf2bb1-d70f-4b82-9b9a-9d7c7a4244ff-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "88cf2bb1-d70f-4b82-9b9a-9d7c7a4244ff" (UID: "88cf2bb1-d70f-4b82-9b9a-9d7c7a4244ff"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:10:42 crc kubenswrapper[4836]: I0217 14:10:42.057664 4836 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88cf2bb1-d70f-4b82-9b9a-9d7c7a4244ff-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 14:10:42 crc kubenswrapper[4836]: I0217 14:10:42.143801 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 14:10:42 crc kubenswrapper[4836]: E0217 14:10:42.171877 4836 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.233:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.18950e04c9e65df6 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-17 14:10:42.170134006 +0000 UTC m=+268.513062275,LastTimestamp:2026-02-17 14:10:42.170134006 +0000 UTC m=+268.513062275,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 17 14:10:42 crc kubenswrapper[4836]: I0217 14:10:42.244143 4836 generic.go:334] "Generic (PLEG): container finished" podID="ffb43a5d-f735-4891-912a-3ba9e47a4055" containerID="ce3e52bd320d663e1a8fc906cd936e572a8197efd430dd75fd6c81b3471e6dd4" exitCode=0 Feb 17 14:10:42 crc kubenswrapper[4836]: I0217 14:10:42.244248 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"ffb43a5d-f735-4891-912a-3ba9e47a4055","Type":"ContainerDied","Data":"ce3e52bd320d663e1a8fc906cd936e572a8197efd430dd75fd6c81b3471e6dd4"} Feb 17 14:10:42 crc kubenswrapper[4836]: I0217 14:10:42.245520 4836 status_manager.go:851] "Failed to get status for pod" podUID="ffb43a5d-f735-4891-912a-3ba9e47a4055" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.233:6443: connect: connection refused" Feb 17 14:10:42 crc kubenswrapper[4836]: I0217 14:10:42.246076 4836 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.233:6443: connect: connection refused" Feb 17 14:10:42 crc kubenswrapper[4836]: I0217 14:10:42.246307 4836 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.233:6443: connect: connection refused" Feb 17 14:10:42 crc kubenswrapper[4836]: I0217 14:10:42.247218 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"17a42e4ab2b42f2910a34e8c55afe5dfb679b723c809ad5f44ffa7b713039e7e"} Feb 17 14:10:42 crc kubenswrapper[4836]: I0217 14:10:42.252316 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Feb 17 14:10:42 crc kubenswrapper[4836]: I0217 14:10:42.254384 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 17 14:10:42 crc kubenswrapper[4836]: I0217 14:10:42.255688 4836 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="14e1fa543737ef273b75b2cffc78dc2edf4d1673c2c4095edda7c40db8238f2c" exitCode=0 Feb 17 14:10:42 crc kubenswrapper[4836]: I0217 14:10:42.255716 4836 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="2b36202875b19f7dafcfde168e006d9d47571de30240d5886e5e0920b74b609b" exitCode=0 Feb 17 14:10:42 crc kubenswrapper[4836]: I0217 14:10:42.255729 4836 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="9bc90fa5f9363034d51c35721493655d10f8f8d5fad4dda86ca4c882963fb7a3" exitCode=0 Feb 17 14:10:42 crc kubenswrapper[4836]: I0217 14:10:42.255738 4836 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="edaaa277a27d7e7b9e8b42acaa9ee04fc0933b440441ad3f8abee458e96945c0" exitCode=2 Feb 17 14:10:42 crc kubenswrapper[4836]: I0217 14:10:42.255825 4836 scope.go:117] "RemoveContainer" containerID="a13a546a929cc67fa3a708f23186f65c3dbc28afd67a421a66c732b8308e4177" Feb 17 14:10:42 crc kubenswrapper[4836]: I0217 14:10:42.261595 4836 generic.go:334] "Generic (PLEG): container finished" podID="88cf2bb1-d70f-4b82-9b9a-9d7c7a4244ff" containerID="cde4d60e0777293bf4b491417a101ff54d8c34c827d15b3c2d4b38007512d375" exitCode=0 Feb 17 14:10:42 crc kubenswrapper[4836]: I0217 14:10:42.261671 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5rfnm" event={"ID":"88cf2bb1-d70f-4b82-9b9a-9d7c7a4244ff","Type":"ContainerDied","Data":"cde4d60e0777293bf4b491417a101ff54d8c34c827d15b3c2d4b38007512d375"} Feb 17 14:10:42 crc kubenswrapper[4836]: I0217 14:10:42.261700 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5rfnm" event={"ID":"88cf2bb1-d70f-4b82-9b9a-9d7c7a4244ff","Type":"ContainerDied","Data":"f4532e92cda0f4cb49095bb6a57a64c5099b9a9601e8744de1408686bb81a1cb"} Feb 17 14:10:42 crc kubenswrapper[4836]: I0217 14:10:42.261740 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5rfnm" Feb 17 14:10:42 crc kubenswrapper[4836]: I0217 14:10:42.262684 4836 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.233:6443: connect: connection refused" Feb 17 14:10:42 crc kubenswrapper[4836]: I0217 14:10:42.263226 4836 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.233:6443: connect: connection refused" Feb 17 14:10:42 crc kubenswrapper[4836]: I0217 14:10:42.263958 4836 status_manager.go:851] "Failed to get status for pod" podUID="88cf2bb1-d70f-4b82-9b9a-9d7c7a4244ff" pod="openshift-marketplace/redhat-operators-5rfnm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-5rfnm\": dial tcp 38.102.83.233:6443: connect: connection refused" Feb 17 14:10:42 crc kubenswrapper[4836]: I0217 14:10:42.264181 4836 status_manager.go:851] "Failed to get status for pod" podUID="ffb43a5d-f735-4891-912a-3ba9e47a4055" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.233:6443: connect: connection refused" Feb 17 14:10:42 crc kubenswrapper[4836]: I0217 14:10:42.279550 4836 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.233:6443: connect: connection refused" Feb 17 14:10:42 crc kubenswrapper[4836]: I0217 14:10:42.280032 4836 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.233:6443: connect: connection refused" Feb 17 14:10:42 crc kubenswrapper[4836]: I0217 14:10:42.280333 4836 status_manager.go:851] "Failed to get status for pod" podUID="88cf2bb1-d70f-4b82-9b9a-9d7c7a4244ff" pod="openshift-marketplace/redhat-operators-5rfnm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-5rfnm\": dial tcp 38.102.83.233:6443: connect: connection refused" Feb 17 14:10:42 crc kubenswrapper[4836]: I0217 14:10:42.280619 4836 status_manager.go:851] "Failed to get status for pod" podUID="ffb43a5d-f735-4891-912a-3ba9e47a4055" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.233:6443: connect: connection refused" Feb 17 14:10:42 crc kubenswrapper[4836]: I0217 14:10:42.300716 4836 scope.go:117] "RemoveContainer" containerID="cde4d60e0777293bf4b491417a101ff54d8c34c827d15b3c2d4b38007512d375" Feb 17 14:10:42 crc kubenswrapper[4836]: I0217 14:10:42.328439 4836 scope.go:117] "RemoveContainer" containerID="c3a9e8665fc73a09585fc2a34fa81b5e1b737549477832ab2c07840098e0bf3c" Feb 17 14:10:42 crc kubenswrapper[4836]: I0217 14:10:42.346241 4836 scope.go:117] "RemoveContainer" containerID="6bb0c69e3bd51a902da87668ec1687c119230884a9fa4dff71d17ec28d9300e8" Feb 17 14:10:42 crc kubenswrapper[4836]: I0217 14:10:42.374663 4836 scope.go:117] "RemoveContainer" containerID="cde4d60e0777293bf4b491417a101ff54d8c34c827d15b3c2d4b38007512d375" Feb 17 14:10:42 crc kubenswrapper[4836]: E0217 14:10:42.376114 4836 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cde4d60e0777293bf4b491417a101ff54d8c34c827d15b3c2d4b38007512d375\": container with ID starting with cde4d60e0777293bf4b491417a101ff54d8c34c827d15b3c2d4b38007512d375 not found: ID does not exist" containerID="cde4d60e0777293bf4b491417a101ff54d8c34c827d15b3c2d4b38007512d375" Feb 17 14:10:42 crc kubenswrapper[4836]: I0217 14:10:42.376166 4836 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cde4d60e0777293bf4b491417a101ff54d8c34c827d15b3c2d4b38007512d375"} err="failed to get container status \"cde4d60e0777293bf4b491417a101ff54d8c34c827d15b3c2d4b38007512d375\": rpc error: code = NotFound desc = could not find container \"cde4d60e0777293bf4b491417a101ff54d8c34c827d15b3c2d4b38007512d375\": container with ID starting with cde4d60e0777293bf4b491417a101ff54d8c34c827d15b3c2d4b38007512d375 not found: ID does not exist" Feb 17 14:10:42 crc kubenswrapper[4836]: I0217 14:10:42.376200 4836 scope.go:117] "RemoveContainer" containerID="c3a9e8665fc73a09585fc2a34fa81b5e1b737549477832ab2c07840098e0bf3c" Feb 17 14:10:42 crc kubenswrapper[4836]: E0217 14:10:42.376614 4836 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3a9e8665fc73a09585fc2a34fa81b5e1b737549477832ab2c07840098e0bf3c\": container with ID starting with c3a9e8665fc73a09585fc2a34fa81b5e1b737549477832ab2c07840098e0bf3c not found: ID does not exist" containerID="c3a9e8665fc73a09585fc2a34fa81b5e1b737549477832ab2c07840098e0bf3c" Feb 17 14:10:42 crc kubenswrapper[4836]: I0217 14:10:42.376667 4836 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3a9e8665fc73a09585fc2a34fa81b5e1b737549477832ab2c07840098e0bf3c"} err="failed to get container status \"c3a9e8665fc73a09585fc2a34fa81b5e1b737549477832ab2c07840098e0bf3c\": rpc error: code = NotFound desc = could not find container \"c3a9e8665fc73a09585fc2a34fa81b5e1b737549477832ab2c07840098e0bf3c\": container with ID starting with c3a9e8665fc73a09585fc2a34fa81b5e1b737549477832ab2c07840098e0bf3c not found: ID does not exist" Feb 17 14:10:42 crc kubenswrapper[4836]: I0217 14:10:42.376709 4836 scope.go:117] "RemoveContainer" containerID="6bb0c69e3bd51a902da87668ec1687c119230884a9fa4dff71d17ec28d9300e8" Feb 17 14:10:42 crc kubenswrapper[4836]: E0217 14:10:42.377104 4836 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6bb0c69e3bd51a902da87668ec1687c119230884a9fa4dff71d17ec28d9300e8\": container with ID starting with 6bb0c69e3bd51a902da87668ec1687c119230884a9fa4dff71d17ec28d9300e8 not found: ID does not exist" containerID="6bb0c69e3bd51a902da87668ec1687c119230884a9fa4dff71d17ec28d9300e8" Feb 17 14:10:42 crc kubenswrapper[4836]: I0217 14:10:42.377152 4836 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6bb0c69e3bd51a902da87668ec1687c119230884a9fa4dff71d17ec28d9300e8"} err="failed to get container status \"6bb0c69e3bd51a902da87668ec1687c119230884a9fa4dff71d17ec28d9300e8\": rpc error: code = NotFound desc = could not find container \"6bb0c69e3bd51a902da87668ec1687c119230884a9fa4dff71d17ec28d9300e8\": container with ID starting with 6bb0c69e3bd51a902da87668ec1687c119230884a9fa4dff71d17ec28d9300e8 not found: ID does not exist" Feb 17 14:10:43 crc kubenswrapper[4836]: I0217 14:10:43.272291 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 17 14:10:43 crc kubenswrapper[4836]: I0217 14:10:43.277528 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"8757baf5bc7be098bf127cb1aafc088ae6d42ff3064423215125a6bb9cdaa67e"} Feb 17 14:10:43 crc kubenswrapper[4836]: I0217 14:10:43.279516 4836 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.233:6443: connect: connection refused" Feb 17 14:10:43 crc kubenswrapper[4836]: I0217 14:10:43.279892 4836 status_manager.go:851] "Failed to get status for pod" podUID="88cf2bb1-d70f-4b82-9b9a-9d7c7a4244ff" pod="openshift-marketplace/redhat-operators-5rfnm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-5rfnm\": dial tcp 38.102.83.233:6443: connect: connection refused" Feb 17 14:10:43 crc kubenswrapper[4836]: I0217 14:10:43.280101 4836 status_manager.go:851] "Failed to get status for pod" podUID="ffb43a5d-f735-4891-912a-3ba9e47a4055" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.233:6443: connect: connection refused" Feb 17 14:10:43 crc kubenswrapper[4836]: I0217 14:10:43.516307 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 17 14:10:43 crc kubenswrapper[4836]: I0217 14:10:43.516903 4836 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.233:6443: connect: connection refused" Feb 17 14:10:43 crc kubenswrapper[4836]: I0217 14:10:43.517205 4836 status_manager.go:851] "Failed to get status for pod" podUID="88cf2bb1-d70f-4b82-9b9a-9d7c7a4244ff" pod="openshift-marketplace/redhat-operators-5rfnm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-5rfnm\": dial tcp 38.102.83.233:6443: connect: connection refused" Feb 17 14:10:43 crc kubenswrapper[4836]: I0217 14:10:43.517585 4836 status_manager.go:851] "Failed to get status for pod" podUID="ffb43a5d-f735-4891-912a-3ba9e47a4055" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.233:6443: connect: connection refused" Feb 17 14:10:43 crc kubenswrapper[4836]: I0217 14:10:43.577437 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ffb43a5d-f735-4891-912a-3ba9e47a4055-kubelet-dir\") pod \"ffb43a5d-f735-4891-912a-3ba9e47a4055\" (UID: \"ffb43a5d-f735-4891-912a-3ba9e47a4055\") " Feb 17 14:10:43 crc kubenswrapper[4836]: I0217 14:10:43.577521 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ffb43a5d-f735-4891-912a-3ba9e47a4055-var-lock\") pod \"ffb43a5d-f735-4891-912a-3ba9e47a4055\" (UID: \"ffb43a5d-f735-4891-912a-3ba9e47a4055\") " Feb 17 14:10:43 crc kubenswrapper[4836]: I0217 14:10:43.577587 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ffb43a5d-f735-4891-912a-3ba9e47a4055-kube-api-access\") pod \"ffb43a5d-f735-4891-912a-3ba9e47a4055\" (UID: \"ffb43a5d-f735-4891-912a-3ba9e47a4055\") " Feb 17 14:10:43 crc kubenswrapper[4836]: I0217 14:10:43.577586 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ffb43a5d-f735-4891-912a-3ba9e47a4055-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "ffb43a5d-f735-4891-912a-3ba9e47a4055" (UID: "ffb43a5d-f735-4891-912a-3ba9e47a4055"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 14:10:43 crc kubenswrapper[4836]: I0217 14:10:43.577658 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ffb43a5d-f735-4891-912a-3ba9e47a4055-var-lock" (OuterVolumeSpecName: "var-lock") pod "ffb43a5d-f735-4891-912a-3ba9e47a4055" (UID: "ffb43a5d-f735-4891-912a-3ba9e47a4055"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 14:10:43 crc kubenswrapper[4836]: I0217 14:10:43.577833 4836 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ffb43a5d-f735-4891-912a-3ba9e47a4055-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 17 14:10:43 crc kubenswrapper[4836]: I0217 14:10:43.577848 4836 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ffb43a5d-f735-4891-912a-3ba9e47a4055-var-lock\") on node \"crc\" DevicePath \"\"" Feb 17 14:10:43 crc kubenswrapper[4836]: I0217 14:10:43.583348 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ffb43a5d-f735-4891-912a-3ba9e47a4055-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "ffb43a5d-f735-4891-912a-3ba9e47a4055" (UID: "ffb43a5d-f735-4891-912a-3ba9e47a4055"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:10:43 crc kubenswrapper[4836]: I0217 14:10:43.678816 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ffb43a5d-f735-4891-912a-3ba9e47a4055-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 17 14:10:44 crc kubenswrapper[4836]: I0217 14:10:44.285338 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"ffb43a5d-f735-4891-912a-3ba9e47a4055","Type":"ContainerDied","Data":"29d09f7152c586bbd27086739e98c573e0ee0e7fe283e6e465f8d46989dcb7a3"} Feb 17 14:10:44 crc kubenswrapper[4836]: I0217 14:10:44.286107 4836 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="29d09f7152c586bbd27086739e98c573e0ee0e7fe283e6e465f8d46989dcb7a3" Feb 17 14:10:44 crc kubenswrapper[4836]: I0217 14:10:44.285534 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 17 14:10:44 crc kubenswrapper[4836]: I0217 14:10:44.336929 4836 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.233:6443: connect: connection refused" Feb 17 14:10:44 crc kubenswrapper[4836]: I0217 14:10:44.337271 4836 status_manager.go:851] "Failed to get status for pod" podUID="88cf2bb1-d70f-4b82-9b9a-9d7c7a4244ff" pod="openshift-marketplace/redhat-operators-5rfnm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-5rfnm\": dial tcp 38.102.83.233:6443: connect: connection refused" Feb 17 14:10:44 crc kubenswrapper[4836]: I0217 14:10:44.337705 4836 status_manager.go:851] "Failed to get status for pod" podUID="ffb43a5d-f735-4891-912a-3ba9e47a4055" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.233:6443: connect: connection refused" Feb 17 14:10:44 crc kubenswrapper[4836]: I0217 14:10:44.340103 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 17 14:10:44 crc kubenswrapper[4836]: I0217 14:10:44.340833 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:10:44 crc kubenswrapper[4836]: I0217 14:10:44.341219 4836 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.233:6443: connect: connection refused" Feb 17 14:10:44 crc kubenswrapper[4836]: I0217 14:10:44.341532 4836 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.233:6443: connect: connection refused" Feb 17 14:10:44 crc kubenswrapper[4836]: I0217 14:10:44.341811 4836 status_manager.go:851] "Failed to get status for pod" podUID="88cf2bb1-d70f-4b82-9b9a-9d7c7a4244ff" pod="openshift-marketplace/redhat-operators-5rfnm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-5rfnm\": dial tcp 38.102.83.233:6443: connect: connection refused" Feb 17 14:10:44 crc kubenswrapper[4836]: I0217 14:10:44.342075 4836 status_manager.go:851] "Failed to get status for pod" podUID="ffb43a5d-f735-4891-912a-3ba9e47a4055" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.233:6443: connect: connection refused" Feb 17 14:10:44 crc kubenswrapper[4836]: I0217 14:10:44.388002 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 17 14:10:44 crc kubenswrapper[4836]: I0217 14:10:44.388058 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 17 14:10:44 crc kubenswrapper[4836]: I0217 14:10:44.388100 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 17 14:10:44 crc kubenswrapper[4836]: I0217 14:10:44.388177 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 14:10:44 crc kubenswrapper[4836]: I0217 14:10:44.388178 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 14:10:44 crc kubenswrapper[4836]: I0217 14:10:44.388209 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 14:10:44 crc kubenswrapper[4836]: I0217 14:10:44.388554 4836 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Feb 17 14:10:44 crc kubenswrapper[4836]: I0217 14:10:44.388607 4836 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 17 14:10:44 crc kubenswrapper[4836]: I0217 14:10:44.388620 4836 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 17 14:10:44 crc kubenswrapper[4836]: I0217 14:10:44.570730 4836 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.233:6443: connect: connection refused" Feb 17 14:10:44 crc kubenswrapper[4836]: I0217 14:10:44.571058 4836 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.233:6443: connect: connection refused" Feb 17 14:10:44 crc kubenswrapper[4836]: I0217 14:10:44.571353 4836 status_manager.go:851] "Failed to get status for pod" podUID="88cf2bb1-d70f-4b82-9b9a-9d7c7a4244ff" pod="openshift-marketplace/redhat-operators-5rfnm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-5rfnm\": dial tcp 38.102.83.233:6443: connect: connection refused" Feb 17 14:10:44 crc kubenswrapper[4836]: I0217 14:10:44.571583 4836 status_manager.go:851] "Failed to get status for pod" podUID="ffb43a5d-f735-4891-912a-3ba9e47a4055" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.233:6443: connect: connection refused" Feb 17 14:10:44 crc kubenswrapper[4836]: I0217 14:10:44.575597 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Feb 17 14:10:44 crc kubenswrapper[4836]: E0217 14:10:44.588741 4836 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.233:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.18950e04c9e65df6 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-17 14:10:42.170134006 +0000 UTC m=+268.513062275,LastTimestamp:2026-02-17 14:10:42.170134006 +0000 UTC m=+268.513062275,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 17 14:10:45 crc kubenswrapper[4836]: I0217 14:10:45.293545 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 17 14:10:45 crc kubenswrapper[4836]: I0217 14:10:45.294131 4836 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="281c6f51b95a96d557a29b449759275131f2fde30c4814e7692edfff96442e71" exitCode=0 Feb 17 14:10:45 crc kubenswrapper[4836]: I0217 14:10:45.294181 4836 scope.go:117] "RemoveContainer" containerID="14e1fa543737ef273b75b2cffc78dc2edf4d1673c2c4095edda7c40db8238f2c" Feb 17 14:10:45 crc kubenswrapper[4836]: I0217 14:10:45.294335 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:10:45 crc kubenswrapper[4836]: I0217 14:10:45.295135 4836 status_manager.go:851] "Failed to get status for pod" podUID="ffb43a5d-f735-4891-912a-3ba9e47a4055" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.233:6443: connect: connection refused" Feb 17 14:10:45 crc kubenswrapper[4836]: I0217 14:10:45.295356 4836 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.233:6443: connect: connection refused" Feb 17 14:10:45 crc kubenswrapper[4836]: I0217 14:10:45.295650 4836 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.233:6443: connect: connection refused" Feb 17 14:10:45 crc kubenswrapper[4836]: I0217 14:10:45.295805 4836 status_manager.go:851] "Failed to get status for pod" podUID="88cf2bb1-d70f-4b82-9b9a-9d7c7a4244ff" pod="openshift-marketplace/redhat-operators-5rfnm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-5rfnm\": dial tcp 38.102.83.233:6443: connect: connection refused" Feb 17 14:10:45 crc kubenswrapper[4836]: I0217 14:10:45.301401 4836 status_manager.go:851] "Failed to get status for pod" podUID="88cf2bb1-d70f-4b82-9b9a-9d7c7a4244ff" pod="openshift-marketplace/redhat-operators-5rfnm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-5rfnm\": dial tcp 38.102.83.233:6443: connect: connection refused" Feb 17 14:10:45 crc kubenswrapper[4836]: I0217 14:10:45.301679 4836 status_manager.go:851] "Failed to get status for pod" podUID="ffb43a5d-f735-4891-912a-3ba9e47a4055" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.233:6443: connect: connection refused" Feb 17 14:10:45 crc kubenswrapper[4836]: I0217 14:10:45.301887 4836 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.233:6443: connect: connection refused" Feb 17 14:10:45 crc kubenswrapper[4836]: I0217 14:10:45.302097 4836 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.233:6443: connect: connection refused" Feb 17 14:10:45 crc kubenswrapper[4836]: I0217 14:10:45.317748 4836 scope.go:117] "RemoveContainer" containerID="2b36202875b19f7dafcfde168e006d9d47571de30240d5886e5e0920b74b609b" Feb 17 14:10:45 crc kubenswrapper[4836]: I0217 14:10:45.344965 4836 scope.go:117] "RemoveContainer" containerID="9bc90fa5f9363034d51c35721493655d10f8f8d5fad4dda86ca4c882963fb7a3" Feb 17 14:10:45 crc kubenswrapper[4836]: I0217 14:10:45.371373 4836 scope.go:117] "RemoveContainer" containerID="edaaa277a27d7e7b9e8b42acaa9ee04fc0933b440441ad3f8abee458e96945c0" Feb 17 14:10:45 crc kubenswrapper[4836]: I0217 14:10:45.388839 4836 scope.go:117] "RemoveContainer" containerID="281c6f51b95a96d557a29b449759275131f2fde30c4814e7692edfff96442e71" Feb 17 14:10:45 crc kubenswrapper[4836]: I0217 14:10:45.404145 4836 scope.go:117] "RemoveContainer" containerID="bf6a761df18c5e7699f2ec87d4b97f19da71f588cb12eff8e541140a361fa905" Feb 17 14:10:45 crc kubenswrapper[4836]: I0217 14:10:45.431549 4836 scope.go:117] "RemoveContainer" containerID="14e1fa543737ef273b75b2cffc78dc2edf4d1673c2c4095edda7c40db8238f2c" Feb 17 14:10:45 crc kubenswrapper[4836]: E0217 14:10:45.432859 4836 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14e1fa543737ef273b75b2cffc78dc2edf4d1673c2c4095edda7c40db8238f2c\": container with ID starting with 14e1fa543737ef273b75b2cffc78dc2edf4d1673c2c4095edda7c40db8238f2c not found: ID does not exist" containerID="14e1fa543737ef273b75b2cffc78dc2edf4d1673c2c4095edda7c40db8238f2c" Feb 17 14:10:45 crc kubenswrapper[4836]: I0217 14:10:45.432932 4836 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14e1fa543737ef273b75b2cffc78dc2edf4d1673c2c4095edda7c40db8238f2c"} err="failed to get container status \"14e1fa543737ef273b75b2cffc78dc2edf4d1673c2c4095edda7c40db8238f2c\": rpc error: code = NotFound desc = could not find container \"14e1fa543737ef273b75b2cffc78dc2edf4d1673c2c4095edda7c40db8238f2c\": container with ID starting with 14e1fa543737ef273b75b2cffc78dc2edf4d1673c2c4095edda7c40db8238f2c not found: ID does not exist" Feb 17 14:10:45 crc kubenswrapper[4836]: I0217 14:10:45.432975 4836 scope.go:117] "RemoveContainer" containerID="2b36202875b19f7dafcfde168e006d9d47571de30240d5886e5e0920b74b609b" Feb 17 14:10:45 crc kubenswrapper[4836]: E0217 14:10:45.433408 4836 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b36202875b19f7dafcfde168e006d9d47571de30240d5886e5e0920b74b609b\": container with ID starting with 2b36202875b19f7dafcfde168e006d9d47571de30240d5886e5e0920b74b609b not found: ID does not exist" containerID="2b36202875b19f7dafcfde168e006d9d47571de30240d5886e5e0920b74b609b" Feb 17 14:10:45 crc kubenswrapper[4836]: I0217 14:10:45.433536 4836 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b36202875b19f7dafcfde168e006d9d47571de30240d5886e5e0920b74b609b"} err="failed to get container status \"2b36202875b19f7dafcfde168e006d9d47571de30240d5886e5e0920b74b609b\": rpc error: code = NotFound desc = could not find container \"2b36202875b19f7dafcfde168e006d9d47571de30240d5886e5e0920b74b609b\": container with ID starting with 2b36202875b19f7dafcfde168e006d9d47571de30240d5886e5e0920b74b609b not found: ID does not exist" Feb 17 14:10:45 crc kubenswrapper[4836]: I0217 14:10:45.433667 4836 scope.go:117] "RemoveContainer" containerID="9bc90fa5f9363034d51c35721493655d10f8f8d5fad4dda86ca4c882963fb7a3" Feb 17 14:10:45 crc kubenswrapper[4836]: E0217 14:10:45.434133 4836 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9bc90fa5f9363034d51c35721493655d10f8f8d5fad4dda86ca4c882963fb7a3\": container with ID starting with 9bc90fa5f9363034d51c35721493655d10f8f8d5fad4dda86ca4c882963fb7a3 not found: ID does not exist" containerID="9bc90fa5f9363034d51c35721493655d10f8f8d5fad4dda86ca4c882963fb7a3" Feb 17 14:10:45 crc kubenswrapper[4836]: I0217 14:10:45.434171 4836 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9bc90fa5f9363034d51c35721493655d10f8f8d5fad4dda86ca4c882963fb7a3"} err="failed to get container status \"9bc90fa5f9363034d51c35721493655d10f8f8d5fad4dda86ca4c882963fb7a3\": rpc error: code = NotFound desc = could not find container \"9bc90fa5f9363034d51c35721493655d10f8f8d5fad4dda86ca4c882963fb7a3\": container with ID starting with 9bc90fa5f9363034d51c35721493655d10f8f8d5fad4dda86ca4c882963fb7a3 not found: ID does not exist" Feb 17 14:10:45 crc kubenswrapper[4836]: I0217 14:10:45.434212 4836 scope.go:117] "RemoveContainer" containerID="edaaa277a27d7e7b9e8b42acaa9ee04fc0933b440441ad3f8abee458e96945c0" Feb 17 14:10:45 crc kubenswrapper[4836]: E0217 14:10:45.434526 4836 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"edaaa277a27d7e7b9e8b42acaa9ee04fc0933b440441ad3f8abee458e96945c0\": container with ID starting with edaaa277a27d7e7b9e8b42acaa9ee04fc0933b440441ad3f8abee458e96945c0 not found: ID does not exist" containerID="edaaa277a27d7e7b9e8b42acaa9ee04fc0933b440441ad3f8abee458e96945c0" Feb 17 14:10:45 crc kubenswrapper[4836]: I0217 14:10:45.434580 4836 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"edaaa277a27d7e7b9e8b42acaa9ee04fc0933b440441ad3f8abee458e96945c0"} err="failed to get container status \"edaaa277a27d7e7b9e8b42acaa9ee04fc0933b440441ad3f8abee458e96945c0\": rpc error: code = NotFound desc = could not find container \"edaaa277a27d7e7b9e8b42acaa9ee04fc0933b440441ad3f8abee458e96945c0\": container with ID starting with edaaa277a27d7e7b9e8b42acaa9ee04fc0933b440441ad3f8abee458e96945c0 not found: ID does not exist" Feb 17 14:10:45 crc kubenswrapper[4836]: I0217 14:10:45.434619 4836 scope.go:117] "RemoveContainer" containerID="281c6f51b95a96d557a29b449759275131f2fde30c4814e7692edfff96442e71" Feb 17 14:10:45 crc kubenswrapper[4836]: E0217 14:10:45.435153 4836 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"281c6f51b95a96d557a29b449759275131f2fde30c4814e7692edfff96442e71\": container with ID starting with 281c6f51b95a96d557a29b449759275131f2fde30c4814e7692edfff96442e71 not found: ID does not exist" containerID="281c6f51b95a96d557a29b449759275131f2fde30c4814e7692edfff96442e71" Feb 17 14:10:45 crc kubenswrapper[4836]: I0217 14:10:45.435186 4836 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"281c6f51b95a96d557a29b449759275131f2fde30c4814e7692edfff96442e71"} err="failed to get container status \"281c6f51b95a96d557a29b449759275131f2fde30c4814e7692edfff96442e71\": rpc error: code = NotFound desc = could not find container \"281c6f51b95a96d557a29b449759275131f2fde30c4814e7692edfff96442e71\": container with ID starting with 281c6f51b95a96d557a29b449759275131f2fde30c4814e7692edfff96442e71 not found: ID does not exist" Feb 17 14:10:45 crc kubenswrapper[4836]: I0217 14:10:45.435225 4836 scope.go:117] "RemoveContainer" containerID="bf6a761df18c5e7699f2ec87d4b97f19da71f588cb12eff8e541140a361fa905" Feb 17 14:10:45 crc kubenswrapper[4836]: E0217 14:10:45.436028 4836 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf6a761df18c5e7699f2ec87d4b97f19da71f588cb12eff8e541140a361fa905\": container with ID starting with bf6a761df18c5e7699f2ec87d4b97f19da71f588cb12eff8e541140a361fa905 not found: ID does not exist" containerID="bf6a761df18c5e7699f2ec87d4b97f19da71f588cb12eff8e541140a361fa905" Feb 17 14:10:45 crc kubenswrapper[4836]: I0217 14:10:45.436161 4836 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf6a761df18c5e7699f2ec87d4b97f19da71f588cb12eff8e541140a361fa905"} err="failed to get container status \"bf6a761df18c5e7699f2ec87d4b97f19da71f588cb12eff8e541140a361fa905\": rpc error: code = NotFound desc = could not find container \"bf6a761df18c5e7699f2ec87d4b97f19da71f588cb12eff8e541140a361fa905\": container with ID starting with bf6a761df18c5e7699f2ec87d4b97f19da71f588cb12eff8e541140a361fa905 not found: ID does not exist" Feb 17 14:10:47 crc kubenswrapper[4836]: E0217 14:10:47.036103 4836 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:10:47Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:10:47Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:10:47Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T14:10:47Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:77c09c30acdeaaf95ab463052841d32404d264d7b46bead6207afe51848d25e3\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:b7b252dee7cfed79b278bcdec32ab88d70e98e83e6c0db9565a87d9e962cfecb\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1701350082},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:14398311b101163ddd1de78c093e161c5d3c9aac51a04e3d3d842fca6317ab0f\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:5a091792b99bf4dfaec25f4c8e29da579e2f452d48b924c8323a18accb7f3290\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1234637517},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:ad77d0ead8abca8b884fad3be18215dbe8b4f8f098053551e4a899298cf5c918\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:b5338e2ca87e0b47fec93f55559f0ed6b39eef3ed3b7f085a4f0b205ccb86a5d\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1213306565},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:28df36269fc553eb1adba5566d6dfc258a1a74063c4cfe8b5bdd3f202591cf56\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:7fa59a55753e6c646b3b56a1a7080a5d70767fb964f1857c411fdf4e05ad4c71\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1201887930},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.233:6443: connect: connection refused" Feb 17 14:10:47 crc kubenswrapper[4836]: E0217 14:10:47.036890 4836 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.233:6443: connect: connection refused" Feb 17 14:10:47 crc kubenswrapper[4836]: E0217 14:10:47.037067 4836 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.233:6443: connect: connection refused" Feb 17 14:10:47 crc kubenswrapper[4836]: E0217 14:10:47.037217 4836 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.233:6443: connect: connection refused" Feb 17 14:10:47 crc kubenswrapper[4836]: E0217 14:10:47.037370 4836 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.233:6443: connect: connection refused" Feb 17 14:10:47 crc kubenswrapper[4836]: E0217 14:10:47.037384 4836 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 17 14:10:47 crc kubenswrapper[4836]: E0217 14:10:47.189208 4836 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.233:6443: connect: connection refused" Feb 17 14:10:47 crc kubenswrapper[4836]: E0217 14:10:47.189472 4836 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.233:6443: connect: connection refused" Feb 17 14:10:47 crc kubenswrapper[4836]: E0217 14:10:47.189694 4836 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.233:6443: connect: connection refused" Feb 17 14:10:47 crc kubenswrapper[4836]: E0217 14:10:47.189896 4836 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.233:6443: connect: connection refused" Feb 17 14:10:47 crc kubenswrapper[4836]: E0217 14:10:47.190098 4836 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.233:6443: connect: connection refused" Feb 17 14:10:47 crc kubenswrapper[4836]: I0217 14:10:47.190127 4836 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Feb 17 14:10:47 crc kubenswrapper[4836]: E0217 14:10:47.190318 4836 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.233:6443: connect: connection refused" interval="200ms" Feb 17 14:10:47 crc kubenswrapper[4836]: E0217 14:10:47.391901 4836 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.233:6443: connect: connection refused" interval="400ms" Feb 17 14:10:47 crc kubenswrapper[4836]: E0217 14:10:47.793185 4836 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.233:6443: connect: connection refused" interval="800ms" Feb 17 14:10:48 crc kubenswrapper[4836]: E0217 14:10:48.594485 4836 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.233:6443: connect: connection refused" interval="1.6s" Feb 17 14:10:50 crc kubenswrapper[4836]: E0217 14:10:50.196453 4836 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.233:6443: connect: connection refused" interval="3.2s" Feb 17 14:10:53 crc kubenswrapper[4836]: E0217 14:10:53.397863 4836 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.233:6443: connect: connection refused" interval="6.4s" Feb 17 14:10:53 crc kubenswrapper[4836]: I0217 14:10:53.567310 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:10:53 crc kubenswrapper[4836]: I0217 14:10:53.567981 4836 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.233:6443: connect: connection refused" Feb 17 14:10:53 crc kubenswrapper[4836]: I0217 14:10:53.568371 4836 status_manager.go:851] "Failed to get status for pod" podUID="88cf2bb1-d70f-4b82-9b9a-9d7c7a4244ff" pod="openshift-marketplace/redhat-operators-5rfnm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-5rfnm\": dial tcp 38.102.83.233:6443: connect: connection refused" Feb 17 14:10:53 crc kubenswrapper[4836]: I0217 14:10:53.568710 4836 status_manager.go:851] "Failed to get status for pod" podUID="ffb43a5d-f735-4891-912a-3ba9e47a4055" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.233:6443: connect: connection refused" Feb 17 14:10:53 crc kubenswrapper[4836]: I0217 14:10:53.582135 4836 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="15e7e5f3-c249-4609-937e-ffdc78580880" Feb 17 14:10:53 crc kubenswrapper[4836]: I0217 14:10:53.582174 4836 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="15e7e5f3-c249-4609-937e-ffdc78580880" Feb 17 14:10:53 crc kubenswrapper[4836]: E0217 14:10:53.582922 4836 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.233:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:10:53 crc kubenswrapper[4836]: I0217 14:10:53.583995 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:10:54 crc kubenswrapper[4836]: I0217 14:10:54.356921 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"bf91c6eb62e340b7bc0a6a5d6be5289b8047eab0dce6f276679ec8ec68eb5286"} Feb 17 14:10:54 crc kubenswrapper[4836]: I0217 14:10:54.574803 4836 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.233:6443: connect: connection refused" Feb 17 14:10:54 crc kubenswrapper[4836]: I0217 14:10:54.575251 4836 status_manager.go:851] "Failed to get status for pod" podUID="88cf2bb1-d70f-4b82-9b9a-9d7c7a4244ff" pod="openshift-marketplace/redhat-operators-5rfnm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-5rfnm\": dial tcp 38.102.83.233:6443: connect: connection refused" Feb 17 14:10:54 crc kubenswrapper[4836]: I0217 14:10:54.575979 4836 status_manager.go:851] "Failed to get status for pod" podUID="ffb43a5d-f735-4891-912a-3ba9e47a4055" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.233:6443: connect: connection refused" Feb 17 14:10:54 crc kubenswrapper[4836]: I0217 14:10:54.576534 4836 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.233:6443: connect: connection refused" Feb 17 14:10:54 crc kubenswrapper[4836]: E0217 14:10:54.590697 4836 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.233:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.18950e04c9e65df6 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-17 14:10:42.170134006 +0000 UTC m=+268.513062275,LastTimestamp:2026-02-17 14:10:42.170134006 +0000 UTC m=+268.513062275,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 17 14:10:55 crc kubenswrapper[4836]: I0217 14:10:55.372336 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 17 14:10:55 crc kubenswrapper[4836]: I0217 14:10:55.372394 4836 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="b6196237bf10370f0d43ac7ba96f1fbc861d8690734cf883081f67616e6047dc" exitCode=1 Feb 17 14:10:55 crc kubenswrapper[4836]: I0217 14:10:55.372461 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"b6196237bf10370f0d43ac7ba96f1fbc861d8690734cf883081f67616e6047dc"} Feb 17 14:10:55 crc kubenswrapper[4836]: I0217 14:10:55.373339 4836 scope.go:117] "RemoveContainer" containerID="b6196237bf10370f0d43ac7ba96f1fbc861d8690734cf883081f67616e6047dc" Feb 17 14:10:55 crc kubenswrapper[4836]: I0217 14:10:55.373604 4836 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.233:6443: connect: connection refused" Feb 17 14:10:55 crc kubenswrapper[4836]: I0217 14:10:55.374402 4836 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.233:6443: connect: connection refused" Feb 17 14:10:55 crc kubenswrapper[4836]: I0217 14:10:55.374820 4836 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.233:6443: connect: connection refused" Feb 17 14:10:55 crc kubenswrapper[4836]: I0217 14:10:55.374974 4836 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="3077a3e0255093329a2e6d9fd21fb4fc0023c6b610b391d980a05fed4eddab3d" exitCode=0 Feb 17 14:10:55 crc kubenswrapper[4836]: I0217 14:10:55.375041 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"3077a3e0255093329a2e6d9fd21fb4fc0023c6b610b391d980a05fed4eddab3d"} Feb 17 14:10:55 crc kubenswrapper[4836]: I0217 14:10:55.375137 4836 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="15e7e5f3-c249-4609-937e-ffdc78580880" Feb 17 14:10:55 crc kubenswrapper[4836]: I0217 14:10:55.375160 4836 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="15e7e5f3-c249-4609-937e-ffdc78580880" Feb 17 14:10:55 crc kubenswrapper[4836]: E0217 14:10:55.375411 4836 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.233:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:10:55 crc kubenswrapper[4836]: I0217 14:10:55.375545 4836 status_manager.go:851] "Failed to get status for pod" podUID="88cf2bb1-d70f-4b82-9b9a-9d7c7a4244ff" pod="openshift-marketplace/redhat-operators-5rfnm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-5rfnm\": dial tcp 38.102.83.233:6443: connect: connection refused" Feb 17 14:10:55 crc kubenswrapper[4836]: I0217 14:10:55.376024 4836 status_manager.go:851] "Failed to get status for pod" podUID="ffb43a5d-f735-4891-912a-3ba9e47a4055" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.233:6443: connect: connection refused" Feb 17 14:10:55 crc kubenswrapper[4836]: I0217 14:10:55.376764 4836 status_manager.go:851] "Failed to get status for pod" podUID="ffb43a5d-f735-4891-912a-3ba9e47a4055" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.233:6443: connect: connection refused" Feb 17 14:10:55 crc kubenswrapper[4836]: I0217 14:10:55.377374 4836 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.233:6443: connect: connection refused" Feb 17 14:10:55 crc kubenswrapper[4836]: I0217 14:10:55.377992 4836 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.233:6443: connect: connection refused" Feb 17 14:10:55 crc kubenswrapper[4836]: I0217 14:10:55.378508 4836 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.233:6443: connect: connection refused" Feb 17 14:10:55 crc kubenswrapper[4836]: I0217 14:10:55.378773 4836 status_manager.go:851] "Failed to get status for pod" podUID="88cf2bb1-d70f-4b82-9b9a-9d7c7a4244ff" pod="openshift-marketplace/redhat-operators-5rfnm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-5rfnm\": dial tcp 38.102.83.233:6443: connect: connection refused" Feb 17 14:10:55 crc kubenswrapper[4836]: I0217 14:10:55.922367 4836 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 14:10:56 crc kubenswrapper[4836]: I0217 14:10:56.384555 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"df5604c711d07a00c95b46282b2130f03ca9f80f7eeadd5328d6a53447b2cafd"} Feb 17 14:10:56 crc kubenswrapper[4836]: I0217 14:10:56.384607 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"a62f76c3004a201a9960d20c895684207c229b1ef49ad96499ee78628828853b"} Feb 17 14:10:56 crc kubenswrapper[4836]: I0217 14:10:56.391322 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 17 14:10:56 crc kubenswrapper[4836]: I0217 14:10:56.391370 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"f6e01917f3bbcd52cacc7f629e1bdcb4b5ebe5d9cbc8366c09b3f5f83d046b8f"} Feb 17 14:10:56 crc kubenswrapper[4836]: I0217 14:10:56.437892 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 14:10:56 crc kubenswrapper[4836]: I0217 14:10:56.446694 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 14:10:57 crc kubenswrapper[4836]: I0217 14:10:57.416509 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"ece71941e8928ac74a98d60736174b728094d825a0933c160c294572fc2102f6"} Feb 17 14:10:57 crc kubenswrapper[4836]: I0217 14:10:57.417143 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"75dc129d71fbe6ef412d19324fa8cc3d4a50c1dbf2df6234354c4be811fcda50"} Feb 17 14:10:57 crc kubenswrapper[4836]: I0217 14:10:57.417198 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 14:10:58 crc kubenswrapper[4836]: I0217 14:10:58.425342 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"776243837c45b0b35c6d11141421a4357f14f398cc3c9d1ae1cfbd04cde7f3f2"} Feb 17 14:10:58 crc kubenswrapper[4836]: I0217 14:10:58.425897 4836 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="15e7e5f3-c249-4609-937e-ffdc78580880" Feb 17 14:10:58 crc kubenswrapper[4836]: I0217 14:10:58.425921 4836 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="15e7e5f3-c249-4609-937e-ffdc78580880" Feb 17 14:10:58 crc kubenswrapper[4836]: I0217 14:10:58.584520 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:10:58 crc kubenswrapper[4836]: I0217 14:10:58.584605 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:10:58 crc kubenswrapper[4836]: I0217 14:10:58.589582 4836 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Feb 17 14:10:58 crc kubenswrapper[4836]: [+]log ok Feb 17 14:10:58 crc kubenswrapper[4836]: [+]etcd ok Feb 17 14:10:58 crc kubenswrapper[4836]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Feb 17 14:10:58 crc kubenswrapper[4836]: [+]poststarthook/openshift.io-api-request-count-filter ok Feb 17 14:10:58 crc kubenswrapper[4836]: [+]poststarthook/openshift.io-startkubeinformers ok Feb 17 14:10:58 crc kubenswrapper[4836]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Feb 17 14:10:58 crc kubenswrapper[4836]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Feb 17 14:10:58 crc kubenswrapper[4836]: [+]poststarthook/start-apiserver-admission-initializer ok Feb 17 14:10:58 crc kubenswrapper[4836]: [+]poststarthook/generic-apiserver-start-informers ok Feb 17 14:10:58 crc kubenswrapper[4836]: [+]poststarthook/priority-and-fairness-config-consumer ok Feb 17 14:10:58 crc kubenswrapper[4836]: [+]poststarthook/priority-and-fairness-filter ok Feb 17 14:10:58 crc kubenswrapper[4836]: [+]poststarthook/storage-object-count-tracker-hook ok Feb 17 14:10:58 crc kubenswrapper[4836]: [+]poststarthook/start-apiextensions-informers ok Feb 17 14:10:58 crc kubenswrapper[4836]: [+]poststarthook/start-apiextensions-controllers ok Feb 17 14:10:58 crc kubenswrapper[4836]: [+]poststarthook/crd-informer-synced ok Feb 17 14:10:58 crc kubenswrapper[4836]: [+]poststarthook/start-system-namespaces-controller ok Feb 17 14:10:58 crc kubenswrapper[4836]: [+]poststarthook/start-cluster-authentication-info-controller ok Feb 17 14:10:58 crc kubenswrapper[4836]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Feb 17 14:10:58 crc kubenswrapper[4836]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Feb 17 14:10:58 crc kubenswrapper[4836]: [+]poststarthook/start-legacy-token-tracking-controller ok Feb 17 14:10:58 crc kubenswrapper[4836]: [+]poststarthook/start-service-ip-repair-controllers ok Feb 17 14:10:58 crc kubenswrapper[4836]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Feb 17 14:10:58 crc kubenswrapper[4836]: [-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld Feb 17 14:10:58 crc kubenswrapper[4836]: [+]poststarthook/priority-and-fairness-config-producer ok Feb 17 14:10:58 crc kubenswrapper[4836]: [+]poststarthook/bootstrap-controller ok Feb 17 14:10:58 crc kubenswrapper[4836]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Feb 17 14:10:58 crc kubenswrapper[4836]: [+]poststarthook/start-kube-aggregator-informers ok Feb 17 14:10:58 crc kubenswrapper[4836]: [+]poststarthook/apiservice-status-local-available-controller ok Feb 17 14:10:58 crc kubenswrapper[4836]: [+]poststarthook/apiservice-status-remote-available-controller ok Feb 17 14:10:58 crc kubenswrapper[4836]: [+]poststarthook/apiservice-registration-controller ok Feb 17 14:10:58 crc kubenswrapper[4836]: [+]poststarthook/apiservice-wait-for-first-sync ok Feb 17 14:10:58 crc kubenswrapper[4836]: [+]poststarthook/apiservice-discovery-controller ok Feb 17 14:10:58 crc kubenswrapper[4836]: [+]poststarthook/kube-apiserver-autoregistration ok Feb 17 14:10:58 crc kubenswrapper[4836]: [+]autoregister-completion ok Feb 17 14:10:58 crc kubenswrapper[4836]: [+]poststarthook/apiservice-openapi-controller ok Feb 17 14:10:58 crc kubenswrapper[4836]: [+]poststarthook/apiservice-openapiv3-controller ok Feb 17 14:10:58 crc kubenswrapper[4836]: livez check failed Feb 17 14:10:58 crc kubenswrapper[4836]: I0217 14:10:58.589650 4836 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 14:11:03 crc kubenswrapper[4836]: I0217 14:11:03.434026 4836 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:11:03 crc kubenswrapper[4836]: I0217 14:11:03.463546 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:11:03 crc kubenswrapper[4836]: I0217 14:11:03.463559 4836 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="15e7e5f3-c249-4609-937e-ffdc78580880" Feb 17 14:11:03 crc kubenswrapper[4836]: I0217 14:11:03.463605 4836 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="15e7e5f3-c249-4609-937e-ffdc78580880" Feb 17 14:11:03 crc kubenswrapper[4836]: I0217 14:11:03.588644 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:11:03 crc kubenswrapper[4836]: I0217 14:11:03.591731 4836 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="8349b29e-b284-4c0a-bda2-bda8a9c51c5d" Feb 17 14:11:04 crc kubenswrapper[4836]: I0217 14:11:04.469935 4836 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="15e7e5f3-c249-4609-937e-ffdc78580880" Feb 17 14:11:04 crc kubenswrapper[4836]: I0217 14:11:04.469971 4836 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="15e7e5f3-c249-4609-937e-ffdc78580880" Feb 17 14:11:04 crc kubenswrapper[4836]: I0217 14:11:04.473058 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:11:04 crc kubenswrapper[4836]: I0217 14:11:04.582506 4836 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="8349b29e-b284-4c0a-bda2-bda8a9c51c5d" Feb 17 14:11:05 crc kubenswrapper[4836]: I0217 14:11:05.473574 4836 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="15e7e5f3-c249-4609-937e-ffdc78580880" Feb 17 14:11:05 crc kubenswrapper[4836]: I0217 14:11:05.473609 4836 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="15e7e5f3-c249-4609-937e-ffdc78580880" Feb 17 14:11:05 crc kubenswrapper[4836]: I0217 14:11:05.503069 4836 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="8349b29e-b284-4c0a-bda2-bda8a9c51c5d" Feb 17 14:11:12 crc kubenswrapper[4836]: I0217 14:11:12.353713 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 17 14:11:13 crc kubenswrapper[4836]: I0217 14:11:13.411797 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 17 14:11:13 crc kubenswrapper[4836]: I0217 14:11:13.446678 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 17 14:11:13 crc kubenswrapper[4836]: I0217 14:11:13.474651 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 14:11:13 crc kubenswrapper[4836]: I0217 14:11:13.511966 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 17 14:11:14 crc kubenswrapper[4836]: I0217 14:11:14.256647 4836 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Feb 17 14:11:14 crc kubenswrapper[4836]: I0217 14:11:14.328796 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 17 14:11:14 crc kubenswrapper[4836]: I0217 14:11:14.380900 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 17 14:11:14 crc kubenswrapper[4836]: I0217 14:11:14.830821 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 17 14:11:14 crc kubenswrapper[4836]: I0217 14:11:14.879427 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 17 14:11:14 crc kubenswrapper[4836]: I0217 14:11:14.879465 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 17 14:11:15 crc kubenswrapper[4836]: I0217 14:11:15.138732 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 17 14:11:15 crc kubenswrapper[4836]: I0217 14:11:15.157447 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 17 14:11:15 crc kubenswrapper[4836]: I0217 14:11:15.165800 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 17 14:11:15 crc kubenswrapper[4836]: I0217 14:11:15.184428 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 17 14:11:15 crc kubenswrapper[4836]: I0217 14:11:15.234654 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 17 14:11:15 crc kubenswrapper[4836]: I0217 14:11:15.340504 4836 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 17 14:11:15 crc kubenswrapper[4836]: I0217 14:11:15.598375 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 17 14:11:15 crc kubenswrapper[4836]: I0217 14:11:15.655430 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 17 14:11:15 crc kubenswrapper[4836]: I0217 14:11:15.811448 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 17 14:11:15 crc kubenswrapper[4836]: I0217 14:11:15.924911 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 17 14:11:16 crc kubenswrapper[4836]: I0217 14:11:16.054374 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 17 14:11:16 crc kubenswrapper[4836]: I0217 14:11:16.245940 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 17 14:11:16 crc kubenswrapper[4836]: I0217 14:11:16.294985 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 17 14:11:16 crc kubenswrapper[4836]: I0217 14:11:16.353813 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 17 14:11:16 crc kubenswrapper[4836]: I0217 14:11:16.399520 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 17 14:11:16 crc kubenswrapper[4836]: I0217 14:11:16.409549 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 17 14:11:16 crc kubenswrapper[4836]: I0217 14:11:16.474821 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 17 14:11:16 crc kubenswrapper[4836]: I0217 14:11:16.494930 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 17 14:11:16 crc kubenswrapper[4836]: I0217 14:11:16.607364 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 17 14:11:16 crc kubenswrapper[4836]: I0217 14:11:16.628819 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 17 14:11:16 crc kubenswrapper[4836]: I0217 14:11:16.650521 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 17 14:11:16 crc kubenswrapper[4836]: I0217 14:11:16.688605 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 17 14:11:16 crc kubenswrapper[4836]: I0217 14:11:16.701926 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 17 14:11:16 crc kubenswrapper[4836]: I0217 14:11:16.840431 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 17 14:11:16 crc kubenswrapper[4836]: I0217 14:11:16.858054 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 17 14:11:17 crc kubenswrapper[4836]: I0217 14:11:17.167449 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 17 14:11:17 crc kubenswrapper[4836]: I0217 14:11:17.197155 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 17 14:11:17 crc kubenswrapper[4836]: I0217 14:11:17.249909 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 17 14:11:17 crc kubenswrapper[4836]: I0217 14:11:17.273392 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 17 14:11:17 crc kubenswrapper[4836]: I0217 14:11:17.313616 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 17 14:11:17 crc kubenswrapper[4836]: I0217 14:11:17.325673 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 17 14:11:17 crc kubenswrapper[4836]: I0217 14:11:17.436352 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 17 14:11:17 crc kubenswrapper[4836]: I0217 14:11:17.451630 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 17 14:11:17 crc kubenswrapper[4836]: I0217 14:11:17.549770 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 17 14:11:17 crc kubenswrapper[4836]: I0217 14:11:17.568486 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 17 14:11:17 crc kubenswrapper[4836]: I0217 14:11:17.589123 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 17 14:11:17 crc kubenswrapper[4836]: I0217 14:11:17.640929 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 17 14:11:17 crc kubenswrapper[4836]: I0217 14:11:17.785071 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 17 14:11:17 crc kubenswrapper[4836]: I0217 14:11:17.790858 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 17 14:11:17 crc kubenswrapper[4836]: I0217 14:11:17.903682 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 17 14:11:17 crc kubenswrapper[4836]: I0217 14:11:17.950246 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 17 14:11:17 crc kubenswrapper[4836]: I0217 14:11:17.963460 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 17 14:11:18 crc kubenswrapper[4836]: I0217 14:11:18.025976 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 17 14:11:18 crc kubenswrapper[4836]: I0217 14:11:18.095995 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 17 14:11:18 crc kubenswrapper[4836]: I0217 14:11:18.157838 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 17 14:11:18 crc kubenswrapper[4836]: I0217 14:11:18.178952 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 17 14:11:18 crc kubenswrapper[4836]: I0217 14:11:18.184127 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 17 14:11:18 crc kubenswrapper[4836]: I0217 14:11:18.196551 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 17 14:11:18 crc kubenswrapper[4836]: I0217 14:11:18.241145 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 17 14:11:18 crc kubenswrapper[4836]: I0217 14:11:18.278342 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 17 14:11:18 crc kubenswrapper[4836]: I0217 14:11:18.278412 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 17 14:11:18 crc kubenswrapper[4836]: I0217 14:11:18.506777 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 17 14:11:18 crc kubenswrapper[4836]: I0217 14:11:18.510708 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 17 14:11:18 crc kubenswrapper[4836]: I0217 14:11:18.527766 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 17 14:11:18 crc kubenswrapper[4836]: I0217 14:11:18.665353 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 17 14:11:18 crc kubenswrapper[4836]: I0217 14:11:18.685935 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 17 14:11:18 crc kubenswrapper[4836]: I0217 14:11:18.727243 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 17 14:11:18 crc kubenswrapper[4836]: I0217 14:11:18.727763 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 17 14:11:18 crc kubenswrapper[4836]: I0217 14:11:18.737503 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 17 14:11:18 crc kubenswrapper[4836]: I0217 14:11:18.775321 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 17 14:11:18 crc kubenswrapper[4836]: I0217 14:11:18.971905 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 17 14:11:18 crc kubenswrapper[4836]: I0217 14:11:18.972649 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 17 14:11:19 crc kubenswrapper[4836]: I0217 14:11:19.026969 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 17 14:11:19 crc kubenswrapper[4836]: I0217 14:11:19.067865 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 17 14:11:19 crc kubenswrapper[4836]: I0217 14:11:19.097483 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 17 14:11:19 crc kubenswrapper[4836]: I0217 14:11:19.159902 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 17 14:11:19 crc kubenswrapper[4836]: I0217 14:11:19.238121 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 17 14:11:19 crc kubenswrapper[4836]: I0217 14:11:19.243450 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 17 14:11:19 crc kubenswrapper[4836]: I0217 14:11:19.335406 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 17 14:11:19 crc kubenswrapper[4836]: I0217 14:11:19.348445 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 17 14:11:19 crc kubenswrapper[4836]: I0217 14:11:19.419077 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 17 14:11:19 crc kubenswrapper[4836]: I0217 14:11:19.498345 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 17 14:11:19 crc kubenswrapper[4836]: I0217 14:11:19.499431 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 17 14:11:19 crc kubenswrapper[4836]: I0217 14:11:19.617753 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 17 14:11:19 crc kubenswrapper[4836]: I0217 14:11:19.621492 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 17 14:11:19 crc kubenswrapper[4836]: I0217 14:11:19.622256 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 17 14:11:19 crc kubenswrapper[4836]: I0217 14:11:19.660018 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 17 14:11:19 crc kubenswrapper[4836]: I0217 14:11:19.706048 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 17 14:11:19 crc kubenswrapper[4836]: I0217 14:11:19.838340 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 17 14:11:19 crc kubenswrapper[4836]: I0217 14:11:19.915205 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 17 14:11:19 crc kubenswrapper[4836]: I0217 14:11:19.920946 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 17 14:11:19 crc kubenswrapper[4836]: I0217 14:11:19.986183 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 17 14:11:20 crc kubenswrapper[4836]: I0217 14:11:20.003253 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 17 14:11:20 crc kubenswrapper[4836]: I0217 14:11:20.095614 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 17 14:11:20 crc kubenswrapper[4836]: I0217 14:11:20.167801 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 17 14:11:20 crc kubenswrapper[4836]: I0217 14:11:20.224381 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 17 14:11:20 crc kubenswrapper[4836]: I0217 14:11:20.461369 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 17 14:11:20 crc kubenswrapper[4836]: I0217 14:11:20.536835 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 17 14:11:20 crc kubenswrapper[4836]: I0217 14:11:20.604055 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 17 14:11:20 crc kubenswrapper[4836]: I0217 14:11:20.621464 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 17 14:11:20 crc kubenswrapper[4836]: I0217 14:11:20.649015 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 17 14:11:20 crc kubenswrapper[4836]: I0217 14:11:20.824408 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 17 14:11:20 crc kubenswrapper[4836]: I0217 14:11:20.868037 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 17 14:11:20 crc kubenswrapper[4836]: I0217 14:11:20.948991 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 17 14:11:21 crc kubenswrapper[4836]: I0217 14:11:21.107887 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 17 14:11:21 crc kubenswrapper[4836]: I0217 14:11:21.131895 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 17 14:11:21 crc kubenswrapper[4836]: I0217 14:11:21.159364 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 17 14:11:21 crc kubenswrapper[4836]: I0217 14:11:21.179682 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 17 14:11:21 crc kubenswrapper[4836]: I0217 14:11:21.260142 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 17 14:11:21 crc kubenswrapper[4836]: I0217 14:11:21.426728 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 17 14:11:21 crc kubenswrapper[4836]: I0217 14:11:21.478724 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 17 14:11:21 crc kubenswrapper[4836]: I0217 14:11:21.525120 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 17 14:11:21 crc kubenswrapper[4836]: I0217 14:11:21.635846 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 17 14:11:21 crc kubenswrapper[4836]: I0217 14:11:21.637363 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 17 14:11:21 crc kubenswrapper[4836]: I0217 14:11:21.685080 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 17 14:11:21 crc kubenswrapper[4836]: I0217 14:11:21.738627 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 17 14:11:21 crc kubenswrapper[4836]: I0217 14:11:21.823480 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 17 14:11:21 crc kubenswrapper[4836]: I0217 14:11:21.830958 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 17 14:11:21 crc kubenswrapper[4836]: I0217 14:11:21.839558 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 17 14:11:21 crc kubenswrapper[4836]: I0217 14:11:21.893921 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 17 14:11:21 crc kubenswrapper[4836]: I0217 14:11:21.905943 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 17 14:11:21 crc kubenswrapper[4836]: I0217 14:11:21.908387 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 17 14:11:21 crc kubenswrapper[4836]: I0217 14:11:21.942016 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 17 14:11:21 crc kubenswrapper[4836]: I0217 14:11:21.942374 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 17 14:11:22 crc kubenswrapper[4836]: I0217 14:11:22.044667 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 17 14:11:22 crc kubenswrapper[4836]: I0217 14:11:22.090619 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 17 14:11:22 crc kubenswrapper[4836]: I0217 14:11:22.134583 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 17 14:11:22 crc kubenswrapper[4836]: I0217 14:11:22.166787 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 17 14:11:22 crc kubenswrapper[4836]: I0217 14:11:22.166995 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 17 14:11:22 crc kubenswrapper[4836]: I0217 14:11:22.177380 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 17 14:11:22 crc kubenswrapper[4836]: I0217 14:11:22.216499 4836 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 17 14:11:22 crc kubenswrapper[4836]: I0217 14:11:22.236010 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 17 14:11:22 crc kubenswrapper[4836]: I0217 14:11:22.238332 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 17 14:11:22 crc kubenswrapper[4836]: I0217 14:11:22.288347 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 17 14:11:22 crc kubenswrapper[4836]: I0217 14:11:22.291658 4836 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 17 14:11:22 crc kubenswrapper[4836]: I0217 14:11:22.294350 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 17 14:11:22 crc kubenswrapper[4836]: I0217 14:11:22.317815 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 17 14:11:22 crc kubenswrapper[4836]: I0217 14:11:22.350733 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 17 14:11:22 crc kubenswrapper[4836]: I0217 14:11:22.405057 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 17 14:11:22 crc kubenswrapper[4836]: I0217 14:11:22.420011 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 17 14:11:22 crc kubenswrapper[4836]: I0217 14:11:22.490105 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 17 14:11:22 crc kubenswrapper[4836]: I0217 14:11:22.506547 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 17 14:11:22 crc kubenswrapper[4836]: I0217 14:11:22.561652 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 17 14:11:22 crc kubenswrapper[4836]: I0217 14:11:22.579279 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 17 14:11:22 crc kubenswrapper[4836]: I0217 14:11:22.629055 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 17 14:11:22 crc kubenswrapper[4836]: I0217 14:11:22.655236 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 17 14:11:22 crc kubenswrapper[4836]: I0217 14:11:22.670998 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 17 14:11:22 crc kubenswrapper[4836]: I0217 14:11:22.919808 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 17 14:11:23 crc kubenswrapper[4836]: I0217 14:11:23.103971 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 17 14:11:23 crc kubenswrapper[4836]: I0217 14:11:23.117143 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 17 14:11:23 crc kubenswrapper[4836]: I0217 14:11:23.159875 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 17 14:11:23 crc kubenswrapper[4836]: I0217 14:11:23.162952 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 17 14:11:23 crc kubenswrapper[4836]: I0217 14:11:23.191065 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 17 14:11:23 crc kubenswrapper[4836]: I0217 14:11:23.219675 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 17 14:11:23 crc kubenswrapper[4836]: I0217 14:11:23.423145 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 17 14:11:23 crc kubenswrapper[4836]: I0217 14:11:23.439934 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 17 14:11:23 crc kubenswrapper[4836]: I0217 14:11:23.445495 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 17 14:11:23 crc kubenswrapper[4836]: I0217 14:11:23.464466 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 17 14:11:23 crc kubenswrapper[4836]: I0217 14:11:23.490081 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 17 14:11:23 crc kubenswrapper[4836]: I0217 14:11:23.491546 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 17 14:11:23 crc kubenswrapper[4836]: I0217 14:11:23.575371 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 17 14:11:23 crc kubenswrapper[4836]: I0217 14:11:23.607940 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 17 14:11:23 crc kubenswrapper[4836]: I0217 14:11:23.609727 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 17 14:11:23 crc kubenswrapper[4836]: I0217 14:11:23.694549 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 17 14:11:23 crc kubenswrapper[4836]: I0217 14:11:23.702564 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 17 14:11:23 crc kubenswrapper[4836]: I0217 14:11:23.728603 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 17 14:11:23 crc kubenswrapper[4836]: I0217 14:11:23.779317 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 17 14:11:23 crc kubenswrapper[4836]: I0217 14:11:23.821662 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 17 14:11:23 crc kubenswrapper[4836]: I0217 14:11:23.832427 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 17 14:11:23 crc kubenswrapper[4836]: I0217 14:11:23.854156 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 17 14:11:24 crc kubenswrapper[4836]: I0217 14:11:24.030603 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 17 14:11:24 crc kubenswrapper[4836]: I0217 14:11:24.060416 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 17 14:11:24 crc kubenswrapper[4836]: I0217 14:11:24.062791 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 17 14:11:24 crc kubenswrapper[4836]: I0217 14:11:24.188193 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 17 14:11:24 crc kubenswrapper[4836]: I0217 14:11:24.238854 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 17 14:11:24 crc kubenswrapper[4836]: I0217 14:11:24.246219 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 17 14:11:24 crc kubenswrapper[4836]: I0217 14:11:24.277705 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 17 14:11:24 crc kubenswrapper[4836]: I0217 14:11:24.314000 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 17 14:11:24 crc kubenswrapper[4836]: I0217 14:11:24.450409 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 17 14:11:24 crc kubenswrapper[4836]: I0217 14:11:24.503415 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 17 14:11:24 crc kubenswrapper[4836]: I0217 14:11:24.544314 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 17 14:11:24 crc kubenswrapper[4836]: I0217 14:11:24.687101 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 17 14:11:24 crc kubenswrapper[4836]: I0217 14:11:24.711675 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 17 14:11:24 crc kubenswrapper[4836]: I0217 14:11:24.755676 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 17 14:11:24 crc kubenswrapper[4836]: I0217 14:11:24.781006 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 17 14:11:24 crc kubenswrapper[4836]: I0217 14:11:24.836394 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 17 14:11:24 crc kubenswrapper[4836]: I0217 14:11:24.903971 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 17 14:11:24 crc kubenswrapper[4836]: I0217 14:11:24.988016 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 17 14:11:25 crc kubenswrapper[4836]: I0217 14:11:25.016464 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 17 14:11:25 crc kubenswrapper[4836]: I0217 14:11:25.028432 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 17 14:11:25 crc kubenswrapper[4836]: I0217 14:11:25.053768 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 17 14:11:25 crc kubenswrapper[4836]: I0217 14:11:25.060152 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 17 14:11:25 crc kubenswrapper[4836]: I0217 14:11:25.061652 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 17 14:11:25 crc kubenswrapper[4836]: I0217 14:11:25.071332 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 17 14:11:25 crc kubenswrapper[4836]: I0217 14:11:25.075214 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 17 14:11:25 crc kubenswrapper[4836]: I0217 14:11:25.092333 4836 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 17 14:11:25 crc kubenswrapper[4836]: I0217 14:11:25.115665 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 17 14:11:25 crc kubenswrapper[4836]: I0217 14:11:25.119571 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 17 14:11:25 crc kubenswrapper[4836]: I0217 14:11:25.210846 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 17 14:11:25 crc kubenswrapper[4836]: I0217 14:11:25.237071 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 17 14:11:25 crc kubenswrapper[4836]: I0217 14:11:25.237099 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 17 14:11:25 crc kubenswrapper[4836]: I0217 14:11:25.409643 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 17 14:11:25 crc kubenswrapper[4836]: I0217 14:11:25.424701 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 17 14:11:25 crc kubenswrapper[4836]: I0217 14:11:25.458215 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 17 14:11:25 crc kubenswrapper[4836]: I0217 14:11:25.458230 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 17 14:11:25 crc kubenswrapper[4836]: I0217 14:11:25.473914 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 17 14:11:25 crc kubenswrapper[4836]: I0217 14:11:25.550000 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 17 14:11:25 crc kubenswrapper[4836]: I0217 14:11:25.550003 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 17 14:11:25 crc kubenswrapper[4836]: I0217 14:11:25.640353 4836 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 17 14:11:25 crc kubenswrapper[4836]: I0217 14:11:25.773642 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 17 14:11:25 crc kubenswrapper[4836]: I0217 14:11:25.803851 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 17 14:11:25 crc kubenswrapper[4836]: I0217 14:11:25.925150 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 17 14:11:26 crc kubenswrapper[4836]: I0217 14:11:26.062194 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 17 14:11:26 crc kubenswrapper[4836]: I0217 14:11:26.221026 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 17 14:11:26 crc kubenswrapper[4836]: I0217 14:11:26.258550 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 17 14:11:26 crc kubenswrapper[4836]: I0217 14:11:26.273751 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 17 14:11:26 crc kubenswrapper[4836]: I0217 14:11:26.325610 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 17 14:11:26 crc kubenswrapper[4836]: I0217 14:11:26.379113 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 17 14:11:26 crc kubenswrapper[4836]: I0217 14:11:26.433648 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 17 14:11:26 crc kubenswrapper[4836]: I0217 14:11:26.503149 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 17 14:11:26 crc kubenswrapper[4836]: I0217 14:11:26.541626 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 17 14:11:26 crc kubenswrapper[4836]: I0217 14:11:26.643218 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 17 14:11:26 crc kubenswrapper[4836]: I0217 14:11:26.662324 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 17 14:11:26 crc kubenswrapper[4836]: I0217 14:11:26.854930 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 17 14:11:26 crc kubenswrapper[4836]: I0217 14:11:26.897817 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 17 14:11:26 crc kubenswrapper[4836]: I0217 14:11:26.910715 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 17 14:11:27 crc kubenswrapper[4836]: I0217 14:11:27.131073 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 17 14:11:27 crc kubenswrapper[4836]: I0217 14:11:27.237367 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 17 14:11:27 crc kubenswrapper[4836]: I0217 14:11:27.350914 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 17 14:11:27 crc kubenswrapper[4836]: I0217 14:11:27.415907 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 17 14:11:27 crc kubenswrapper[4836]: I0217 14:11:27.477958 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 17 14:11:27 crc kubenswrapper[4836]: I0217 14:11:27.503025 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 17 14:11:27 crc kubenswrapper[4836]: I0217 14:11:27.552576 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 17 14:11:27 crc kubenswrapper[4836]: I0217 14:11:27.645150 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 17 14:11:27 crc kubenswrapper[4836]: I0217 14:11:27.706358 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 17 14:11:27 crc kubenswrapper[4836]: I0217 14:11:27.786280 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 17 14:11:27 crc kubenswrapper[4836]: I0217 14:11:27.817960 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 17 14:11:27 crc kubenswrapper[4836]: I0217 14:11:27.926653 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 17 14:11:28 crc kubenswrapper[4836]: I0217 14:11:28.146510 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 17 14:11:28 crc kubenswrapper[4836]: I0217 14:11:28.181935 4836 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 17 14:11:28 crc kubenswrapper[4836]: I0217 14:11:28.185250 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=47.185215921 podStartE2EDuration="47.185215921s" podCreationTimestamp="2026-02-17 14:10:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:11:02.943033577 +0000 UTC m=+289.285961866" watchObservedRunningTime="2026-02-17 14:11:28.185215921 +0000 UTC m=+314.528144190" Feb 17 14:11:28 crc kubenswrapper[4836]: I0217 14:11:28.186956 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-5rfnm","openshift-kube-apiserver/kube-apiserver-crc"] Feb 17 14:11:28 crc kubenswrapper[4836]: I0217 14:11:28.187032 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-rhsgl","openshift-kube-apiserver/kube-apiserver-crc"] Feb 17 14:11:28 crc kubenswrapper[4836]: E0217 14:11:28.187283 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffb43a5d-f735-4891-912a-3ba9e47a4055" containerName="installer" Feb 17 14:11:28 crc kubenswrapper[4836]: I0217 14:11:28.187319 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffb43a5d-f735-4891-912a-3ba9e47a4055" containerName="installer" Feb 17 14:11:28 crc kubenswrapper[4836]: I0217 14:11:28.187454 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffb43a5d-f735-4891-912a-3ba9e47a4055" containerName="installer" Feb 17 14:11:28 crc kubenswrapper[4836]: I0217 14:11:28.188008 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-khbdr","openshift-marketplace/redhat-marketplace-pxwhr","openshift-marketplace/community-operators-9w8zr","openshift-marketplace/redhat-operators-252vj","openshift-marketplace/certified-operators-vfmw4"] Feb 17 14:11:28 crc kubenswrapper[4836]: I0217 14:11:28.188115 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-rhsgl" Feb 17 14:11:28 crc kubenswrapper[4836]: I0217 14:11:28.188483 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-252vj" podUID="a172042c-7dc6-4cea-906e-3d9135523f15" containerName="registry-server" containerID="cri-o://6b92cf31cabf6eec3e2e948b5b114dee8dd77bf37e267af921b53ecc81d156aa" gracePeriod=30 Feb 17 14:11:28 crc kubenswrapper[4836]: I0217 14:11:28.188595 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-vfmw4" podUID="8762f2f2-8375-4fdd-8a29-ea2ab598afa1" containerName="registry-server" containerID="cri-o://63859747d78e0b196aa1ae4f9aecdf579a3667fc25d5a072dbcb78b4447b6dc2" gracePeriod=30 Feb 17 14:11:28 crc kubenswrapper[4836]: I0217 14:11:28.188641 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-khbdr" podUID="985bc83c-52fa-45dc-ab4f-6e47ee47683e" containerName="marketplace-operator" containerID="cri-o://4b1cfa0180186477ad01885b0380528a2ed9a9e38e3b90ab0219a2e26e3de881" gracePeriod=30 Feb 17 14:11:28 crc kubenswrapper[4836]: I0217 14:11:28.188711 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-9w8zr" podUID="089d1289-afe9-4ffe-9d96-ac10058335ed" containerName="registry-server" containerID="cri-o://56a4ac051fd52f2fd8e193686dffb745df251c7f892fec72d600a2fa80ecbd34" gracePeriod=30 Feb 17 14:11:28 crc kubenswrapper[4836]: I0217 14:11:28.188814 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-pxwhr" podUID="e9f23804-837d-4d3c-94b7-7cdefe6e94df" containerName="registry-server" containerID="cri-o://c2eaa809f67d2bf376430950b6f31e802fbb9ae20ab0242708d65041bbaf3f07" gracePeriod=30 Feb 17 14:11:28 crc kubenswrapper[4836]: I0217 14:11:28.192092 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 14:11:28 crc kubenswrapper[4836]: I0217 14:11:28.197456 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bd68f8c7-fdcc-449d-9f92-2f7afcb4917b-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-rhsgl\" (UID: \"bd68f8c7-fdcc-449d-9f92-2f7afcb4917b\") " pod="openshift-marketplace/marketplace-operator-79b997595-rhsgl" Feb 17 14:11:28 crc kubenswrapper[4836]: I0217 14:11:28.197514 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/bd68f8c7-fdcc-449d-9f92-2f7afcb4917b-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-rhsgl\" (UID: \"bd68f8c7-fdcc-449d-9f92-2f7afcb4917b\") " pod="openshift-marketplace/marketplace-operator-79b997595-rhsgl" Feb 17 14:11:28 crc kubenswrapper[4836]: I0217 14:11:28.197598 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnh7k\" (UniqueName: \"kubernetes.io/projected/bd68f8c7-fdcc-449d-9f92-2f7afcb4917b-kube-api-access-vnh7k\") pod \"marketplace-operator-79b997595-rhsgl\" (UID: \"bd68f8c7-fdcc-449d-9f92-2f7afcb4917b\") " pod="openshift-marketplace/marketplace-operator-79b997595-rhsgl" Feb 17 14:11:28 crc kubenswrapper[4836]: I0217 14:11:28.225262 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=25.225235428 podStartE2EDuration="25.225235428s" podCreationTimestamp="2026-02-17 14:11:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:11:28.221927865 +0000 UTC m=+314.564856134" watchObservedRunningTime="2026-02-17 14:11:28.225235428 +0000 UTC m=+314.568163707" Feb 17 14:11:28 crc kubenswrapper[4836]: I0217 14:11:28.235277 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 17 14:11:28 crc kubenswrapper[4836]: I0217 14:11:28.254816 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 17 14:11:28 crc kubenswrapper[4836]: I0217 14:11:28.300323 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/bd68f8c7-fdcc-449d-9f92-2f7afcb4917b-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-rhsgl\" (UID: \"bd68f8c7-fdcc-449d-9f92-2f7afcb4917b\") " pod="openshift-marketplace/marketplace-operator-79b997595-rhsgl" Feb 17 14:11:28 crc kubenswrapper[4836]: I0217 14:11:28.300383 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnh7k\" (UniqueName: \"kubernetes.io/projected/bd68f8c7-fdcc-449d-9f92-2f7afcb4917b-kube-api-access-vnh7k\") pod \"marketplace-operator-79b997595-rhsgl\" (UID: \"bd68f8c7-fdcc-449d-9f92-2f7afcb4917b\") " pod="openshift-marketplace/marketplace-operator-79b997595-rhsgl" Feb 17 14:11:28 crc kubenswrapper[4836]: I0217 14:11:28.300504 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bd68f8c7-fdcc-449d-9f92-2f7afcb4917b-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-rhsgl\" (UID: \"bd68f8c7-fdcc-449d-9f92-2f7afcb4917b\") " pod="openshift-marketplace/marketplace-operator-79b997595-rhsgl" Feb 17 14:11:28 crc kubenswrapper[4836]: I0217 14:11:28.301885 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bd68f8c7-fdcc-449d-9f92-2f7afcb4917b-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-rhsgl\" (UID: \"bd68f8c7-fdcc-449d-9f92-2f7afcb4917b\") " pod="openshift-marketplace/marketplace-operator-79b997595-rhsgl" Feb 17 14:11:28 crc kubenswrapper[4836]: I0217 14:11:28.309278 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/bd68f8c7-fdcc-449d-9f92-2f7afcb4917b-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-rhsgl\" (UID: \"bd68f8c7-fdcc-449d-9f92-2f7afcb4917b\") " pod="openshift-marketplace/marketplace-operator-79b997595-rhsgl" Feb 17 14:11:28 crc kubenswrapper[4836]: I0217 14:11:28.338170 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnh7k\" (UniqueName: \"kubernetes.io/projected/bd68f8c7-fdcc-449d-9f92-2f7afcb4917b-kube-api-access-vnh7k\") pod \"marketplace-operator-79b997595-rhsgl\" (UID: \"bd68f8c7-fdcc-449d-9f92-2f7afcb4917b\") " pod="openshift-marketplace/marketplace-operator-79b997595-rhsgl" Feb 17 14:11:28 crc kubenswrapper[4836]: I0217 14:11:28.385242 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 17 14:11:28 crc kubenswrapper[4836]: I0217 14:11:28.411674 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 17 14:11:28 crc kubenswrapper[4836]: I0217 14:11:28.512952 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-rhsgl" Feb 17 14:11:28 crc kubenswrapper[4836]: I0217 14:11:28.545743 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 17 14:11:28 crc kubenswrapper[4836]: I0217 14:11:28.583845 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88cf2bb1-d70f-4b82-9b9a-9d7c7a4244ff" path="/var/lib/kubelet/pods/88cf2bb1-d70f-4b82-9b9a-9d7c7a4244ff/volumes" Feb 17 14:11:28 crc kubenswrapper[4836]: E0217 14:11:28.607022 4836 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6b92cf31cabf6eec3e2e948b5b114dee8dd77bf37e267af921b53ecc81d156aa is running failed: container process not found" containerID="6b92cf31cabf6eec3e2e948b5b114dee8dd77bf37e267af921b53ecc81d156aa" cmd=["grpc_health_probe","-addr=:50051"] Feb 17 14:11:28 crc kubenswrapper[4836]: E0217 14:11:28.607451 4836 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6b92cf31cabf6eec3e2e948b5b114dee8dd77bf37e267af921b53ecc81d156aa is running failed: container process not found" containerID="6b92cf31cabf6eec3e2e948b5b114dee8dd77bf37e267af921b53ecc81d156aa" cmd=["grpc_health_probe","-addr=:50051"] Feb 17 14:11:28 crc kubenswrapper[4836]: E0217 14:11:28.607786 4836 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6b92cf31cabf6eec3e2e948b5b114dee8dd77bf37e267af921b53ecc81d156aa is running failed: container process not found" containerID="6b92cf31cabf6eec3e2e948b5b114dee8dd77bf37e267af921b53ecc81d156aa" cmd=["grpc_health_probe","-addr=:50051"] Feb 17 14:11:28 crc kubenswrapper[4836]: E0217 14:11:28.607842 4836 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6b92cf31cabf6eec3e2e948b5b114dee8dd77bf37e267af921b53ecc81d156aa is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-operators-252vj" podUID="a172042c-7dc6-4cea-906e-3d9135523f15" containerName="registry-server" Feb 17 14:11:28 crc kubenswrapper[4836]: I0217 14:11:28.615319 4836 generic.go:334] "Generic (PLEG): container finished" podID="089d1289-afe9-4ffe-9d96-ac10058335ed" containerID="56a4ac051fd52f2fd8e193686dffb745df251c7f892fec72d600a2fa80ecbd34" exitCode=0 Feb 17 14:11:28 crc kubenswrapper[4836]: I0217 14:11:28.615396 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9w8zr" event={"ID":"089d1289-afe9-4ffe-9d96-ac10058335ed","Type":"ContainerDied","Data":"56a4ac051fd52f2fd8e193686dffb745df251c7f892fec72d600a2fa80ecbd34"} Feb 17 14:11:28 crc kubenswrapper[4836]: I0217 14:11:28.615425 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9w8zr" event={"ID":"089d1289-afe9-4ffe-9d96-ac10058335ed","Type":"ContainerDied","Data":"aec4a035ba778cf216a49780b8ffa622c813a3d3daa4a826e68b03c1acc34c4d"} Feb 17 14:11:28 crc kubenswrapper[4836]: I0217 14:11:28.615438 4836 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aec4a035ba778cf216a49780b8ffa622c813a3d3daa4a826e68b03c1acc34c4d" Feb 17 14:11:28 crc kubenswrapper[4836]: I0217 14:11:28.619893 4836 generic.go:334] "Generic (PLEG): container finished" podID="985bc83c-52fa-45dc-ab4f-6e47ee47683e" containerID="4b1cfa0180186477ad01885b0380528a2ed9a9e38e3b90ab0219a2e26e3de881" exitCode=0 Feb 17 14:11:28 crc kubenswrapper[4836]: I0217 14:11:28.619942 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-khbdr" event={"ID":"985bc83c-52fa-45dc-ab4f-6e47ee47683e","Type":"ContainerDied","Data":"4b1cfa0180186477ad01885b0380528a2ed9a9e38e3b90ab0219a2e26e3de881"} Feb 17 14:11:28 crc kubenswrapper[4836]: I0217 14:11:28.626105 4836 generic.go:334] "Generic (PLEG): container finished" podID="e9f23804-837d-4d3c-94b7-7cdefe6e94df" containerID="c2eaa809f67d2bf376430950b6f31e802fbb9ae20ab0242708d65041bbaf3f07" exitCode=0 Feb 17 14:11:28 crc kubenswrapper[4836]: I0217 14:11:28.626277 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pxwhr" event={"ID":"e9f23804-837d-4d3c-94b7-7cdefe6e94df","Type":"ContainerDied","Data":"c2eaa809f67d2bf376430950b6f31e802fbb9ae20ab0242708d65041bbaf3f07"} Feb 17 14:11:28 crc kubenswrapper[4836]: I0217 14:11:28.628818 4836 generic.go:334] "Generic (PLEG): container finished" podID="8762f2f2-8375-4fdd-8a29-ea2ab598afa1" containerID="63859747d78e0b196aa1ae4f9aecdf579a3667fc25d5a072dbcb78b4447b6dc2" exitCode=0 Feb 17 14:11:28 crc kubenswrapper[4836]: I0217 14:11:28.628857 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vfmw4" event={"ID":"8762f2f2-8375-4fdd-8a29-ea2ab598afa1","Type":"ContainerDied","Data":"63859747d78e0b196aa1ae4f9aecdf579a3667fc25d5a072dbcb78b4447b6dc2"} Feb 17 14:11:28 crc kubenswrapper[4836]: I0217 14:11:28.639168 4836 generic.go:334] "Generic (PLEG): container finished" podID="a172042c-7dc6-4cea-906e-3d9135523f15" containerID="6b92cf31cabf6eec3e2e948b5b114dee8dd77bf37e267af921b53ecc81d156aa" exitCode=0 Feb 17 14:11:28 crc kubenswrapper[4836]: I0217 14:11:28.639327 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-252vj" event={"ID":"a172042c-7dc6-4cea-906e-3d9135523f15","Type":"ContainerDied","Data":"6b92cf31cabf6eec3e2e948b5b114dee8dd77bf37e267af921b53ecc81d156aa"} Feb 17 14:11:28 crc kubenswrapper[4836]: I0217 14:11:28.640119 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 17 14:11:28 crc kubenswrapper[4836]: I0217 14:11:28.679829 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9w8zr" Feb 17 14:11:28 crc kubenswrapper[4836]: I0217 14:11:28.698006 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vfmw4" Feb 17 14:11:28 crc kubenswrapper[4836]: I0217 14:11:28.711362 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-khbdr" Feb 17 14:11:28 crc kubenswrapper[4836]: I0217 14:11:28.711890 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/089d1289-afe9-4ffe-9d96-ac10058335ed-catalog-content\") pod \"089d1289-afe9-4ffe-9d96-ac10058335ed\" (UID: \"089d1289-afe9-4ffe-9d96-ac10058335ed\") " Feb 17 14:11:28 crc kubenswrapper[4836]: I0217 14:11:28.720681 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-252vj" Feb 17 14:11:28 crc kubenswrapper[4836]: I0217 14:11:28.730006 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pxwhr" Feb 17 14:11:28 crc kubenswrapper[4836]: I0217 14:11:28.786209 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-rhsgl"] Feb 17 14:11:28 crc kubenswrapper[4836]: I0217 14:11:28.803921 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/089d1289-afe9-4ffe-9d96-ac10058335ed-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "089d1289-afe9-4ffe-9d96-ac10058335ed" (UID: "089d1289-afe9-4ffe-9d96-ac10058335ed"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:11:28 crc kubenswrapper[4836]: I0217 14:11:28.813093 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a172042c-7dc6-4cea-906e-3d9135523f15-utilities\") pod \"a172042c-7dc6-4cea-906e-3d9135523f15\" (UID: \"a172042c-7dc6-4cea-906e-3d9135523f15\") " Feb 17 14:11:28 crc kubenswrapper[4836]: I0217 14:11:28.813144 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9f23804-837d-4d3c-94b7-7cdefe6e94df-catalog-content\") pod \"e9f23804-837d-4d3c-94b7-7cdefe6e94df\" (UID: \"e9f23804-837d-4d3c-94b7-7cdefe6e94df\") " Feb 17 14:11:28 crc kubenswrapper[4836]: I0217 14:11:28.813174 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hqqlz\" (UniqueName: \"kubernetes.io/projected/e9f23804-837d-4d3c-94b7-7cdefe6e94df-kube-api-access-hqqlz\") pod \"e9f23804-837d-4d3c-94b7-7cdefe6e94df\" (UID: \"e9f23804-837d-4d3c-94b7-7cdefe6e94df\") " Feb 17 14:11:28 crc kubenswrapper[4836]: I0217 14:11:28.813198 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-thgpf\" (UniqueName: \"kubernetes.io/projected/a172042c-7dc6-4cea-906e-3d9135523f15-kube-api-access-thgpf\") pod \"a172042c-7dc6-4cea-906e-3d9135523f15\" (UID: \"a172042c-7dc6-4cea-906e-3d9135523f15\") " Feb 17 14:11:28 crc kubenswrapper[4836]: I0217 14:11:28.813220 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8762f2f2-8375-4fdd-8a29-ea2ab598afa1-utilities\") pod \"8762f2f2-8375-4fdd-8a29-ea2ab598afa1\" (UID: \"8762f2f2-8375-4fdd-8a29-ea2ab598afa1\") " Feb 17 14:11:28 crc kubenswrapper[4836]: I0217 14:11:28.813251 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xd5dt\" (UniqueName: \"kubernetes.io/projected/8762f2f2-8375-4fdd-8a29-ea2ab598afa1-kube-api-access-xd5dt\") pod \"8762f2f2-8375-4fdd-8a29-ea2ab598afa1\" (UID: \"8762f2f2-8375-4fdd-8a29-ea2ab598afa1\") " Feb 17 14:11:28 crc kubenswrapper[4836]: I0217 14:11:28.813285 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/985bc83c-52fa-45dc-ab4f-6e47ee47683e-marketplace-operator-metrics\") pod \"985bc83c-52fa-45dc-ab4f-6e47ee47683e\" (UID: \"985bc83c-52fa-45dc-ab4f-6e47ee47683e\") " Feb 17 14:11:28 crc kubenswrapper[4836]: I0217 14:11:28.813339 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/089d1289-afe9-4ffe-9d96-ac10058335ed-utilities\") pod \"089d1289-afe9-4ffe-9d96-ac10058335ed\" (UID: \"089d1289-afe9-4ffe-9d96-ac10058335ed\") " Feb 17 14:11:28 crc kubenswrapper[4836]: I0217 14:11:28.813379 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tknx6\" (UniqueName: \"kubernetes.io/projected/089d1289-afe9-4ffe-9d96-ac10058335ed-kube-api-access-tknx6\") pod \"089d1289-afe9-4ffe-9d96-ac10058335ed\" (UID: \"089d1289-afe9-4ffe-9d96-ac10058335ed\") " Feb 17 14:11:28 crc kubenswrapper[4836]: I0217 14:11:28.813406 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8762f2f2-8375-4fdd-8a29-ea2ab598afa1-catalog-content\") pod \"8762f2f2-8375-4fdd-8a29-ea2ab598afa1\" (UID: \"8762f2f2-8375-4fdd-8a29-ea2ab598afa1\") " Feb 17 14:11:28 crc kubenswrapper[4836]: I0217 14:11:28.813433 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a172042c-7dc6-4cea-906e-3d9135523f15-catalog-content\") pod \"a172042c-7dc6-4cea-906e-3d9135523f15\" (UID: \"a172042c-7dc6-4cea-906e-3d9135523f15\") " Feb 17 14:11:28 crc kubenswrapper[4836]: I0217 14:11:28.813474 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9f23804-837d-4d3c-94b7-7cdefe6e94df-utilities\") pod \"e9f23804-837d-4d3c-94b7-7cdefe6e94df\" (UID: \"e9f23804-837d-4d3c-94b7-7cdefe6e94df\") " Feb 17 14:11:28 crc kubenswrapper[4836]: I0217 14:11:28.813498 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/985bc83c-52fa-45dc-ab4f-6e47ee47683e-marketplace-trusted-ca\") pod \"985bc83c-52fa-45dc-ab4f-6e47ee47683e\" (UID: \"985bc83c-52fa-45dc-ab4f-6e47ee47683e\") " Feb 17 14:11:28 crc kubenswrapper[4836]: I0217 14:11:28.813526 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sfrbt\" (UniqueName: \"kubernetes.io/projected/985bc83c-52fa-45dc-ab4f-6e47ee47683e-kube-api-access-sfrbt\") pod \"985bc83c-52fa-45dc-ab4f-6e47ee47683e\" (UID: \"985bc83c-52fa-45dc-ab4f-6e47ee47683e\") " Feb 17 14:11:28 crc kubenswrapper[4836]: I0217 14:11:28.813694 4836 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/089d1289-afe9-4ffe-9d96-ac10058335ed-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 14:11:28 crc kubenswrapper[4836]: I0217 14:11:28.815061 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/089d1289-afe9-4ffe-9d96-ac10058335ed-utilities" (OuterVolumeSpecName: "utilities") pod "089d1289-afe9-4ffe-9d96-ac10058335ed" (UID: "089d1289-afe9-4ffe-9d96-ac10058335ed"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:11:28 crc kubenswrapper[4836]: I0217 14:11:28.815631 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8762f2f2-8375-4fdd-8a29-ea2ab598afa1-utilities" (OuterVolumeSpecName: "utilities") pod "8762f2f2-8375-4fdd-8a29-ea2ab598afa1" (UID: "8762f2f2-8375-4fdd-8a29-ea2ab598afa1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:11:28 crc kubenswrapper[4836]: I0217 14:11:28.816635 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9f23804-837d-4d3c-94b7-7cdefe6e94df-utilities" (OuterVolumeSpecName: "utilities") pod "e9f23804-837d-4d3c-94b7-7cdefe6e94df" (UID: "e9f23804-837d-4d3c-94b7-7cdefe6e94df"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:11:28 crc kubenswrapper[4836]: I0217 14:11:28.816729 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a172042c-7dc6-4cea-906e-3d9135523f15-utilities" (OuterVolumeSpecName: "utilities") pod "a172042c-7dc6-4cea-906e-3d9135523f15" (UID: "a172042c-7dc6-4cea-906e-3d9135523f15"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:11:28 crc kubenswrapper[4836]: I0217 14:11:28.817460 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/985bc83c-52fa-45dc-ab4f-6e47ee47683e-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "985bc83c-52fa-45dc-ab4f-6e47ee47683e" (UID: "985bc83c-52fa-45dc-ab4f-6e47ee47683e"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:11:28 crc kubenswrapper[4836]: I0217 14:11:28.821058 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/985bc83c-52fa-45dc-ab4f-6e47ee47683e-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "985bc83c-52fa-45dc-ab4f-6e47ee47683e" (UID: "985bc83c-52fa-45dc-ab4f-6e47ee47683e"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:11:28 crc kubenswrapper[4836]: I0217 14:11:28.825642 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a172042c-7dc6-4cea-906e-3d9135523f15-kube-api-access-thgpf" (OuterVolumeSpecName: "kube-api-access-thgpf") pod "a172042c-7dc6-4cea-906e-3d9135523f15" (UID: "a172042c-7dc6-4cea-906e-3d9135523f15"). InnerVolumeSpecName "kube-api-access-thgpf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:11:28 crc kubenswrapper[4836]: I0217 14:11:28.825872 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8762f2f2-8375-4fdd-8a29-ea2ab598afa1-kube-api-access-xd5dt" (OuterVolumeSpecName: "kube-api-access-xd5dt") pod "8762f2f2-8375-4fdd-8a29-ea2ab598afa1" (UID: "8762f2f2-8375-4fdd-8a29-ea2ab598afa1"). InnerVolumeSpecName "kube-api-access-xd5dt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:11:28 crc kubenswrapper[4836]: I0217 14:11:28.825973 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9f23804-837d-4d3c-94b7-7cdefe6e94df-kube-api-access-hqqlz" (OuterVolumeSpecName: "kube-api-access-hqqlz") pod "e9f23804-837d-4d3c-94b7-7cdefe6e94df" (UID: "e9f23804-837d-4d3c-94b7-7cdefe6e94df"). InnerVolumeSpecName "kube-api-access-hqqlz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:11:28 crc kubenswrapper[4836]: I0217 14:11:28.837073 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/985bc83c-52fa-45dc-ab4f-6e47ee47683e-kube-api-access-sfrbt" (OuterVolumeSpecName: "kube-api-access-sfrbt") pod "985bc83c-52fa-45dc-ab4f-6e47ee47683e" (UID: "985bc83c-52fa-45dc-ab4f-6e47ee47683e"). InnerVolumeSpecName "kube-api-access-sfrbt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:11:28 crc kubenswrapper[4836]: I0217 14:11:28.845696 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/089d1289-afe9-4ffe-9d96-ac10058335ed-kube-api-access-tknx6" (OuterVolumeSpecName: "kube-api-access-tknx6") pod "089d1289-afe9-4ffe-9d96-ac10058335ed" (UID: "089d1289-afe9-4ffe-9d96-ac10058335ed"). InnerVolumeSpecName "kube-api-access-tknx6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:11:28 crc kubenswrapper[4836]: I0217 14:11:28.847441 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9f23804-837d-4d3c-94b7-7cdefe6e94df-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e9f23804-837d-4d3c-94b7-7cdefe6e94df" (UID: "e9f23804-837d-4d3c-94b7-7cdefe6e94df"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:11:28 crc kubenswrapper[4836]: I0217 14:11:28.887932 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 17 14:11:28 crc kubenswrapper[4836]: I0217 14:11:28.897487 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8762f2f2-8375-4fdd-8a29-ea2ab598afa1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8762f2f2-8375-4fdd-8a29-ea2ab598afa1" (UID: "8762f2f2-8375-4fdd-8a29-ea2ab598afa1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:11:28 crc kubenswrapper[4836]: I0217 14:11:28.914422 4836 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a172042c-7dc6-4cea-906e-3d9135523f15-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 14:11:28 crc kubenswrapper[4836]: I0217 14:11:28.914457 4836 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9f23804-837d-4d3c-94b7-7cdefe6e94df-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 14:11:28 crc kubenswrapper[4836]: I0217 14:11:28.914467 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hqqlz\" (UniqueName: \"kubernetes.io/projected/e9f23804-837d-4d3c-94b7-7cdefe6e94df-kube-api-access-hqqlz\") on node \"crc\" DevicePath \"\"" Feb 17 14:11:28 crc kubenswrapper[4836]: I0217 14:11:28.914479 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-thgpf\" (UniqueName: \"kubernetes.io/projected/a172042c-7dc6-4cea-906e-3d9135523f15-kube-api-access-thgpf\") on node \"crc\" DevicePath \"\"" Feb 17 14:11:28 crc kubenswrapper[4836]: I0217 14:11:28.914489 4836 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8762f2f2-8375-4fdd-8a29-ea2ab598afa1-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 14:11:28 crc kubenswrapper[4836]: I0217 14:11:28.914498 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xd5dt\" (UniqueName: \"kubernetes.io/projected/8762f2f2-8375-4fdd-8a29-ea2ab598afa1-kube-api-access-xd5dt\") on node \"crc\" DevicePath \"\"" Feb 17 14:11:28 crc kubenswrapper[4836]: I0217 14:11:28.914509 4836 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/985bc83c-52fa-45dc-ab4f-6e47ee47683e-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 17 14:11:28 crc kubenswrapper[4836]: I0217 14:11:28.914519 4836 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/089d1289-afe9-4ffe-9d96-ac10058335ed-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 14:11:28 crc kubenswrapper[4836]: I0217 14:11:28.914529 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tknx6\" (UniqueName: \"kubernetes.io/projected/089d1289-afe9-4ffe-9d96-ac10058335ed-kube-api-access-tknx6\") on node \"crc\" DevicePath \"\"" Feb 17 14:11:28 crc kubenswrapper[4836]: I0217 14:11:28.914538 4836 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8762f2f2-8375-4fdd-8a29-ea2ab598afa1-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 14:11:28 crc kubenswrapper[4836]: I0217 14:11:28.914586 4836 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9f23804-837d-4d3c-94b7-7cdefe6e94df-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 14:11:28 crc kubenswrapper[4836]: I0217 14:11:28.914596 4836 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/985bc83c-52fa-45dc-ab4f-6e47ee47683e-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 17 14:11:28 crc kubenswrapper[4836]: I0217 14:11:28.914605 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sfrbt\" (UniqueName: \"kubernetes.io/projected/985bc83c-52fa-45dc-ab4f-6e47ee47683e-kube-api-access-sfrbt\") on node \"crc\" DevicePath \"\"" Feb 17 14:11:28 crc kubenswrapper[4836]: I0217 14:11:28.960822 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a172042c-7dc6-4cea-906e-3d9135523f15-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a172042c-7dc6-4cea-906e-3d9135523f15" (UID: "a172042c-7dc6-4cea-906e-3d9135523f15"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:11:28 crc kubenswrapper[4836]: I0217 14:11:28.980937 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 17 14:11:29 crc kubenswrapper[4836]: I0217 14:11:29.015249 4836 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a172042c-7dc6-4cea-906e-3d9135523f15-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 14:11:29 crc kubenswrapper[4836]: I0217 14:11:29.114356 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 17 14:11:29 crc kubenswrapper[4836]: I0217 14:11:29.151554 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 17 14:11:29 crc kubenswrapper[4836]: I0217 14:11:29.378873 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 17 14:11:29 crc kubenswrapper[4836]: I0217 14:11:29.646209 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-rhsgl" event={"ID":"bd68f8c7-fdcc-449d-9f92-2f7afcb4917b","Type":"ContainerStarted","Data":"a6c42aa007fef7ea456d745f9c58e51c9dca34bd60748608d671a37224d1bf1e"} Feb 17 14:11:29 crc kubenswrapper[4836]: I0217 14:11:29.646275 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-rhsgl" event={"ID":"bd68f8c7-fdcc-449d-9f92-2f7afcb4917b","Type":"ContainerStarted","Data":"8bea1300461bd8244149c813fa4e26a87b305c74e5cde9750a56a339a4aa01e3"} Feb 17 14:11:29 crc kubenswrapper[4836]: I0217 14:11:29.646574 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-rhsgl" Feb 17 14:11:29 crc kubenswrapper[4836]: I0217 14:11:29.648040 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-khbdr" event={"ID":"985bc83c-52fa-45dc-ab4f-6e47ee47683e","Type":"ContainerDied","Data":"7d0ca8f5e10670b96b45ab236df0ffcb5b0c0577a99d998beb3a30327978aa5e"} Feb 17 14:11:29 crc kubenswrapper[4836]: I0217 14:11:29.648060 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-khbdr" Feb 17 14:11:29 crc kubenswrapper[4836]: I0217 14:11:29.648108 4836 scope.go:117] "RemoveContainer" containerID="4b1cfa0180186477ad01885b0380528a2ed9a9e38e3b90ab0219a2e26e3de881" Feb 17 14:11:29 crc kubenswrapper[4836]: I0217 14:11:29.650288 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pxwhr" event={"ID":"e9f23804-837d-4d3c-94b7-7cdefe6e94df","Type":"ContainerDied","Data":"aef23167292c1dafb12389117081e87d3bd5bee8abc67ecc65bf3cd0a4bf9f1c"} Feb 17 14:11:29 crc kubenswrapper[4836]: I0217 14:11:29.650417 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pxwhr" Feb 17 14:11:29 crc kubenswrapper[4836]: I0217 14:11:29.652978 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-rhsgl" Feb 17 14:11:29 crc kubenswrapper[4836]: I0217 14:11:29.655371 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vfmw4" event={"ID":"8762f2f2-8375-4fdd-8a29-ea2ab598afa1","Type":"ContainerDied","Data":"a5a3c8cb6babc233f1c2ac1e8dd3635788628a1cb1bb705f8a779b47d0562e2b"} Feb 17 14:11:29 crc kubenswrapper[4836]: I0217 14:11:29.655606 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vfmw4" Feb 17 14:11:29 crc kubenswrapper[4836]: I0217 14:11:29.662994 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9w8zr" Feb 17 14:11:29 crc kubenswrapper[4836]: I0217 14:11:29.663671 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-252vj" Feb 17 14:11:29 crc kubenswrapper[4836]: I0217 14:11:29.663751 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-252vj" event={"ID":"a172042c-7dc6-4cea-906e-3d9135523f15","Type":"ContainerDied","Data":"8ef482fc8eb2712be43ba1d606607d7a887e18d38349afed73ed063a65b62543"} Feb 17 14:11:29 crc kubenswrapper[4836]: I0217 14:11:29.668984 4836 scope.go:117] "RemoveContainer" containerID="c2eaa809f67d2bf376430950b6f31e802fbb9ae20ab0242708d65041bbaf3f07" Feb 17 14:11:29 crc kubenswrapper[4836]: I0217 14:11:29.671279 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-rhsgl" podStartSLOduration=4.671255436 podStartE2EDuration="4.671255436s" podCreationTimestamp="2026-02-17 14:11:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:11:29.668230691 +0000 UTC m=+316.011158980" watchObservedRunningTime="2026-02-17 14:11:29.671255436 +0000 UTC m=+316.014183705" Feb 17 14:11:29 crc kubenswrapper[4836]: I0217 14:11:29.686724 4836 scope.go:117] "RemoveContainer" containerID="03ed8f2f65fff33093a6776fd604dcab5d3520ae863a96ba61bdb418d4e8293c" Feb 17 14:11:29 crc kubenswrapper[4836]: I0217 14:11:29.749746 4836 scope.go:117] "RemoveContainer" containerID="73060b123dbcdb54cacfb96235e77305156ac3a055b89a97013a4725f13fbc92" Feb 17 14:11:29 crc kubenswrapper[4836]: I0217 14:11:29.751986 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-252vj"] Feb 17 14:11:29 crc kubenswrapper[4836]: I0217 14:11:29.768370 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-252vj"] Feb 17 14:11:29 crc kubenswrapper[4836]: I0217 14:11:29.770433 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-khbdr"] Feb 17 14:11:29 crc kubenswrapper[4836]: I0217 14:11:29.773842 4836 scope.go:117] "RemoveContainer" containerID="63859747d78e0b196aa1ae4f9aecdf579a3667fc25d5a072dbcb78b4447b6dc2" Feb 17 14:11:29 crc kubenswrapper[4836]: I0217 14:11:29.776934 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-khbdr"] Feb 17 14:11:29 crc kubenswrapper[4836]: I0217 14:11:29.780342 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pxwhr"] Feb 17 14:11:29 crc kubenswrapper[4836]: I0217 14:11:29.784033 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-pxwhr"] Feb 17 14:11:29 crc kubenswrapper[4836]: I0217 14:11:29.794503 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vfmw4"] Feb 17 14:11:29 crc kubenswrapper[4836]: I0217 14:11:29.798211 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-vfmw4"] Feb 17 14:11:29 crc kubenswrapper[4836]: I0217 14:11:29.807052 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9w8zr"] Feb 17 14:11:29 crc kubenswrapper[4836]: I0217 14:11:29.808887 4836 scope.go:117] "RemoveContainer" containerID="c45d87fc95c2bb97baee74cdf9eb8890199ccbcb1361ab9d40701a3bf1b0aef6" Feb 17 14:11:29 crc kubenswrapper[4836]: I0217 14:11:29.815918 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-9w8zr"] Feb 17 14:11:29 crc kubenswrapper[4836]: I0217 14:11:29.831109 4836 scope.go:117] "RemoveContainer" containerID="fdc430f3f9d22a422de0b99423af704e6cc0b0c2a36fc9623c6db36600886e79" Feb 17 14:11:29 crc kubenswrapper[4836]: I0217 14:11:29.847933 4836 scope.go:117] "RemoveContainer" containerID="6b92cf31cabf6eec3e2e948b5b114dee8dd77bf37e267af921b53ecc81d156aa" Feb 17 14:11:29 crc kubenswrapper[4836]: I0217 14:11:29.869472 4836 scope.go:117] "RemoveContainer" containerID="5fe13927481d2948ed6f845b9678013bbf8fcbf061f7116c0ec82c5abd9ee696" Feb 17 14:11:29 crc kubenswrapper[4836]: I0217 14:11:29.888102 4836 scope.go:117] "RemoveContainer" containerID="fbdef3e9d702e26b2d9eab100a7cb39741759b5bc646072d63aa2cde6951ee43" Feb 17 14:11:30 crc kubenswrapper[4836]: I0217 14:11:30.576828 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="089d1289-afe9-4ffe-9d96-ac10058335ed" path="/var/lib/kubelet/pods/089d1289-afe9-4ffe-9d96-ac10058335ed/volumes" Feb 17 14:11:30 crc kubenswrapper[4836]: I0217 14:11:30.577639 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8762f2f2-8375-4fdd-8a29-ea2ab598afa1" path="/var/lib/kubelet/pods/8762f2f2-8375-4fdd-8a29-ea2ab598afa1/volumes" Feb 17 14:11:30 crc kubenswrapper[4836]: I0217 14:11:30.578423 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="985bc83c-52fa-45dc-ab4f-6e47ee47683e" path="/var/lib/kubelet/pods/985bc83c-52fa-45dc-ab4f-6e47ee47683e/volumes" Feb 17 14:11:30 crc kubenswrapper[4836]: I0217 14:11:30.579556 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a172042c-7dc6-4cea-906e-3d9135523f15" path="/var/lib/kubelet/pods/a172042c-7dc6-4cea-906e-3d9135523f15/volumes" Feb 17 14:11:30 crc kubenswrapper[4836]: I0217 14:11:30.580253 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9f23804-837d-4d3c-94b7-7cdefe6e94df" path="/var/lib/kubelet/pods/e9f23804-837d-4d3c-94b7-7cdefe6e94df/volumes" Feb 17 14:11:31 crc kubenswrapper[4836]: I0217 14:11:31.244875 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 17 14:11:31 crc kubenswrapper[4836]: I0217 14:11:31.626437 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 17 14:11:36 crc kubenswrapper[4836]: I0217 14:11:36.975636 4836 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 17 14:11:36 crc kubenswrapper[4836]: I0217 14:11:36.976706 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://8757baf5bc7be098bf127cb1aafc088ae6d42ff3064423215125a6bb9cdaa67e" gracePeriod=5 Feb 17 14:11:42 crc kubenswrapper[4836]: I0217 14:11:42.550430 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 17 14:11:42 crc kubenswrapper[4836]: I0217 14:11:42.551329 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 14:11:42 crc kubenswrapper[4836]: I0217 14:11:42.576287 4836 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Feb 17 14:11:42 crc kubenswrapper[4836]: I0217 14:11:42.587484 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 17 14:11:42 crc kubenswrapper[4836]: I0217 14:11:42.587528 4836 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="9043d749-7e9c-488b-b0a8-bee71a618a8c" Feb 17 14:11:42 crc kubenswrapper[4836]: I0217 14:11:42.590774 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 17 14:11:42 crc kubenswrapper[4836]: I0217 14:11:42.590844 4836 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="9043d749-7e9c-488b-b0a8-bee71a618a8c" Feb 17 14:11:42 crc kubenswrapper[4836]: I0217 14:11:42.705966 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 17 14:11:42 crc kubenswrapper[4836]: I0217 14:11:42.706052 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 17 14:11:42 crc kubenswrapper[4836]: I0217 14:11:42.706156 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 17 14:11:42 crc kubenswrapper[4836]: I0217 14:11:42.706163 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 14:11:42 crc kubenswrapper[4836]: I0217 14:11:42.706193 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 17 14:11:42 crc kubenswrapper[4836]: I0217 14:11:42.706287 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 14:11:42 crc kubenswrapper[4836]: I0217 14:11:42.706324 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 14:11:42 crc kubenswrapper[4836]: I0217 14:11:42.706408 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 17 14:11:42 crc kubenswrapper[4836]: I0217 14:11:42.706516 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 14:11:42 crc kubenswrapper[4836]: I0217 14:11:42.706985 4836 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Feb 17 14:11:42 crc kubenswrapper[4836]: I0217 14:11:42.707000 4836 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 17 14:11:42 crc kubenswrapper[4836]: I0217 14:11:42.707010 4836 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Feb 17 14:11:42 crc kubenswrapper[4836]: I0217 14:11:42.707019 4836 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Feb 17 14:11:42 crc kubenswrapper[4836]: I0217 14:11:42.717396 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 14:11:42 crc kubenswrapper[4836]: I0217 14:11:42.744608 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 17 14:11:42 crc kubenswrapper[4836]: I0217 14:11:42.744667 4836 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="8757baf5bc7be098bf127cb1aafc088ae6d42ff3064423215125a6bb9cdaa67e" exitCode=137 Feb 17 14:11:42 crc kubenswrapper[4836]: I0217 14:11:42.744716 4836 scope.go:117] "RemoveContainer" containerID="8757baf5bc7be098bf127cb1aafc088ae6d42ff3064423215125a6bb9cdaa67e" Feb 17 14:11:42 crc kubenswrapper[4836]: I0217 14:11:42.744754 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 14:11:42 crc kubenswrapper[4836]: I0217 14:11:42.765121 4836 scope.go:117] "RemoveContainer" containerID="8757baf5bc7be098bf127cb1aafc088ae6d42ff3064423215125a6bb9cdaa67e" Feb 17 14:11:42 crc kubenswrapper[4836]: E0217 14:11:42.765760 4836 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8757baf5bc7be098bf127cb1aafc088ae6d42ff3064423215125a6bb9cdaa67e\": container with ID starting with 8757baf5bc7be098bf127cb1aafc088ae6d42ff3064423215125a6bb9cdaa67e not found: ID does not exist" containerID="8757baf5bc7be098bf127cb1aafc088ae6d42ff3064423215125a6bb9cdaa67e" Feb 17 14:11:42 crc kubenswrapper[4836]: I0217 14:11:42.765817 4836 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8757baf5bc7be098bf127cb1aafc088ae6d42ff3064423215125a6bb9cdaa67e"} err="failed to get container status \"8757baf5bc7be098bf127cb1aafc088ae6d42ff3064423215125a6bb9cdaa67e\": rpc error: code = NotFound desc = could not find container \"8757baf5bc7be098bf127cb1aafc088ae6d42ff3064423215125a6bb9cdaa67e\": container with ID starting with 8757baf5bc7be098bf127cb1aafc088ae6d42ff3064423215125a6bb9cdaa67e not found: ID does not exist" Feb 17 14:11:42 crc kubenswrapper[4836]: I0217 14:11:42.808199 4836 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 17 14:11:44 crc kubenswrapper[4836]: I0217 14:11:44.575833 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Feb 17 14:12:17 crc kubenswrapper[4836]: I0217 14:12:17.739197 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-5l6x4"] Feb 17 14:12:17 crc kubenswrapper[4836]: I0217 14:12:17.739919 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-5l6x4" podUID="8c77bcf1-4025-4c35-9580-41e9a61195e8" containerName="controller-manager" containerID="cri-o://48a4ab3353f6c1184058baaeccdfdd55224e71bf388392ae1f7582ff6b6cbd0c" gracePeriod=30 Feb 17 14:12:17 crc kubenswrapper[4836]: I0217 14:12:17.818629 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-2mmw4"] Feb 17 14:12:17 crc kubenswrapper[4836]: I0217 14:12:17.818940 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2mmw4" podUID="5ad14aa6-962d-4f8f-babe-745f65d63560" containerName="route-controller-manager" containerID="cri-o://7bf8b1ad3766a39524e948476d227ffd5480f4d617fc83f3457ebccacb5d8b7e" gracePeriod=30 Feb 17 14:12:18 crc kubenswrapper[4836]: I0217 14:12:18.103103 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-5l6x4" Feb 17 14:12:18 crc kubenswrapper[4836]: I0217 14:12:18.156823 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2mmw4" Feb 17 14:12:18 crc kubenswrapper[4836]: I0217 14:12:18.202398 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8c77bcf1-4025-4c35-9580-41e9a61195e8-client-ca\") pod \"8c77bcf1-4025-4c35-9580-41e9a61195e8\" (UID: \"8c77bcf1-4025-4c35-9580-41e9a61195e8\") " Feb 17 14:12:18 crc kubenswrapper[4836]: I0217 14:12:18.202499 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5ad14aa6-962d-4f8f-babe-745f65d63560-serving-cert\") pod \"5ad14aa6-962d-4f8f-babe-745f65d63560\" (UID: \"5ad14aa6-962d-4f8f-babe-745f65d63560\") " Feb 17 14:12:18 crc kubenswrapper[4836]: I0217 14:12:18.202597 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8c77bcf1-4025-4c35-9580-41e9a61195e8-serving-cert\") pod \"8c77bcf1-4025-4c35-9580-41e9a61195e8\" (UID: \"8c77bcf1-4025-4c35-9580-41e9a61195e8\") " Feb 17 14:12:18 crc kubenswrapper[4836]: I0217 14:12:18.202638 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nj6ml\" (UniqueName: \"kubernetes.io/projected/8c77bcf1-4025-4c35-9580-41e9a61195e8-kube-api-access-nj6ml\") pod \"8c77bcf1-4025-4c35-9580-41e9a61195e8\" (UID: \"8c77bcf1-4025-4c35-9580-41e9a61195e8\") " Feb 17 14:12:18 crc kubenswrapper[4836]: I0217 14:12:18.202685 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ad14aa6-962d-4f8f-babe-745f65d63560-config\") pod \"5ad14aa6-962d-4f8f-babe-745f65d63560\" (UID: \"5ad14aa6-962d-4f8f-babe-745f65d63560\") " Feb 17 14:12:18 crc kubenswrapper[4836]: I0217 14:12:18.202720 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5ad14aa6-962d-4f8f-babe-745f65d63560-client-ca\") pod \"5ad14aa6-962d-4f8f-babe-745f65d63560\" (UID: \"5ad14aa6-962d-4f8f-babe-745f65d63560\") " Feb 17 14:12:18 crc kubenswrapper[4836]: I0217 14:12:18.202747 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c77bcf1-4025-4c35-9580-41e9a61195e8-config\") pod \"8c77bcf1-4025-4c35-9580-41e9a61195e8\" (UID: \"8c77bcf1-4025-4c35-9580-41e9a61195e8\") " Feb 17 14:12:18 crc kubenswrapper[4836]: I0217 14:12:18.202809 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8c77bcf1-4025-4c35-9580-41e9a61195e8-proxy-ca-bundles\") pod \"8c77bcf1-4025-4c35-9580-41e9a61195e8\" (UID: \"8c77bcf1-4025-4c35-9580-41e9a61195e8\") " Feb 17 14:12:18 crc kubenswrapper[4836]: I0217 14:12:18.202883 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fhngg\" (UniqueName: \"kubernetes.io/projected/5ad14aa6-962d-4f8f-babe-745f65d63560-kube-api-access-fhngg\") pod \"5ad14aa6-962d-4f8f-babe-745f65d63560\" (UID: \"5ad14aa6-962d-4f8f-babe-745f65d63560\") " Feb 17 14:12:18 crc kubenswrapper[4836]: I0217 14:12:18.204502 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ad14aa6-962d-4f8f-babe-745f65d63560-config" (OuterVolumeSpecName: "config") pod "5ad14aa6-962d-4f8f-babe-745f65d63560" (UID: "5ad14aa6-962d-4f8f-babe-745f65d63560"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:12:18 crc kubenswrapper[4836]: I0217 14:12:18.205197 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c77bcf1-4025-4c35-9580-41e9a61195e8-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "8c77bcf1-4025-4c35-9580-41e9a61195e8" (UID: "8c77bcf1-4025-4c35-9580-41e9a61195e8"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:12:18 crc kubenswrapper[4836]: I0217 14:12:18.204789 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c77bcf1-4025-4c35-9580-41e9a61195e8-config" (OuterVolumeSpecName: "config") pod "8c77bcf1-4025-4c35-9580-41e9a61195e8" (UID: "8c77bcf1-4025-4c35-9580-41e9a61195e8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:12:18 crc kubenswrapper[4836]: I0217 14:12:18.205314 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ad14aa6-962d-4f8f-babe-745f65d63560-client-ca" (OuterVolumeSpecName: "client-ca") pod "5ad14aa6-962d-4f8f-babe-745f65d63560" (UID: "5ad14aa6-962d-4f8f-babe-745f65d63560"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:12:18 crc kubenswrapper[4836]: I0217 14:12:18.205492 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c77bcf1-4025-4c35-9580-41e9a61195e8-client-ca" (OuterVolumeSpecName: "client-ca") pod "8c77bcf1-4025-4c35-9580-41e9a61195e8" (UID: "8c77bcf1-4025-4c35-9580-41e9a61195e8"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:12:18 crc kubenswrapper[4836]: I0217 14:12:18.211937 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ad14aa6-962d-4f8f-babe-745f65d63560-kube-api-access-fhngg" (OuterVolumeSpecName: "kube-api-access-fhngg") pod "5ad14aa6-962d-4f8f-babe-745f65d63560" (UID: "5ad14aa6-962d-4f8f-babe-745f65d63560"). InnerVolumeSpecName "kube-api-access-fhngg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:12:18 crc kubenswrapper[4836]: I0217 14:12:18.211942 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ad14aa6-962d-4f8f-babe-745f65d63560-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5ad14aa6-962d-4f8f-babe-745f65d63560" (UID: "5ad14aa6-962d-4f8f-babe-745f65d63560"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:12:18 crc kubenswrapper[4836]: I0217 14:12:18.211950 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c77bcf1-4025-4c35-9580-41e9a61195e8-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8c77bcf1-4025-4c35-9580-41e9a61195e8" (UID: "8c77bcf1-4025-4c35-9580-41e9a61195e8"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:12:18 crc kubenswrapper[4836]: I0217 14:12:18.212241 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c77bcf1-4025-4c35-9580-41e9a61195e8-kube-api-access-nj6ml" (OuterVolumeSpecName: "kube-api-access-nj6ml") pod "8c77bcf1-4025-4c35-9580-41e9a61195e8" (UID: "8c77bcf1-4025-4c35-9580-41e9a61195e8"). InnerVolumeSpecName "kube-api-access-nj6ml". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:12:18 crc kubenswrapper[4836]: I0217 14:12:18.304396 4836 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8c77bcf1-4025-4c35-9580-41e9a61195e8-client-ca\") on node \"crc\" DevicePath \"\"" Feb 17 14:12:18 crc kubenswrapper[4836]: I0217 14:12:18.304455 4836 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5ad14aa6-962d-4f8f-babe-745f65d63560-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 14:12:18 crc kubenswrapper[4836]: I0217 14:12:18.304473 4836 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8c77bcf1-4025-4c35-9580-41e9a61195e8-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 14:12:18 crc kubenswrapper[4836]: I0217 14:12:18.304494 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nj6ml\" (UniqueName: \"kubernetes.io/projected/8c77bcf1-4025-4c35-9580-41e9a61195e8-kube-api-access-nj6ml\") on node \"crc\" DevicePath \"\"" Feb 17 14:12:18 crc kubenswrapper[4836]: I0217 14:12:18.304514 4836 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ad14aa6-962d-4f8f-babe-745f65d63560-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:12:18 crc kubenswrapper[4836]: I0217 14:12:18.304530 4836 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5ad14aa6-962d-4f8f-babe-745f65d63560-client-ca\") on node \"crc\" DevicePath \"\"" Feb 17 14:12:18 crc kubenswrapper[4836]: I0217 14:12:18.304546 4836 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c77bcf1-4025-4c35-9580-41e9a61195e8-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:12:18 crc kubenswrapper[4836]: I0217 14:12:18.304563 4836 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8c77bcf1-4025-4c35-9580-41e9a61195e8-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 17 14:12:18 crc kubenswrapper[4836]: I0217 14:12:18.304580 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fhngg\" (UniqueName: \"kubernetes.io/projected/5ad14aa6-962d-4f8f-babe-745f65d63560-kube-api-access-fhngg\") on node \"crc\" DevicePath \"\"" Feb 17 14:12:18 crc kubenswrapper[4836]: I0217 14:12:18.384702 4836 generic.go:334] "Generic (PLEG): container finished" podID="5ad14aa6-962d-4f8f-babe-745f65d63560" containerID="7bf8b1ad3766a39524e948476d227ffd5480f4d617fc83f3457ebccacb5d8b7e" exitCode=0 Feb 17 14:12:18 crc kubenswrapper[4836]: I0217 14:12:18.384754 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2mmw4" event={"ID":"5ad14aa6-962d-4f8f-babe-745f65d63560","Type":"ContainerDied","Data":"7bf8b1ad3766a39524e948476d227ffd5480f4d617fc83f3457ebccacb5d8b7e"} Feb 17 14:12:18 crc kubenswrapper[4836]: I0217 14:12:18.384788 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2mmw4" Feb 17 14:12:18 crc kubenswrapper[4836]: I0217 14:12:18.384826 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2mmw4" event={"ID":"5ad14aa6-962d-4f8f-babe-745f65d63560","Type":"ContainerDied","Data":"5a18cb47469c9084e91c362d0628474be4ea76582e846e1e93705e36c466141f"} Feb 17 14:12:18 crc kubenswrapper[4836]: I0217 14:12:18.384851 4836 scope.go:117] "RemoveContainer" containerID="7bf8b1ad3766a39524e948476d227ffd5480f4d617fc83f3457ebccacb5d8b7e" Feb 17 14:12:18 crc kubenswrapper[4836]: I0217 14:12:18.387550 4836 generic.go:334] "Generic (PLEG): container finished" podID="8c77bcf1-4025-4c35-9580-41e9a61195e8" containerID="48a4ab3353f6c1184058baaeccdfdd55224e71bf388392ae1f7582ff6b6cbd0c" exitCode=0 Feb 17 14:12:18 crc kubenswrapper[4836]: I0217 14:12:18.387599 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-5l6x4" event={"ID":"8c77bcf1-4025-4c35-9580-41e9a61195e8","Type":"ContainerDied","Data":"48a4ab3353f6c1184058baaeccdfdd55224e71bf388392ae1f7582ff6b6cbd0c"} Feb 17 14:12:18 crc kubenswrapper[4836]: I0217 14:12:18.387619 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-5l6x4" event={"ID":"8c77bcf1-4025-4c35-9580-41e9a61195e8","Type":"ContainerDied","Data":"b99d73db17eb9c6b2aa85ca03f0903902f643a2f2fbc708d9b4c51f4e9d1ede7"} Feb 17 14:12:18 crc kubenswrapper[4836]: I0217 14:12:18.387625 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-5l6x4" Feb 17 14:12:18 crc kubenswrapper[4836]: I0217 14:12:18.413016 4836 scope.go:117] "RemoveContainer" containerID="7bf8b1ad3766a39524e948476d227ffd5480f4d617fc83f3457ebccacb5d8b7e" Feb 17 14:12:18 crc kubenswrapper[4836]: E0217 14:12:18.414051 4836 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7bf8b1ad3766a39524e948476d227ffd5480f4d617fc83f3457ebccacb5d8b7e\": container with ID starting with 7bf8b1ad3766a39524e948476d227ffd5480f4d617fc83f3457ebccacb5d8b7e not found: ID does not exist" containerID="7bf8b1ad3766a39524e948476d227ffd5480f4d617fc83f3457ebccacb5d8b7e" Feb 17 14:12:18 crc kubenswrapper[4836]: I0217 14:12:18.414084 4836 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7bf8b1ad3766a39524e948476d227ffd5480f4d617fc83f3457ebccacb5d8b7e"} err="failed to get container status \"7bf8b1ad3766a39524e948476d227ffd5480f4d617fc83f3457ebccacb5d8b7e\": rpc error: code = NotFound desc = could not find container \"7bf8b1ad3766a39524e948476d227ffd5480f4d617fc83f3457ebccacb5d8b7e\": container with ID starting with 7bf8b1ad3766a39524e948476d227ffd5480f4d617fc83f3457ebccacb5d8b7e not found: ID does not exist" Feb 17 14:12:18 crc kubenswrapper[4836]: I0217 14:12:18.414136 4836 scope.go:117] "RemoveContainer" containerID="48a4ab3353f6c1184058baaeccdfdd55224e71bf388392ae1f7582ff6b6cbd0c" Feb 17 14:12:18 crc kubenswrapper[4836]: I0217 14:12:18.441910 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-5l6x4"] Feb 17 14:12:18 crc kubenswrapper[4836]: I0217 14:12:18.448823 4836 scope.go:117] "RemoveContainer" containerID="48a4ab3353f6c1184058baaeccdfdd55224e71bf388392ae1f7582ff6b6cbd0c" Feb 17 14:12:18 crc kubenswrapper[4836]: E0217 14:12:18.449605 4836 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48a4ab3353f6c1184058baaeccdfdd55224e71bf388392ae1f7582ff6b6cbd0c\": container with ID starting with 48a4ab3353f6c1184058baaeccdfdd55224e71bf388392ae1f7582ff6b6cbd0c not found: ID does not exist" containerID="48a4ab3353f6c1184058baaeccdfdd55224e71bf388392ae1f7582ff6b6cbd0c" Feb 17 14:12:18 crc kubenswrapper[4836]: I0217 14:12:18.449732 4836 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48a4ab3353f6c1184058baaeccdfdd55224e71bf388392ae1f7582ff6b6cbd0c"} err="failed to get container status \"48a4ab3353f6c1184058baaeccdfdd55224e71bf388392ae1f7582ff6b6cbd0c\": rpc error: code = NotFound desc = could not find container \"48a4ab3353f6c1184058baaeccdfdd55224e71bf388392ae1f7582ff6b6cbd0c\": container with ID starting with 48a4ab3353f6c1184058baaeccdfdd55224e71bf388392ae1f7582ff6b6cbd0c not found: ID does not exist" Feb 17 14:12:18 crc kubenswrapper[4836]: I0217 14:12:18.451037 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-5l6x4"] Feb 17 14:12:18 crc kubenswrapper[4836]: I0217 14:12:18.455392 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-2mmw4"] Feb 17 14:12:18 crc kubenswrapper[4836]: I0217 14:12:18.459598 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-2mmw4"] Feb 17 14:12:18 crc kubenswrapper[4836]: I0217 14:12:18.577061 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ad14aa6-962d-4f8f-babe-745f65d63560" path="/var/lib/kubelet/pods/5ad14aa6-962d-4f8f-babe-745f65d63560/volumes" Feb 17 14:12:18 crc kubenswrapper[4836]: I0217 14:12:18.578012 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c77bcf1-4025-4c35-9580-41e9a61195e8" path="/var/lib/kubelet/pods/8c77bcf1-4025-4c35-9580-41e9a61195e8/volumes" Feb 17 14:12:19 crc kubenswrapper[4836]: I0217 14:12:19.162508 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-69d6c474bd-9hpt4"] Feb 17 14:12:19 crc kubenswrapper[4836]: E0217 14:12:19.163251 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9f23804-837d-4d3c-94b7-7cdefe6e94df" containerName="extract-content" Feb 17 14:12:19 crc kubenswrapper[4836]: I0217 14:12:19.163266 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9f23804-837d-4d3c-94b7-7cdefe6e94df" containerName="extract-content" Feb 17 14:12:19 crc kubenswrapper[4836]: E0217 14:12:19.163274 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 17 14:12:19 crc kubenswrapper[4836]: I0217 14:12:19.163281 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 17 14:12:19 crc kubenswrapper[4836]: E0217 14:12:19.163312 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a172042c-7dc6-4cea-906e-3d9135523f15" containerName="registry-server" Feb 17 14:12:19 crc kubenswrapper[4836]: I0217 14:12:19.163320 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="a172042c-7dc6-4cea-906e-3d9135523f15" containerName="registry-server" Feb 17 14:12:19 crc kubenswrapper[4836]: E0217 14:12:19.163329 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c77bcf1-4025-4c35-9580-41e9a61195e8" containerName="controller-manager" Feb 17 14:12:19 crc kubenswrapper[4836]: I0217 14:12:19.163341 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c77bcf1-4025-4c35-9580-41e9a61195e8" containerName="controller-manager" Feb 17 14:12:19 crc kubenswrapper[4836]: E0217 14:12:19.163354 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ad14aa6-962d-4f8f-babe-745f65d63560" containerName="route-controller-manager" Feb 17 14:12:19 crc kubenswrapper[4836]: I0217 14:12:19.163359 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ad14aa6-962d-4f8f-babe-745f65d63560" containerName="route-controller-manager" Feb 17 14:12:19 crc kubenswrapper[4836]: E0217 14:12:19.163366 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="985bc83c-52fa-45dc-ab4f-6e47ee47683e" containerName="marketplace-operator" Feb 17 14:12:19 crc kubenswrapper[4836]: I0217 14:12:19.163372 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="985bc83c-52fa-45dc-ab4f-6e47ee47683e" containerName="marketplace-operator" Feb 17 14:12:19 crc kubenswrapper[4836]: E0217 14:12:19.163383 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9f23804-837d-4d3c-94b7-7cdefe6e94df" containerName="registry-server" Feb 17 14:12:19 crc kubenswrapper[4836]: I0217 14:12:19.163389 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9f23804-837d-4d3c-94b7-7cdefe6e94df" containerName="registry-server" Feb 17 14:12:19 crc kubenswrapper[4836]: E0217 14:12:19.163401 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a172042c-7dc6-4cea-906e-3d9135523f15" containerName="extract-utilities" Feb 17 14:12:19 crc kubenswrapper[4836]: I0217 14:12:19.163407 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="a172042c-7dc6-4cea-906e-3d9135523f15" containerName="extract-utilities" Feb 17 14:12:19 crc kubenswrapper[4836]: E0217 14:12:19.163414 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a172042c-7dc6-4cea-906e-3d9135523f15" containerName="extract-content" Feb 17 14:12:19 crc kubenswrapper[4836]: I0217 14:12:19.163420 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="a172042c-7dc6-4cea-906e-3d9135523f15" containerName="extract-content" Feb 17 14:12:19 crc kubenswrapper[4836]: E0217 14:12:19.163428 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9f23804-837d-4d3c-94b7-7cdefe6e94df" containerName="extract-utilities" Feb 17 14:12:19 crc kubenswrapper[4836]: I0217 14:12:19.163434 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9f23804-837d-4d3c-94b7-7cdefe6e94df" containerName="extract-utilities" Feb 17 14:12:19 crc kubenswrapper[4836]: E0217 14:12:19.163444 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="089d1289-afe9-4ffe-9d96-ac10058335ed" containerName="extract-content" Feb 17 14:12:19 crc kubenswrapper[4836]: I0217 14:12:19.163450 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="089d1289-afe9-4ffe-9d96-ac10058335ed" containerName="extract-content" Feb 17 14:12:19 crc kubenswrapper[4836]: E0217 14:12:19.163458 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8762f2f2-8375-4fdd-8a29-ea2ab598afa1" containerName="extract-utilities" Feb 17 14:12:19 crc kubenswrapper[4836]: I0217 14:12:19.163464 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="8762f2f2-8375-4fdd-8a29-ea2ab598afa1" containerName="extract-utilities" Feb 17 14:12:19 crc kubenswrapper[4836]: E0217 14:12:19.163471 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="089d1289-afe9-4ffe-9d96-ac10058335ed" containerName="registry-server" Feb 17 14:12:19 crc kubenswrapper[4836]: I0217 14:12:19.163477 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="089d1289-afe9-4ffe-9d96-ac10058335ed" containerName="registry-server" Feb 17 14:12:19 crc kubenswrapper[4836]: E0217 14:12:19.163486 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8762f2f2-8375-4fdd-8a29-ea2ab598afa1" containerName="extract-content" Feb 17 14:12:19 crc kubenswrapper[4836]: I0217 14:12:19.163492 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="8762f2f2-8375-4fdd-8a29-ea2ab598afa1" containerName="extract-content" Feb 17 14:12:19 crc kubenswrapper[4836]: E0217 14:12:19.163499 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8762f2f2-8375-4fdd-8a29-ea2ab598afa1" containerName="registry-server" Feb 17 14:12:19 crc kubenswrapper[4836]: I0217 14:12:19.163505 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="8762f2f2-8375-4fdd-8a29-ea2ab598afa1" containerName="registry-server" Feb 17 14:12:19 crc kubenswrapper[4836]: E0217 14:12:19.163513 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="089d1289-afe9-4ffe-9d96-ac10058335ed" containerName="extract-utilities" Feb 17 14:12:19 crc kubenswrapper[4836]: I0217 14:12:19.163519 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="089d1289-afe9-4ffe-9d96-ac10058335ed" containerName="extract-utilities" Feb 17 14:12:19 crc kubenswrapper[4836]: I0217 14:12:19.163615 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="a172042c-7dc6-4cea-906e-3d9135523f15" containerName="registry-server" Feb 17 14:12:19 crc kubenswrapper[4836]: I0217 14:12:19.163629 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="8762f2f2-8375-4fdd-8a29-ea2ab598afa1" containerName="registry-server" Feb 17 14:12:19 crc kubenswrapper[4836]: I0217 14:12:19.163636 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="985bc83c-52fa-45dc-ab4f-6e47ee47683e" containerName="marketplace-operator" Feb 17 14:12:19 crc kubenswrapper[4836]: I0217 14:12:19.163643 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c77bcf1-4025-4c35-9580-41e9a61195e8" containerName="controller-manager" Feb 17 14:12:19 crc kubenswrapper[4836]: I0217 14:12:19.163650 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 17 14:12:19 crc kubenswrapper[4836]: I0217 14:12:19.163659 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9f23804-837d-4d3c-94b7-7cdefe6e94df" containerName="registry-server" Feb 17 14:12:19 crc kubenswrapper[4836]: I0217 14:12:19.163668 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="089d1289-afe9-4ffe-9d96-ac10058335ed" containerName="registry-server" Feb 17 14:12:19 crc kubenswrapper[4836]: I0217 14:12:19.163674 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ad14aa6-962d-4f8f-babe-745f65d63560" containerName="route-controller-manager" Feb 17 14:12:19 crc kubenswrapper[4836]: I0217 14:12:19.164164 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-69d6c474bd-9hpt4" Feb 17 14:12:19 crc kubenswrapper[4836]: I0217 14:12:19.166644 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 17 14:12:19 crc kubenswrapper[4836]: I0217 14:12:19.166652 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 17 14:12:19 crc kubenswrapper[4836]: I0217 14:12:19.167458 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 17 14:12:19 crc kubenswrapper[4836]: I0217 14:12:19.167523 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 17 14:12:19 crc kubenswrapper[4836]: I0217 14:12:19.167522 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 17 14:12:19 crc kubenswrapper[4836]: I0217 14:12:19.167775 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 17 14:12:19 crc kubenswrapper[4836]: I0217 14:12:19.176337 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 17 14:12:19 crc kubenswrapper[4836]: I0217 14:12:19.182585 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-69d6c474bd-9hpt4"] Feb 17 14:12:19 crc kubenswrapper[4836]: I0217 14:12:19.198957 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6bb9976d8b-krcmb"] Feb 17 14:12:19 crc kubenswrapper[4836]: I0217 14:12:19.199897 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6bb9976d8b-krcmb" Feb 17 14:12:19 crc kubenswrapper[4836]: I0217 14:12:19.201730 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 17 14:12:19 crc kubenswrapper[4836]: I0217 14:12:19.202077 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 17 14:12:19 crc kubenswrapper[4836]: I0217 14:12:19.202899 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 17 14:12:19 crc kubenswrapper[4836]: I0217 14:12:19.203030 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 17 14:12:19 crc kubenswrapper[4836]: I0217 14:12:19.203165 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 17 14:12:19 crc kubenswrapper[4836]: I0217 14:12:19.203447 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 17 14:12:19 crc kubenswrapper[4836]: I0217 14:12:19.224406 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6bb9976d8b-krcmb"] Feb 17 14:12:19 crc kubenswrapper[4836]: I0217 14:12:19.286000 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-69d6c474bd-9hpt4"] Feb 17 14:12:19 crc kubenswrapper[4836]: E0217 14:12:19.286691 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[client-ca config kube-api-access-wttn9 proxy-ca-bundles serving-cert], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-controller-manager/controller-manager-69d6c474bd-9hpt4" podUID="e4797214-796b-4e39-ae05-c719bbffd7bf" Feb 17 14:12:19 crc kubenswrapper[4836]: I0217 14:12:19.315341 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6bb9976d8b-krcmb"] Feb 17 14:12:19 crc kubenswrapper[4836]: E0217 14:12:19.315978 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[client-ca config kube-api-access-h9jbv serving-cert], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-route-controller-manager/route-controller-manager-6bb9976d8b-krcmb" podUID="34e63fa4-25c0-40bc-85bf-9428bc0842b0" Feb 17 14:12:19 crc kubenswrapper[4836]: I0217 14:12:19.345929 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/34e63fa4-25c0-40bc-85bf-9428bc0842b0-client-ca\") pod \"route-controller-manager-6bb9976d8b-krcmb\" (UID: \"34e63fa4-25c0-40bc-85bf-9428bc0842b0\") " pod="openshift-route-controller-manager/route-controller-manager-6bb9976d8b-krcmb" Feb 17 14:12:19 crc kubenswrapper[4836]: I0217 14:12:19.345991 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/34e63fa4-25c0-40bc-85bf-9428bc0842b0-serving-cert\") pod \"route-controller-manager-6bb9976d8b-krcmb\" (UID: \"34e63fa4-25c0-40bc-85bf-9428bc0842b0\") " pod="openshift-route-controller-manager/route-controller-manager-6bb9976d8b-krcmb" Feb 17 14:12:19 crc kubenswrapper[4836]: I0217 14:12:19.346164 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34e63fa4-25c0-40bc-85bf-9428bc0842b0-config\") pod \"route-controller-manager-6bb9976d8b-krcmb\" (UID: \"34e63fa4-25c0-40bc-85bf-9428bc0842b0\") " pod="openshift-route-controller-manager/route-controller-manager-6bb9976d8b-krcmb" Feb 17 14:12:19 crc kubenswrapper[4836]: I0217 14:12:19.346250 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e4797214-796b-4e39-ae05-c719bbffd7bf-client-ca\") pod \"controller-manager-69d6c474bd-9hpt4\" (UID: \"e4797214-796b-4e39-ae05-c719bbffd7bf\") " pod="openshift-controller-manager/controller-manager-69d6c474bd-9hpt4" Feb 17 14:12:19 crc kubenswrapper[4836]: I0217 14:12:19.346338 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e4797214-796b-4e39-ae05-c719bbffd7bf-proxy-ca-bundles\") pod \"controller-manager-69d6c474bd-9hpt4\" (UID: \"e4797214-796b-4e39-ae05-c719bbffd7bf\") " pod="openshift-controller-manager/controller-manager-69d6c474bd-9hpt4" Feb 17 14:12:19 crc kubenswrapper[4836]: I0217 14:12:19.346383 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4797214-796b-4e39-ae05-c719bbffd7bf-config\") pod \"controller-manager-69d6c474bd-9hpt4\" (UID: \"e4797214-796b-4e39-ae05-c719bbffd7bf\") " pod="openshift-controller-manager/controller-manager-69d6c474bd-9hpt4" Feb 17 14:12:19 crc kubenswrapper[4836]: I0217 14:12:19.346442 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e4797214-796b-4e39-ae05-c719bbffd7bf-serving-cert\") pod \"controller-manager-69d6c474bd-9hpt4\" (UID: \"e4797214-796b-4e39-ae05-c719bbffd7bf\") " pod="openshift-controller-manager/controller-manager-69d6c474bd-9hpt4" Feb 17 14:12:19 crc kubenswrapper[4836]: I0217 14:12:19.346470 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wttn9\" (UniqueName: \"kubernetes.io/projected/e4797214-796b-4e39-ae05-c719bbffd7bf-kube-api-access-wttn9\") pod \"controller-manager-69d6c474bd-9hpt4\" (UID: \"e4797214-796b-4e39-ae05-c719bbffd7bf\") " pod="openshift-controller-manager/controller-manager-69d6c474bd-9hpt4" Feb 17 14:12:19 crc kubenswrapper[4836]: I0217 14:12:19.346647 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9jbv\" (UniqueName: \"kubernetes.io/projected/34e63fa4-25c0-40bc-85bf-9428bc0842b0-kube-api-access-h9jbv\") pod \"route-controller-manager-6bb9976d8b-krcmb\" (UID: \"34e63fa4-25c0-40bc-85bf-9428bc0842b0\") " pod="openshift-route-controller-manager/route-controller-manager-6bb9976d8b-krcmb" Feb 17 14:12:19 crc kubenswrapper[4836]: I0217 14:12:19.397018 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-69d6c474bd-9hpt4" Feb 17 14:12:19 crc kubenswrapper[4836]: I0217 14:12:19.397051 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6bb9976d8b-krcmb" Feb 17 14:12:19 crc kubenswrapper[4836]: I0217 14:12:19.407982 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-69d6c474bd-9hpt4" Feb 17 14:12:19 crc kubenswrapper[4836]: I0217 14:12:19.414222 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6bb9976d8b-krcmb" Feb 17 14:12:19 crc kubenswrapper[4836]: I0217 14:12:19.448429 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9jbv\" (UniqueName: \"kubernetes.io/projected/34e63fa4-25c0-40bc-85bf-9428bc0842b0-kube-api-access-h9jbv\") pod \"route-controller-manager-6bb9976d8b-krcmb\" (UID: \"34e63fa4-25c0-40bc-85bf-9428bc0842b0\") " pod="openshift-route-controller-manager/route-controller-manager-6bb9976d8b-krcmb" Feb 17 14:12:19 crc kubenswrapper[4836]: I0217 14:12:19.448510 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/34e63fa4-25c0-40bc-85bf-9428bc0842b0-client-ca\") pod \"route-controller-manager-6bb9976d8b-krcmb\" (UID: \"34e63fa4-25c0-40bc-85bf-9428bc0842b0\") " pod="openshift-route-controller-manager/route-controller-manager-6bb9976d8b-krcmb" Feb 17 14:12:19 crc kubenswrapper[4836]: I0217 14:12:19.448539 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/34e63fa4-25c0-40bc-85bf-9428bc0842b0-serving-cert\") pod \"route-controller-manager-6bb9976d8b-krcmb\" (UID: \"34e63fa4-25c0-40bc-85bf-9428bc0842b0\") " pod="openshift-route-controller-manager/route-controller-manager-6bb9976d8b-krcmb" Feb 17 14:12:19 crc kubenswrapper[4836]: I0217 14:12:19.448568 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34e63fa4-25c0-40bc-85bf-9428bc0842b0-config\") pod \"route-controller-manager-6bb9976d8b-krcmb\" (UID: \"34e63fa4-25c0-40bc-85bf-9428bc0842b0\") " pod="openshift-route-controller-manager/route-controller-manager-6bb9976d8b-krcmb" Feb 17 14:12:19 crc kubenswrapper[4836]: I0217 14:12:19.448595 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e4797214-796b-4e39-ae05-c719bbffd7bf-client-ca\") pod \"controller-manager-69d6c474bd-9hpt4\" (UID: \"e4797214-796b-4e39-ae05-c719bbffd7bf\") " pod="openshift-controller-manager/controller-manager-69d6c474bd-9hpt4" Feb 17 14:12:19 crc kubenswrapper[4836]: I0217 14:12:19.448625 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e4797214-796b-4e39-ae05-c719bbffd7bf-proxy-ca-bundles\") pod \"controller-manager-69d6c474bd-9hpt4\" (UID: \"e4797214-796b-4e39-ae05-c719bbffd7bf\") " pod="openshift-controller-manager/controller-manager-69d6c474bd-9hpt4" Feb 17 14:12:19 crc kubenswrapper[4836]: I0217 14:12:19.448670 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4797214-796b-4e39-ae05-c719bbffd7bf-config\") pod \"controller-manager-69d6c474bd-9hpt4\" (UID: \"e4797214-796b-4e39-ae05-c719bbffd7bf\") " pod="openshift-controller-manager/controller-manager-69d6c474bd-9hpt4" Feb 17 14:12:19 crc kubenswrapper[4836]: I0217 14:12:19.448710 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e4797214-796b-4e39-ae05-c719bbffd7bf-serving-cert\") pod \"controller-manager-69d6c474bd-9hpt4\" (UID: \"e4797214-796b-4e39-ae05-c719bbffd7bf\") " pod="openshift-controller-manager/controller-manager-69d6c474bd-9hpt4" Feb 17 14:12:19 crc kubenswrapper[4836]: I0217 14:12:19.448743 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wttn9\" (UniqueName: \"kubernetes.io/projected/e4797214-796b-4e39-ae05-c719bbffd7bf-kube-api-access-wttn9\") pod \"controller-manager-69d6c474bd-9hpt4\" (UID: \"e4797214-796b-4e39-ae05-c719bbffd7bf\") " pod="openshift-controller-manager/controller-manager-69d6c474bd-9hpt4" Feb 17 14:12:19 crc kubenswrapper[4836]: I0217 14:12:19.449658 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/34e63fa4-25c0-40bc-85bf-9428bc0842b0-client-ca\") pod \"route-controller-manager-6bb9976d8b-krcmb\" (UID: \"34e63fa4-25c0-40bc-85bf-9428bc0842b0\") " pod="openshift-route-controller-manager/route-controller-manager-6bb9976d8b-krcmb" Feb 17 14:12:19 crc kubenswrapper[4836]: I0217 14:12:19.450066 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e4797214-796b-4e39-ae05-c719bbffd7bf-client-ca\") pod \"controller-manager-69d6c474bd-9hpt4\" (UID: \"e4797214-796b-4e39-ae05-c719bbffd7bf\") " pod="openshift-controller-manager/controller-manager-69d6c474bd-9hpt4" Feb 17 14:12:19 crc kubenswrapper[4836]: I0217 14:12:19.450495 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e4797214-796b-4e39-ae05-c719bbffd7bf-proxy-ca-bundles\") pod \"controller-manager-69d6c474bd-9hpt4\" (UID: \"e4797214-796b-4e39-ae05-c719bbffd7bf\") " pod="openshift-controller-manager/controller-manager-69d6c474bd-9hpt4" Feb 17 14:12:19 crc kubenswrapper[4836]: I0217 14:12:19.450696 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34e63fa4-25c0-40bc-85bf-9428bc0842b0-config\") pod \"route-controller-manager-6bb9976d8b-krcmb\" (UID: \"34e63fa4-25c0-40bc-85bf-9428bc0842b0\") " pod="openshift-route-controller-manager/route-controller-manager-6bb9976d8b-krcmb" Feb 17 14:12:19 crc kubenswrapper[4836]: I0217 14:12:19.450800 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4797214-796b-4e39-ae05-c719bbffd7bf-config\") pod \"controller-manager-69d6c474bd-9hpt4\" (UID: \"e4797214-796b-4e39-ae05-c719bbffd7bf\") " pod="openshift-controller-manager/controller-manager-69d6c474bd-9hpt4" Feb 17 14:12:19 crc kubenswrapper[4836]: I0217 14:12:19.455766 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e4797214-796b-4e39-ae05-c719bbffd7bf-serving-cert\") pod \"controller-manager-69d6c474bd-9hpt4\" (UID: \"e4797214-796b-4e39-ae05-c719bbffd7bf\") " pod="openshift-controller-manager/controller-manager-69d6c474bd-9hpt4" Feb 17 14:12:19 crc kubenswrapper[4836]: I0217 14:12:19.457841 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/34e63fa4-25c0-40bc-85bf-9428bc0842b0-serving-cert\") pod \"route-controller-manager-6bb9976d8b-krcmb\" (UID: \"34e63fa4-25c0-40bc-85bf-9428bc0842b0\") " pod="openshift-route-controller-manager/route-controller-manager-6bb9976d8b-krcmb" Feb 17 14:12:19 crc kubenswrapper[4836]: I0217 14:12:19.466360 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9jbv\" (UniqueName: \"kubernetes.io/projected/34e63fa4-25c0-40bc-85bf-9428bc0842b0-kube-api-access-h9jbv\") pod \"route-controller-manager-6bb9976d8b-krcmb\" (UID: \"34e63fa4-25c0-40bc-85bf-9428bc0842b0\") " pod="openshift-route-controller-manager/route-controller-manager-6bb9976d8b-krcmb" Feb 17 14:12:19 crc kubenswrapper[4836]: I0217 14:12:19.474072 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wttn9\" (UniqueName: \"kubernetes.io/projected/e4797214-796b-4e39-ae05-c719bbffd7bf-kube-api-access-wttn9\") pod \"controller-manager-69d6c474bd-9hpt4\" (UID: \"e4797214-796b-4e39-ae05-c719bbffd7bf\") " pod="openshift-controller-manager/controller-manager-69d6c474bd-9hpt4" Feb 17 14:12:19 crc kubenswrapper[4836]: I0217 14:12:19.650536 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34e63fa4-25c0-40bc-85bf-9428bc0842b0-config\") pod \"34e63fa4-25c0-40bc-85bf-9428bc0842b0\" (UID: \"34e63fa4-25c0-40bc-85bf-9428bc0842b0\") " Feb 17 14:12:19 crc kubenswrapper[4836]: I0217 14:12:19.650630 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e4797214-796b-4e39-ae05-c719bbffd7bf-client-ca\") pod \"e4797214-796b-4e39-ae05-c719bbffd7bf\" (UID: \"e4797214-796b-4e39-ae05-c719bbffd7bf\") " Feb 17 14:12:19 crc kubenswrapper[4836]: I0217 14:12:19.650716 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e4797214-796b-4e39-ae05-c719bbffd7bf-proxy-ca-bundles\") pod \"e4797214-796b-4e39-ae05-c719bbffd7bf\" (UID: \"e4797214-796b-4e39-ae05-c719bbffd7bf\") " Feb 17 14:12:19 crc kubenswrapper[4836]: I0217 14:12:19.650790 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/34e63fa4-25c0-40bc-85bf-9428bc0842b0-serving-cert\") pod \"34e63fa4-25c0-40bc-85bf-9428bc0842b0\" (UID: \"34e63fa4-25c0-40bc-85bf-9428bc0842b0\") " Feb 17 14:12:19 crc kubenswrapper[4836]: I0217 14:12:19.650811 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h9jbv\" (UniqueName: \"kubernetes.io/projected/34e63fa4-25c0-40bc-85bf-9428bc0842b0-kube-api-access-h9jbv\") pod \"34e63fa4-25c0-40bc-85bf-9428bc0842b0\" (UID: \"34e63fa4-25c0-40bc-85bf-9428bc0842b0\") " Feb 17 14:12:19 crc kubenswrapper[4836]: I0217 14:12:19.650859 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4797214-796b-4e39-ae05-c719bbffd7bf-config\") pod \"e4797214-796b-4e39-ae05-c719bbffd7bf\" (UID: \"e4797214-796b-4e39-ae05-c719bbffd7bf\") " Feb 17 14:12:19 crc kubenswrapper[4836]: I0217 14:12:19.650887 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/34e63fa4-25c0-40bc-85bf-9428bc0842b0-client-ca\") pod \"34e63fa4-25c0-40bc-85bf-9428bc0842b0\" (UID: \"34e63fa4-25c0-40bc-85bf-9428bc0842b0\") " Feb 17 14:12:19 crc kubenswrapper[4836]: I0217 14:12:19.650925 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wttn9\" (UniqueName: \"kubernetes.io/projected/e4797214-796b-4e39-ae05-c719bbffd7bf-kube-api-access-wttn9\") pod \"e4797214-796b-4e39-ae05-c719bbffd7bf\" (UID: \"e4797214-796b-4e39-ae05-c719bbffd7bf\") " Feb 17 14:12:19 crc kubenswrapper[4836]: I0217 14:12:19.650946 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e4797214-796b-4e39-ae05-c719bbffd7bf-serving-cert\") pod \"e4797214-796b-4e39-ae05-c719bbffd7bf\" (UID: \"e4797214-796b-4e39-ae05-c719bbffd7bf\") " Feb 17 14:12:19 crc kubenswrapper[4836]: I0217 14:12:19.651905 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4797214-796b-4e39-ae05-c719bbffd7bf-client-ca" (OuterVolumeSpecName: "client-ca") pod "e4797214-796b-4e39-ae05-c719bbffd7bf" (UID: "e4797214-796b-4e39-ae05-c719bbffd7bf"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:12:19 crc kubenswrapper[4836]: I0217 14:12:19.652065 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34e63fa4-25c0-40bc-85bf-9428bc0842b0-config" (OuterVolumeSpecName: "config") pod "34e63fa4-25c0-40bc-85bf-9428bc0842b0" (UID: "34e63fa4-25c0-40bc-85bf-9428bc0842b0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:12:19 crc kubenswrapper[4836]: I0217 14:12:19.652657 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34e63fa4-25c0-40bc-85bf-9428bc0842b0-client-ca" (OuterVolumeSpecName: "client-ca") pod "34e63fa4-25c0-40bc-85bf-9428bc0842b0" (UID: "34e63fa4-25c0-40bc-85bf-9428bc0842b0"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:12:19 crc kubenswrapper[4836]: I0217 14:12:19.652940 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4797214-796b-4e39-ae05-c719bbffd7bf-config" (OuterVolumeSpecName: "config") pod "e4797214-796b-4e39-ae05-c719bbffd7bf" (UID: "e4797214-796b-4e39-ae05-c719bbffd7bf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:12:19 crc kubenswrapper[4836]: I0217 14:12:19.653013 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4797214-796b-4e39-ae05-c719bbffd7bf-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "e4797214-796b-4e39-ae05-c719bbffd7bf" (UID: "e4797214-796b-4e39-ae05-c719bbffd7bf"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:12:19 crc kubenswrapper[4836]: I0217 14:12:19.655592 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34e63fa4-25c0-40bc-85bf-9428bc0842b0-kube-api-access-h9jbv" (OuterVolumeSpecName: "kube-api-access-h9jbv") pod "34e63fa4-25c0-40bc-85bf-9428bc0842b0" (UID: "34e63fa4-25c0-40bc-85bf-9428bc0842b0"). InnerVolumeSpecName "kube-api-access-h9jbv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:12:19 crc kubenswrapper[4836]: I0217 14:12:19.656886 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34e63fa4-25c0-40bc-85bf-9428bc0842b0-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "34e63fa4-25c0-40bc-85bf-9428bc0842b0" (UID: "34e63fa4-25c0-40bc-85bf-9428bc0842b0"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:12:19 crc kubenswrapper[4836]: I0217 14:12:19.657017 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4797214-796b-4e39-ae05-c719bbffd7bf-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e4797214-796b-4e39-ae05-c719bbffd7bf" (UID: "e4797214-796b-4e39-ae05-c719bbffd7bf"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:12:19 crc kubenswrapper[4836]: I0217 14:12:19.659391 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4797214-796b-4e39-ae05-c719bbffd7bf-kube-api-access-wttn9" (OuterVolumeSpecName: "kube-api-access-wttn9") pod "e4797214-796b-4e39-ae05-c719bbffd7bf" (UID: "e4797214-796b-4e39-ae05-c719bbffd7bf"). InnerVolumeSpecName "kube-api-access-wttn9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:12:19 crc kubenswrapper[4836]: I0217 14:12:19.752595 4836 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e4797214-796b-4e39-ae05-c719bbffd7bf-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 17 14:12:19 crc kubenswrapper[4836]: I0217 14:12:19.752657 4836 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/34e63fa4-25c0-40bc-85bf-9428bc0842b0-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 14:12:19 crc kubenswrapper[4836]: I0217 14:12:19.752672 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h9jbv\" (UniqueName: \"kubernetes.io/projected/34e63fa4-25c0-40bc-85bf-9428bc0842b0-kube-api-access-h9jbv\") on node \"crc\" DevicePath \"\"" Feb 17 14:12:19 crc kubenswrapper[4836]: I0217 14:12:19.752689 4836 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/34e63fa4-25c0-40bc-85bf-9428bc0842b0-client-ca\") on node \"crc\" DevicePath \"\"" Feb 17 14:12:19 crc kubenswrapper[4836]: I0217 14:12:19.752706 4836 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4797214-796b-4e39-ae05-c719bbffd7bf-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:12:19 crc kubenswrapper[4836]: I0217 14:12:19.752718 4836 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e4797214-796b-4e39-ae05-c719bbffd7bf-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 14:12:19 crc kubenswrapper[4836]: I0217 14:12:19.752731 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wttn9\" (UniqueName: \"kubernetes.io/projected/e4797214-796b-4e39-ae05-c719bbffd7bf-kube-api-access-wttn9\") on node \"crc\" DevicePath \"\"" Feb 17 14:12:19 crc kubenswrapper[4836]: I0217 14:12:19.752744 4836 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34e63fa4-25c0-40bc-85bf-9428bc0842b0-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:12:19 crc kubenswrapper[4836]: I0217 14:12:19.752757 4836 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e4797214-796b-4e39-ae05-c719bbffd7bf-client-ca\") on node \"crc\" DevicePath \"\"" Feb 17 14:12:20 crc kubenswrapper[4836]: I0217 14:12:20.403670 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-69d6c474bd-9hpt4" Feb 17 14:12:20 crc kubenswrapper[4836]: I0217 14:12:20.403761 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6bb9976d8b-krcmb" Feb 17 14:12:20 crc kubenswrapper[4836]: I0217 14:12:20.454477 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-655d9f6b-9w5qq"] Feb 17 14:12:20 crc kubenswrapper[4836]: I0217 14:12:20.455200 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-655d9f6b-9w5qq" Feb 17 14:12:20 crc kubenswrapper[4836]: I0217 14:12:20.459228 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 17 14:12:20 crc kubenswrapper[4836]: I0217 14:12:20.459483 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 17 14:12:20 crc kubenswrapper[4836]: I0217 14:12:20.461108 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 17 14:12:20 crc kubenswrapper[4836]: I0217 14:12:20.461381 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 17 14:12:20 crc kubenswrapper[4836]: I0217 14:12:20.461551 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 17 14:12:20 crc kubenswrapper[4836]: I0217 14:12:20.468581 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 17 14:12:20 crc kubenswrapper[4836]: I0217 14:12:20.474019 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6bb9976d8b-krcmb"] Feb 17 14:12:20 crc kubenswrapper[4836]: I0217 14:12:20.480439 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6bb9976d8b-krcmb"] Feb 17 14:12:20 crc kubenswrapper[4836]: I0217 14:12:20.483711 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-655d9f6b-9w5qq"] Feb 17 14:12:20 crc kubenswrapper[4836]: I0217 14:12:20.491032 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-69d6c474bd-9hpt4"] Feb 17 14:12:20 crc kubenswrapper[4836]: I0217 14:12:20.493592 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-69d6c474bd-9hpt4"] Feb 17 14:12:20 crc kubenswrapper[4836]: I0217 14:12:20.564416 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a41b80c5-58ef-4d96-a176-02d0618297ee-client-ca\") pod \"route-controller-manager-655d9f6b-9w5qq\" (UID: \"a41b80c5-58ef-4d96-a176-02d0618297ee\") " pod="openshift-route-controller-manager/route-controller-manager-655d9f6b-9w5qq" Feb 17 14:12:20 crc kubenswrapper[4836]: I0217 14:12:20.564465 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrs9m\" (UniqueName: \"kubernetes.io/projected/a41b80c5-58ef-4d96-a176-02d0618297ee-kube-api-access-lrs9m\") pod \"route-controller-manager-655d9f6b-9w5qq\" (UID: \"a41b80c5-58ef-4d96-a176-02d0618297ee\") " pod="openshift-route-controller-manager/route-controller-manager-655d9f6b-9w5qq" Feb 17 14:12:20 crc kubenswrapper[4836]: I0217 14:12:20.564560 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a41b80c5-58ef-4d96-a176-02d0618297ee-config\") pod \"route-controller-manager-655d9f6b-9w5qq\" (UID: \"a41b80c5-58ef-4d96-a176-02d0618297ee\") " pod="openshift-route-controller-manager/route-controller-manager-655d9f6b-9w5qq" Feb 17 14:12:20 crc kubenswrapper[4836]: I0217 14:12:20.564599 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a41b80c5-58ef-4d96-a176-02d0618297ee-serving-cert\") pod \"route-controller-manager-655d9f6b-9w5qq\" (UID: \"a41b80c5-58ef-4d96-a176-02d0618297ee\") " pod="openshift-route-controller-manager/route-controller-manager-655d9f6b-9w5qq" Feb 17 14:12:20 crc kubenswrapper[4836]: I0217 14:12:20.575860 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34e63fa4-25c0-40bc-85bf-9428bc0842b0" path="/var/lib/kubelet/pods/34e63fa4-25c0-40bc-85bf-9428bc0842b0/volumes" Feb 17 14:12:20 crc kubenswrapper[4836]: I0217 14:12:20.576330 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4797214-796b-4e39-ae05-c719bbffd7bf" path="/var/lib/kubelet/pods/e4797214-796b-4e39-ae05-c719bbffd7bf/volumes" Feb 17 14:12:20 crc kubenswrapper[4836]: I0217 14:12:20.666171 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a41b80c5-58ef-4d96-a176-02d0618297ee-config\") pod \"route-controller-manager-655d9f6b-9w5qq\" (UID: \"a41b80c5-58ef-4d96-a176-02d0618297ee\") " pod="openshift-route-controller-manager/route-controller-manager-655d9f6b-9w5qq" Feb 17 14:12:20 crc kubenswrapper[4836]: I0217 14:12:20.666231 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a41b80c5-58ef-4d96-a176-02d0618297ee-serving-cert\") pod \"route-controller-manager-655d9f6b-9w5qq\" (UID: \"a41b80c5-58ef-4d96-a176-02d0618297ee\") " pod="openshift-route-controller-manager/route-controller-manager-655d9f6b-9w5qq" Feb 17 14:12:20 crc kubenswrapper[4836]: I0217 14:12:20.666255 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a41b80c5-58ef-4d96-a176-02d0618297ee-client-ca\") pod \"route-controller-manager-655d9f6b-9w5qq\" (UID: \"a41b80c5-58ef-4d96-a176-02d0618297ee\") " pod="openshift-route-controller-manager/route-controller-manager-655d9f6b-9w5qq" Feb 17 14:12:20 crc kubenswrapper[4836]: I0217 14:12:20.666274 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrs9m\" (UniqueName: \"kubernetes.io/projected/a41b80c5-58ef-4d96-a176-02d0618297ee-kube-api-access-lrs9m\") pod \"route-controller-manager-655d9f6b-9w5qq\" (UID: \"a41b80c5-58ef-4d96-a176-02d0618297ee\") " pod="openshift-route-controller-manager/route-controller-manager-655d9f6b-9w5qq" Feb 17 14:12:20 crc kubenswrapper[4836]: I0217 14:12:20.667520 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a41b80c5-58ef-4d96-a176-02d0618297ee-config\") pod \"route-controller-manager-655d9f6b-9w5qq\" (UID: \"a41b80c5-58ef-4d96-a176-02d0618297ee\") " pod="openshift-route-controller-manager/route-controller-manager-655d9f6b-9w5qq" Feb 17 14:12:20 crc kubenswrapper[4836]: I0217 14:12:20.667712 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a41b80c5-58ef-4d96-a176-02d0618297ee-client-ca\") pod \"route-controller-manager-655d9f6b-9w5qq\" (UID: \"a41b80c5-58ef-4d96-a176-02d0618297ee\") " pod="openshift-route-controller-manager/route-controller-manager-655d9f6b-9w5qq" Feb 17 14:12:20 crc kubenswrapper[4836]: I0217 14:12:20.670872 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a41b80c5-58ef-4d96-a176-02d0618297ee-serving-cert\") pod \"route-controller-manager-655d9f6b-9w5qq\" (UID: \"a41b80c5-58ef-4d96-a176-02d0618297ee\") " pod="openshift-route-controller-manager/route-controller-manager-655d9f6b-9w5qq" Feb 17 14:12:20 crc kubenswrapper[4836]: I0217 14:12:20.685406 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrs9m\" (UniqueName: \"kubernetes.io/projected/a41b80c5-58ef-4d96-a176-02d0618297ee-kube-api-access-lrs9m\") pod \"route-controller-manager-655d9f6b-9w5qq\" (UID: \"a41b80c5-58ef-4d96-a176-02d0618297ee\") " pod="openshift-route-controller-manager/route-controller-manager-655d9f6b-9w5qq" Feb 17 14:12:20 crc kubenswrapper[4836]: I0217 14:12:20.777770 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-655d9f6b-9w5qq" Feb 17 14:12:21 crc kubenswrapper[4836]: I0217 14:12:21.016633 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-655d9f6b-9w5qq"] Feb 17 14:12:21 crc kubenswrapper[4836]: I0217 14:12:21.411785 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-655d9f6b-9w5qq" event={"ID":"a41b80c5-58ef-4d96-a176-02d0618297ee","Type":"ContainerStarted","Data":"ea6b6d4a9dc8e039294cc0c6dd70b4120029dc0a72d0e5936067797486705deb"} Feb 17 14:12:21 crc kubenswrapper[4836]: I0217 14:12:21.412186 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-655d9f6b-9w5qq" event={"ID":"a41b80c5-58ef-4d96-a176-02d0618297ee","Type":"ContainerStarted","Data":"4ca4605b7bfe00847e11510ae10a3a3ad08cfbb71d140733665acea3abcefee6"} Feb 17 14:12:21 crc kubenswrapper[4836]: I0217 14:12:21.413985 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-655d9f6b-9w5qq" Feb 17 14:12:21 crc kubenswrapper[4836]: I0217 14:12:21.440518 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-655d9f6b-9w5qq" podStartSLOduration=2.440492006 podStartE2EDuration="2.440492006s" podCreationTimestamp="2026-02-17 14:12:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:12:21.439230827 +0000 UTC m=+367.782159106" watchObservedRunningTime="2026-02-17 14:12:21.440492006 +0000 UTC m=+367.783420285" Feb 17 14:12:21 crc kubenswrapper[4836]: I0217 14:12:21.564275 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-655d9f6b-9w5qq" Feb 17 14:12:23 crc kubenswrapper[4836]: I0217 14:12:23.150227 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-85f58dddc7-nc7rr"] Feb 17 14:12:23 crc kubenswrapper[4836]: I0217 14:12:23.151359 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-85f58dddc7-nc7rr" Feb 17 14:12:23 crc kubenswrapper[4836]: I0217 14:12:23.156052 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 17 14:12:23 crc kubenswrapper[4836]: I0217 14:12:23.156052 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 17 14:12:23 crc kubenswrapper[4836]: I0217 14:12:23.157511 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 17 14:12:23 crc kubenswrapper[4836]: I0217 14:12:23.157831 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 17 14:12:23 crc kubenswrapper[4836]: I0217 14:12:23.158487 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 17 14:12:23 crc kubenswrapper[4836]: I0217 14:12:23.158714 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 17 14:12:23 crc kubenswrapper[4836]: I0217 14:12:23.161746 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 17 14:12:23 crc kubenswrapper[4836]: I0217 14:12:23.177783 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-85f58dddc7-nc7rr"] Feb 17 14:12:23 crc kubenswrapper[4836]: I0217 14:12:23.303407 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3135ca20-3162-4278-bbd7-de1d6f977dfe-client-ca\") pod \"controller-manager-85f58dddc7-nc7rr\" (UID: \"3135ca20-3162-4278-bbd7-de1d6f977dfe\") " pod="openshift-controller-manager/controller-manager-85f58dddc7-nc7rr" Feb 17 14:12:23 crc kubenswrapper[4836]: I0217 14:12:23.303468 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdjgk\" (UniqueName: \"kubernetes.io/projected/3135ca20-3162-4278-bbd7-de1d6f977dfe-kube-api-access-hdjgk\") pod \"controller-manager-85f58dddc7-nc7rr\" (UID: \"3135ca20-3162-4278-bbd7-de1d6f977dfe\") " pod="openshift-controller-manager/controller-manager-85f58dddc7-nc7rr" Feb 17 14:12:23 crc kubenswrapper[4836]: I0217 14:12:23.303516 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3135ca20-3162-4278-bbd7-de1d6f977dfe-proxy-ca-bundles\") pod \"controller-manager-85f58dddc7-nc7rr\" (UID: \"3135ca20-3162-4278-bbd7-de1d6f977dfe\") " pod="openshift-controller-manager/controller-manager-85f58dddc7-nc7rr" Feb 17 14:12:23 crc kubenswrapper[4836]: I0217 14:12:23.303725 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3135ca20-3162-4278-bbd7-de1d6f977dfe-serving-cert\") pod \"controller-manager-85f58dddc7-nc7rr\" (UID: \"3135ca20-3162-4278-bbd7-de1d6f977dfe\") " pod="openshift-controller-manager/controller-manager-85f58dddc7-nc7rr" Feb 17 14:12:23 crc kubenswrapper[4836]: I0217 14:12:23.303861 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3135ca20-3162-4278-bbd7-de1d6f977dfe-config\") pod \"controller-manager-85f58dddc7-nc7rr\" (UID: \"3135ca20-3162-4278-bbd7-de1d6f977dfe\") " pod="openshift-controller-manager/controller-manager-85f58dddc7-nc7rr" Feb 17 14:12:23 crc kubenswrapper[4836]: I0217 14:12:23.405496 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3135ca20-3162-4278-bbd7-de1d6f977dfe-proxy-ca-bundles\") pod \"controller-manager-85f58dddc7-nc7rr\" (UID: \"3135ca20-3162-4278-bbd7-de1d6f977dfe\") " pod="openshift-controller-manager/controller-manager-85f58dddc7-nc7rr" Feb 17 14:12:23 crc kubenswrapper[4836]: I0217 14:12:23.405976 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3135ca20-3162-4278-bbd7-de1d6f977dfe-serving-cert\") pod \"controller-manager-85f58dddc7-nc7rr\" (UID: \"3135ca20-3162-4278-bbd7-de1d6f977dfe\") " pod="openshift-controller-manager/controller-manager-85f58dddc7-nc7rr" Feb 17 14:12:23 crc kubenswrapper[4836]: I0217 14:12:23.406119 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3135ca20-3162-4278-bbd7-de1d6f977dfe-config\") pod \"controller-manager-85f58dddc7-nc7rr\" (UID: \"3135ca20-3162-4278-bbd7-de1d6f977dfe\") " pod="openshift-controller-manager/controller-manager-85f58dddc7-nc7rr" Feb 17 14:12:23 crc kubenswrapper[4836]: I0217 14:12:23.406335 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3135ca20-3162-4278-bbd7-de1d6f977dfe-client-ca\") pod \"controller-manager-85f58dddc7-nc7rr\" (UID: \"3135ca20-3162-4278-bbd7-de1d6f977dfe\") " pod="openshift-controller-manager/controller-manager-85f58dddc7-nc7rr" Feb 17 14:12:23 crc kubenswrapper[4836]: I0217 14:12:23.406450 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hdjgk\" (UniqueName: \"kubernetes.io/projected/3135ca20-3162-4278-bbd7-de1d6f977dfe-kube-api-access-hdjgk\") pod \"controller-manager-85f58dddc7-nc7rr\" (UID: \"3135ca20-3162-4278-bbd7-de1d6f977dfe\") " pod="openshift-controller-manager/controller-manager-85f58dddc7-nc7rr" Feb 17 14:12:23 crc kubenswrapper[4836]: I0217 14:12:23.407984 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3135ca20-3162-4278-bbd7-de1d6f977dfe-proxy-ca-bundles\") pod \"controller-manager-85f58dddc7-nc7rr\" (UID: \"3135ca20-3162-4278-bbd7-de1d6f977dfe\") " pod="openshift-controller-manager/controller-manager-85f58dddc7-nc7rr" Feb 17 14:12:23 crc kubenswrapper[4836]: I0217 14:12:23.408098 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3135ca20-3162-4278-bbd7-de1d6f977dfe-client-ca\") pod \"controller-manager-85f58dddc7-nc7rr\" (UID: \"3135ca20-3162-4278-bbd7-de1d6f977dfe\") " pod="openshift-controller-manager/controller-manager-85f58dddc7-nc7rr" Feb 17 14:12:23 crc kubenswrapper[4836]: I0217 14:12:23.408244 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3135ca20-3162-4278-bbd7-de1d6f977dfe-config\") pod \"controller-manager-85f58dddc7-nc7rr\" (UID: \"3135ca20-3162-4278-bbd7-de1d6f977dfe\") " pod="openshift-controller-manager/controller-manager-85f58dddc7-nc7rr" Feb 17 14:12:23 crc kubenswrapper[4836]: I0217 14:12:23.414078 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3135ca20-3162-4278-bbd7-de1d6f977dfe-serving-cert\") pod \"controller-manager-85f58dddc7-nc7rr\" (UID: \"3135ca20-3162-4278-bbd7-de1d6f977dfe\") " pod="openshift-controller-manager/controller-manager-85f58dddc7-nc7rr" Feb 17 14:12:23 crc kubenswrapper[4836]: I0217 14:12:23.430470 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdjgk\" (UniqueName: \"kubernetes.io/projected/3135ca20-3162-4278-bbd7-de1d6f977dfe-kube-api-access-hdjgk\") pod \"controller-manager-85f58dddc7-nc7rr\" (UID: \"3135ca20-3162-4278-bbd7-de1d6f977dfe\") " pod="openshift-controller-manager/controller-manager-85f58dddc7-nc7rr" Feb 17 14:12:23 crc kubenswrapper[4836]: I0217 14:12:23.477960 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-85f58dddc7-nc7rr" Feb 17 14:12:23 crc kubenswrapper[4836]: I0217 14:12:23.689494 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-85f58dddc7-nc7rr"] Feb 17 14:12:24 crc kubenswrapper[4836]: I0217 14:12:24.434015 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-85f58dddc7-nc7rr" event={"ID":"3135ca20-3162-4278-bbd7-de1d6f977dfe","Type":"ContainerStarted","Data":"7563192babedf15dbfd2b3db7bca7b850f62a2b88195271072c4dd96129be51a"} Feb 17 14:12:24 crc kubenswrapper[4836]: I0217 14:12:24.434551 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-85f58dddc7-nc7rr" event={"ID":"3135ca20-3162-4278-bbd7-de1d6f977dfe","Type":"ContainerStarted","Data":"7e2281e26a53d8122732046bbdbf7cffb006b03fc9fd9923b35c064e85e5470c"} Feb 17 14:12:24 crc kubenswrapper[4836]: I0217 14:12:24.434576 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-85f58dddc7-nc7rr" Feb 17 14:12:24 crc kubenswrapper[4836]: I0217 14:12:24.444252 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-85f58dddc7-nc7rr" Feb 17 14:12:24 crc kubenswrapper[4836]: I0217 14:12:24.455388 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-85f58dddc7-nc7rr" podStartSLOduration=5.455362852 podStartE2EDuration="5.455362852s" podCreationTimestamp="2026-02-17 14:12:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:12:24.455275319 +0000 UTC m=+370.798203598" watchObservedRunningTime="2026-02-17 14:12:24.455362852 +0000 UTC m=+370.798291121" Feb 17 14:12:29 crc kubenswrapper[4836]: I0217 14:12:29.765275 4836 patch_prober.go:28] interesting pod/machine-config-daemon-bkk9g container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 14:12:29 crc kubenswrapper[4836]: I0217 14:12:29.765387 4836 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" podUID="895a19c9-a3f0-4a15-aa19-19347121388c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 14:12:37 crc kubenswrapper[4836]: I0217 14:12:37.717742 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-655d9f6b-9w5qq"] Feb 17 14:12:37 crc kubenswrapper[4836]: I0217 14:12:37.718563 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-655d9f6b-9w5qq" podUID="a41b80c5-58ef-4d96-a176-02d0618297ee" containerName="route-controller-manager" containerID="cri-o://ea6b6d4a9dc8e039294cc0c6dd70b4120029dc0a72d0e5936067797486705deb" gracePeriod=30 Feb 17 14:12:38 crc kubenswrapper[4836]: I0217 14:12:38.100424 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-655d9f6b-9w5qq" Feb 17 14:12:38 crc kubenswrapper[4836]: I0217 14:12:38.202273 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a41b80c5-58ef-4d96-a176-02d0618297ee-serving-cert\") pod \"a41b80c5-58ef-4d96-a176-02d0618297ee\" (UID: \"a41b80c5-58ef-4d96-a176-02d0618297ee\") " Feb 17 14:12:38 crc kubenswrapper[4836]: I0217 14:12:38.202435 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a41b80c5-58ef-4d96-a176-02d0618297ee-config\") pod \"a41b80c5-58ef-4d96-a176-02d0618297ee\" (UID: \"a41b80c5-58ef-4d96-a176-02d0618297ee\") " Feb 17 14:12:38 crc kubenswrapper[4836]: I0217 14:12:38.202474 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a41b80c5-58ef-4d96-a176-02d0618297ee-client-ca\") pod \"a41b80c5-58ef-4d96-a176-02d0618297ee\" (UID: \"a41b80c5-58ef-4d96-a176-02d0618297ee\") " Feb 17 14:12:38 crc kubenswrapper[4836]: I0217 14:12:38.202663 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lrs9m\" (UniqueName: \"kubernetes.io/projected/a41b80c5-58ef-4d96-a176-02d0618297ee-kube-api-access-lrs9m\") pod \"a41b80c5-58ef-4d96-a176-02d0618297ee\" (UID: \"a41b80c5-58ef-4d96-a176-02d0618297ee\") " Feb 17 14:12:38 crc kubenswrapper[4836]: I0217 14:12:38.204211 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a41b80c5-58ef-4d96-a176-02d0618297ee-client-ca" (OuterVolumeSpecName: "client-ca") pod "a41b80c5-58ef-4d96-a176-02d0618297ee" (UID: "a41b80c5-58ef-4d96-a176-02d0618297ee"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:12:38 crc kubenswrapper[4836]: I0217 14:12:38.204199 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a41b80c5-58ef-4d96-a176-02d0618297ee-config" (OuterVolumeSpecName: "config") pod "a41b80c5-58ef-4d96-a176-02d0618297ee" (UID: "a41b80c5-58ef-4d96-a176-02d0618297ee"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:12:38 crc kubenswrapper[4836]: I0217 14:12:38.208662 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a41b80c5-58ef-4d96-a176-02d0618297ee-kube-api-access-lrs9m" (OuterVolumeSpecName: "kube-api-access-lrs9m") pod "a41b80c5-58ef-4d96-a176-02d0618297ee" (UID: "a41b80c5-58ef-4d96-a176-02d0618297ee"). InnerVolumeSpecName "kube-api-access-lrs9m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:12:38 crc kubenswrapper[4836]: I0217 14:12:38.208918 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a41b80c5-58ef-4d96-a176-02d0618297ee-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a41b80c5-58ef-4d96-a176-02d0618297ee" (UID: "a41b80c5-58ef-4d96-a176-02d0618297ee"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:12:38 crc kubenswrapper[4836]: I0217 14:12:38.304464 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lrs9m\" (UniqueName: \"kubernetes.io/projected/a41b80c5-58ef-4d96-a176-02d0618297ee-kube-api-access-lrs9m\") on node \"crc\" DevicePath \"\"" Feb 17 14:12:38 crc kubenswrapper[4836]: I0217 14:12:38.304514 4836 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a41b80c5-58ef-4d96-a176-02d0618297ee-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 14:12:38 crc kubenswrapper[4836]: I0217 14:12:38.304528 4836 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a41b80c5-58ef-4d96-a176-02d0618297ee-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:12:38 crc kubenswrapper[4836]: I0217 14:12:38.304540 4836 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a41b80c5-58ef-4d96-a176-02d0618297ee-client-ca\") on node \"crc\" DevicePath \"\"" Feb 17 14:12:38 crc kubenswrapper[4836]: I0217 14:12:38.526471 4836 generic.go:334] "Generic (PLEG): container finished" podID="a41b80c5-58ef-4d96-a176-02d0618297ee" containerID="ea6b6d4a9dc8e039294cc0c6dd70b4120029dc0a72d0e5936067797486705deb" exitCode=0 Feb 17 14:12:38 crc kubenswrapper[4836]: I0217 14:12:38.526527 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-655d9f6b-9w5qq" event={"ID":"a41b80c5-58ef-4d96-a176-02d0618297ee","Type":"ContainerDied","Data":"ea6b6d4a9dc8e039294cc0c6dd70b4120029dc0a72d0e5936067797486705deb"} Feb 17 14:12:38 crc kubenswrapper[4836]: I0217 14:12:38.526551 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-655d9f6b-9w5qq" Feb 17 14:12:38 crc kubenswrapper[4836]: I0217 14:12:38.526568 4836 scope.go:117] "RemoveContainer" containerID="ea6b6d4a9dc8e039294cc0c6dd70b4120029dc0a72d0e5936067797486705deb" Feb 17 14:12:38 crc kubenswrapper[4836]: I0217 14:12:38.526558 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-655d9f6b-9w5qq" event={"ID":"a41b80c5-58ef-4d96-a176-02d0618297ee","Type":"ContainerDied","Data":"4ca4605b7bfe00847e11510ae10a3a3ad08cfbb71d140733665acea3abcefee6"} Feb 17 14:12:38 crc kubenswrapper[4836]: I0217 14:12:38.548470 4836 scope.go:117] "RemoveContainer" containerID="ea6b6d4a9dc8e039294cc0c6dd70b4120029dc0a72d0e5936067797486705deb" Feb 17 14:12:38 crc kubenswrapper[4836]: E0217 14:12:38.549175 4836 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea6b6d4a9dc8e039294cc0c6dd70b4120029dc0a72d0e5936067797486705deb\": container with ID starting with ea6b6d4a9dc8e039294cc0c6dd70b4120029dc0a72d0e5936067797486705deb not found: ID does not exist" containerID="ea6b6d4a9dc8e039294cc0c6dd70b4120029dc0a72d0e5936067797486705deb" Feb 17 14:12:38 crc kubenswrapper[4836]: I0217 14:12:38.549213 4836 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea6b6d4a9dc8e039294cc0c6dd70b4120029dc0a72d0e5936067797486705deb"} err="failed to get container status \"ea6b6d4a9dc8e039294cc0c6dd70b4120029dc0a72d0e5936067797486705deb\": rpc error: code = NotFound desc = could not find container \"ea6b6d4a9dc8e039294cc0c6dd70b4120029dc0a72d0e5936067797486705deb\": container with ID starting with ea6b6d4a9dc8e039294cc0c6dd70b4120029dc0a72d0e5936067797486705deb not found: ID does not exist" Feb 17 14:12:38 crc kubenswrapper[4836]: I0217 14:12:38.581602 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-655d9f6b-9w5qq"] Feb 17 14:12:38 crc kubenswrapper[4836]: I0217 14:12:38.585576 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-655d9f6b-9w5qq"] Feb 17 14:12:39 crc kubenswrapper[4836]: I0217 14:12:39.157821 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6bb9976d8b-b8bzx"] Feb 17 14:12:39 crc kubenswrapper[4836]: E0217 14:12:39.158339 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a41b80c5-58ef-4d96-a176-02d0618297ee" containerName="route-controller-manager" Feb 17 14:12:39 crc kubenswrapper[4836]: I0217 14:12:39.158357 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="a41b80c5-58ef-4d96-a176-02d0618297ee" containerName="route-controller-manager" Feb 17 14:12:39 crc kubenswrapper[4836]: I0217 14:12:39.158460 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="a41b80c5-58ef-4d96-a176-02d0618297ee" containerName="route-controller-manager" Feb 17 14:12:39 crc kubenswrapper[4836]: I0217 14:12:39.158818 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6bb9976d8b-b8bzx" Feb 17 14:12:39 crc kubenswrapper[4836]: I0217 14:12:39.160724 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 17 14:12:39 crc kubenswrapper[4836]: I0217 14:12:39.160752 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 17 14:12:39 crc kubenswrapper[4836]: I0217 14:12:39.161457 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 17 14:12:39 crc kubenswrapper[4836]: I0217 14:12:39.162096 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 17 14:12:39 crc kubenswrapper[4836]: I0217 14:12:39.162889 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 17 14:12:39 crc kubenswrapper[4836]: I0217 14:12:39.165739 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 17 14:12:39 crc kubenswrapper[4836]: I0217 14:12:39.176385 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6bb9976d8b-b8bzx"] Feb 17 14:12:39 crc kubenswrapper[4836]: I0217 14:12:39.214320 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/21265df2-25f1-466c-b267-95de545523c8-client-ca\") pod \"route-controller-manager-6bb9976d8b-b8bzx\" (UID: \"21265df2-25f1-466c-b267-95de545523c8\") " pod="openshift-route-controller-manager/route-controller-manager-6bb9976d8b-b8bzx" Feb 17 14:12:39 crc kubenswrapper[4836]: I0217 14:12:39.214383 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/21265df2-25f1-466c-b267-95de545523c8-serving-cert\") pod \"route-controller-manager-6bb9976d8b-b8bzx\" (UID: \"21265df2-25f1-466c-b267-95de545523c8\") " pod="openshift-route-controller-manager/route-controller-manager-6bb9976d8b-b8bzx" Feb 17 14:12:39 crc kubenswrapper[4836]: I0217 14:12:39.214450 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21265df2-25f1-466c-b267-95de545523c8-config\") pod \"route-controller-manager-6bb9976d8b-b8bzx\" (UID: \"21265df2-25f1-466c-b267-95de545523c8\") " pod="openshift-route-controller-manager/route-controller-manager-6bb9976d8b-b8bzx" Feb 17 14:12:39 crc kubenswrapper[4836]: I0217 14:12:39.214479 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2s75\" (UniqueName: \"kubernetes.io/projected/21265df2-25f1-466c-b267-95de545523c8-kube-api-access-v2s75\") pod \"route-controller-manager-6bb9976d8b-b8bzx\" (UID: \"21265df2-25f1-466c-b267-95de545523c8\") " pod="openshift-route-controller-manager/route-controller-manager-6bb9976d8b-b8bzx" Feb 17 14:12:39 crc kubenswrapper[4836]: I0217 14:12:39.315125 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21265df2-25f1-466c-b267-95de545523c8-config\") pod \"route-controller-manager-6bb9976d8b-b8bzx\" (UID: \"21265df2-25f1-466c-b267-95de545523c8\") " pod="openshift-route-controller-manager/route-controller-manager-6bb9976d8b-b8bzx" Feb 17 14:12:39 crc kubenswrapper[4836]: I0217 14:12:39.315174 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2s75\" (UniqueName: \"kubernetes.io/projected/21265df2-25f1-466c-b267-95de545523c8-kube-api-access-v2s75\") pod \"route-controller-manager-6bb9976d8b-b8bzx\" (UID: \"21265df2-25f1-466c-b267-95de545523c8\") " pod="openshift-route-controller-manager/route-controller-manager-6bb9976d8b-b8bzx" Feb 17 14:12:39 crc kubenswrapper[4836]: I0217 14:12:39.315237 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/21265df2-25f1-466c-b267-95de545523c8-client-ca\") pod \"route-controller-manager-6bb9976d8b-b8bzx\" (UID: \"21265df2-25f1-466c-b267-95de545523c8\") " pod="openshift-route-controller-manager/route-controller-manager-6bb9976d8b-b8bzx" Feb 17 14:12:39 crc kubenswrapper[4836]: I0217 14:12:39.315257 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/21265df2-25f1-466c-b267-95de545523c8-serving-cert\") pod \"route-controller-manager-6bb9976d8b-b8bzx\" (UID: \"21265df2-25f1-466c-b267-95de545523c8\") " pod="openshift-route-controller-manager/route-controller-manager-6bb9976d8b-b8bzx" Feb 17 14:12:39 crc kubenswrapper[4836]: I0217 14:12:39.316330 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/21265df2-25f1-466c-b267-95de545523c8-client-ca\") pod \"route-controller-manager-6bb9976d8b-b8bzx\" (UID: \"21265df2-25f1-466c-b267-95de545523c8\") " pod="openshift-route-controller-manager/route-controller-manager-6bb9976d8b-b8bzx" Feb 17 14:12:39 crc kubenswrapper[4836]: I0217 14:12:39.316453 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21265df2-25f1-466c-b267-95de545523c8-config\") pod \"route-controller-manager-6bb9976d8b-b8bzx\" (UID: \"21265df2-25f1-466c-b267-95de545523c8\") " pod="openshift-route-controller-manager/route-controller-manager-6bb9976d8b-b8bzx" Feb 17 14:12:39 crc kubenswrapper[4836]: I0217 14:12:39.319157 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/21265df2-25f1-466c-b267-95de545523c8-serving-cert\") pod \"route-controller-manager-6bb9976d8b-b8bzx\" (UID: \"21265df2-25f1-466c-b267-95de545523c8\") " pod="openshift-route-controller-manager/route-controller-manager-6bb9976d8b-b8bzx" Feb 17 14:12:39 crc kubenswrapper[4836]: I0217 14:12:39.331717 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2s75\" (UniqueName: \"kubernetes.io/projected/21265df2-25f1-466c-b267-95de545523c8-kube-api-access-v2s75\") pod \"route-controller-manager-6bb9976d8b-b8bzx\" (UID: \"21265df2-25f1-466c-b267-95de545523c8\") " pod="openshift-route-controller-manager/route-controller-manager-6bb9976d8b-b8bzx" Feb 17 14:12:39 crc kubenswrapper[4836]: I0217 14:12:39.474366 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6bb9976d8b-b8bzx" Feb 17 14:12:39 crc kubenswrapper[4836]: I0217 14:12:39.908512 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6bb9976d8b-b8bzx"] Feb 17 14:12:39 crc kubenswrapper[4836]: W0217 14:12:39.913438 4836 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21265df2_25f1_466c_b267_95de545523c8.slice/crio-3c88aa34a5ef7da27231a11ea2ecd16c4bc19e305853972a881e458301462b71 WatchSource:0}: Error finding container 3c88aa34a5ef7da27231a11ea2ecd16c4bc19e305853972a881e458301462b71: Status 404 returned error can't find the container with id 3c88aa34a5ef7da27231a11ea2ecd16c4bc19e305853972a881e458301462b71 Feb 17 14:12:40 crc kubenswrapper[4836]: I0217 14:12:40.538978 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6bb9976d8b-b8bzx" event={"ID":"21265df2-25f1-466c-b267-95de545523c8","Type":"ContainerStarted","Data":"c7748556e85cb8444b90b10d4dc94cc5ec7aa1761b739525c90fa073be2d9287"} Feb 17 14:12:40 crc kubenswrapper[4836]: I0217 14:12:40.539313 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6bb9976d8b-b8bzx" event={"ID":"21265df2-25f1-466c-b267-95de545523c8","Type":"ContainerStarted","Data":"3c88aa34a5ef7da27231a11ea2ecd16c4bc19e305853972a881e458301462b71"} Feb 17 14:12:40 crc kubenswrapper[4836]: I0217 14:12:40.539339 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6bb9976d8b-b8bzx" Feb 17 14:12:40 crc kubenswrapper[4836]: I0217 14:12:40.544694 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6bb9976d8b-b8bzx" Feb 17 14:12:40 crc kubenswrapper[4836]: I0217 14:12:40.556630 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6bb9976d8b-b8bzx" podStartSLOduration=3.5566019129999997 podStartE2EDuration="3.556601913s" podCreationTimestamp="2026-02-17 14:12:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:12:40.552736625 +0000 UTC m=+386.895664894" watchObservedRunningTime="2026-02-17 14:12:40.556601913 +0000 UTC m=+386.899530182" Feb 17 14:12:40 crc kubenswrapper[4836]: I0217 14:12:40.574094 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a41b80c5-58ef-4d96-a176-02d0618297ee" path="/var/lib/kubelet/pods/a41b80c5-58ef-4d96-a176-02d0618297ee/volumes" Feb 17 14:12:41 crc kubenswrapper[4836]: I0217 14:12:41.566259 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-xxj4j"] Feb 17 14:12:41 crc kubenswrapper[4836]: I0217 14:12:41.567535 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xxj4j" Feb 17 14:12:41 crc kubenswrapper[4836]: I0217 14:12:41.569918 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 17 14:12:41 crc kubenswrapper[4836]: I0217 14:12:41.578422 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xxj4j"] Feb 17 14:12:41 crc kubenswrapper[4836]: I0217 14:12:41.642869 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eaecd71b-3b00-427a-9654-9d04af5469b9-catalog-content\") pod \"certified-operators-xxj4j\" (UID: \"eaecd71b-3b00-427a-9654-9d04af5469b9\") " pod="openshift-marketplace/certified-operators-xxj4j" Feb 17 14:12:41 crc kubenswrapper[4836]: I0217 14:12:41.642993 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qllf\" (UniqueName: \"kubernetes.io/projected/eaecd71b-3b00-427a-9654-9d04af5469b9-kube-api-access-4qllf\") pod \"certified-operators-xxj4j\" (UID: \"eaecd71b-3b00-427a-9654-9d04af5469b9\") " pod="openshift-marketplace/certified-operators-xxj4j" Feb 17 14:12:41 crc kubenswrapper[4836]: I0217 14:12:41.643155 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eaecd71b-3b00-427a-9654-9d04af5469b9-utilities\") pod \"certified-operators-xxj4j\" (UID: \"eaecd71b-3b00-427a-9654-9d04af5469b9\") " pod="openshift-marketplace/certified-operators-xxj4j" Feb 17 14:12:41 crc kubenswrapper[4836]: I0217 14:12:41.745110 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eaecd71b-3b00-427a-9654-9d04af5469b9-utilities\") pod \"certified-operators-xxj4j\" (UID: \"eaecd71b-3b00-427a-9654-9d04af5469b9\") " pod="openshift-marketplace/certified-operators-xxj4j" Feb 17 14:12:41 crc kubenswrapper[4836]: I0217 14:12:41.745182 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eaecd71b-3b00-427a-9654-9d04af5469b9-catalog-content\") pod \"certified-operators-xxj4j\" (UID: \"eaecd71b-3b00-427a-9654-9d04af5469b9\") " pod="openshift-marketplace/certified-operators-xxj4j" Feb 17 14:12:41 crc kubenswrapper[4836]: I0217 14:12:41.745205 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qllf\" (UniqueName: \"kubernetes.io/projected/eaecd71b-3b00-427a-9654-9d04af5469b9-kube-api-access-4qllf\") pod \"certified-operators-xxj4j\" (UID: \"eaecd71b-3b00-427a-9654-9d04af5469b9\") " pod="openshift-marketplace/certified-operators-xxj4j" Feb 17 14:12:41 crc kubenswrapper[4836]: I0217 14:12:41.746022 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eaecd71b-3b00-427a-9654-9d04af5469b9-utilities\") pod \"certified-operators-xxj4j\" (UID: \"eaecd71b-3b00-427a-9654-9d04af5469b9\") " pod="openshift-marketplace/certified-operators-xxj4j" Feb 17 14:12:41 crc kubenswrapper[4836]: I0217 14:12:41.746030 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eaecd71b-3b00-427a-9654-9d04af5469b9-catalog-content\") pod \"certified-operators-xxj4j\" (UID: \"eaecd71b-3b00-427a-9654-9d04af5469b9\") " pod="openshift-marketplace/certified-operators-xxj4j" Feb 17 14:12:41 crc kubenswrapper[4836]: I0217 14:12:41.768219 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qllf\" (UniqueName: \"kubernetes.io/projected/eaecd71b-3b00-427a-9654-9d04af5469b9-kube-api-access-4qllf\") pod \"certified-operators-xxj4j\" (UID: \"eaecd71b-3b00-427a-9654-9d04af5469b9\") " pod="openshift-marketplace/certified-operators-xxj4j" Feb 17 14:12:41 crc kubenswrapper[4836]: I0217 14:12:41.927781 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xxj4j" Feb 17 14:12:42 crc kubenswrapper[4836]: I0217 14:12:42.334397 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xxj4j"] Feb 17 14:12:42 crc kubenswrapper[4836]: I0217 14:12:42.551817 4836 generic.go:334] "Generic (PLEG): container finished" podID="eaecd71b-3b00-427a-9654-9d04af5469b9" containerID="aa7f9ce852f645a39beee4170859071aedade9693dce893892a107d8b6ef0aae" exitCode=0 Feb 17 14:12:42 crc kubenswrapper[4836]: I0217 14:12:42.551918 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xxj4j" event={"ID":"eaecd71b-3b00-427a-9654-9d04af5469b9","Type":"ContainerDied","Data":"aa7f9ce852f645a39beee4170859071aedade9693dce893892a107d8b6ef0aae"} Feb 17 14:12:42 crc kubenswrapper[4836]: I0217 14:12:42.552234 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xxj4j" event={"ID":"eaecd71b-3b00-427a-9654-9d04af5469b9","Type":"ContainerStarted","Data":"9e95b3dc7d55a998957a83e71e517055b8e4593ed57b59207d796a207e2973c1"} Feb 17 14:12:42 crc kubenswrapper[4836]: I0217 14:12:42.578843 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-zjxvt"] Feb 17 14:12:42 crc kubenswrapper[4836]: I0217 14:12:42.580040 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zjxvt" Feb 17 14:12:42 crc kubenswrapper[4836]: I0217 14:12:42.582506 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 17 14:12:42 crc kubenswrapper[4836]: I0217 14:12:42.585322 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zjxvt"] Feb 17 14:12:42 crc kubenswrapper[4836]: I0217 14:12:42.655767 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/212802dd-4c4f-444a-b443-bc3bbd1431bc-utilities\") pod \"community-operators-zjxvt\" (UID: \"212802dd-4c4f-444a-b443-bc3bbd1431bc\") " pod="openshift-marketplace/community-operators-zjxvt" Feb 17 14:12:42 crc kubenswrapper[4836]: I0217 14:12:42.655816 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxt6l\" (UniqueName: \"kubernetes.io/projected/212802dd-4c4f-444a-b443-bc3bbd1431bc-kube-api-access-wxt6l\") pod \"community-operators-zjxvt\" (UID: \"212802dd-4c4f-444a-b443-bc3bbd1431bc\") " pod="openshift-marketplace/community-operators-zjxvt" Feb 17 14:12:42 crc kubenswrapper[4836]: I0217 14:12:42.655844 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/212802dd-4c4f-444a-b443-bc3bbd1431bc-catalog-content\") pod \"community-operators-zjxvt\" (UID: \"212802dd-4c4f-444a-b443-bc3bbd1431bc\") " pod="openshift-marketplace/community-operators-zjxvt" Feb 17 14:12:42 crc kubenswrapper[4836]: I0217 14:12:42.757384 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/212802dd-4c4f-444a-b443-bc3bbd1431bc-catalog-content\") pod \"community-operators-zjxvt\" (UID: \"212802dd-4c4f-444a-b443-bc3bbd1431bc\") " pod="openshift-marketplace/community-operators-zjxvt" Feb 17 14:12:42 crc kubenswrapper[4836]: I0217 14:12:42.757499 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/212802dd-4c4f-444a-b443-bc3bbd1431bc-utilities\") pod \"community-operators-zjxvt\" (UID: \"212802dd-4c4f-444a-b443-bc3bbd1431bc\") " pod="openshift-marketplace/community-operators-zjxvt" Feb 17 14:12:42 crc kubenswrapper[4836]: I0217 14:12:42.757522 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxt6l\" (UniqueName: \"kubernetes.io/projected/212802dd-4c4f-444a-b443-bc3bbd1431bc-kube-api-access-wxt6l\") pod \"community-operators-zjxvt\" (UID: \"212802dd-4c4f-444a-b443-bc3bbd1431bc\") " pod="openshift-marketplace/community-operators-zjxvt" Feb 17 14:12:42 crc kubenswrapper[4836]: I0217 14:12:42.758269 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/212802dd-4c4f-444a-b443-bc3bbd1431bc-catalog-content\") pod \"community-operators-zjxvt\" (UID: \"212802dd-4c4f-444a-b443-bc3bbd1431bc\") " pod="openshift-marketplace/community-operators-zjxvt" Feb 17 14:12:42 crc kubenswrapper[4836]: I0217 14:12:42.758597 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/212802dd-4c4f-444a-b443-bc3bbd1431bc-utilities\") pod \"community-operators-zjxvt\" (UID: \"212802dd-4c4f-444a-b443-bc3bbd1431bc\") " pod="openshift-marketplace/community-operators-zjxvt" Feb 17 14:12:42 crc kubenswrapper[4836]: I0217 14:12:42.778480 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxt6l\" (UniqueName: \"kubernetes.io/projected/212802dd-4c4f-444a-b443-bc3bbd1431bc-kube-api-access-wxt6l\") pod \"community-operators-zjxvt\" (UID: \"212802dd-4c4f-444a-b443-bc3bbd1431bc\") " pod="openshift-marketplace/community-operators-zjxvt" Feb 17 14:12:42 crc kubenswrapper[4836]: I0217 14:12:42.929851 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zjxvt" Feb 17 14:12:43 crc kubenswrapper[4836]: I0217 14:12:43.357131 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zjxvt"] Feb 17 14:12:43 crc kubenswrapper[4836]: W0217 14:12:43.363633 4836 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod212802dd_4c4f_444a_b443_bc3bbd1431bc.slice/crio-02fb12ea464e187c38e849c376aa4db9e9c06deb2e23fabe64865bbea9e1ddbd WatchSource:0}: Error finding container 02fb12ea464e187c38e849c376aa4db9e9c06deb2e23fabe64865bbea9e1ddbd: Status 404 returned error can't find the container with id 02fb12ea464e187c38e849c376aa4db9e9c06deb2e23fabe64865bbea9e1ddbd Feb 17 14:12:43 crc kubenswrapper[4836]: I0217 14:12:43.562548 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xxj4j" event={"ID":"eaecd71b-3b00-427a-9654-9d04af5469b9","Type":"ContainerStarted","Data":"21a01e036ae8bc2d46ea0e432083ed8e2bc2fae27eab393688cc5b04a75de90c"} Feb 17 14:12:43 crc kubenswrapper[4836]: I0217 14:12:43.563966 4836 generic.go:334] "Generic (PLEG): container finished" podID="212802dd-4c4f-444a-b443-bc3bbd1431bc" containerID="d9ae08d204abbfd6e235974dfd87d8cc8bc8ecedd8f26f0aed33b4f099c1a758" exitCode=0 Feb 17 14:12:43 crc kubenswrapper[4836]: I0217 14:12:43.564060 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zjxvt" event={"ID":"212802dd-4c4f-444a-b443-bc3bbd1431bc","Type":"ContainerDied","Data":"d9ae08d204abbfd6e235974dfd87d8cc8bc8ecedd8f26f0aed33b4f099c1a758"} Feb 17 14:12:43 crc kubenswrapper[4836]: I0217 14:12:43.564092 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zjxvt" event={"ID":"212802dd-4c4f-444a-b443-bc3bbd1431bc","Type":"ContainerStarted","Data":"02fb12ea464e187c38e849c376aa4db9e9c06deb2e23fabe64865bbea9e1ddbd"} Feb 17 14:12:43 crc kubenswrapper[4836]: I0217 14:12:43.964161 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8gtc9"] Feb 17 14:12:43 crc kubenswrapper[4836]: I0217 14:12:43.965689 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8gtc9" Feb 17 14:12:43 crc kubenswrapper[4836]: I0217 14:12:43.967812 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 17 14:12:43 crc kubenswrapper[4836]: I0217 14:12:43.977577 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8gtc9"] Feb 17 14:12:44 crc kubenswrapper[4836]: I0217 14:12:44.073924 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8fb3c078-0953-4561-a532-cc25ff32d845-utilities\") pod \"redhat-marketplace-8gtc9\" (UID: \"8fb3c078-0953-4561-a532-cc25ff32d845\") " pod="openshift-marketplace/redhat-marketplace-8gtc9" Feb 17 14:12:44 crc kubenswrapper[4836]: I0217 14:12:44.073977 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27sqg\" (UniqueName: \"kubernetes.io/projected/8fb3c078-0953-4561-a532-cc25ff32d845-kube-api-access-27sqg\") pod \"redhat-marketplace-8gtc9\" (UID: \"8fb3c078-0953-4561-a532-cc25ff32d845\") " pod="openshift-marketplace/redhat-marketplace-8gtc9" Feb 17 14:12:44 crc kubenswrapper[4836]: I0217 14:12:44.074105 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8fb3c078-0953-4561-a532-cc25ff32d845-catalog-content\") pod \"redhat-marketplace-8gtc9\" (UID: \"8fb3c078-0953-4561-a532-cc25ff32d845\") " pod="openshift-marketplace/redhat-marketplace-8gtc9" Feb 17 14:12:44 crc kubenswrapper[4836]: I0217 14:12:44.175259 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8fb3c078-0953-4561-a532-cc25ff32d845-catalog-content\") pod \"redhat-marketplace-8gtc9\" (UID: \"8fb3c078-0953-4561-a532-cc25ff32d845\") " pod="openshift-marketplace/redhat-marketplace-8gtc9" Feb 17 14:12:44 crc kubenswrapper[4836]: I0217 14:12:44.175370 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8fb3c078-0953-4561-a532-cc25ff32d845-utilities\") pod \"redhat-marketplace-8gtc9\" (UID: \"8fb3c078-0953-4561-a532-cc25ff32d845\") " pod="openshift-marketplace/redhat-marketplace-8gtc9" Feb 17 14:12:44 crc kubenswrapper[4836]: I0217 14:12:44.175492 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27sqg\" (UniqueName: \"kubernetes.io/projected/8fb3c078-0953-4561-a532-cc25ff32d845-kube-api-access-27sqg\") pod \"redhat-marketplace-8gtc9\" (UID: \"8fb3c078-0953-4561-a532-cc25ff32d845\") " pod="openshift-marketplace/redhat-marketplace-8gtc9" Feb 17 14:12:44 crc kubenswrapper[4836]: I0217 14:12:44.175888 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8fb3c078-0953-4561-a532-cc25ff32d845-catalog-content\") pod \"redhat-marketplace-8gtc9\" (UID: \"8fb3c078-0953-4561-a532-cc25ff32d845\") " pod="openshift-marketplace/redhat-marketplace-8gtc9" Feb 17 14:12:44 crc kubenswrapper[4836]: I0217 14:12:44.175959 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8fb3c078-0953-4561-a532-cc25ff32d845-utilities\") pod \"redhat-marketplace-8gtc9\" (UID: \"8fb3c078-0953-4561-a532-cc25ff32d845\") " pod="openshift-marketplace/redhat-marketplace-8gtc9" Feb 17 14:12:44 crc kubenswrapper[4836]: I0217 14:12:44.196230 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27sqg\" (UniqueName: \"kubernetes.io/projected/8fb3c078-0953-4561-a532-cc25ff32d845-kube-api-access-27sqg\") pod \"redhat-marketplace-8gtc9\" (UID: \"8fb3c078-0953-4561-a532-cc25ff32d845\") " pod="openshift-marketplace/redhat-marketplace-8gtc9" Feb 17 14:12:44 crc kubenswrapper[4836]: I0217 14:12:44.284018 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8gtc9" Feb 17 14:12:44 crc kubenswrapper[4836]: I0217 14:12:44.585103 4836 generic.go:334] "Generic (PLEG): container finished" podID="eaecd71b-3b00-427a-9654-9d04af5469b9" containerID="21a01e036ae8bc2d46ea0e432083ed8e2bc2fae27eab393688cc5b04a75de90c" exitCode=0 Feb 17 14:12:44 crc kubenswrapper[4836]: I0217 14:12:44.590239 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xxj4j" event={"ID":"eaecd71b-3b00-427a-9654-9d04af5469b9","Type":"ContainerDied","Data":"21a01e036ae8bc2d46ea0e432083ed8e2bc2fae27eab393688cc5b04a75de90c"} Feb 17 14:12:44 crc kubenswrapper[4836]: I0217 14:12:44.590314 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zjxvt" event={"ID":"212802dd-4c4f-444a-b443-bc3bbd1431bc","Type":"ContainerStarted","Data":"57121581c3ae7e47488b89b50164c62707ba95ae0cdbe02ba6f16d945bcfc23e"} Feb 17 14:12:44 crc kubenswrapper[4836]: I0217 14:12:44.713652 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8gtc9"] Feb 17 14:12:44 crc kubenswrapper[4836]: I0217 14:12:44.961211 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-89b2r"] Feb 17 14:12:44 crc kubenswrapper[4836]: I0217 14:12:44.962651 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-89b2r" Feb 17 14:12:44 crc kubenswrapper[4836]: I0217 14:12:44.965224 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 17 14:12:44 crc kubenswrapper[4836]: I0217 14:12:44.977978 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-89b2r"] Feb 17 14:12:45 crc kubenswrapper[4836]: I0217 14:12:45.085715 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc99d806-e359-4577-8a61-1b527af8779f-utilities\") pod \"redhat-operators-89b2r\" (UID: \"cc99d806-e359-4577-8a61-1b527af8779f\") " pod="openshift-marketplace/redhat-operators-89b2r" Feb 17 14:12:45 crc kubenswrapper[4836]: I0217 14:12:45.085781 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97574\" (UniqueName: \"kubernetes.io/projected/cc99d806-e359-4577-8a61-1b527af8779f-kube-api-access-97574\") pod \"redhat-operators-89b2r\" (UID: \"cc99d806-e359-4577-8a61-1b527af8779f\") " pod="openshift-marketplace/redhat-operators-89b2r" Feb 17 14:12:45 crc kubenswrapper[4836]: I0217 14:12:45.085823 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc99d806-e359-4577-8a61-1b527af8779f-catalog-content\") pod \"redhat-operators-89b2r\" (UID: \"cc99d806-e359-4577-8a61-1b527af8779f\") " pod="openshift-marketplace/redhat-operators-89b2r" Feb 17 14:12:45 crc kubenswrapper[4836]: I0217 14:12:45.187000 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-97574\" (UniqueName: \"kubernetes.io/projected/cc99d806-e359-4577-8a61-1b527af8779f-kube-api-access-97574\") pod \"redhat-operators-89b2r\" (UID: \"cc99d806-e359-4577-8a61-1b527af8779f\") " pod="openshift-marketplace/redhat-operators-89b2r" Feb 17 14:12:45 crc kubenswrapper[4836]: I0217 14:12:45.187091 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc99d806-e359-4577-8a61-1b527af8779f-catalog-content\") pod \"redhat-operators-89b2r\" (UID: \"cc99d806-e359-4577-8a61-1b527af8779f\") " pod="openshift-marketplace/redhat-operators-89b2r" Feb 17 14:12:45 crc kubenswrapper[4836]: I0217 14:12:45.187133 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc99d806-e359-4577-8a61-1b527af8779f-utilities\") pod \"redhat-operators-89b2r\" (UID: \"cc99d806-e359-4577-8a61-1b527af8779f\") " pod="openshift-marketplace/redhat-operators-89b2r" Feb 17 14:12:45 crc kubenswrapper[4836]: I0217 14:12:45.187624 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc99d806-e359-4577-8a61-1b527af8779f-utilities\") pod \"redhat-operators-89b2r\" (UID: \"cc99d806-e359-4577-8a61-1b527af8779f\") " pod="openshift-marketplace/redhat-operators-89b2r" Feb 17 14:12:45 crc kubenswrapper[4836]: I0217 14:12:45.187771 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc99d806-e359-4577-8a61-1b527af8779f-catalog-content\") pod \"redhat-operators-89b2r\" (UID: \"cc99d806-e359-4577-8a61-1b527af8779f\") " pod="openshift-marketplace/redhat-operators-89b2r" Feb 17 14:12:45 crc kubenswrapper[4836]: I0217 14:12:45.206273 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-97574\" (UniqueName: \"kubernetes.io/projected/cc99d806-e359-4577-8a61-1b527af8779f-kube-api-access-97574\") pod \"redhat-operators-89b2r\" (UID: \"cc99d806-e359-4577-8a61-1b527af8779f\") " pod="openshift-marketplace/redhat-operators-89b2r" Feb 17 14:12:45 crc kubenswrapper[4836]: I0217 14:12:45.290576 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-89b2r" Feb 17 14:12:45 crc kubenswrapper[4836]: I0217 14:12:45.595370 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xxj4j" event={"ID":"eaecd71b-3b00-427a-9654-9d04af5469b9","Type":"ContainerStarted","Data":"c37fc7ad8678f0bd7ef3cfd0b3f911e8100a176cad637e5e7823983505fa0d4f"} Feb 17 14:12:45 crc kubenswrapper[4836]: I0217 14:12:45.597083 4836 generic.go:334] "Generic (PLEG): container finished" podID="8fb3c078-0953-4561-a532-cc25ff32d845" containerID="eafccfb016bdc7ac4ddee448c0db9934c540fe9db8c38a30073338c2456bb37b" exitCode=0 Feb 17 14:12:45 crc kubenswrapper[4836]: I0217 14:12:45.597173 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8gtc9" event={"ID":"8fb3c078-0953-4561-a532-cc25ff32d845","Type":"ContainerDied","Data":"eafccfb016bdc7ac4ddee448c0db9934c540fe9db8c38a30073338c2456bb37b"} Feb 17 14:12:45 crc kubenswrapper[4836]: I0217 14:12:45.597216 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8gtc9" event={"ID":"8fb3c078-0953-4561-a532-cc25ff32d845","Type":"ContainerStarted","Data":"cf63cc751376ca9c7e7c896ffb7d4daf46e0a826afa0f6eb466c4e72cfccd5e7"} Feb 17 14:12:45 crc kubenswrapper[4836]: I0217 14:12:45.598925 4836 generic.go:334] "Generic (PLEG): container finished" podID="212802dd-4c4f-444a-b443-bc3bbd1431bc" containerID="57121581c3ae7e47488b89b50164c62707ba95ae0cdbe02ba6f16d945bcfc23e" exitCode=0 Feb 17 14:12:45 crc kubenswrapper[4836]: I0217 14:12:45.598961 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zjxvt" event={"ID":"212802dd-4c4f-444a-b443-bc3bbd1431bc","Type":"ContainerDied","Data":"57121581c3ae7e47488b89b50164c62707ba95ae0cdbe02ba6f16d945bcfc23e"} Feb 17 14:12:45 crc kubenswrapper[4836]: I0217 14:12:45.661890 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-xxj4j" podStartSLOduration=2.214965978 podStartE2EDuration="4.661862618s" podCreationTimestamp="2026-02-17 14:12:41 +0000 UTC" firstStartedPulling="2026-02-17 14:12:42.555270447 +0000 UTC m=+388.898198736" lastFinishedPulling="2026-02-17 14:12:45.002167107 +0000 UTC m=+391.345095376" observedRunningTime="2026-02-17 14:12:45.626387444 +0000 UTC m=+391.969315733" watchObservedRunningTime="2026-02-17 14:12:45.661862618 +0000 UTC m=+392.004790887" Feb 17 14:12:45 crc kubenswrapper[4836]: I0217 14:12:45.713474 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-89b2r"] Feb 17 14:12:45 crc kubenswrapper[4836]: W0217 14:12:45.718777 4836 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcc99d806_e359_4577_8a61_1b527af8779f.slice/crio-7917a2258074c4b89c2b9c207136528b694ef3fdf3891f163bd96f2105f7c9c7 WatchSource:0}: Error finding container 7917a2258074c4b89c2b9c207136528b694ef3fdf3891f163bd96f2105f7c9c7: Status 404 returned error can't find the container with id 7917a2258074c4b89c2b9c207136528b694ef3fdf3891f163bd96f2105f7c9c7 Feb 17 14:12:46 crc kubenswrapper[4836]: I0217 14:12:46.608318 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zjxvt" event={"ID":"212802dd-4c4f-444a-b443-bc3bbd1431bc","Type":"ContainerStarted","Data":"100f1474b576f564309897e2bb61f7a8e5947070d42242a0989d63c32f072200"} Feb 17 14:12:46 crc kubenswrapper[4836]: I0217 14:12:46.610711 4836 generic.go:334] "Generic (PLEG): container finished" podID="cc99d806-e359-4577-8a61-1b527af8779f" containerID="fa952a578ab7d74e43550d2abf42e1871632978ec68916c0a6508b2ed82226f0" exitCode=0 Feb 17 14:12:46 crc kubenswrapper[4836]: I0217 14:12:46.610805 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-89b2r" event={"ID":"cc99d806-e359-4577-8a61-1b527af8779f","Type":"ContainerDied","Data":"fa952a578ab7d74e43550d2abf42e1871632978ec68916c0a6508b2ed82226f0"} Feb 17 14:12:46 crc kubenswrapper[4836]: I0217 14:12:46.610844 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-89b2r" event={"ID":"cc99d806-e359-4577-8a61-1b527af8779f","Type":"ContainerStarted","Data":"7917a2258074c4b89c2b9c207136528b694ef3fdf3891f163bd96f2105f7c9c7"} Feb 17 14:12:46 crc kubenswrapper[4836]: I0217 14:12:46.612609 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8gtc9" event={"ID":"8fb3c078-0953-4561-a532-cc25ff32d845","Type":"ContainerStarted","Data":"93383f2b574dbb3dd74a694937e4f11376857c8902e53ddc5cef1e5d40402788"} Feb 17 14:12:46 crc kubenswrapper[4836]: I0217 14:12:46.633704 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-zjxvt" podStartSLOduration=2.123615716 podStartE2EDuration="4.633667797s" podCreationTimestamp="2026-02-17 14:12:42 +0000 UTC" firstStartedPulling="2026-02-17 14:12:43.565667082 +0000 UTC m=+389.908595351" lastFinishedPulling="2026-02-17 14:12:46.075719163 +0000 UTC m=+392.418647432" observedRunningTime="2026-02-17 14:12:46.627228493 +0000 UTC m=+392.970156772" watchObservedRunningTime="2026-02-17 14:12:46.633667797 +0000 UTC m=+392.976596076" Feb 17 14:12:47 crc kubenswrapper[4836]: I0217 14:12:47.619923 4836 generic.go:334] "Generic (PLEG): container finished" podID="8fb3c078-0953-4561-a532-cc25ff32d845" containerID="93383f2b574dbb3dd74a694937e4f11376857c8902e53ddc5cef1e5d40402788" exitCode=0 Feb 17 14:12:47 crc kubenswrapper[4836]: I0217 14:12:47.620035 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8gtc9" event={"ID":"8fb3c078-0953-4561-a532-cc25ff32d845","Type":"ContainerDied","Data":"93383f2b574dbb3dd74a694937e4f11376857c8902e53ddc5cef1e5d40402788"} Feb 17 14:12:48 crc kubenswrapper[4836]: I0217 14:12:48.627078 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-89b2r" event={"ID":"cc99d806-e359-4577-8a61-1b527af8779f","Type":"ContainerStarted","Data":"99c757b68ed859a793668b56d22b853641589be9aa542f670159f298a8c5ffcd"} Feb 17 14:12:48 crc kubenswrapper[4836]: I0217 14:12:48.629623 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8gtc9" event={"ID":"8fb3c078-0953-4561-a532-cc25ff32d845","Type":"ContainerStarted","Data":"389c3d1c8aeeb9b0460e0bedb15b9496356ba81bdfc15eae0bdea17e125e5949"} Feb 17 14:12:48 crc kubenswrapper[4836]: I0217 14:12:48.690846 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8gtc9" podStartSLOduration=3.023952167 podStartE2EDuration="5.690824268s" podCreationTimestamp="2026-02-17 14:12:43 +0000 UTC" firstStartedPulling="2026-02-17 14:12:45.59914713 +0000 UTC m=+391.942075409" lastFinishedPulling="2026-02-17 14:12:48.266019241 +0000 UTC m=+394.608947510" observedRunningTime="2026-02-17 14:12:48.687933136 +0000 UTC m=+395.030861425" watchObservedRunningTime="2026-02-17 14:12:48.690824268 +0000 UTC m=+395.033752537" Feb 17 14:12:50 crc kubenswrapper[4836]: I0217 14:12:50.643069 4836 generic.go:334] "Generic (PLEG): container finished" podID="cc99d806-e359-4577-8a61-1b527af8779f" containerID="99c757b68ed859a793668b56d22b853641589be9aa542f670159f298a8c5ffcd" exitCode=0 Feb 17 14:12:50 crc kubenswrapper[4836]: I0217 14:12:50.643169 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-89b2r" event={"ID":"cc99d806-e359-4577-8a61-1b527af8779f","Type":"ContainerDied","Data":"99c757b68ed859a793668b56d22b853641589be9aa542f670159f298a8c5ffcd"} Feb 17 14:12:51 crc kubenswrapper[4836]: I0217 14:12:51.655113 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-89b2r" event={"ID":"cc99d806-e359-4577-8a61-1b527af8779f","Type":"ContainerStarted","Data":"b84dd65de54881081222d1401d684becd3ab6f396a5d3ddb1a10e413f4f858e0"} Feb 17 14:12:51 crc kubenswrapper[4836]: I0217 14:12:51.683314 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-89b2r" podStartSLOduration=2.879315975 podStartE2EDuration="7.683272681s" podCreationTimestamp="2026-02-17 14:12:44 +0000 UTC" firstStartedPulling="2026-02-17 14:12:46.612246986 +0000 UTC m=+392.955175255" lastFinishedPulling="2026-02-17 14:12:51.416203692 +0000 UTC m=+397.759131961" observedRunningTime="2026-02-17 14:12:51.682492488 +0000 UTC m=+398.025420787" watchObservedRunningTime="2026-02-17 14:12:51.683272681 +0000 UTC m=+398.026200970" Feb 17 14:12:51 crc kubenswrapper[4836]: I0217 14:12:51.928042 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-xxj4j" Feb 17 14:12:51 crc kubenswrapper[4836]: I0217 14:12:51.928458 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-xxj4j" Feb 17 14:12:51 crc kubenswrapper[4836]: I0217 14:12:51.993070 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-xxj4j" Feb 17 14:12:52 crc kubenswrapper[4836]: I0217 14:12:52.702053 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-xxj4j" Feb 17 14:12:52 crc kubenswrapper[4836]: I0217 14:12:52.930335 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-zjxvt" Feb 17 14:12:52 crc kubenswrapper[4836]: I0217 14:12:52.930408 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-zjxvt" Feb 17 14:12:52 crc kubenswrapper[4836]: I0217 14:12:52.988305 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-zjxvt" Feb 17 14:12:53 crc kubenswrapper[4836]: I0217 14:12:53.716776 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-zjxvt" Feb 17 14:12:54 crc kubenswrapper[4836]: I0217 14:12:54.284870 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8gtc9" Feb 17 14:12:54 crc kubenswrapper[4836]: I0217 14:12:54.284935 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8gtc9" Feb 17 14:12:54 crc kubenswrapper[4836]: I0217 14:12:54.367563 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8gtc9" Feb 17 14:12:54 crc kubenswrapper[4836]: I0217 14:12:54.720558 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8gtc9" Feb 17 14:12:55 crc kubenswrapper[4836]: I0217 14:12:55.291520 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-89b2r" Feb 17 14:12:55 crc kubenswrapper[4836]: I0217 14:12:55.293201 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-89b2r" Feb 17 14:12:56 crc kubenswrapper[4836]: I0217 14:12:56.328794 4836 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-89b2r" podUID="cc99d806-e359-4577-8a61-1b527af8779f" containerName="registry-server" probeResult="failure" output=< Feb 17 14:12:56 crc kubenswrapper[4836]: timeout: failed to connect service ":50051" within 1s Feb 17 14:12:56 crc kubenswrapper[4836]: > Feb 17 14:12:57 crc kubenswrapper[4836]: I0217 14:12:57.176453 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-5qw6z"] Feb 17 14:12:57 crc kubenswrapper[4836]: I0217 14:12:57.177240 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-5qw6z" Feb 17 14:12:57 crc kubenswrapper[4836]: I0217 14:12:57.188263 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-5qw6z"] Feb 17 14:12:57 crc kubenswrapper[4836]: I0217 14:12:57.279284 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-5qw6z\" (UID: \"dbed41f2-8f89-4e10-a73e-9c44df59b13b\") " pod="openshift-image-registry/image-registry-66df7c8f76-5qw6z" Feb 17 14:12:57 crc kubenswrapper[4836]: I0217 14:12:57.279614 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/dbed41f2-8f89-4e10-a73e-9c44df59b13b-registry-tls\") pod \"image-registry-66df7c8f76-5qw6z\" (UID: \"dbed41f2-8f89-4e10-a73e-9c44df59b13b\") " pod="openshift-image-registry/image-registry-66df7c8f76-5qw6z" Feb 17 14:12:57 crc kubenswrapper[4836]: I0217 14:12:57.279637 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dbed41f2-8f89-4e10-a73e-9c44df59b13b-trusted-ca\") pod \"image-registry-66df7c8f76-5qw6z\" (UID: \"dbed41f2-8f89-4e10-a73e-9c44df59b13b\") " pod="openshift-image-registry/image-registry-66df7c8f76-5qw6z" Feb 17 14:12:57 crc kubenswrapper[4836]: I0217 14:12:57.279667 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/dbed41f2-8f89-4e10-a73e-9c44df59b13b-registry-certificates\") pod \"image-registry-66df7c8f76-5qw6z\" (UID: \"dbed41f2-8f89-4e10-a73e-9c44df59b13b\") " pod="openshift-image-registry/image-registry-66df7c8f76-5qw6z" Feb 17 14:12:57 crc kubenswrapper[4836]: I0217 14:12:57.279694 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/dbed41f2-8f89-4e10-a73e-9c44df59b13b-installation-pull-secrets\") pod \"image-registry-66df7c8f76-5qw6z\" (UID: \"dbed41f2-8f89-4e10-a73e-9c44df59b13b\") " pod="openshift-image-registry/image-registry-66df7c8f76-5qw6z" Feb 17 14:12:57 crc kubenswrapper[4836]: I0217 14:12:57.279710 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/dbed41f2-8f89-4e10-a73e-9c44df59b13b-bound-sa-token\") pod \"image-registry-66df7c8f76-5qw6z\" (UID: \"dbed41f2-8f89-4e10-a73e-9c44df59b13b\") " pod="openshift-image-registry/image-registry-66df7c8f76-5qw6z" Feb 17 14:12:57 crc kubenswrapper[4836]: I0217 14:12:57.279881 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzw72\" (UniqueName: \"kubernetes.io/projected/dbed41f2-8f89-4e10-a73e-9c44df59b13b-kube-api-access-zzw72\") pod \"image-registry-66df7c8f76-5qw6z\" (UID: \"dbed41f2-8f89-4e10-a73e-9c44df59b13b\") " pod="openshift-image-registry/image-registry-66df7c8f76-5qw6z" Feb 17 14:12:57 crc kubenswrapper[4836]: I0217 14:12:57.279911 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/dbed41f2-8f89-4e10-a73e-9c44df59b13b-ca-trust-extracted\") pod \"image-registry-66df7c8f76-5qw6z\" (UID: \"dbed41f2-8f89-4e10-a73e-9c44df59b13b\") " pod="openshift-image-registry/image-registry-66df7c8f76-5qw6z" Feb 17 14:12:57 crc kubenswrapper[4836]: I0217 14:12:57.306258 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-5qw6z\" (UID: \"dbed41f2-8f89-4e10-a73e-9c44df59b13b\") " pod="openshift-image-registry/image-registry-66df7c8f76-5qw6z" Feb 17 14:12:57 crc kubenswrapper[4836]: I0217 14:12:57.380503 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dbed41f2-8f89-4e10-a73e-9c44df59b13b-trusted-ca\") pod \"image-registry-66df7c8f76-5qw6z\" (UID: \"dbed41f2-8f89-4e10-a73e-9c44df59b13b\") " pod="openshift-image-registry/image-registry-66df7c8f76-5qw6z" Feb 17 14:12:57 crc kubenswrapper[4836]: I0217 14:12:57.381510 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/dbed41f2-8f89-4e10-a73e-9c44df59b13b-registry-tls\") pod \"image-registry-66df7c8f76-5qw6z\" (UID: \"dbed41f2-8f89-4e10-a73e-9c44df59b13b\") " pod="openshift-image-registry/image-registry-66df7c8f76-5qw6z" Feb 17 14:12:57 crc kubenswrapper[4836]: I0217 14:12:57.381643 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/dbed41f2-8f89-4e10-a73e-9c44df59b13b-registry-certificates\") pod \"image-registry-66df7c8f76-5qw6z\" (UID: \"dbed41f2-8f89-4e10-a73e-9c44df59b13b\") " pod="openshift-image-registry/image-registry-66df7c8f76-5qw6z" Feb 17 14:12:57 crc kubenswrapper[4836]: I0217 14:12:57.381672 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/dbed41f2-8f89-4e10-a73e-9c44df59b13b-installation-pull-secrets\") pod \"image-registry-66df7c8f76-5qw6z\" (UID: \"dbed41f2-8f89-4e10-a73e-9c44df59b13b\") " pod="openshift-image-registry/image-registry-66df7c8f76-5qw6z" Feb 17 14:12:57 crc kubenswrapper[4836]: I0217 14:12:57.381689 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/dbed41f2-8f89-4e10-a73e-9c44df59b13b-bound-sa-token\") pod \"image-registry-66df7c8f76-5qw6z\" (UID: \"dbed41f2-8f89-4e10-a73e-9c44df59b13b\") " pod="openshift-image-registry/image-registry-66df7c8f76-5qw6z" Feb 17 14:12:57 crc kubenswrapper[4836]: I0217 14:12:57.382452 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dbed41f2-8f89-4e10-a73e-9c44df59b13b-trusted-ca\") pod \"image-registry-66df7c8f76-5qw6z\" (UID: \"dbed41f2-8f89-4e10-a73e-9c44df59b13b\") " pod="openshift-image-registry/image-registry-66df7c8f76-5qw6z" Feb 17 14:12:57 crc kubenswrapper[4836]: I0217 14:12:57.382605 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzw72\" (UniqueName: \"kubernetes.io/projected/dbed41f2-8f89-4e10-a73e-9c44df59b13b-kube-api-access-zzw72\") pod \"image-registry-66df7c8f76-5qw6z\" (UID: \"dbed41f2-8f89-4e10-a73e-9c44df59b13b\") " pod="openshift-image-registry/image-registry-66df7c8f76-5qw6z" Feb 17 14:12:57 crc kubenswrapper[4836]: I0217 14:12:57.382718 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/dbed41f2-8f89-4e10-a73e-9c44df59b13b-ca-trust-extracted\") pod \"image-registry-66df7c8f76-5qw6z\" (UID: \"dbed41f2-8f89-4e10-a73e-9c44df59b13b\") " pod="openshift-image-registry/image-registry-66df7c8f76-5qw6z" Feb 17 14:12:57 crc kubenswrapper[4836]: I0217 14:12:57.383143 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/dbed41f2-8f89-4e10-a73e-9c44df59b13b-ca-trust-extracted\") pod \"image-registry-66df7c8f76-5qw6z\" (UID: \"dbed41f2-8f89-4e10-a73e-9c44df59b13b\") " pod="openshift-image-registry/image-registry-66df7c8f76-5qw6z" Feb 17 14:12:57 crc kubenswrapper[4836]: I0217 14:12:57.383234 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/dbed41f2-8f89-4e10-a73e-9c44df59b13b-registry-certificates\") pod \"image-registry-66df7c8f76-5qw6z\" (UID: \"dbed41f2-8f89-4e10-a73e-9c44df59b13b\") " pod="openshift-image-registry/image-registry-66df7c8f76-5qw6z" Feb 17 14:12:57 crc kubenswrapper[4836]: I0217 14:12:57.387324 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/dbed41f2-8f89-4e10-a73e-9c44df59b13b-registry-tls\") pod \"image-registry-66df7c8f76-5qw6z\" (UID: \"dbed41f2-8f89-4e10-a73e-9c44df59b13b\") " pod="openshift-image-registry/image-registry-66df7c8f76-5qw6z" Feb 17 14:12:57 crc kubenswrapper[4836]: I0217 14:12:57.401709 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/dbed41f2-8f89-4e10-a73e-9c44df59b13b-installation-pull-secrets\") pod \"image-registry-66df7c8f76-5qw6z\" (UID: \"dbed41f2-8f89-4e10-a73e-9c44df59b13b\") " pod="openshift-image-registry/image-registry-66df7c8f76-5qw6z" Feb 17 14:12:57 crc kubenswrapper[4836]: I0217 14:12:57.405003 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/dbed41f2-8f89-4e10-a73e-9c44df59b13b-bound-sa-token\") pod \"image-registry-66df7c8f76-5qw6z\" (UID: \"dbed41f2-8f89-4e10-a73e-9c44df59b13b\") " pod="openshift-image-registry/image-registry-66df7c8f76-5qw6z" Feb 17 14:12:57 crc kubenswrapper[4836]: I0217 14:12:57.409214 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzw72\" (UniqueName: \"kubernetes.io/projected/dbed41f2-8f89-4e10-a73e-9c44df59b13b-kube-api-access-zzw72\") pod \"image-registry-66df7c8f76-5qw6z\" (UID: \"dbed41f2-8f89-4e10-a73e-9c44df59b13b\") " pod="openshift-image-registry/image-registry-66df7c8f76-5qw6z" Feb 17 14:12:57 crc kubenswrapper[4836]: I0217 14:12:57.501849 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-5qw6z" Feb 17 14:12:57 crc kubenswrapper[4836]: I0217 14:12:57.825979 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-85f58dddc7-nc7rr"] Feb 17 14:12:57 crc kubenswrapper[4836]: I0217 14:12:57.826748 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-85f58dddc7-nc7rr" podUID="3135ca20-3162-4278-bbd7-de1d6f977dfe" containerName="controller-manager" containerID="cri-o://7563192babedf15dbfd2b3db7bca7b850f62a2b88195271072c4dd96129be51a" gracePeriod=30 Feb 17 14:12:58 crc kubenswrapper[4836]: I0217 14:12:58.228772 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-5qw6z"] Feb 17 14:12:58 crc kubenswrapper[4836]: W0217 14:12:58.243160 4836 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddbed41f2_8f89_4e10_a73e_9c44df59b13b.slice/crio-29146d415312ad83deb04838fc6884623244cd4c1654ce771355c8d0555a4b15 WatchSource:0}: Error finding container 29146d415312ad83deb04838fc6884623244cd4c1654ce771355c8d0555a4b15: Status 404 returned error can't find the container with id 29146d415312ad83deb04838fc6884623244cd4c1654ce771355c8d0555a4b15 Feb 17 14:12:58 crc kubenswrapper[4836]: I0217 14:12:58.255380 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-85f58dddc7-nc7rr" Feb 17 14:12:58 crc kubenswrapper[4836]: I0217 14:12:58.300078 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hdjgk\" (UniqueName: \"kubernetes.io/projected/3135ca20-3162-4278-bbd7-de1d6f977dfe-kube-api-access-hdjgk\") pod \"3135ca20-3162-4278-bbd7-de1d6f977dfe\" (UID: \"3135ca20-3162-4278-bbd7-de1d6f977dfe\") " Feb 17 14:12:58 crc kubenswrapper[4836]: I0217 14:12:58.300331 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3135ca20-3162-4278-bbd7-de1d6f977dfe-client-ca\") pod \"3135ca20-3162-4278-bbd7-de1d6f977dfe\" (UID: \"3135ca20-3162-4278-bbd7-de1d6f977dfe\") " Feb 17 14:12:58 crc kubenswrapper[4836]: I0217 14:12:58.300470 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3135ca20-3162-4278-bbd7-de1d6f977dfe-proxy-ca-bundles\") pod \"3135ca20-3162-4278-bbd7-de1d6f977dfe\" (UID: \"3135ca20-3162-4278-bbd7-de1d6f977dfe\") " Feb 17 14:12:58 crc kubenswrapper[4836]: I0217 14:12:58.300558 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3135ca20-3162-4278-bbd7-de1d6f977dfe-config\") pod \"3135ca20-3162-4278-bbd7-de1d6f977dfe\" (UID: \"3135ca20-3162-4278-bbd7-de1d6f977dfe\") " Feb 17 14:12:58 crc kubenswrapper[4836]: I0217 14:12:58.300640 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3135ca20-3162-4278-bbd7-de1d6f977dfe-serving-cert\") pod \"3135ca20-3162-4278-bbd7-de1d6f977dfe\" (UID: \"3135ca20-3162-4278-bbd7-de1d6f977dfe\") " Feb 17 14:12:58 crc kubenswrapper[4836]: I0217 14:12:58.301317 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3135ca20-3162-4278-bbd7-de1d6f977dfe-client-ca" (OuterVolumeSpecName: "client-ca") pod "3135ca20-3162-4278-bbd7-de1d6f977dfe" (UID: "3135ca20-3162-4278-bbd7-de1d6f977dfe"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:12:58 crc kubenswrapper[4836]: I0217 14:12:58.301399 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3135ca20-3162-4278-bbd7-de1d6f977dfe-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "3135ca20-3162-4278-bbd7-de1d6f977dfe" (UID: "3135ca20-3162-4278-bbd7-de1d6f977dfe"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:12:58 crc kubenswrapper[4836]: I0217 14:12:58.301777 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3135ca20-3162-4278-bbd7-de1d6f977dfe-config" (OuterVolumeSpecName: "config") pod "3135ca20-3162-4278-bbd7-de1d6f977dfe" (UID: "3135ca20-3162-4278-bbd7-de1d6f977dfe"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:12:58 crc kubenswrapper[4836]: I0217 14:12:58.312888 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3135ca20-3162-4278-bbd7-de1d6f977dfe-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "3135ca20-3162-4278-bbd7-de1d6f977dfe" (UID: "3135ca20-3162-4278-bbd7-de1d6f977dfe"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:12:58 crc kubenswrapper[4836]: I0217 14:12:58.313597 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3135ca20-3162-4278-bbd7-de1d6f977dfe-kube-api-access-hdjgk" (OuterVolumeSpecName: "kube-api-access-hdjgk") pod "3135ca20-3162-4278-bbd7-de1d6f977dfe" (UID: "3135ca20-3162-4278-bbd7-de1d6f977dfe"). InnerVolumeSpecName "kube-api-access-hdjgk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:12:58 crc kubenswrapper[4836]: I0217 14:12:58.402798 4836 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3135ca20-3162-4278-bbd7-de1d6f977dfe-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 14:12:58 crc kubenswrapper[4836]: I0217 14:12:58.402851 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hdjgk\" (UniqueName: \"kubernetes.io/projected/3135ca20-3162-4278-bbd7-de1d6f977dfe-kube-api-access-hdjgk\") on node \"crc\" DevicePath \"\"" Feb 17 14:12:58 crc kubenswrapper[4836]: I0217 14:12:58.402862 4836 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3135ca20-3162-4278-bbd7-de1d6f977dfe-client-ca\") on node \"crc\" DevicePath \"\"" Feb 17 14:12:58 crc kubenswrapper[4836]: I0217 14:12:58.402872 4836 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3135ca20-3162-4278-bbd7-de1d6f977dfe-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 17 14:12:58 crc kubenswrapper[4836]: I0217 14:12:58.402884 4836 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3135ca20-3162-4278-bbd7-de1d6f977dfe-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:12:58 crc kubenswrapper[4836]: I0217 14:12:58.712704 4836 generic.go:334] "Generic (PLEG): container finished" podID="3135ca20-3162-4278-bbd7-de1d6f977dfe" containerID="7563192babedf15dbfd2b3db7bca7b850f62a2b88195271072c4dd96129be51a" exitCode=0 Feb 17 14:12:58 crc kubenswrapper[4836]: I0217 14:12:58.712848 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-85f58dddc7-nc7rr" event={"ID":"3135ca20-3162-4278-bbd7-de1d6f977dfe","Type":"ContainerDied","Data":"7563192babedf15dbfd2b3db7bca7b850f62a2b88195271072c4dd96129be51a"} Feb 17 14:12:58 crc kubenswrapper[4836]: I0217 14:12:58.713117 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-85f58dddc7-nc7rr" event={"ID":"3135ca20-3162-4278-bbd7-de1d6f977dfe","Type":"ContainerDied","Data":"7e2281e26a53d8122732046bbdbf7cffb006b03fc9fd9923b35c064e85e5470c"} Feb 17 14:12:58 crc kubenswrapper[4836]: I0217 14:12:58.713007 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-85f58dddc7-nc7rr" Feb 17 14:12:58 crc kubenswrapper[4836]: I0217 14:12:58.713143 4836 scope.go:117] "RemoveContainer" containerID="7563192babedf15dbfd2b3db7bca7b850f62a2b88195271072c4dd96129be51a" Feb 17 14:12:58 crc kubenswrapper[4836]: I0217 14:12:58.715752 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-5qw6z" event={"ID":"dbed41f2-8f89-4e10-a73e-9c44df59b13b","Type":"ContainerStarted","Data":"236e86df2c880f2555debb95a06cfc97b9c450df455c00d9ff6b005211b9594a"} Feb 17 14:12:58 crc kubenswrapper[4836]: I0217 14:12:58.715805 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-5qw6z" event={"ID":"dbed41f2-8f89-4e10-a73e-9c44df59b13b","Type":"ContainerStarted","Data":"29146d415312ad83deb04838fc6884623244cd4c1654ce771355c8d0555a4b15"} Feb 17 14:12:58 crc kubenswrapper[4836]: I0217 14:12:58.715998 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-5qw6z" Feb 17 14:12:58 crc kubenswrapper[4836]: I0217 14:12:58.734943 4836 scope.go:117] "RemoveContainer" containerID="7563192babedf15dbfd2b3db7bca7b850f62a2b88195271072c4dd96129be51a" Feb 17 14:12:58 crc kubenswrapper[4836]: E0217 14:12:58.735752 4836 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7563192babedf15dbfd2b3db7bca7b850f62a2b88195271072c4dd96129be51a\": container with ID starting with 7563192babedf15dbfd2b3db7bca7b850f62a2b88195271072c4dd96129be51a not found: ID does not exist" containerID="7563192babedf15dbfd2b3db7bca7b850f62a2b88195271072c4dd96129be51a" Feb 17 14:12:58 crc kubenswrapper[4836]: I0217 14:12:58.735814 4836 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7563192babedf15dbfd2b3db7bca7b850f62a2b88195271072c4dd96129be51a"} err="failed to get container status \"7563192babedf15dbfd2b3db7bca7b850f62a2b88195271072c4dd96129be51a\": rpc error: code = NotFound desc = could not find container \"7563192babedf15dbfd2b3db7bca7b850f62a2b88195271072c4dd96129be51a\": container with ID starting with 7563192babedf15dbfd2b3db7bca7b850f62a2b88195271072c4dd96129be51a not found: ID does not exist" Feb 17 14:12:58 crc kubenswrapper[4836]: I0217 14:12:58.740269 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-5qw6z" podStartSLOduration=1.740221024 podStartE2EDuration="1.740221024s" podCreationTimestamp="2026-02-17 14:12:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:12:58.737672241 +0000 UTC m=+405.080600530" watchObservedRunningTime="2026-02-17 14:12:58.740221024 +0000 UTC m=+405.083149303" Feb 17 14:12:58 crc kubenswrapper[4836]: I0217 14:12:58.752868 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-85f58dddc7-nc7rr"] Feb 17 14:12:58 crc kubenswrapper[4836]: I0217 14:12:58.766627 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-85f58dddc7-nc7rr"] Feb 17 14:12:59 crc kubenswrapper[4836]: I0217 14:12:59.175754 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-69d6c474bd-jspw5"] Feb 17 14:12:59 crc kubenswrapper[4836]: E0217 14:12:59.176388 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3135ca20-3162-4278-bbd7-de1d6f977dfe" containerName="controller-manager" Feb 17 14:12:59 crc kubenswrapper[4836]: I0217 14:12:59.176478 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="3135ca20-3162-4278-bbd7-de1d6f977dfe" containerName="controller-manager" Feb 17 14:12:59 crc kubenswrapper[4836]: I0217 14:12:59.176870 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="3135ca20-3162-4278-bbd7-de1d6f977dfe" containerName="controller-manager" Feb 17 14:12:59 crc kubenswrapper[4836]: I0217 14:12:59.177708 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-69d6c474bd-jspw5" Feb 17 14:12:59 crc kubenswrapper[4836]: I0217 14:12:59.180900 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 17 14:12:59 crc kubenswrapper[4836]: I0217 14:12:59.181217 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 17 14:12:59 crc kubenswrapper[4836]: I0217 14:12:59.184254 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 17 14:12:59 crc kubenswrapper[4836]: I0217 14:12:59.189266 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 17 14:12:59 crc kubenswrapper[4836]: I0217 14:12:59.189777 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 17 14:12:59 crc kubenswrapper[4836]: I0217 14:12:59.190071 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 17 14:12:59 crc kubenswrapper[4836]: I0217 14:12:59.190254 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 17 14:12:59 crc kubenswrapper[4836]: I0217 14:12:59.191522 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-69d6c474bd-jspw5"] Feb 17 14:12:59 crc kubenswrapper[4836]: I0217 14:12:59.313412 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f772d120-59e6-4232-ada8-751b59262fc5-config\") pod \"controller-manager-69d6c474bd-jspw5\" (UID: \"f772d120-59e6-4232-ada8-751b59262fc5\") " pod="openshift-controller-manager/controller-manager-69d6c474bd-jspw5" Feb 17 14:12:59 crc kubenswrapper[4836]: I0217 14:12:59.313484 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2d8tm\" (UniqueName: \"kubernetes.io/projected/f772d120-59e6-4232-ada8-751b59262fc5-kube-api-access-2d8tm\") pod \"controller-manager-69d6c474bd-jspw5\" (UID: \"f772d120-59e6-4232-ada8-751b59262fc5\") " pod="openshift-controller-manager/controller-manager-69d6c474bd-jspw5" Feb 17 14:12:59 crc kubenswrapper[4836]: I0217 14:12:59.313515 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f772d120-59e6-4232-ada8-751b59262fc5-serving-cert\") pod \"controller-manager-69d6c474bd-jspw5\" (UID: \"f772d120-59e6-4232-ada8-751b59262fc5\") " pod="openshift-controller-manager/controller-manager-69d6c474bd-jspw5" Feb 17 14:12:59 crc kubenswrapper[4836]: I0217 14:12:59.313540 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f772d120-59e6-4232-ada8-751b59262fc5-client-ca\") pod \"controller-manager-69d6c474bd-jspw5\" (UID: \"f772d120-59e6-4232-ada8-751b59262fc5\") " pod="openshift-controller-manager/controller-manager-69d6c474bd-jspw5" Feb 17 14:12:59 crc kubenswrapper[4836]: I0217 14:12:59.313567 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f772d120-59e6-4232-ada8-751b59262fc5-proxy-ca-bundles\") pod \"controller-manager-69d6c474bd-jspw5\" (UID: \"f772d120-59e6-4232-ada8-751b59262fc5\") " pod="openshift-controller-manager/controller-manager-69d6c474bd-jspw5" Feb 17 14:12:59 crc kubenswrapper[4836]: I0217 14:12:59.414408 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f772d120-59e6-4232-ada8-751b59262fc5-config\") pod \"controller-manager-69d6c474bd-jspw5\" (UID: \"f772d120-59e6-4232-ada8-751b59262fc5\") " pod="openshift-controller-manager/controller-manager-69d6c474bd-jspw5" Feb 17 14:12:59 crc kubenswrapper[4836]: I0217 14:12:59.414503 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2d8tm\" (UniqueName: \"kubernetes.io/projected/f772d120-59e6-4232-ada8-751b59262fc5-kube-api-access-2d8tm\") pod \"controller-manager-69d6c474bd-jspw5\" (UID: \"f772d120-59e6-4232-ada8-751b59262fc5\") " pod="openshift-controller-manager/controller-manager-69d6c474bd-jspw5" Feb 17 14:12:59 crc kubenswrapper[4836]: I0217 14:12:59.414538 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f772d120-59e6-4232-ada8-751b59262fc5-serving-cert\") pod \"controller-manager-69d6c474bd-jspw5\" (UID: \"f772d120-59e6-4232-ada8-751b59262fc5\") " pod="openshift-controller-manager/controller-manager-69d6c474bd-jspw5" Feb 17 14:12:59 crc kubenswrapper[4836]: I0217 14:12:59.414567 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f772d120-59e6-4232-ada8-751b59262fc5-client-ca\") pod \"controller-manager-69d6c474bd-jspw5\" (UID: \"f772d120-59e6-4232-ada8-751b59262fc5\") " pod="openshift-controller-manager/controller-manager-69d6c474bd-jspw5" Feb 17 14:12:59 crc kubenswrapper[4836]: I0217 14:12:59.414593 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f772d120-59e6-4232-ada8-751b59262fc5-proxy-ca-bundles\") pod \"controller-manager-69d6c474bd-jspw5\" (UID: \"f772d120-59e6-4232-ada8-751b59262fc5\") " pod="openshift-controller-manager/controller-manager-69d6c474bd-jspw5" Feb 17 14:12:59 crc kubenswrapper[4836]: I0217 14:12:59.416047 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f772d120-59e6-4232-ada8-751b59262fc5-client-ca\") pod \"controller-manager-69d6c474bd-jspw5\" (UID: \"f772d120-59e6-4232-ada8-751b59262fc5\") " pod="openshift-controller-manager/controller-manager-69d6c474bd-jspw5" Feb 17 14:12:59 crc kubenswrapper[4836]: I0217 14:12:59.416959 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f772d120-59e6-4232-ada8-751b59262fc5-config\") pod \"controller-manager-69d6c474bd-jspw5\" (UID: \"f772d120-59e6-4232-ada8-751b59262fc5\") " pod="openshift-controller-manager/controller-manager-69d6c474bd-jspw5" Feb 17 14:12:59 crc kubenswrapper[4836]: I0217 14:12:59.417024 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f772d120-59e6-4232-ada8-751b59262fc5-proxy-ca-bundles\") pod \"controller-manager-69d6c474bd-jspw5\" (UID: \"f772d120-59e6-4232-ada8-751b59262fc5\") " pod="openshift-controller-manager/controller-manager-69d6c474bd-jspw5" Feb 17 14:12:59 crc kubenswrapper[4836]: I0217 14:12:59.426509 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f772d120-59e6-4232-ada8-751b59262fc5-serving-cert\") pod \"controller-manager-69d6c474bd-jspw5\" (UID: \"f772d120-59e6-4232-ada8-751b59262fc5\") " pod="openshift-controller-manager/controller-manager-69d6c474bd-jspw5" Feb 17 14:12:59 crc kubenswrapper[4836]: I0217 14:12:59.436767 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2d8tm\" (UniqueName: \"kubernetes.io/projected/f772d120-59e6-4232-ada8-751b59262fc5-kube-api-access-2d8tm\") pod \"controller-manager-69d6c474bd-jspw5\" (UID: \"f772d120-59e6-4232-ada8-751b59262fc5\") " pod="openshift-controller-manager/controller-manager-69d6c474bd-jspw5" Feb 17 14:12:59 crc kubenswrapper[4836]: I0217 14:12:59.497034 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-69d6c474bd-jspw5" Feb 17 14:12:59 crc kubenswrapper[4836]: I0217 14:12:59.765207 4836 patch_prober.go:28] interesting pod/machine-config-daemon-bkk9g container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 14:12:59 crc kubenswrapper[4836]: I0217 14:12:59.765678 4836 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" podUID="895a19c9-a3f0-4a15-aa19-19347121388c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 14:12:59 crc kubenswrapper[4836]: I0217 14:12:59.962334 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-69d6c474bd-jspw5"] Feb 17 14:13:00 crc kubenswrapper[4836]: I0217 14:13:00.577145 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3135ca20-3162-4278-bbd7-de1d6f977dfe" path="/var/lib/kubelet/pods/3135ca20-3162-4278-bbd7-de1d6f977dfe/volumes" Feb 17 14:13:00 crc kubenswrapper[4836]: I0217 14:13:00.736891 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-69d6c474bd-jspw5" event={"ID":"f772d120-59e6-4232-ada8-751b59262fc5","Type":"ContainerStarted","Data":"ff24011989436c878dcb7ddeca2877a0ab55a69759535b681dce4057da5174cc"} Feb 17 14:13:00 crc kubenswrapper[4836]: I0217 14:13:00.736963 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-69d6c474bd-jspw5" event={"ID":"f772d120-59e6-4232-ada8-751b59262fc5","Type":"ContainerStarted","Data":"8227ce45a3cb16dc92ac913a96d2d14ccdb873a020d0c2dfa26dd3a64bf600d1"} Feb 17 14:13:00 crc kubenswrapper[4836]: I0217 14:13:00.737144 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-69d6c474bd-jspw5" Feb 17 14:13:00 crc kubenswrapper[4836]: I0217 14:13:00.742862 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-69d6c474bd-jspw5" Feb 17 14:13:00 crc kubenswrapper[4836]: I0217 14:13:00.760644 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-69d6c474bd-jspw5" podStartSLOduration=3.760621297 podStartE2EDuration="3.760621297s" podCreationTimestamp="2026-02-17 14:12:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:13:00.757345873 +0000 UTC m=+407.100274162" watchObservedRunningTime="2026-02-17 14:13:00.760621297 +0000 UTC m=+407.103549576" Feb 17 14:13:05 crc kubenswrapper[4836]: I0217 14:13:05.335922 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-89b2r" Feb 17 14:13:05 crc kubenswrapper[4836]: I0217 14:13:05.390944 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-89b2r" Feb 17 14:13:17 crc kubenswrapper[4836]: I0217 14:13:17.508523 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-5qw6z" Feb 17 14:13:17 crc kubenswrapper[4836]: I0217 14:13:17.584401 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-5vhz9"] Feb 17 14:13:29 crc kubenswrapper[4836]: I0217 14:13:29.765577 4836 patch_prober.go:28] interesting pod/machine-config-daemon-bkk9g container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 14:13:29 crc kubenswrapper[4836]: I0217 14:13:29.766378 4836 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" podUID="895a19c9-a3f0-4a15-aa19-19347121388c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 14:13:29 crc kubenswrapper[4836]: I0217 14:13:29.766446 4836 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" Feb 17 14:13:29 crc kubenswrapper[4836]: I0217 14:13:29.767375 4836 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6ca471c2a83c51e21c02e6df84d64c6720d133c689bc0501ece1848cccb37b3b"} pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 14:13:29 crc kubenswrapper[4836]: I0217 14:13:29.767446 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" podUID="895a19c9-a3f0-4a15-aa19-19347121388c" containerName="machine-config-daemon" containerID="cri-o://6ca471c2a83c51e21c02e6df84d64c6720d133c689bc0501ece1848cccb37b3b" gracePeriod=600 Feb 17 14:13:30 crc kubenswrapper[4836]: I0217 14:13:30.059278 4836 generic.go:334] "Generic (PLEG): container finished" podID="895a19c9-a3f0-4a15-aa19-19347121388c" containerID="6ca471c2a83c51e21c02e6df84d64c6720d133c689bc0501ece1848cccb37b3b" exitCode=0 Feb 17 14:13:30 crc kubenswrapper[4836]: I0217 14:13:30.059346 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" event={"ID":"895a19c9-a3f0-4a15-aa19-19347121388c","Type":"ContainerDied","Data":"6ca471c2a83c51e21c02e6df84d64c6720d133c689bc0501ece1848cccb37b3b"} Feb 17 14:13:30 crc kubenswrapper[4836]: I0217 14:13:30.059854 4836 scope.go:117] "RemoveContainer" containerID="c10f7b69881ea75bfa81905a42379fed8daf48e788b5f2787c522a52d28f58cb" Feb 17 14:13:31 crc kubenswrapper[4836]: I0217 14:13:31.067820 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" event={"ID":"895a19c9-a3f0-4a15-aa19-19347121388c","Type":"ContainerStarted","Data":"1b2a0d64ec4a5faa95e6312a8de2b21c8f3e85f4d851c39760904a4b16753249"} Feb 17 14:13:42 crc kubenswrapper[4836]: I0217 14:13:42.636616 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-5vhz9" podUID="4cd3f585-c95f-43ee-962c-ea33aff90415" containerName="registry" containerID="cri-o://bdf9c0182351aaefe8f64ad91ea1c51a4b7acfffd267691001139a7e914dc3b6" gracePeriod=30 Feb 17 14:13:43 crc kubenswrapper[4836]: I0217 14:13:43.091343 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-5vhz9" Feb 17 14:13:43 crc kubenswrapper[4836]: I0217 14:13:43.138326 4836 generic.go:334] "Generic (PLEG): container finished" podID="4cd3f585-c95f-43ee-962c-ea33aff90415" containerID="bdf9c0182351aaefe8f64ad91ea1c51a4b7acfffd267691001139a7e914dc3b6" exitCode=0 Feb 17 14:13:43 crc kubenswrapper[4836]: I0217 14:13:43.138376 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-5vhz9" event={"ID":"4cd3f585-c95f-43ee-962c-ea33aff90415","Type":"ContainerDied","Data":"bdf9c0182351aaefe8f64ad91ea1c51a4b7acfffd267691001139a7e914dc3b6"} Feb 17 14:13:43 crc kubenswrapper[4836]: I0217 14:13:43.138393 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-5vhz9" Feb 17 14:13:43 crc kubenswrapper[4836]: I0217 14:13:43.138406 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-5vhz9" event={"ID":"4cd3f585-c95f-43ee-962c-ea33aff90415","Type":"ContainerDied","Data":"b92bf709add22f9c57e92a26debc7c9604b5ddd76791fbcef0b8821c381eba8e"} Feb 17 14:13:43 crc kubenswrapper[4836]: I0217 14:13:43.138429 4836 scope.go:117] "RemoveContainer" containerID="bdf9c0182351aaefe8f64ad91ea1c51a4b7acfffd267691001139a7e914dc3b6" Feb 17 14:13:43 crc kubenswrapper[4836]: I0217 14:13:43.161656 4836 scope.go:117] "RemoveContainer" containerID="bdf9c0182351aaefe8f64ad91ea1c51a4b7acfffd267691001139a7e914dc3b6" Feb 17 14:13:43 crc kubenswrapper[4836]: E0217 14:13:43.162546 4836 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bdf9c0182351aaefe8f64ad91ea1c51a4b7acfffd267691001139a7e914dc3b6\": container with ID starting with bdf9c0182351aaefe8f64ad91ea1c51a4b7acfffd267691001139a7e914dc3b6 not found: ID does not exist" containerID="bdf9c0182351aaefe8f64ad91ea1c51a4b7acfffd267691001139a7e914dc3b6" Feb 17 14:13:43 crc kubenswrapper[4836]: I0217 14:13:43.162602 4836 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bdf9c0182351aaefe8f64ad91ea1c51a4b7acfffd267691001139a7e914dc3b6"} err="failed to get container status \"bdf9c0182351aaefe8f64ad91ea1c51a4b7acfffd267691001139a7e914dc3b6\": rpc error: code = NotFound desc = could not find container \"bdf9c0182351aaefe8f64ad91ea1c51a4b7acfffd267691001139a7e914dc3b6\": container with ID starting with bdf9c0182351aaefe8f64ad91ea1c51a4b7acfffd267691001139a7e914dc3b6 not found: ID does not exist" Feb 17 14:13:43 crc kubenswrapper[4836]: I0217 14:13:43.243834 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4cd3f585-c95f-43ee-962c-ea33aff90415-ca-trust-extracted\") pod \"4cd3f585-c95f-43ee-962c-ea33aff90415\" (UID: \"4cd3f585-c95f-43ee-962c-ea33aff90415\") " Feb 17 14:13:43 crc kubenswrapper[4836]: I0217 14:13:43.243952 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vhp9d\" (UniqueName: \"kubernetes.io/projected/4cd3f585-c95f-43ee-962c-ea33aff90415-kube-api-access-vhp9d\") pod \"4cd3f585-c95f-43ee-962c-ea33aff90415\" (UID: \"4cd3f585-c95f-43ee-962c-ea33aff90415\") " Feb 17 14:13:43 crc kubenswrapper[4836]: I0217 14:13:43.243980 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4cd3f585-c95f-43ee-962c-ea33aff90415-registry-tls\") pod \"4cd3f585-c95f-43ee-962c-ea33aff90415\" (UID: \"4cd3f585-c95f-43ee-962c-ea33aff90415\") " Feb 17 14:13:43 crc kubenswrapper[4836]: I0217 14:13:43.244189 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"4cd3f585-c95f-43ee-962c-ea33aff90415\" (UID: \"4cd3f585-c95f-43ee-962c-ea33aff90415\") " Feb 17 14:13:43 crc kubenswrapper[4836]: I0217 14:13:43.244235 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4cd3f585-c95f-43ee-962c-ea33aff90415-trusted-ca\") pod \"4cd3f585-c95f-43ee-962c-ea33aff90415\" (UID: \"4cd3f585-c95f-43ee-962c-ea33aff90415\") " Feb 17 14:13:43 crc kubenswrapper[4836]: I0217 14:13:43.244269 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4cd3f585-c95f-43ee-962c-ea33aff90415-registry-certificates\") pod \"4cd3f585-c95f-43ee-962c-ea33aff90415\" (UID: \"4cd3f585-c95f-43ee-962c-ea33aff90415\") " Feb 17 14:13:43 crc kubenswrapper[4836]: I0217 14:13:43.244289 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4cd3f585-c95f-43ee-962c-ea33aff90415-installation-pull-secrets\") pod \"4cd3f585-c95f-43ee-962c-ea33aff90415\" (UID: \"4cd3f585-c95f-43ee-962c-ea33aff90415\") " Feb 17 14:13:43 crc kubenswrapper[4836]: I0217 14:13:43.244425 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4cd3f585-c95f-43ee-962c-ea33aff90415-bound-sa-token\") pod \"4cd3f585-c95f-43ee-962c-ea33aff90415\" (UID: \"4cd3f585-c95f-43ee-962c-ea33aff90415\") " Feb 17 14:13:43 crc kubenswrapper[4836]: I0217 14:13:43.245589 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4cd3f585-c95f-43ee-962c-ea33aff90415-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "4cd3f585-c95f-43ee-962c-ea33aff90415" (UID: "4cd3f585-c95f-43ee-962c-ea33aff90415"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:13:43 crc kubenswrapper[4836]: I0217 14:13:43.246188 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4cd3f585-c95f-43ee-962c-ea33aff90415-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "4cd3f585-c95f-43ee-962c-ea33aff90415" (UID: "4cd3f585-c95f-43ee-962c-ea33aff90415"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:13:43 crc kubenswrapper[4836]: I0217 14:13:43.247027 4836 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4cd3f585-c95f-43ee-962c-ea33aff90415-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 17 14:13:43 crc kubenswrapper[4836]: I0217 14:13:43.247376 4836 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4cd3f585-c95f-43ee-962c-ea33aff90415-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 17 14:13:43 crc kubenswrapper[4836]: I0217 14:13:43.252806 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4cd3f585-c95f-43ee-962c-ea33aff90415-kube-api-access-vhp9d" (OuterVolumeSpecName: "kube-api-access-vhp9d") pod "4cd3f585-c95f-43ee-962c-ea33aff90415" (UID: "4cd3f585-c95f-43ee-962c-ea33aff90415"). InnerVolumeSpecName "kube-api-access-vhp9d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:13:43 crc kubenswrapper[4836]: I0217 14:13:43.252916 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4cd3f585-c95f-43ee-962c-ea33aff90415-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "4cd3f585-c95f-43ee-962c-ea33aff90415" (UID: "4cd3f585-c95f-43ee-962c-ea33aff90415"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:13:43 crc kubenswrapper[4836]: I0217 14:13:43.254174 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4cd3f585-c95f-43ee-962c-ea33aff90415-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "4cd3f585-c95f-43ee-962c-ea33aff90415" (UID: "4cd3f585-c95f-43ee-962c-ea33aff90415"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:13:43 crc kubenswrapper[4836]: I0217 14:13:43.254669 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4cd3f585-c95f-43ee-962c-ea33aff90415-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "4cd3f585-c95f-43ee-962c-ea33aff90415" (UID: "4cd3f585-c95f-43ee-962c-ea33aff90415"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:13:43 crc kubenswrapper[4836]: I0217 14:13:43.258383 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "4cd3f585-c95f-43ee-962c-ea33aff90415" (UID: "4cd3f585-c95f-43ee-962c-ea33aff90415"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 17 14:13:43 crc kubenswrapper[4836]: I0217 14:13:43.264822 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4cd3f585-c95f-43ee-962c-ea33aff90415-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "4cd3f585-c95f-43ee-962c-ea33aff90415" (UID: "4cd3f585-c95f-43ee-962c-ea33aff90415"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:13:43 crc kubenswrapper[4836]: I0217 14:13:43.348611 4836 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4cd3f585-c95f-43ee-962c-ea33aff90415-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 17 14:13:43 crc kubenswrapper[4836]: I0217 14:13:43.348671 4836 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4cd3f585-c95f-43ee-962c-ea33aff90415-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 17 14:13:43 crc kubenswrapper[4836]: I0217 14:13:43.348686 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vhp9d\" (UniqueName: \"kubernetes.io/projected/4cd3f585-c95f-43ee-962c-ea33aff90415-kube-api-access-vhp9d\") on node \"crc\" DevicePath \"\"" Feb 17 14:13:43 crc kubenswrapper[4836]: I0217 14:13:43.348703 4836 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4cd3f585-c95f-43ee-962c-ea33aff90415-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 17 14:13:43 crc kubenswrapper[4836]: I0217 14:13:43.348716 4836 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4cd3f585-c95f-43ee-962c-ea33aff90415-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 17 14:13:43 crc kubenswrapper[4836]: I0217 14:13:43.474722 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-5vhz9"] Feb 17 14:13:43 crc kubenswrapper[4836]: I0217 14:13:43.485712 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-5vhz9"] Feb 17 14:13:44 crc kubenswrapper[4836]: I0217 14:13:44.577764 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4cd3f585-c95f-43ee-962c-ea33aff90415" path="/var/lib/kubelet/pods/4cd3f585-c95f-43ee-962c-ea33aff90415/volumes" Feb 17 14:15:00 crc kubenswrapper[4836]: I0217 14:15:00.185491 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522295-87mmf"] Feb 17 14:15:00 crc kubenswrapper[4836]: E0217 14:15:00.186398 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cd3f585-c95f-43ee-962c-ea33aff90415" containerName="registry" Feb 17 14:15:00 crc kubenswrapper[4836]: I0217 14:15:00.186412 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cd3f585-c95f-43ee-962c-ea33aff90415" containerName="registry" Feb 17 14:15:00 crc kubenswrapper[4836]: I0217 14:15:00.186530 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="4cd3f585-c95f-43ee-962c-ea33aff90415" containerName="registry" Feb 17 14:15:00 crc kubenswrapper[4836]: I0217 14:15:00.186957 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522295-87mmf" Feb 17 14:15:00 crc kubenswrapper[4836]: I0217 14:15:00.189601 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 17 14:15:00 crc kubenswrapper[4836]: I0217 14:15:00.189755 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 17 14:15:00 crc kubenswrapper[4836]: I0217 14:15:00.206763 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522295-87mmf"] Feb 17 14:15:00 crc kubenswrapper[4836]: I0217 14:15:00.293194 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/18a8fa91-916f-4b01-bf45-63e0add01572-secret-volume\") pod \"collect-profiles-29522295-87mmf\" (UID: \"18a8fa91-916f-4b01-bf45-63e0add01572\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522295-87mmf" Feb 17 14:15:00 crc kubenswrapper[4836]: I0217 14:15:00.293425 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r44dg\" (UniqueName: \"kubernetes.io/projected/18a8fa91-916f-4b01-bf45-63e0add01572-kube-api-access-r44dg\") pod \"collect-profiles-29522295-87mmf\" (UID: \"18a8fa91-916f-4b01-bf45-63e0add01572\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522295-87mmf" Feb 17 14:15:00 crc kubenswrapper[4836]: I0217 14:15:00.293463 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/18a8fa91-916f-4b01-bf45-63e0add01572-config-volume\") pod \"collect-profiles-29522295-87mmf\" (UID: \"18a8fa91-916f-4b01-bf45-63e0add01572\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522295-87mmf" Feb 17 14:15:00 crc kubenswrapper[4836]: I0217 14:15:00.394849 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r44dg\" (UniqueName: \"kubernetes.io/projected/18a8fa91-916f-4b01-bf45-63e0add01572-kube-api-access-r44dg\") pod \"collect-profiles-29522295-87mmf\" (UID: \"18a8fa91-916f-4b01-bf45-63e0add01572\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522295-87mmf" Feb 17 14:15:00 crc kubenswrapper[4836]: I0217 14:15:00.394899 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/18a8fa91-916f-4b01-bf45-63e0add01572-config-volume\") pod \"collect-profiles-29522295-87mmf\" (UID: \"18a8fa91-916f-4b01-bf45-63e0add01572\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522295-87mmf" Feb 17 14:15:00 crc kubenswrapper[4836]: I0217 14:15:00.394954 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/18a8fa91-916f-4b01-bf45-63e0add01572-secret-volume\") pod \"collect-profiles-29522295-87mmf\" (UID: \"18a8fa91-916f-4b01-bf45-63e0add01572\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522295-87mmf" Feb 17 14:15:00 crc kubenswrapper[4836]: I0217 14:15:00.395828 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/18a8fa91-916f-4b01-bf45-63e0add01572-config-volume\") pod \"collect-profiles-29522295-87mmf\" (UID: \"18a8fa91-916f-4b01-bf45-63e0add01572\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522295-87mmf" Feb 17 14:15:00 crc kubenswrapper[4836]: I0217 14:15:00.400883 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/18a8fa91-916f-4b01-bf45-63e0add01572-secret-volume\") pod \"collect-profiles-29522295-87mmf\" (UID: \"18a8fa91-916f-4b01-bf45-63e0add01572\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522295-87mmf" Feb 17 14:15:00 crc kubenswrapper[4836]: I0217 14:15:00.411055 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r44dg\" (UniqueName: \"kubernetes.io/projected/18a8fa91-916f-4b01-bf45-63e0add01572-kube-api-access-r44dg\") pod \"collect-profiles-29522295-87mmf\" (UID: \"18a8fa91-916f-4b01-bf45-63e0add01572\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522295-87mmf" Feb 17 14:15:00 crc kubenswrapper[4836]: I0217 14:15:00.507903 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522295-87mmf" Feb 17 14:15:00 crc kubenswrapper[4836]: I0217 14:15:00.931086 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522295-87mmf"] Feb 17 14:15:01 crc kubenswrapper[4836]: I0217 14:15:01.730601 4836 generic.go:334] "Generic (PLEG): container finished" podID="18a8fa91-916f-4b01-bf45-63e0add01572" containerID="43a52fe036affd4e3617ffdb7972a0968f159b10d3199b68e41003595bd9384d" exitCode=0 Feb 17 14:15:01 crc kubenswrapper[4836]: I0217 14:15:01.730788 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522295-87mmf" event={"ID":"18a8fa91-916f-4b01-bf45-63e0add01572","Type":"ContainerDied","Data":"43a52fe036affd4e3617ffdb7972a0968f159b10d3199b68e41003595bd9384d"} Feb 17 14:15:01 crc kubenswrapper[4836]: I0217 14:15:01.731405 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522295-87mmf" event={"ID":"18a8fa91-916f-4b01-bf45-63e0add01572","Type":"ContainerStarted","Data":"062c0bac4496dd7efecca69afab7a8c68666e9d2123b26903c1db4b26c6d0114"} Feb 17 14:15:03 crc kubenswrapper[4836]: I0217 14:15:03.001091 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522295-87mmf" Feb 17 14:15:03 crc kubenswrapper[4836]: I0217 14:15:03.165431 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/18a8fa91-916f-4b01-bf45-63e0add01572-secret-volume\") pod \"18a8fa91-916f-4b01-bf45-63e0add01572\" (UID: \"18a8fa91-916f-4b01-bf45-63e0add01572\") " Feb 17 14:15:03 crc kubenswrapper[4836]: I0217 14:15:03.165597 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r44dg\" (UniqueName: \"kubernetes.io/projected/18a8fa91-916f-4b01-bf45-63e0add01572-kube-api-access-r44dg\") pod \"18a8fa91-916f-4b01-bf45-63e0add01572\" (UID: \"18a8fa91-916f-4b01-bf45-63e0add01572\") " Feb 17 14:15:03 crc kubenswrapper[4836]: I0217 14:15:03.165626 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/18a8fa91-916f-4b01-bf45-63e0add01572-config-volume\") pod \"18a8fa91-916f-4b01-bf45-63e0add01572\" (UID: \"18a8fa91-916f-4b01-bf45-63e0add01572\") " Feb 17 14:15:03 crc kubenswrapper[4836]: I0217 14:15:03.166542 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18a8fa91-916f-4b01-bf45-63e0add01572-config-volume" (OuterVolumeSpecName: "config-volume") pod "18a8fa91-916f-4b01-bf45-63e0add01572" (UID: "18a8fa91-916f-4b01-bf45-63e0add01572"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:15:03 crc kubenswrapper[4836]: I0217 14:15:03.172246 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18a8fa91-916f-4b01-bf45-63e0add01572-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "18a8fa91-916f-4b01-bf45-63e0add01572" (UID: "18a8fa91-916f-4b01-bf45-63e0add01572"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:15:03 crc kubenswrapper[4836]: I0217 14:15:03.172353 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18a8fa91-916f-4b01-bf45-63e0add01572-kube-api-access-r44dg" (OuterVolumeSpecName: "kube-api-access-r44dg") pod "18a8fa91-916f-4b01-bf45-63e0add01572" (UID: "18a8fa91-916f-4b01-bf45-63e0add01572"). InnerVolumeSpecName "kube-api-access-r44dg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:15:03 crc kubenswrapper[4836]: I0217 14:15:03.267093 4836 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/18a8fa91-916f-4b01-bf45-63e0add01572-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 17 14:15:03 crc kubenswrapper[4836]: I0217 14:15:03.267153 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r44dg\" (UniqueName: \"kubernetes.io/projected/18a8fa91-916f-4b01-bf45-63e0add01572-kube-api-access-r44dg\") on node \"crc\" DevicePath \"\"" Feb 17 14:15:03 crc kubenswrapper[4836]: I0217 14:15:03.267168 4836 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/18a8fa91-916f-4b01-bf45-63e0add01572-config-volume\") on node \"crc\" DevicePath \"\"" Feb 17 14:15:03 crc kubenswrapper[4836]: I0217 14:15:03.746378 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522295-87mmf" event={"ID":"18a8fa91-916f-4b01-bf45-63e0add01572","Type":"ContainerDied","Data":"062c0bac4496dd7efecca69afab7a8c68666e9d2123b26903c1db4b26c6d0114"} Feb 17 14:15:03 crc kubenswrapper[4836]: I0217 14:15:03.746816 4836 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="062c0bac4496dd7efecca69afab7a8c68666e9d2123b26903c1db4b26c6d0114" Feb 17 14:15:03 crc kubenswrapper[4836]: I0217 14:15:03.746683 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522295-87mmf" Feb 17 14:15:14 crc kubenswrapper[4836]: I0217 14:15:14.870506 4836 scope.go:117] "RemoveContainer" containerID="f5f1510b84a48fd765ca27386941284d20f6da0225cb6c655223588a86aa6f8f" Feb 17 14:15:59 crc kubenswrapper[4836]: I0217 14:15:59.765216 4836 patch_prober.go:28] interesting pod/machine-config-daemon-bkk9g container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 14:15:59 crc kubenswrapper[4836]: I0217 14:15:59.765848 4836 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" podUID="895a19c9-a3f0-4a15-aa19-19347121388c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 14:16:29 crc kubenswrapper[4836]: I0217 14:16:29.765070 4836 patch_prober.go:28] interesting pod/machine-config-daemon-bkk9g container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 14:16:29 crc kubenswrapper[4836]: I0217 14:16:29.765757 4836 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" podUID="895a19c9-a3f0-4a15-aa19-19347121388c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 14:16:59 crc kubenswrapper[4836]: I0217 14:16:59.765154 4836 patch_prober.go:28] interesting pod/machine-config-daemon-bkk9g container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 14:16:59 crc kubenswrapper[4836]: I0217 14:16:59.765991 4836 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" podUID="895a19c9-a3f0-4a15-aa19-19347121388c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 14:16:59 crc kubenswrapper[4836]: I0217 14:16:59.766062 4836 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" Feb 17 14:16:59 crc kubenswrapper[4836]: I0217 14:16:59.766963 4836 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1b2a0d64ec4a5faa95e6312a8de2b21c8f3e85f4d851c39760904a4b16753249"} pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 14:16:59 crc kubenswrapper[4836]: I0217 14:16:59.767112 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" podUID="895a19c9-a3f0-4a15-aa19-19347121388c" containerName="machine-config-daemon" containerID="cri-o://1b2a0d64ec4a5faa95e6312a8de2b21c8f3e85f4d851c39760904a4b16753249" gracePeriod=600 Feb 17 14:17:00 crc kubenswrapper[4836]: I0217 14:17:00.605979 4836 generic.go:334] "Generic (PLEG): container finished" podID="895a19c9-a3f0-4a15-aa19-19347121388c" containerID="1b2a0d64ec4a5faa95e6312a8de2b21c8f3e85f4d851c39760904a4b16753249" exitCode=0 Feb 17 14:17:00 crc kubenswrapper[4836]: I0217 14:17:00.606988 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" event={"ID":"895a19c9-a3f0-4a15-aa19-19347121388c","Type":"ContainerDied","Data":"1b2a0d64ec4a5faa95e6312a8de2b21c8f3e85f4d851c39760904a4b16753249"} Feb 17 14:17:00 crc kubenswrapper[4836]: I0217 14:17:00.607033 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" event={"ID":"895a19c9-a3f0-4a15-aa19-19347121388c","Type":"ContainerStarted","Data":"d7f43ee4be167fb696d056804834f76d74b6a96b2dd00fc7f1328e7b9c2e7869"} Feb 17 14:17:00 crc kubenswrapper[4836]: I0217 14:17:00.607059 4836 scope.go:117] "RemoveContainer" containerID="6ca471c2a83c51e21c02e6df84d64c6720d133c689bc0501ece1848cccb37b3b" Feb 17 14:17:14 crc kubenswrapper[4836]: I0217 14:17:14.922277 4836 scope.go:117] "RemoveContainer" containerID="56a4ac051fd52f2fd8e193686dffb745df251c7f892fec72d600a2fa80ecbd34" Feb 17 14:17:14 crc kubenswrapper[4836]: I0217 14:17:14.952679 4836 scope.go:117] "RemoveContainer" containerID="12b9c51f4d9306ca0c2b4adb55d1695962298f8f615d1a514d7884045bb5aea1" Feb 17 14:17:30 crc kubenswrapper[4836]: I0217 14:17:30.166386 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08j84kd"] Feb 17 14:17:30 crc kubenswrapper[4836]: E0217 14:17:30.167154 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18a8fa91-916f-4b01-bf45-63e0add01572" containerName="collect-profiles" Feb 17 14:17:30 crc kubenswrapper[4836]: I0217 14:17:30.167170 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="18a8fa91-916f-4b01-bf45-63e0add01572" containerName="collect-profiles" Feb 17 14:17:30 crc kubenswrapper[4836]: I0217 14:17:30.167274 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="18a8fa91-916f-4b01-bf45-63e0add01572" containerName="collect-profiles" Feb 17 14:17:30 crc kubenswrapper[4836]: I0217 14:17:30.168367 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08j84kd" Feb 17 14:17:30 crc kubenswrapper[4836]: I0217 14:17:30.170449 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 17 14:17:30 crc kubenswrapper[4836]: I0217 14:17:30.186435 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08j84kd"] Feb 17 14:17:30 crc kubenswrapper[4836]: I0217 14:17:30.203326 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f611c52f-90dc-454e-8c3c-ca9d6a915f58-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08j84kd\" (UID: \"f611c52f-90dc-454e-8c3c-ca9d6a915f58\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08j84kd" Feb 17 14:17:30 crc kubenswrapper[4836]: I0217 14:17:30.203426 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f611c52f-90dc-454e-8c3c-ca9d6a915f58-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08j84kd\" (UID: \"f611c52f-90dc-454e-8c3c-ca9d6a915f58\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08j84kd" Feb 17 14:17:30 crc kubenswrapper[4836]: I0217 14:17:30.203460 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddgg8\" (UniqueName: \"kubernetes.io/projected/f611c52f-90dc-454e-8c3c-ca9d6a915f58-kube-api-access-ddgg8\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08j84kd\" (UID: \"f611c52f-90dc-454e-8c3c-ca9d6a915f58\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08j84kd" Feb 17 14:17:30 crc kubenswrapper[4836]: I0217 14:17:30.304325 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f611c52f-90dc-454e-8c3c-ca9d6a915f58-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08j84kd\" (UID: \"f611c52f-90dc-454e-8c3c-ca9d6a915f58\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08j84kd" Feb 17 14:17:30 crc kubenswrapper[4836]: I0217 14:17:30.304698 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f611c52f-90dc-454e-8c3c-ca9d6a915f58-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08j84kd\" (UID: \"f611c52f-90dc-454e-8c3c-ca9d6a915f58\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08j84kd" Feb 17 14:17:30 crc kubenswrapper[4836]: I0217 14:17:30.304808 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddgg8\" (UniqueName: \"kubernetes.io/projected/f611c52f-90dc-454e-8c3c-ca9d6a915f58-kube-api-access-ddgg8\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08j84kd\" (UID: \"f611c52f-90dc-454e-8c3c-ca9d6a915f58\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08j84kd" Feb 17 14:17:30 crc kubenswrapper[4836]: I0217 14:17:30.305334 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f611c52f-90dc-454e-8c3c-ca9d6a915f58-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08j84kd\" (UID: \"f611c52f-90dc-454e-8c3c-ca9d6a915f58\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08j84kd" Feb 17 14:17:30 crc kubenswrapper[4836]: I0217 14:17:30.305776 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f611c52f-90dc-454e-8c3c-ca9d6a915f58-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08j84kd\" (UID: \"f611c52f-90dc-454e-8c3c-ca9d6a915f58\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08j84kd" Feb 17 14:17:30 crc kubenswrapper[4836]: I0217 14:17:30.327056 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddgg8\" (UniqueName: \"kubernetes.io/projected/f611c52f-90dc-454e-8c3c-ca9d6a915f58-kube-api-access-ddgg8\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08j84kd\" (UID: \"f611c52f-90dc-454e-8c3c-ca9d6a915f58\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08j84kd" Feb 17 14:17:30 crc kubenswrapper[4836]: I0217 14:17:30.485720 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08j84kd" Feb 17 14:17:30 crc kubenswrapper[4836]: I0217 14:17:30.688326 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08j84kd"] Feb 17 14:17:30 crc kubenswrapper[4836]: I0217 14:17:30.792821 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08j84kd" event={"ID":"f611c52f-90dc-454e-8c3c-ca9d6a915f58","Type":"ContainerStarted","Data":"169f45bab279dc23066b81c45e03bb80038e08b42dbaf3f661014cf87fbe3efe"} Feb 17 14:17:31 crc kubenswrapper[4836]: I0217 14:17:31.800243 4836 generic.go:334] "Generic (PLEG): container finished" podID="f611c52f-90dc-454e-8c3c-ca9d6a915f58" containerID="ce843b6fe3e045a48ce3a1314ded9e63587f96156cbcca0054c5f79af5057933" exitCode=0 Feb 17 14:17:31 crc kubenswrapper[4836]: I0217 14:17:31.800358 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08j84kd" event={"ID":"f611c52f-90dc-454e-8c3c-ca9d6a915f58","Type":"ContainerDied","Data":"ce843b6fe3e045a48ce3a1314ded9e63587f96156cbcca0054c5f79af5057933"} Feb 17 14:17:31 crc kubenswrapper[4836]: I0217 14:17:31.802414 4836 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 17 14:17:33 crc kubenswrapper[4836]: I0217 14:17:33.817920 4836 generic.go:334] "Generic (PLEG): container finished" podID="f611c52f-90dc-454e-8c3c-ca9d6a915f58" containerID="874ff80b58db2b7d602ff4671cc1d0299855d3670e45dd9330d8c2a7336c6ed8" exitCode=0 Feb 17 14:17:33 crc kubenswrapper[4836]: I0217 14:17:33.817973 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08j84kd" event={"ID":"f611c52f-90dc-454e-8c3c-ca9d6a915f58","Type":"ContainerDied","Data":"874ff80b58db2b7d602ff4671cc1d0299855d3670e45dd9330d8c2a7336c6ed8"} Feb 17 14:17:34 crc kubenswrapper[4836]: I0217 14:17:34.826859 4836 generic.go:334] "Generic (PLEG): container finished" podID="f611c52f-90dc-454e-8c3c-ca9d6a915f58" containerID="3564e75ebe2d2e80922083e1796e4178d0f8b5b3b276be4aecae53c87752f7fb" exitCode=0 Feb 17 14:17:34 crc kubenswrapper[4836]: I0217 14:17:34.826922 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08j84kd" event={"ID":"f611c52f-90dc-454e-8c3c-ca9d6a915f58","Type":"ContainerDied","Data":"3564e75ebe2d2e80922083e1796e4178d0f8b5b3b276be4aecae53c87752f7fb"} Feb 17 14:17:36 crc kubenswrapper[4836]: I0217 14:17:36.121074 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08j84kd" Feb 17 14:17:36 crc kubenswrapper[4836]: I0217 14:17:36.188462 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ddgg8\" (UniqueName: \"kubernetes.io/projected/f611c52f-90dc-454e-8c3c-ca9d6a915f58-kube-api-access-ddgg8\") pod \"f611c52f-90dc-454e-8c3c-ca9d6a915f58\" (UID: \"f611c52f-90dc-454e-8c3c-ca9d6a915f58\") " Feb 17 14:17:36 crc kubenswrapper[4836]: I0217 14:17:36.188510 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f611c52f-90dc-454e-8c3c-ca9d6a915f58-bundle\") pod \"f611c52f-90dc-454e-8c3c-ca9d6a915f58\" (UID: \"f611c52f-90dc-454e-8c3c-ca9d6a915f58\") " Feb 17 14:17:36 crc kubenswrapper[4836]: I0217 14:17:36.188529 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f611c52f-90dc-454e-8c3c-ca9d6a915f58-util\") pod \"f611c52f-90dc-454e-8c3c-ca9d6a915f58\" (UID: \"f611c52f-90dc-454e-8c3c-ca9d6a915f58\") " Feb 17 14:17:36 crc kubenswrapper[4836]: I0217 14:17:36.190587 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f611c52f-90dc-454e-8c3c-ca9d6a915f58-bundle" (OuterVolumeSpecName: "bundle") pod "f611c52f-90dc-454e-8c3c-ca9d6a915f58" (UID: "f611c52f-90dc-454e-8c3c-ca9d6a915f58"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:17:36 crc kubenswrapper[4836]: I0217 14:17:36.193637 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f611c52f-90dc-454e-8c3c-ca9d6a915f58-kube-api-access-ddgg8" (OuterVolumeSpecName: "kube-api-access-ddgg8") pod "f611c52f-90dc-454e-8c3c-ca9d6a915f58" (UID: "f611c52f-90dc-454e-8c3c-ca9d6a915f58"). InnerVolumeSpecName "kube-api-access-ddgg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:17:36 crc kubenswrapper[4836]: I0217 14:17:36.208550 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f611c52f-90dc-454e-8c3c-ca9d6a915f58-util" (OuterVolumeSpecName: "util") pod "f611c52f-90dc-454e-8c3c-ca9d6a915f58" (UID: "f611c52f-90dc-454e-8c3c-ca9d6a915f58"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:17:36 crc kubenswrapper[4836]: I0217 14:17:36.289790 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ddgg8\" (UniqueName: \"kubernetes.io/projected/f611c52f-90dc-454e-8c3c-ca9d6a915f58-kube-api-access-ddgg8\") on node \"crc\" DevicePath \"\"" Feb 17 14:17:36 crc kubenswrapper[4836]: I0217 14:17:36.289837 4836 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f611c52f-90dc-454e-8c3c-ca9d6a915f58-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:17:36 crc kubenswrapper[4836]: I0217 14:17:36.289849 4836 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f611c52f-90dc-454e-8c3c-ca9d6a915f58-util\") on node \"crc\" DevicePath \"\"" Feb 17 14:17:36 crc kubenswrapper[4836]: I0217 14:17:36.838947 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08j84kd" event={"ID":"f611c52f-90dc-454e-8c3c-ca9d6a915f58","Type":"ContainerDied","Data":"169f45bab279dc23066b81c45e03bb80038e08b42dbaf3f661014cf87fbe3efe"} Feb 17 14:17:36 crc kubenswrapper[4836]: I0217 14:17:36.838996 4836 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="169f45bab279dc23066b81c45e03bb80038e08b42dbaf3f661014cf87fbe3efe" Feb 17 14:17:36 crc kubenswrapper[4836]: I0217 14:17:36.839031 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08j84kd" Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.453476 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-gfznp"] Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.454652 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" podUID="67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e" containerName="ovn-controller" containerID="cri-o://c6545ec908eecb35a4405050dd77343f0acd4ec7d54ec55853a926e00b7f37f2" gracePeriod=30 Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.455318 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" podUID="67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e" containerName="sbdb" containerID="cri-o://0c8d2edfc6a88391d034f310661e09eb58503f33f015b5e067654db06e8b152e" gracePeriod=30 Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.455394 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" podUID="67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e" containerName="nbdb" containerID="cri-o://f5576148df454de0fb3d15ea9de93736aab907ca8cf09669c3c25fde79ec6fad" gracePeriod=30 Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.455464 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" podUID="67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e" containerName="northd" containerID="cri-o://47fdc413501c45a2d6e531a1427fd6df3e9d2c57e3d60098b31fbfd695d98a0a" gracePeriod=30 Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.455528 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" podUID="67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://bb05318b17722c3f731dc976d38c05a4ab7eafd686ad0fcf17bd01032409eaee" gracePeriod=30 Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.455584 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" podUID="67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e" containerName="kube-rbac-proxy-node" containerID="cri-o://ebae2d4e3375fda616b0ea16c0c86a5c1c36004e7d22873a289ff37874bbc74b" gracePeriod=30 Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.455631 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" podUID="67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e" containerName="ovn-acl-logging" containerID="cri-o://1d36d269996169583e4e181aac060491640e74df94385276752f460c408273cc" gracePeriod=30 Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.494692 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" podUID="67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e" containerName="ovnkube-controller" containerID="cri-o://61bfc9a285863adea7e23ce49aaa4e592d2fd16c4e2e57632d659c3574f51775" gracePeriod=30 Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.791452 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gfznp_67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e/ovnkube-controller/3.log" Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.793067 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gfznp_67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e/ovn-acl-logging/0.log" Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.793682 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gfznp_67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e/ovn-controller/0.log" Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.794415 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.851817 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e-log-socket\") pod \"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\" (UID: \"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\") " Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.851865 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e-host-run-netns\") pod \"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\" (UID: \"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\") " Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.851907 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e-run-ovn\") pod \"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\" (UID: \"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\") " Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.851926 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e-run-systemd\") pod \"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\" (UID: \"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\") " Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.851949 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e-node-log\") pod \"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\" (UID: \"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\") " Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.851977 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e-host-cni-netd\") pod \"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\" (UID: \"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\") " Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.852001 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e-host-slash\") pod \"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\" (UID: \"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\") " Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.852019 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\" (UID: \"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\") " Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.852053 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e-ovnkube-config\") pod \"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\" (UID: \"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\") " Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.852092 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e-host-cni-bin\") pod \"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\" (UID: \"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\") " Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.852109 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e-run-openvswitch\") pod \"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\" (UID: \"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\") " Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.852134 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7zdwb\" (UniqueName: \"kubernetes.io/projected/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e-kube-api-access-7zdwb\") pod \"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\" (UID: \"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\") " Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.852155 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e-ovnkube-script-lib\") pod \"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\" (UID: \"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\") " Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.852176 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e-systemd-units\") pod \"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\" (UID: \"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\") " Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.852198 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e-etc-openvswitch\") pod \"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\" (UID: \"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\") " Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.852214 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e-var-lib-openvswitch\") pod \"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\" (UID: \"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\") " Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.852236 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e-ovn-node-metrics-cert\") pod \"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\" (UID: \"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\") " Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.852274 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e-host-run-ovn-kubernetes\") pod \"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\" (UID: \"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\") " Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.852321 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e-host-kubelet\") pod \"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\" (UID: \"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\") " Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.852344 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e-env-overrides\") pod \"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\" (UID: \"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e\") " Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.852967 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e" (UID: "67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.853014 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e-log-socket" (OuterVolumeSpecName: "log-socket") pod "67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e" (UID: "67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.853034 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e" (UID: "67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.853051 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e" (UID: "67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.853499 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e-node-log" (OuterVolumeSpecName: "node-log") pod "67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e" (UID: "67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.853570 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e" (UID: "67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.853593 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e-host-slash" (OuterVolumeSpecName: "host-slash") pod "67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e" (UID: "67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.853613 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e" (UID: "67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.853767 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e" (UID: "67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.853829 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e" (UID: "67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.853876 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e" (UID: "67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.853983 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e" (UID: "67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.854021 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e" (UID: "67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.854041 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e" (UID: "67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.857596 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e" (UID: "67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.857675 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e" (UID: "67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.857811 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e" (UID: "67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.859732 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e-kube-api-access-7zdwb" (OuterVolumeSpecName: "kube-api-access-7zdwb") pod "67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e" (UID: "67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e"). InnerVolumeSpecName "kube-api-access-7zdwb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.860938 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e" (UID: "67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.865603 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-nb8gc"] Feb 17 14:17:40 crc kubenswrapper[4836]: E0217 14:17:40.865868 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f611c52f-90dc-454e-8c3c-ca9d6a915f58" containerName="util" Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.865882 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="f611c52f-90dc-454e-8c3c-ca9d6a915f58" containerName="util" Feb 17 14:17:40 crc kubenswrapper[4836]: E0217 14:17:40.865950 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e" containerName="ovnkube-controller" Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.865962 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e" containerName="ovnkube-controller" Feb 17 14:17:40 crc kubenswrapper[4836]: E0217 14:17:40.865975 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f611c52f-90dc-454e-8c3c-ca9d6a915f58" containerName="extract" Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.865984 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="f611c52f-90dc-454e-8c3c-ca9d6a915f58" containerName="extract" Feb 17 14:17:40 crc kubenswrapper[4836]: E0217 14:17:40.865995 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f611c52f-90dc-454e-8c3c-ca9d6a915f58" containerName="pull" Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.866002 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="f611c52f-90dc-454e-8c3c-ca9d6a915f58" containerName="pull" Feb 17 14:17:40 crc kubenswrapper[4836]: E0217 14:17:40.866013 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e" containerName="ovnkube-controller" Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.866020 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e" containerName="ovnkube-controller" Feb 17 14:17:40 crc kubenswrapper[4836]: E0217 14:17:40.866028 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e" containerName="kube-rbac-proxy-ovn-metrics" Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.866035 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e" containerName="kube-rbac-proxy-ovn-metrics" Feb 17 14:17:40 crc kubenswrapper[4836]: E0217 14:17:40.866044 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e" containerName="nbdb" Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.866051 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e" containerName="nbdb" Feb 17 14:17:40 crc kubenswrapper[4836]: E0217 14:17:40.866064 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e" containerName="ovn-acl-logging" Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.866071 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e" containerName="ovn-acl-logging" Feb 17 14:17:40 crc kubenswrapper[4836]: E0217 14:17:40.866081 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e" containerName="kube-rbac-proxy-node" Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.866089 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e" containerName="kube-rbac-proxy-node" Feb 17 14:17:40 crc kubenswrapper[4836]: E0217 14:17:40.866101 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e" containerName="ovn-controller" Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.866108 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e" containerName="ovn-controller" Feb 17 14:17:40 crc kubenswrapper[4836]: E0217 14:17:40.866121 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e" containerName="northd" Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.866128 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e" containerName="northd" Feb 17 14:17:40 crc kubenswrapper[4836]: E0217 14:17:40.866137 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e" containerName="ovnkube-controller" Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.866145 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e" containerName="ovnkube-controller" Feb 17 14:17:40 crc kubenswrapper[4836]: E0217 14:17:40.866153 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e" containerName="sbdb" Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.866160 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e" containerName="sbdb" Feb 17 14:17:40 crc kubenswrapper[4836]: E0217 14:17:40.866195 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e" containerName="kubecfg-setup" Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.866202 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e" containerName="kubecfg-setup" Feb 17 14:17:40 crc kubenswrapper[4836]: E0217 14:17:40.866215 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e" containerName="ovnkube-controller" Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.866260 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e" containerName="ovnkube-controller" Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.866735 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e" containerName="kube-rbac-proxy-node" Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.866757 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e" containerName="sbdb" Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.867123 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e" containerName="ovnkube-controller" Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.867141 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e" containerName="ovnkube-controller" Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.867148 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e" containerName="ovnkube-controller" Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.867155 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e" containerName="ovnkube-controller" Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.867162 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e" containerName="ovn-acl-logging" Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.867171 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e" containerName="northd" Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.867179 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="f611c52f-90dc-454e-8c3c-ca9d6a915f58" containerName="extract" Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.867187 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e" containerName="nbdb" Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.867195 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e" containerName="ovn-controller" Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.867204 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e" containerName="kube-rbac-proxy-ovn-metrics" Feb 17 14:17:40 crc kubenswrapper[4836]: E0217 14:17:40.867528 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e" containerName="ovnkube-controller" Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.867543 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e" containerName="ovnkube-controller" Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.867830 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e" containerName="ovnkube-controller" Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.868060 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gfznp_67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e/ovnkube-controller/3.log" Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.870461 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gfznp_67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e/ovn-acl-logging/0.log" Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.871131 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gfznp_67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e/ovn-controller/0.log" Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.871789 4836 generic.go:334] "Generic (PLEG): container finished" podID="67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e" containerID="61bfc9a285863adea7e23ce49aaa4e592d2fd16c4e2e57632d659c3574f51775" exitCode=0 Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.871820 4836 generic.go:334] "Generic (PLEG): container finished" podID="67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e" containerID="0c8d2edfc6a88391d034f310661e09eb58503f33f015b5e067654db06e8b152e" exitCode=0 Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.871828 4836 generic.go:334] "Generic (PLEG): container finished" podID="67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e" containerID="f5576148df454de0fb3d15ea9de93736aab907ca8cf09669c3c25fde79ec6fad" exitCode=0 Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.871835 4836 generic.go:334] "Generic (PLEG): container finished" podID="67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e" containerID="47fdc413501c45a2d6e531a1427fd6df3e9d2c57e3d60098b31fbfd695d98a0a" exitCode=0 Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.871843 4836 generic.go:334] "Generic (PLEG): container finished" podID="67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e" containerID="bb05318b17722c3f731dc976d38c05a4ab7eafd686ad0fcf17bd01032409eaee" exitCode=0 Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.871849 4836 generic.go:334] "Generic (PLEG): container finished" podID="67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e" containerID="ebae2d4e3375fda616b0ea16c0c86a5c1c36004e7d22873a289ff37874bbc74b" exitCode=0 Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.871855 4836 generic.go:334] "Generic (PLEG): container finished" podID="67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e" containerID="1d36d269996169583e4e181aac060491640e74df94385276752f460c408273cc" exitCode=143 Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.871863 4836 generic.go:334] "Generic (PLEG): container finished" podID="67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e" containerID="c6545ec908eecb35a4405050dd77343f0acd4ec7d54ec55853a926e00b7f37f2" exitCode=143 Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.872032 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.872223 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e" (UID: "67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.874931 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" event={"ID":"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e","Type":"ContainerDied","Data":"61bfc9a285863adea7e23ce49aaa4e592d2fd16c4e2e57632d659c3574f51775"} Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.875730 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-nb8gc" Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.876504 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" event={"ID":"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e","Type":"ContainerDied","Data":"0c8d2edfc6a88391d034f310661e09eb58503f33f015b5e067654db06e8b152e"} Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.876559 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" event={"ID":"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e","Type":"ContainerDied","Data":"f5576148df454de0fb3d15ea9de93736aab907ca8cf09669c3c25fde79ec6fad"} Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.876574 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" event={"ID":"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e","Type":"ContainerDied","Data":"47fdc413501c45a2d6e531a1427fd6df3e9d2c57e3d60098b31fbfd695d98a0a"} Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.876585 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" event={"ID":"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e","Type":"ContainerDied","Data":"bb05318b17722c3f731dc976d38c05a4ab7eafd686ad0fcf17bd01032409eaee"} Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.876596 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" event={"ID":"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e","Type":"ContainerDied","Data":"ebae2d4e3375fda616b0ea16c0c86a5c1c36004e7d22873a289ff37874bbc74b"} Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.876606 4836 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"38fa1bf6b59359eb32319ce0184baa7dac44d10d2a0d9502169ea98eb3408c83"} Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.876628 4836 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0c8d2edfc6a88391d034f310661e09eb58503f33f015b5e067654db06e8b152e"} Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.876635 4836 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f5576148df454de0fb3d15ea9de93736aab907ca8cf09669c3c25fde79ec6fad"} Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.876640 4836 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"47fdc413501c45a2d6e531a1427fd6df3e9d2c57e3d60098b31fbfd695d98a0a"} Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.876645 4836 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bb05318b17722c3f731dc976d38c05a4ab7eafd686ad0fcf17bd01032409eaee"} Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.876650 4836 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ebae2d4e3375fda616b0ea16c0c86a5c1c36004e7d22873a289ff37874bbc74b"} Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.876655 4836 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1d36d269996169583e4e181aac060491640e74df94385276752f460c408273cc"} Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.876660 4836 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c6545ec908eecb35a4405050dd77343f0acd4ec7d54ec55853a926e00b7f37f2"} Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.876666 4836 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"81e6d07edc7d7857cb288980bc7e9d5002234c2fe642df6798611b150b6d66e9"} Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.876673 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" event={"ID":"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e","Type":"ContainerDied","Data":"1d36d269996169583e4e181aac060491640e74df94385276752f460c408273cc"} Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.876681 4836 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"61bfc9a285863adea7e23ce49aaa4e592d2fd16c4e2e57632d659c3574f51775"} Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.876687 4836 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"38fa1bf6b59359eb32319ce0184baa7dac44d10d2a0d9502169ea98eb3408c83"} Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.876692 4836 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0c8d2edfc6a88391d034f310661e09eb58503f33f015b5e067654db06e8b152e"} Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.876696 4836 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f5576148df454de0fb3d15ea9de93736aab907ca8cf09669c3c25fde79ec6fad"} Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.876702 4836 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"47fdc413501c45a2d6e531a1427fd6df3e9d2c57e3d60098b31fbfd695d98a0a"} Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.876707 4836 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bb05318b17722c3f731dc976d38c05a4ab7eafd686ad0fcf17bd01032409eaee"} Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.876712 4836 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ebae2d4e3375fda616b0ea16c0c86a5c1c36004e7d22873a289ff37874bbc74b"} Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.876717 4836 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1d36d269996169583e4e181aac060491640e74df94385276752f460c408273cc"} Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.876722 4836 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c6545ec908eecb35a4405050dd77343f0acd4ec7d54ec55853a926e00b7f37f2"} Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.876727 4836 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"81e6d07edc7d7857cb288980bc7e9d5002234c2fe642df6798611b150b6d66e9"} Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.876734 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" event={"ID":"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e","Type":"ContainerDied","Data":"c6545ec908eecb35a4405050dd77343f0acd4ec7d54ec55853a926e00b7f37f2"} Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.876744 4836 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"61bfc9a285863adea7e23ce49aaa4e592d2fd16c4e2e57632d659c3574f51775"} Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.876751 4836 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"38fa1bf6b59359eb32319ce0184baa7dac44d10d2a0d9502169ea98eb3408c83"} Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.876756 4836 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0c8d2edfc6a88391d034f310661e09eb58503f33f015b5e067654db06e8b152e"} Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.876761 4836 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f5576148df454de0fb3d15ea9de93736aab907ca8cf09669c3c25fde79ec6fad"} Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.876766 4836 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"47fdc413501c45a2d6e531a1427fd6df3e9d2c57e3d60098b31fbfd695d98a0a"} Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.876771 4836 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bb05318b17722c3f731dc976d38c05a4ab7eafd686ad0fcf17bd01032409eaee"} Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.876776 4836 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ebae2d4e3375fda616b0ea16c0c86a5c1c36004e7d22873a289ff37874bbc74b"} Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.876781 4836 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1d36d269996169583e4e181aac060491640e74df94385276752f460c408273cc"} Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.876786 4836 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c6545ec908eecb35a4405050dd77343f0acd4ec7d54ec55853a926e00b7f37f2"} Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.876792 4836 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"81e6d07edc7d7857cb288980bc7e9d5002234c2fe642df6798611b150b6d66e9"} Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.876855 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gfznp" event={"ID":"67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e","Type":"ContainerDied","Data":"3bdc7f19fb50c4c29fda01e2e231206d0048a98eab720a4ee93274d360c514d1"} Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.876865 4836 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"61bfc9a285863adea7e23ce49aaa4e592d2fd16c4e2e57632d659c3574f51775"} Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.876871 4836 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"38fa1bf6b59359eb32319ce0184baa7dac44d10d2a0d9502169ea98eb3408c83"} Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.876876 4836 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0c8d2edfc6a88391d034f310661e09eb58503f33f015b5e067654db06e8b152e"} Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.876881 4836 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f5576148df454de0fb3d15ea9de93736aab907ca8cf09669c3c25fde79ec6fad"} Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.876886 4836 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"47fdc413501c45a2d6e531a1427fd6df3e9d2c57e3d60098b31fbfd695d98a0a"} Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.876891 4836 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bb05318b17722c3f731dc976d38c05a4ab7eafd686ad0fcf17bd01032409eaee"} Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.876896 4836 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ebae2d4e3375fda616b0ea16c0c86a5c1c36004e7d22873a289ff37874bbc74b"} Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.876902 4836 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1d36d269996169583e4e181aac060491640e74df94385276752f460c408273cc"} Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.876907 4836 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c6545ec908eecb35a4405050dd77343f0acd4ec7d54ec55853a926e00b7f37f2"} Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.876912 4836 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"81e6d07edc7d7857cb288980bc7e9d5002234c2fe642df6798611b150b6d66e9"} Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.876929 4836 scope.go:117] "RemoveContainer" containerID="61bfc9a285863adea7e23ce49aaa4e592d2fd16c4e2e57632d659c3574f51775" Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.876977 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-c76cc_592aa549-1b1b-441e-93e4-0821e05ff2b2/kube-multus/2.log" Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.877585 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-c76cc_592aa549-1b1b-441e-93e4-0821e05ff2b2/kube-multus/1.log" Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.877619 4836 generic.go:334] "Generic (PLEG): container finished" podID="592aa549-1b1b-441e-93e4-0821e05ff2b2" containerID="d7051348fa11415bbd3ca42ccce04342cfc29fef1e5015e7fedf40514e49824c" exitCode=2 Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.877650 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-c76cc" event={"ID":"592aa549-1b1b-441e-93e4-0821e05ff2b2","Type":"ContainerDied","Data":"d7051348fa11415bbd3ca42ccce04342cfc29fef1e5015e7fedf40514e49824c"} Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.877675 4836 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b64d012815880d0ba6314438a96b0ffd6bdad4678135a9bd3c9c2a4d6eb83c41"} Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.878134 4836 scope.go:117] "RemoveContainer" containerID="d7051348fa11415bbd3ca42ccce04342cfc29fef1e5015e7fedf40514e49824c" Feb 17 14:17:40 crc kubenswrapper[4836]: E0217 14:17:40.878368 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-c76cc_openshift-multus(592aa549-1b1b-441e-93e4-0821e05ff2b2)\"" pod="openshift-multus/multus-c76cc" podUID="592aa549-1b1b-441e-93e4-0821e05ff2b2" Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.898646 4836 scope.go:117] "RemoveContainer" containerID="38fa1bf6b59359eb32319ce0184baa7dac44d10d2a0d9502169ea98eb3408c83" Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.954206 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/dd545c20-7aca-4536-84b1-826c46c009f0-host-run-netns\") pod \"ovnkube-node-nb8gc\" (UID: \"dd545c20-7aca-4536-84b1-826c46c009f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-nb8gc" Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.954269 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dd545c20-7aca-4536-84b1-826c46c009f0-etc-openvswitch\") pod \"ovnkube-node-nb8gc\" (UID: \"dd545c20-7aca-4536-84b1-826c46c009f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-nb8gc" Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.954322 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/dd545c20-7aca-4536-84b1-826c46c009f0-ovnkube-config\") pod \"ovnkube-node-nb8gc\" (UID: \"dd545c20-7aca-4536-84b1-826c46c009f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-nb8gc" Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.954358 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/dd545c20-7aca-4536-84b1-826c46c009f0-host-slash\") pod \"ovnkube-node-nb8gc\" (UID: \"dd545c20-7aca-4536-84b1-826c46c009f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-nb8gc" Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.954380 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/dd545c20-7aca-4536-84b1-826c46c009f0-node-log\") pod \"ovnkube-node-nb8gc\" (UID: \"dd545c20-7aca-4536-84b1-826c46c009f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-nb8gc" Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.954408 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/dd545c20-7aca-4536-84b1-826c46c009f0-ovn-node-metrics-cert\") pod \"ovnkube-node-nb8gc\" (UID: \"dd545c20-7aca-4536-84b1-826c46c009f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-nb8gc" Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.954511 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dd545c20-7aca-4536-84b1-826c46c009f0-run-openvswitch\") pod \"ovnkube-node-nb8gc\" (UID: \"dd545c20-7aca-4536-84b1-826c46c009f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-nb8gc" Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.954604 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/dd545c20-7aca-4536-84b1-826c46c009f0-host-cni-netd\") pod \"ovnkube-node-nb8gc\" (UID: \"dd545c20-7aca-4536-84b1-826c46c009f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-nb8gc" Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.954689 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/dd545c20-7aca-4536-84b1-826c46c009f0-host-cni-bin\") pod \"ovnkube-node-nb8gc\" (UID: \"dd545c20-7aca-4536-84b1-826c46c009f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-nb8gc" Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.954745 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/dd545c20-7aca-4536-84b1-826c46c009f0-host-kubelet\") pod \"ovnkube-node-nb8gc\" (UID: \"dd545c20-7aca-4536-84b1-826c46c009f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-nb8gc" Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.954906 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n99sx\" (UniqueName: \"kubernetes.io/projected/dd545c20-7aca-4536-84b1-826c46c009f0-kube-api-access-n99sx\") pod \"ovnkube-node-nb8gc\" (UID: \"dd545c20-7aca-4536-84b1-826c46c009f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-nb8gc" Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.954956 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/dd545c20-7aca-4536-84b1-826c46c009f0-run-systemd\") pod \"ovnkube-node-nb8gc\" (UID: \"dd545c20-7aca-4536-84b1-826c46c009f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-nb8gc" Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.954989 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/dd545c20-7aca-4536-84b1-826c46c009f0-log-socket\") pod \"ovnkube-node-nb8gc\" (UID: \"dd545c20-7aca-4536-84b1-826c46c009f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-nb8gc" Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.955010 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/dd545c20-7aca-4536-84b1-826c46c009f0-systemd-units\") pod \"ovnkube-node-nb8gc\" (UID: \"dd545c20-7aca-4536-84b1-826c46c009f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-nb8gc" Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.955043 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dd545c20-7aca-4536-84b1-826c46c009f0-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-nb8gc\" (UID: \"dd545c20-7aca-4536-84b1-826c46c009f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-nb8gc" Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.955087 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dd545c20-7aca-4536-84b1-826c46c009f0-var-lib-openvswitch\") pod \"ovnkube-node-nb8gc\" (UID: \"dd545c20-7aca-4536-84b1-826c46c009f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-nb8gc" Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.955119 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/dd545c20-7aca-4536-84b1-826c46c009f0-run-ovn\") pod \"ovnkube-node-nb8gc\" (UID: \"dd545c20-7aca-4536-84b1-826c46c009f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-nb8gc" Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.955166 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/dd545c20-7aca-4536-84b1-826c46c009f0-env-overrides\") pod \"ovnkube-node-nb8gc\" (UID: \"dd545c20-7aca-4536-84b1-826c46c009f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-nb8gc" Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.955186 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dd545c20-7aca-4536-84b1-826c46c009f0-host-run-ovn-kubernetes\") pod \"ovnkube-node-nb8gc\" (UID: \"dd545c20-7aca-4536-84b1-826c46c009f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-nb8gc" Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.955510 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/dd545c20-7aca-4536-84b1-826c46c009f0-ovnkube-script-lib\") pod \"ovnkube-node-nb8gc\" (UID: \"dd545c20-7aca-4536-84b1-826c46c009f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-nb8gc" Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.955709 4836 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.955729 4836 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e-host-kubelet\") on node \"crc\" DevicePath \"\"" Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.955740 4836 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.955751 4836 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e-log-socket\") on node \"crc\" DevicePath \"\"" Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.955760 4836 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e-host-run-netns\") on node \"crc\" DevicePath \"\"" Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.955770 4836 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.955779 4836 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e-run-systemd\") on node \"crc\" DevicePath \"\"" Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.955786 4836 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e-node-log\") on node \"crc\" DevicePath \"\"" Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.955795 4836 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e-host-cni-netd\") on node \"crc\" DevicePath \"\"" Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.955804 4836 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e-host-slash\") on node \"crc\" DevicePath \"\"" Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.955815 4836 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.955828 4836 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.955837 4836 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e-host-cni-bin\") on node \"crc\" DevicePath \"\"" Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.955845 4836 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e-run-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.955854 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7zdwb\" (UniqueName: \"kubernetes.io/projected/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e-kube-api-access-7zdwb\") on node \"crc\" DevicePath \"\"" Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.955863 4836 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.955871 4836 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e-systemd-units\") on node \"crc\" DevicePath \"\"" Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.955882 4836 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.955890 4836 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.955899 4836 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.963902 4836 scope.go:117] "RemoveContainer" containerID="0c8d2edfc6a88391d034f310661e09eb58503f33f015b5e067654db06e8b152e" Feb 17 14:17:40 crc kubenswrapper[4836]: I0217 14:17:40.987634 4836 scope.go:117] "RemoveContainer" containerID="f5576148df454de0fb3d15ea9de93736aab907ca8cf09669c3c25fde79ec6fad" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.002037 4836 scope.go:117] "RemoveContainer" containerID="47fdc413501c45a2d6e531a1427fd6df3e9d2c57e3d60098b31fbfd695d98a0a" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.014267 4836 scope.go:117] "RemoveContainer" containerID="bb05318b17722c3f731dc976d38c05a4ab7eafd686ad0fcf17bd01032409eaee" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.025731 4836 scope.go:117] "RemoveContainer" containerID="ebae2d4e3375fda616b0ea16c0c86a5c1c36004e7d22873a289ff37874bbc74b" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.036483 4836 scope.go:117] "RemoveContainer" containerID="1d36d269996169583e4e181aac060491640e74df94385276752f460c408273cc" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.053513 4836 scope.go:117] "RemoveContainer" containerID="c6545ec908eecb35a4405050dd77343f0acd4ec7d54ec55853a926e00b7f37f2" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.057090 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/dd545c20-7aca-4536-84b1-826c46c009f0-host-cni-netd\") pod \"ovnkube-node-nb8gc\" (UID: \"dd545c20-7aca-4536-84b1-826c46c009f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-nb8gc" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.057131 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/dd545c20-7aca-4536-84b1-826c46c009f0-host-kubelet\") pod \"ovnkube-node-nb8gc\" (UID: \"dd545c20-7aca-4536-84b1-826c46c009f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-nb8gc" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.057154 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/dd545c20-7aca-4536-84b1-826c46c009f0-host-cni-bin\") pod \"ovnkube-node-nb8gc\" (UID: \"dd545c20-7aca-4536-84b1-826c46c009f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-nb8gc" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.057192 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n99sx\" (UniqueName: \"kubernetes.io/projected/dd545c20-7aca-4536-84b1-826c46c009f0-kube-api-access-n99sx\") pod \"ovnkube-node-nb8gc\" (UID: \"dd545c20-7aca-4536-84b1-826c46c009f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-nb8gc" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.057219 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/dd545c20-7aca-4536-84b1-826c46c009f0-run-systemd\") pod \"ovnkube-node-nb8gc\" (UID: \"dd545c20-7aca-4536-84b1-826c46c009f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-nb8gc" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.057241 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/dd545c20-7aca-4536-84b1-826c46c009f0-log-socket\") pod \"ovnkube-node-nb8gc\" (UID: \"dd545c20-7aca-4536-84b1-826c46c009f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-nb8gc" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.057254 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/dd545c20-7aca-4536-84b1-826c46c009f0-host-cni-bin\") pod \"ovnkube-node-nb8gc\" (UID: \"dd545c20-7aca-4536-84b1-826c46c009f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-nb8gc" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.057263 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/dd545c20-7aca-4536-84b1-826c46c009f0-systemd-units\") pod \"ovnkube-node-nb8gc\" (UID: \"dd545c20-7aca-4536-84b1-826c46c009f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-nb8gc" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.057313 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/dd545c20-7aca-4536-84b1-826c46c009f0-systemd-units\") pod \"ovnkube-node-nb8gc\" (UID: \"dd545c20-7aca-4536-84b1-826c46c009f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-nb8gc" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.057317 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dd545c20-7aca-4536-84b1-826c46c009f0-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-nb8gc\" (UID: \"dd545c20-7aca-4536-84b1-826c46c009f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-nb8gc" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.057314 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/dd545c20-7aca-4536-84b1-826c46c009f0-host-kubelet\") pod \"ovnkube-node-nb8gc\" (UID: \"dd545c20-7aca-4536-84b1-826c46c009f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-nb8gc" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.057342 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dd545c20-7aca-4536-84b1-826c46c009f0-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-nb8gc\" (UID: \"dd545c20-7aca-4536-84b1-826c46c009f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-nb8gc" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.057356 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dd545c20-7aca-4536-84b1-826c46c009f0-var-lib-openvswitch\") pod \"ovnkube-node-nb8gc\" (UID: \"dd545c20-7aca-4536-84b1-826c46c009f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-nb8gc" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.057389 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/dd545c20-7aca-4536-84b1-826c46c009f0-run-ovn\") pod \"ovnkube-node-nb8gc\" (UID: \"dd545c20-7aca-4536-84b1-826c46c009f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-nb8gc" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.057389 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/dd545c20-7aca-4536-84b1-826c46c009f0-run-systemd\") pod \"ovnkube-node-nb8gc\" (UID: \"dd545c20-7aca-4536-84b1-826c46c009f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-nb8gc" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.057422 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/dd545c20-7aca-4536-84b1-826c46c009f0-env-overrides\") pod \"ovnkube-node-nb8gc\" (UID: \"dd545c20-7aca-4536-84b1-826c46c009f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-nb8gc" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.057435 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dd545c20-7aca-4536-84b1-826c46c009f0-var-lib-openvswitch\") pod \"ovnkube-node-nb8gc\" (UID: \"dd545c20-7aca-4536-84b1-826c46c009f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-nb8gc" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.057429 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/dd545c20-7aca-4536-84b1-826c46c009f0-log-socket\") pod \"ovnkube-node-nb8gc\" (UID: \"dd545c20-7aca-4536-84b1-826c46c009f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-nb8gc" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.057472 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dd545c20-7aca-4536-84b1-826c46c009f0-host-run-ovn-kubernetes\") pod \"ovnkube-node-nb8gc\" (UID: \"dd545c20-7aca-4536-84b1-826c46c009f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-nb8gc" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.057463 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/dd545c20-7aca-4536-84b1-826c46c009f0-run-ovn\") pod \"ovnkube-node-nb8gc\" (UID: \"dd545c20-7aca-4536-84b1-826c46c009f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-nb8gc" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.057447 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dd545c20-7aca-4536-84b1-826c46c009f0-host-run-ovn-kubernetes\") pod \"ovnkube-node-nb8gc\" (UID: \"dd545c20-7aca-4536-84b1-826c46c009f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-nb8gc" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.057588 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/dd545c20-7aca-4536-84b1-826c46c009f0-ovnkube-script-lib\") pod \"ovnkube-node-nb8gc\" (UID: \"dd545c20-7aca-4536-84b1-826c46c009f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-nb8gc" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.057653 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/dd545c20-7aca-4536-84b1-826c46c009f0-host-run-netns\") pod \"ovnkube-node-nb8gc\" (UID: \"dd545c20-7aca-4536-84b1-826c46c009f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-nb8gc" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.057696 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dd545c20-7aca-4536-84b1-826c46c009f0-etc-openvswitch\") pod \"ovnkube-node-nb8gc\" (UID: \"dd545c20-7aca-4536-84b1-826c46c009f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-nb8gc" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.057728 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/dd545c20-7aca-4536-84b1-826c46c009f0-ovnkube-config\") pod \"ovnkube-node-nb8gc\" (UID: \"dd545c20-7aca-4536-84b1-826c46c009f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-nb8gc" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.057781 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/dd545c20-7aca-4536-84b1-826c46c009f0-host-slash\") pod \"ovnkube-node-nb8gc\" (UID: \"dd545c20-7aca-4536-84b1-826c46c009f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-nb8gc" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.057818 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/dd545c20-7aca-4536-84b1-826c46c009f0-node-log\") pod \"ovnkube-node-nb8gc\" (UID: \"dd545c20-7aca-4536-84b1-826c46c009f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-nb8gc" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.057861 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/dd545c20-7aca-4536-84b1-826c46c009f0-ovn-node-metrics-cert\") pod \"ovnkube-node-nb8gc\" (UID: \"dd545c20-7aca-4536-84b1-826c46c009f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-nb8gc" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.057904 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dd545c20-7aca-4536-84b1-826c46c009f0-run-openvswitch\") pod \"ovnkube-node-nb8gc\" (UID: \"dd545c20-7aca-4536-84b1-826c46c009f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-nb8gc" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.057995 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/dd545c20-7aca-4536-84b1-826c46c009f0-env-overrides\") pod \"ovnkube-node-nb8gc\" (UID: \"dd545c20-7aca-4536-84b1-826c46c009f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-nb8gc" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.058050 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dd545c20-7aca-4536-84b1-826c46c009f0-run-openvswitch\") pod \"ovnkube-node-nb8gc\" (UID: \"dd545c20-7aca-4536-84b1-826c46c009f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-nb8gc" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.058109 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/dd545c20-7aca-4536-84b1-826c46c009f0-host-slash\") pod \"ovnkube-node-nb8gc\" (UID: \"dd545c20-7aca-4536-84b1-826c46c009f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-nb8gc" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.058237 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/dd545c20-7aca-4536-84b1-826c46c009f0-node-log\") pod \"ovnkube-node-nb8gc\" (UID: \"dd545c20-7aca-4536-84b1-826c46c009f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-nb8gc" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.058358 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/dd545c20-7aca-4536-84b1-826c46c009f0-ovnkube-script-lib\") pod \"ovnkube-node-nb8gc\" (UID: \"dd545c20-7aca-4536-84b1-826c46c009f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-nb8gc" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.058372 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/dd545c20-7aca-4536-84b1-826c46c009f0-host-run-netns\") pod \"ovnkube-node-nb8gc\" (UID: \"dd545c20-7aca-4536-84b1-826c46c009f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-nb8gc" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.058372 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dd545c20-7aca-4536-84b1-826c46c009f0-etc-openvswitch\") pod \"ovnkube-node-nb8gc\" (UID: \"dd545c20-7aca-4536-84b1-826c46c009f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-nb8gc" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.058430 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/dd545c20-7aca-4536-84b1-826c46c009f0-host-cni-netd\") pod \"ovnkube-node-nb8gc\" (UID: \"dd545c20-7aca-4536-84b1-826c46c009f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-nb8gc" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.058615 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/dd545c20-7aca-4536-84b1-826c46c009f0-ovnkube-config\") pod \"ovnkube-node-nb8gc\" (UID: \"dd545c20-7aca-4536-84b1-826c46c009f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-nb8gc" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.061762 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/dd545c20-7aca-4536-84b1-826c46c009f0-ovn-node-metrics-cert\") pod \"ovnkube-node-nb8gc\" (UID: \"dd545c20-7aca-4536-84b1-826c46c009f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-nb8gc" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.070117 4836 scope.go:117] "RemoveContainer" containerID="81e6d07edc7d7857cb288980bc7e9d5002234c2fe642df6798611b150b6d66e9" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.077351 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n99sx\" (UniqueName: \"kubernetes.io/projected/dd545c20-7aca-4536-84b1-826c46c009f0-kube-api-access-n99sx\") pod \"ovnkube-node-nb8gc\" (UID: \"dd545c20-7aca-4536-84b1-826c46c009f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-nb8gc" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.083013 4836 scope.go:117] "RemoveContainer" containerID="61bfc9a285863adea7e23ce49aaa4e592d2fd16c4e2e57632d659c3574f51775" Feb 17 14:17:41 crc kubenswrapper[4836]: E0217 14:17:41.083621 4836 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61bfc9a285863adea7e23ce49aaa4e592d2fd16c4e2e57632d659c3574f51775\": container with ID starting with 61bfc9a285863adea7e23ce49aaa4e592d2fd16c4e2e57632d659c3574f51775 not found: ID does not exist" containerID="61bfc9a285863adea7e23ce49aaa4e592d2fd16c4e2e57632d659c3574f51775" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.083692 4836 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61bfc9a285863adea7e23ce49aaa4e592d2fd16c4e2e57632d659c3574f51775"} err="failed to get container status \"61bfc9a285863adea7e23ce49aaa4e592d2fd16c4e2e57632d659c3574f51775\": rpc error: code = NotFound desc = could not find container \"61bfc9a285863adea7e23ce49aaa4e592d2fd16c4e2e57632d659c3574f51775\": container with ID starting with 61bfc9a285863adea7e23ce49aaa4e592d2fd16c4e2e57632d659c3574f51775 not found: ID does not exist" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.083723 4836 scope.go:117] "RemoveContainer" containerID="38fa1bf6b59359eb32319ce0184baa7dac44d10d2a0d9502169ea98eb3408c83" Feb 17 14:17:41 crc kubenswrapper[4836]: E0217 14:17:41.084068 4836 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38fa1bf6b59359eb32319ce0184baa7dac44d10d2a0d9502169ea98eb3408c83\": container with ID starting with 38fa1bf6b59359eb32319ce0184baa7dac44d10d2a0d9502169ea98eb3408c83 not found: ID does not exist" containerID="38fa1bf6b59359eb32319ce0184baa7dac44d10d2a0d9502169ea98eb3408c83" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.084134 4836 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38fa1bf6b59359eb32319ce0184baa7dac44d10d2a0d9502169ea98eb3408c83"} err="failed to get container status \"38fa1bf6b59359eb32319ce0184baa7dac44d10d2a0d9502169ea98eb3408c83\": rpc error: code = NotFound desc = could not find container \"38fa1bf6b59359eb32319ce0184baa7dac44d10d2a0d9502169ea98eb3408c83\": container with ID starting with 38fa1bf6b59359eb32319ce0184baa7dac44d10d2a0d9502169ea98eb3408c83 not found: ID does not exist" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.084164 4836 scope.go:117] "RemoveContainer" containerID="0c8d2edfc6a88391d034f310661e09eb58503f33f015b5e067654db06e8b152e" Feb 17 14:17:41 crc kubenswrapper[4836]: E0217 14:17:41.084519 4836 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c8d2edfc6a88391d034f310661e09eb58503f33f015b5e067654db06e8b152e\": container with ID starting with 0c8d2edfc6a88391d034f310661e09eb58503f33f015b5e067654db06e8b152e not found: ID does not exist" containerID="0c8d2edfc6a88391d034f310661e09eb58503f33f015b5e067654db06e8b152e" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.084552 4836 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c8d2edfc6a88391d034f310661e09eb58503f33f015b5e067654db06e8b152e"} err="failed to get container status \"0c8d2edfc6a88391d034f310661e09eb58503f33f015b5e067654db06e8b152e\": rpc error: code = NotFound desc = could not find container \"0c8d2edfc6a88391d034f310661e09eb58503f33f015b5e067654db06e8b152e\": container with ID starting with 0c8d2edfc6a88391d034f310661e09eb58503f33f015b5e067654db06e8b152e not found: ID does not exist" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.084572 4836 scope.go:117] "RemoveContainer" containerID="f5576148df454de0fb3d15ea9de93736aab907ca8cf09669c3c25fde79ec6fad" Feb 17 14:17:41 crc kubenswrapper[4836]: E0217 14:17:41.084797 4836 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5576148df454de0fb3d15ea9de93736aab907ca8cf09669c3c25fde79ec6fad\": container with ID starting with f5576148df454de0fb3d15ea9de93736aab907ca8cf09669c3c25fde79ec6fad not found: ID does not exist" containerID="f5576148df454de0fb3d15ea9de93736aab907ca8cf09669c3c25fde79ec6fad" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.084900 4836 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5576148df454de0fb3d15ea9de93736aab907ca8cf09669c3c25fde79ec6fad"} err="failed to get container status \"f5576148df454de0fb3d15ea9de93736aab907ca8cf09669c3c25fde79ec6fad\": rpc error: code = NotFound desc = could not find container \"f5576148df454de0fb3d15ea9de93736aab907ca8cf09669c3c25fde79ec6fad\": container with ID starting with f5576148df454de0fb3d15ea9de93736aab907ca8cf09669c3c25fde79ec6fad not found: ID does not exist" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.084938 4836 scope.go:117] "RemoveContainer" containerID="47fdc413501c45a2d6e531a1427fd6df3e9d2c57e3d60098b31fbfd695d98a0a" Feb 17 14:17:41 crc kubenswrapper[4836]: E0217 14:17:41.085189 4836 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47fdc413501c45a2d6e531a1427fd6df3e9d2c57e3d60098b31fbfd695d98a0a\": container with ID starting with 47fdc413501c45a2d6e531a1427fd6df3e9d2c57e3d60098b31fbfd695d98a0a not found: ID does not exist" containerID="47fdc413501c45a2d6e531a1427fd6df3e9d2c57e3d60098b31fbfd695d98a0a" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.085236 4836 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47fdc413501c45a2d6e531a1427fd6df3e9d2c57e3d60098b31fbfd695d98a0a"} err="failed to get container status \"47fdc413501c45a2d6e531a1427fd6df3e9d2c57e3d60098b31fbfd695d98a0a\": rpc error: code = NotFound desc = could not find container \"47fdc413501c45a2d6e531a1427fd6df3e9d2c57e3d60098b31fbfd695d98a0a\": container with ID starting with 47fdc413501c45a2d6e531a1427fd6df3e9d2c57e3d60098b31fbfd695d98a0a not found: ID does not exist" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.085254 4836 scope.go:117] "RemoveContainer" containerID="bb05318b17722c3f731dc976d38c05a4ab7eafd686ad0fcf17bd01032409eaee" Feb 17 14:17:41 crc kubenswrapper[4836]: E0217 14:17:41.094940 4836 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb05318b17722c3f731dc976d38c05a4ab7eafd686ad0fcf17bd01032409eaee\": container with ID starting with bb05318b17722c3f731dc976d38c05a4ab7eafd686ad0fcf17bd01032409eaee not found: ID does not exist" containerID="bb05318b17722c3f731dc976d38c05a4ab7eafd686ad0fcf17bd01032409eaee" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.095103 4836 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb05318b17722c3f731dc976d38c05a4ab7eafd686ad0fcf17bd01032409eaee"} err="failed to get container status \"bb05318b17722c3f731dc976d38c05a4ab7eafd686ad0fcf17bd01032409eaee\": rpc error: code = NotFound desc = could not find container \"bb05318b17722c3f731dc976d38c05a4ab7eafd686ad0fcf17bd01032409eaee\": container with ID starting with bb05318b17722c3f731dc976d38c05a4ab7eafd686ad0fcf17bd01032409eaee not found: ID does not exist" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.095133 4836 scope.go:117] "RemoveContainer" containerID="ebae2d4e3375fda616b0ea16c0c86a5c1c36004e7d22873a289ff37874bbc74b" Feb 17 14:17:41 crc kubenswrapper[4836]: E0217 14:17:41.095709 4836 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebae2d4e3375fda616b0ea16c0c86a5c1c36004e7d22873a289ff37874bbc74b\": container with ID starting with ebae2d4e3375fda616b0ea16c0c86a5c1c36004e7d22873a289ff37874bbc74b not found: ID does not exist" containerID="ebae2d4e3375fda616b0ea16c0c86a5c1c36004e7d22873a289ff37874bbc74b" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.095741 4836 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebae2d4e3375fda616b0ea16c0c86a5c1c36004e7d22873a289ff37874bbc74b"} err="failed to get container status \"ebae2d4e3375fda616b0ea16c0c86a5c1c36004e7d22873a289ff37874bbc74b\": rpc error: code = NotFound desc = could not find container \"ebae2d4e3375fda616b0ea16c0c86a5c1c36004e7d22873a289ff37874bbc74b\": container with ID starting with ebae2d4e3375fda616b0ea16c0c86a5c1c36004e7d22873a289ff37874bbc74b not found: ID does not exist" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.095762 4836 scope.go:117] "RemoveContainer" containerID="1d36d269996169583e4e181aac060491640e74df94385276752f460c408273cc" Feb 17 14:17:41 crc kubenswrapper[4836]: E0217 14:17:41.096410 4836 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d36d269996169583e4e181aac060491640e74df94385276752f460c408273cc\": container with ID starting with 1d36d269996169583e4e181aac060491640e74df94385276752f460c408273cc not found: ID does not exist" containerID="1d36d269996169583e4e181aac060491640e74df94385276752f460c408273cc" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.096456 4836 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d36d269996169583e4e181aac060491640e74df94385276752f460c408273cc"} err="failed to get container status \"1d36d269996169583e4e181aac060491640e74df94385276752f460c408273cc\": rpc error: code = NotFound desc = could not find container \"1d36d269996169583e4e181aac060491640e74df94385276752f460c408273cc\": container with ID starting with 1d36d269996169583e4e181aac060491640e74df94385276752f460c408273cc not found: ID does not exist" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.096475 4836 scope.go:117] "RemoveContainer" containerID="c6545ec908eecb35a4405050dd77343f0acd4ec7d54ec55853a926e00b7f37f2" Feb 17 14:17:41 crc kubenswrapper[4836]: E0217 14:17:41.097551 4836 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6545ec908eecb35a4405050dd77343f0acd4ec7d54ec55853a926e00b7f37f2\": container with ID starting with c6545ec908eecb35a4405050dd77343f0acd4ec7d54ec55853a926e00b7f37f2 not found: ID does not exist" containerID="c6545ec908eecb35a4405050dd77343f0acd4ec7d54ec55853a926e00b7f37f2" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.097602 4836 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6545ec908eecb35a4405050dd77343f0acd4ec7d54ec55853a926e00b7f37f2"} err="failed to get container status \"c6545ec908eecb35a4405050dd77343f0acd4ec7d54ec55853a926e00b7f37f2\": rpc error: code = NotFound desc = could not find container \"c6545ec908eecb35a4405050dd77343f0acd4ec7d54ec55853a926e00b7f37f2\": container with ID starting with c6545ec908eecb35a4405050dd77343f0acd4ec7d54ec55853a926e00b7f37f2 not found: ID does not exist" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.097638 4836 scope.go:117] "RemoveContainer" containerID="81e6d07edc7d7857cb288980bc7e9d5002234c2fe642df6798611b150b6d66e9" Feb 17 14:17:41 crc kubenswrapper[4836]: E0217 14:17:41.098049 4836 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81e6d07edc7d7857cb288980bc7e9d5002234c2fe642df6798611b150b6d66e9\": container with ID starting with 81e6d07edc7d7857cb288980bc7e9d5002234c2fe642df6798611b150b6d66e9 not found: ID does not exist" containerID="81e6d07edc7d7857cb288980bc7e9d5002234c2fe642df6798611b150b6d66e9" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.098068 4836 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81e6d07edc7d7857cb288980bc7e9d5002234c2fe642df6798611b150b6d66e9"} err="failed to get container status \"81e6d07edc7d7857cb288980bc7e9d5002234c2fe642df6798611b150b6d66e9\": rpc error: code = NotFound desc = could not find container \"81e6d07edc7d7857cb288980bc7e9d5002234c2fe642df6798611b150b6d66e9\": container with ID starting with 81e6d07edc7d7857cb288980bc7e9d5002234c2fe642df6798611b150b6d66e9 not found: ID does not exist" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.098081 4836 scope.go:117] "RemoveContainer" containerID="61bfc9a285863adea7e23ce49aaa4e592d2fd16c4e2e57632d659c3574f51775" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.098786 4836 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61bfc9a285863adea7e23ce49aaa4e592d2fd16c4e2e57632d659c3574f51775"} err="failed to get container status \"61bfc9a285863adea7e23ce49aaa4e592d2fd16c4e2e57632d659c3574f51775\": rpc error: code = NotFound desc = could not find container \"61bfc9a285863adea7e23ce49aaa4e592d2fd16c4e2e57632d659c3574f51775\": container with ID starting with 61bfc9a285863adea7e23ce49aaa4e592d2fd16c4e2e57632d659c3574f51775 not found: ID does not exist" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.098820 4836 scope.go:117] "RemoveContainer" containerID="38fa1bf6b59359eb32319ce0184baa7dac44d10d2a0d9502169ea98eb3408c83" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.099352 4836 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38fa1bf6b59359eb32319ce0184baa7dac44d10d2a0d9502169ea98eb3408c83"} err="failed to get container status \"38fa1bf6b59359eb32319ce0184baa7dac44d10d2a0d9502169ea98eb3408c83\": rpc error: code = NotFound desc = could not find container \"38fa1bf6b59359eb32319ce0184baa7dac44d10d2a0d9502169ea98eb3408c83\": container with ID starting with 38fa1bf6b59359eb32319ce0184baa7dac44d10d2a0d9502169ea98eb3408c83 not found: ID does not exist" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.099376 4836 scope.go:117] "RemoveContainer" containerID="0c8d2edfc6a88391d034f310661e09eb58503f33f015b5e067654db06e8b152e" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.099836 4836 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c8d2edfc6a88391d034f310661e09eb58503f33f015b5e067654db06e8b152e"} err="failed to get container status \"0c8d2edfc6a88391d034f310661e09eb58503f33f015b5e067654db06e8b152e\": rpc error: code = NotFound desc = could not find container \"0c8d2edfc6a88391d034f310661e09eb58503f33f015b5e067654db06e8b152e\": container with ID starting with 0c8d2edfc6a88391d034f310661e09eb58503f33f015b5e067654db06e8b152e not found: ID does not exist" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.099859 4836 scope.go:117] "RemoveContainer" containerID="f5576148df454de0fb3d15ea9de93736aab907ca8cf09669c3c25fde79ec6fad" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.100246 4836 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5576148df454de0fb3d15ea9de93736aab907ca8cf09669c3c25fde79ec6fad"} err="failed to get container status \"f5576148df454de0fb3d15ea9de93736aab907ca8cf09669c3c25fde79ec6fad\": rpc error: code = NotFound desc = could not find container \"f5576148df454de0fb3d15ea9de93736aab907ca8cf09669c3c25fde79ec6fad\": container with ID starting with f5576148df454de0fb3d15ea9de93736aab907ca8cf09669c3c25fde79ec6fad not found: ID does not exist" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.100268 4836 scope.go:117] "RemoveContainer" containerID="47fdc413501c45a2d6e531a1427fd6df3e9d2c57e3d60098b31fbfd695d98a0a" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.100673 4836 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47fdc413501c45a2d6e531a1427fd6df3e9d2c57e3d60098b31fbfd695d98a0a"} err="failed to get container status \"47fdc413501c45a2d6e531a1427fd6df3e9d2c57e3d60098b31fbfd695d98a0a\": rpc error: code = NotFound desc = could not find container \"47fdc413501c45a2d6e531a1427fd6df3e9d2c57e3d60098b31fbfd695d98a0a\": container with ID starting with 47fdc413501c45a2d6e531a1427fd6df3e9d2c57e3d60098b31fbfd695d98a0a not found: ID does not exist" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.100691 4836 scope.go:117] "RemoveContainer" containerID="bb05318b17722c3f731dc976d38c05a4ab7eafd686ad0fcf17bd01032409eaee" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.101075 4836 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb05318b17722c3f731dc976d38c05a4ab7eafd686ad0fcf17bd01032409eaee"} err="failed to get container status \"bb05318b17722c3f731dc976d38c05a4ab7eafd686ad0fcf17bd01032409eaee\": rpc error: code = NotFound desc = could not find container \"bb05318b17722c3f731dc976d38c05a4ab7eafd686ad0fcf17bd01032409eaee\": container with ID starting with bb05318b17722c3f731dc976d38c05a4ab7eafd686ad0fcf17bd01032409eaee not found: ID does not exist" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.101092 4836 scope.go:117] "RemoveContainer" containerID="ebae2d4e3375fda616b0ea16c0c86a5c1c36004e7d22873a289ff37874bbc74b" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.101564 4836 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebae2d4e3375fda616b0ea16c0c86a5c1c36004e7d22873a289ff37874bbc74b"} err="failed to get container status \"ebae2d4e3375fda616b0ea16c0c86a5c1c36004e7d22873a289ff37874bbc74b\": rpc error: code = NotFound desc = could not find container \"ebae2d4e3375fda616b0ea16c0c86a5c1c36004e7d22873a289ff37874bbc74b\": container with ID starting with ebae2d4e3375fda616b0ea16c0c86a5c1c36004e7d22873a289ff37874bbc74b not found: ID does not exist" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.101659 4836 scope.go:117] "RemoveContainer" containerID="1d36d269996169583e4e181aac060491640e74df94385276752f460c408273cc" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.102088 4836 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d36d269996169583e4e181aac060491640e74df94385276752f460c408273cc"} err="failed to get container status \"1d36d269996169583e4e181aac060491640e74df94385276752f460c408273cc\": rpc error: code = NotFound desc = could not find container \"1d36d269996169583e4e181aac060491640e74df94385276752f460c408273cc\": container with ID starting with 1d36d269996169583e4e181aac060491640e74df94385276752f460c408273cc not found: ID does not exist" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.102106 4836 scope.go:117] "RemoveContainer" containerID="c6545ec908eecb35a4405050dd77343f0acd4ec7d54ec55853a926e00b7f37f2" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.102448 4836 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6545ec908eecb35a4405050dd77343f0acd4ec7d54ec55853a926e00b7f37f2"} err="failed to get container status \"c6545ec908eecb35a4405050dd77343f0acd4ec7d54ec55853a926e00b7f37f2\": rpc error: code = NotFound desc = could not find container \"c6545ec908eecb35a4405050dd77343f0acd4ec7d54ec55853a926e00b7f37f2\": container with ID starting with c6545ec908eecb35a4405050dd77343f0acd4ec7d54ec55853a926e00b7f37f2 not found: ID does not exist" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.102478 4836 scope.go:117] "RemoveContainer" containerID="81e6d07edc7d7857cb288980bc7e9d5002234c2fe642df6798611b150b6d66e9" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.102798 4836 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81e6d07edc7d7857cb288980bc7e9d5002234c2fe642df6798611b150b6d66e9"} err="failed to get container status \"81e6d07edc7d7857cb288980bc7e9d5002234c2fe642df6798611b150b6d66e9\": rpc error: code = NotFound desc = could not find container \"81e6d07edc7d7857cb288980bc7e9d5002234c2fe642df6798611b150b6d66e9\": container with ID starting with 81e6d07edc7d7857cb288980bc7e9d5002234c2fe642df6798611b150b6d66e9 not found: ID does not exist" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.102825 4836 scope.go:117] "RemoveContainer" containerID="61bfc9a285863adea7e23ce49aaa4e592d2fd16c4e2e57632d659c3574f51775" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.103097 4836 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61bfc9a285863adea7e23ce49aaa4e592d2fd16c4e2e57632d659c3574f51775"} err="failed to get container status \"61bfc9a285863adea7e23ce49aaa4e592d2fd16c4e2e57632d659c3574f51775\": rpc error: code = NotFound desc = could not find container \"61bfc9a285863adea7e23ce49aaa4e592d2fd16c4e2e57632d659c3574f51775\": container with ID starting with 61bfc9a285863adea7e23ce49aaa4e592d2fd16c4e2e57632d659c3574f51775 not found: ID does not exist" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.103119 4836 scope.go:117] "RemoveContainer" containerID="38fa1bf6b59359eb32319ce0184baa7dac44d10d2a0d9502169ea98eb3408c83" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.103440 4836 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38fa1bf6b59359eb32319ce0184baa7dac44d10d2a0d9502169ea98eb3408c83"} err="failed to get container status \"38fa1bf6b59359eb32319ce0184baa7dac44d10d2a0d9502169ea98eb3408c83\": rpc error: code = NotFound desc = could not find container \"38fa1bf6b59359eb32319ce0184baa7dac44d10d2a0d9502169ea98eb3408c83\": container with ID starting with 38fa1bf6b59359eb32319ce0184baa7dac44d10d2a0d9502169ea98eb3408c83 not found: ID does not exist" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.103477 4836 scope.go:117] "RemoveContainer" containerID="0c8d2edfc6a88391d034f310661e09eb58503f33f015b5e067654db06e8b152e" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.103783 4836 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c8d2edfc6a88391d034f310661e09eb58503f33f015b5e067654db06e8b152e"} err="failed to get container status \"0c8d2edfc6a88391d034f310661e09eb58503f33f015b5e067654db06e8b152e\": rpc error: code = NotFound desc = could not find container \"0c8d2edfc6a88391d034f310661e09eb58503f33f015b5e067654db06e8b152e\": container with ID starting with 0c8d2edfc6a88391d034f310661e09eb58503f33f015b5e067654db06e8b152e not found: ID does not exist" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.103807 4836 scope.go:117] "RemoveContainer" containerID="f5576148df454de0fb3d15ea9de93736aab907ca8cf09669c3c25fde79ec6fad" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.104168 4836 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5576148df454de0fb3d15ea9de93736aab907ca8cf09669c3c25fde79ec6fad"} err="failed to get container status \"f5576148df454de0fb3d15ea9de93736aab907ca8cf09669c3c25fde79ec6fad\": rpc error: code = NotFound desc = could not find container \"f5576148df454de0fb3d15ea9de93736aab907ca8cf09669c3c25fde79ec6fad\": container with ID starting with f5576148df454de0fb3d15ea9de93736aab907ca8cf09669c3c25fde79ec6fad not found: ID does not exist" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.104189 4836 scope.go:117] "RemoveContainer" containerID="47fdc413501c45a2d6e531a1427fd6df3e9d2c57e3d60098b31fbfd695d98a0a" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.104495 4836 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47fdc413501c45a2d6e531a1427fd6df3e9d2c57e3d60098b31fbfd695d98a0a"} err="failed to get container status \"47fdc413501c45a2d6e531a1427fd6df3e9d2c57e3d60098b31fbfd695d98a0a\": rpc error: code = NotFound desc = could not find container \"47fdc413501c45a2d6e531a1427fd6df3e9d2c57e3d60098b31fbfd695d98a0a\": container with ID starting with 47fdc413501c45a2d6e531a1427fd6df3e9d2c57e3d60098b31fbfd695d98a0a not found: ID does not exist" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.104518 4836 scope.go:117] "RemoveContainer" containerID="bb05318b17722c3f731dc976d38c05a4ab7eafd686ad0fcf17bd01032409eaee" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.104807 4836 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb05318b17722c3f731dc976d38c05a4ab7eafd686ad0fcf17bd01032409eaee"} err="failed to get container status \"bb05318b17722c3f731dc976d38c05a4ab7eafd686ad0fcf17bd01032409eaee\": rpc error: code = NotFound desc = could not find container \"bb05318b17722c3f731dc976d38c05a4ab7eafd686ad0fcf17bd01032409eaee\": container with ID starting with bb05318b17722c3f731dc976d38c05a4ab7eafd686ad0fcf17bd01032409eaee not found: ID does not exist" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.104828 4836 scope.go:117] "RemoveContainer" containerID="ebae2d4e3375fda616b0ea16c0c86a5c1c36004e7d22873a289ff37874bbc74b" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.105085 4836 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebae2d4e3375fda616b0ea16c0c86a5c1c36004e7d22873a289ff37874bbc74b"} err="failed to get container status \"ebae2d4e3375fda616b0ea16c0c86a5c1c36004e7d22873a289ff37874bbc74b\": rpc error: code = NotFound desc = could not find container \"ebae2d4e3375fda616b0ea16c0c86a5c1c36004e7d22873a289ff37874bbc74b\": container with ID starting with ebae2d4e3375fda616b0ea16c0c86a5c1c36004e7d22873a289ff37874bbc74b not found: ID does not exist" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.105106 4836 scope.go:117] "RemoveContainer" containerID="1d36d269996169583e4e181aac060491640e74df94385276752f460c408273cc" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.105359 4836 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d36d269996169583e4e181aac060491640e74df94385276752f460c408273cc"} err="failed to get container status \"1d36d269996169583e4e181aac060491640e74df94385276752f460c408273cc\": rpc error: code = NotFound desc = could not find container \"1d36d269996169583e4e181aac060491640e74df94385276752f460c408273cc\": container with ID starting with 1d36d269996169583e4e181aac060491640e74df94385276752f460c408273cc not found: ID does not exist" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.105396 4836 scope.go:117] "RemoveContainer" containerID="c6545ec908eecb35a4405050dd77343f0acd4ec7d54ec55853a926e00b7f37f2" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.105733 4836 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6545ec908eecb35a4405050dd77343f0acd4ec7d54ec55853a926e00b7f37f2"} err="failed to get container status \"c6545ec908eecb35a4405050dd77343f0acd4ec7d54ec55853a926e00b7f37f2\": rpc error: code = NotFound desc = could not find container \"c6545ec908eecb35a4405050dd77343f0acd4ec7d54ec55853a926e00b7f37f2\": container with ID starting with c6545ec908eecb35a4405050dd77343f0acd4ec7d54ec55853a926e00b7f37f2 not found: ID does not exist" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.105756 4836 scope.go:117] "RemoveContainer" containerID="81e6d07edc7d7857cb288980bc7e9d5002234c2fe642df6798611b150b6d66e9" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.106024 4836 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81e6d07edc7d7857cb288980bc7e9d5002234c2fe642df6798611b150b6d66e9"} err="failed to get container status \"81e6d07edc7d7857cb288980bc7e9d5002234c2fe642df6798611b150b6d66e9\": rpc error: code = NotFound desc = could not find container \"81e6d07edc7d7857cb288980bc7e9d5002234c2fe642df6798611b150b6d66e9\": container with ID starting with 81e6d07edc7d7857cb288980bc7e9d5002234c2fe642df6798611b150b6d66e9 not found: ID does not exist" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.106049 4836 scope.go:117] "RemoveContainer" containerID="61bfc9a285863adea7e23ce49aaa4e592d2fd16c4e2e57632d659c3574f51775" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.106489 4836 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61bfc9a285863adea7e23ce49aaa4e592d2fd16c4e2e57632d659c3574f51775"} err="failed to get container status \"61bfc9a285863adea7e23ce49aaa4e592d2fd16c4e2e57632d659c3574f51775\": rpc error: code = NotFound desc = could not find container \"61bfc9a285863adea7e23ce49aaa4e592d2fd16c4e2e57632d659c3574f51775\": container with ID starting with 61bfc9a285863adea7e23ce49aaa4e592d2fd16c4e2e57632d659c3574f51775 not found: ID does not exist" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.106523 4836 scope.go:117] "RemoveContainer" containerID="38fa1bf6b59359eb32319ce0184baa7dac44d10d2a0d9502169ea98eb3408c83" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.106825 4836 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38fa1bf6b59359eb32319ce0184baa7dac44d10d2a0d9502169ea98eb3408c83"} err="failed to get container status \"38fa1bf6b59359eb32319ce0184baa7dac44d10d2a0d9502169ea98eb3408c83\": rpc error: code = NotFound desc = could not find container \"38fa1bf6b59359eb32319ce0184baa7dac44d10d2a0d9502169ea98eb3408c83\": container with ID starting with 38fa1bf6b59359eb32319ce0184baa7dac44d10d2a0d9502169ea98eb3408c83 not found: ID does not exist" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.106846 4836 scope.go:117] "RemoveContainer" containerID="0c8d2edfc6a88391d034f310661e09eb58503f33f015b5e067654db06e8b152e" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.107199 4836 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c8d2edfc6a88391d034f310661e09eb58503f33f015b5e067654db06e8b152e"} err="failed to get container status \"0c8d2edfc6a88391d034f310661e09eb58503f33f015b5e067654db06e8b152e\": rpc error: code = NotFound desc = could not find container \"0c8d2edfc6a88391d034f310661e09eb58503f33f015b5e067654db06e8b152e\": container with ID starting with 0c8d2edfc6a88391d034f310661e09eb58503f33f015b5e067654db06e8b152e not found: ID does not exist" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.107222 4836 scope.go:117] "RemoveContainer" containerID="f5576148df454de0fb3d15ea9de93736aab907ca8cf09669c3c25fde79ec6fad" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.107586 4836 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5576148df454de0fb3d15ea9de93736aab907ca8cf09669c3c25fde79ec6fad"} err="failed to get container status \"f5576148df454de0fb3d15ea9de93736aab907ca8cf09669c3c25fde79ec6fad\": rpc error: code = NotFound desc = could not find container \"f5576148df454de0fb3d15ea9de93736aab907ca8cf09669c3c25fde79ec6fad\": container with ID starting with f5576148df454de0fb3d15ea9de93736aab907ca8cf09669c3c25fde79ec6fad not found: ID does not exist" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.107619 4836 scope.go:117] "RemoveContainer" containerID="47fdc413501c45a2d6e531a1427fd6df3e9d2c57e3d60098b31fbfd695d98a0a" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.107930 4836 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47fdc413501c45a2d6e531a1427fd6df3e9d2c57e3d60098b31fbfd695d98a0a"} err="failed to get container status \"47fdc413501c45a2d6e531a1427fd6df3e9d2c57e3d60098b31fbfd695d98a0a\": rpc error: code = NotFound desc = could not find container \"47fdc413501c45a2d6e531a1427fd6df3e9d2c57e3d60098b31fbfd695d98a0a\": container with ID starting with 47fdc413501c45a2d6e531a1427fd6df3e9d2c57e3d60098b31fbfd695d98a0a not found: ID does not exist" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.107949 4836 scope.go:117] "RemoveContainer" containerID="bb05318b17722c3f731dc976d38c05a4ab7eafd686ad0fcf17bd01032409eaee" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.108382 4836 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb05318b17722c3f731dc976d38c05a4ab7eafd686ad0fcf17bd01032409eaee"} err="failed to get container status \"bb05318b17722c3f731dc976d38c05a4ab7eafd686ad0fcf17bd01032409eaee\": rpc error: code = NotFound desc = could not find container \"bb05318b17722c3f731dc976d38c05a4ab7eafd686ad0fcf17bd01032409eaee\": container with ID starting with bb05318b17722c3f731dc976d38c05a4ab7eafd686ad0fcf17bd01032409eaee not found: ID does not exist" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.108412 4836 scope.go:117] "RemoveContainer" containerID="ebae2d4e3375fda616b0ea16c0c86a5c1c36004e7d22873a289ff37874bbc74b" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.108656 4836 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebae2d4e3375fda616b0ea16c0c86a5c1c36004e7d22873a289ff37874bbc74b"} err="failed to get container status \"ebae2d4e3375fda616b0ea16c0c86a5c1c36004e7d22873a289ff37874bbc74b\": rpc error: code = NotFound desc = could not find container \"ebae2d4e3375fda616b0ea16c0c86a5c1c36004e7d22873a289ff37874bbc74b\": container with ID starting with ebae2d4e3375fda616b0ea16c0c86a5c1c36004e7d22873a289ff37874bbc74b not found: ID does not exist" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.108673 4836 scope.go:117] "RemoveContainer" containerID="1d36d269996169583e4e181aac060491640e74df94385276752f460c408273cc" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.108953 4836 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d36d269996169583e4e181aac060491640e74df94385276752f460c408273cc"} err="failed to get container status \"1d36d269996169583e4e181aac060491640e74df94385276752f460c408273cc\": rpc error: code = NotFound desc = could not find container \"1d36d269996169583e4e181aac060491640e74df94385276752f460c408273cc\": container with ID starting with 1d36d269996169583e4e181aac060491640e74df94385276752f460c408273cc not found: ID does not exist" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.108976 4836 scope.go:117] "RemoveContainer" containerID="c6545ec908eecb35a4405050dd77343f0acd4ec7d54ec55853a926e00b7f37f2" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.109244 4836 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6545ec908eecb35a4405050dd77343f0acd4ec7d54ec55853a926e00b7f37f2"} err="failed to get container status \"c6545ec908eecb35a4405050dd77343f0acd4ec7d54ec55853a926e00b7f37f2\": rpc error: code = NotFound desc = could not find container \"c6545ec908eecb35a4405050dd77343f0acd4ec7d54ec55853a926e00b7f37f2\": container with ID starting with c6545ec908eecb35a4405050dd77343f0acd4ec7d54ec55853a926e00b7f37f2 not found: ID does not exist" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.109281 4836 scope.go:117] "RemoveContainer" containerID="81e6d07edc7d7857cb288980bc7e9d5002234c2fe642df6798611b150b6d66e9" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.109585 4836 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81e6d07edc7d7857cb288980bc7e9d5002234c2fe642df6798611b150b6d66e9"} err="failed to get container status \"81e6d07edc7d7857cb288980bc7e9d5002234c2fe642df6798611b150b6d66e9\": rpc error: code = NotFound desc = could not find container \"81e6d07edc7d7857cb288980bc7e9d5002234c2fe642df6798611b150b6d66e9\": container with ID starting with 81e6d07edc7d7857cb288980bc7e9d5002234c2fe642df6798611b150b6d66e9 not found: ID does not exist" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.109615 4836 scope.go:117] "RemoveContainer" containerID="61bfc9a285863adea7e23ce49aaa4e592d2fd16c4e2e57632d659c3574f51775" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.109967 4836 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61bfc9a285863adea7e23ce49aaa4e592d2fd16c4e2e57632d659c3574f51775"} err="failed to get container status \"61bfc9a285863adea7e23ce49aaa4e592d2fd16c4e2e57632d659c3574f51775\": rpc error: code = NotFound desc = could not find container \"61bfc9a285863adea7e23ce49aaa4e592d2fd16c4e2e57632d659c3574f51775\": container with ID starting with 61bfc9a285863adea7e23ce49aaa4e592d2fd16c4e2e57632d659c3574f51775 not found: ID does not exist" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.199885 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-nb8gc" Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.409852 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-gfznp"] Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.428225 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-gfznp"] Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.888197 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nb8gc" event={"ID":"dd545c20-7aca-4536-84b1-826c46c009f0","Type":"ContainerDied","Data":"58e884a408a1fef2d13b71fe9ee8454ee2da8617e8e4073b51bdec20c186e183"} Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.888176 4836 generic.go:334] "Generic (PLEG): container finished" podID="dd545c20-7aca-4536-84b1-826c46c009f0" containerID="58e884a408a1fef2d13b71fe9ee8454ee2da8617e8e4073b51bdec20c186e183" exitCode=0 Feb 17 14:17:41 crc kubenswrapper[4836]: I0217 14:17:41.889571 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nb8gc" event={"ID":"dd545c20-7aca-4536-84b1-826c46c009f0","Type":"ContainerStarted","Data":"942b9b8eb130dc6de6a87582e54cb8ca5624dfd821110527917b3ae4a552a902"} Feb 17 14:17:42 crc kubenswrapper[4836]: I0217 14:17:42.580213 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e" path="/var/lib/kubelet/pods/67e8cda7-ec53-43bd-9fec-8ac4d6ecc26e/volumes" Feb 17 14:17:42 crc kubenswrapper[4836]: I0217 14:17:42.898741 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nb8gc" event={"ID":"dd545c20-7aca-4536-84b1-826c46c009f0","Type":"ContainerStarted","Data":"25737070acaff4cc5b998dfce6cc5ffcc9c81fcbac86a43c4d8b7f733a985974"} Feb 17 14:17:42 crc kubenswrapper[4836]: I0217 14:17:42.899120 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nb8gc" event={"ID":"dd545c20-7aca-4536-84b1-826c46c009f0","Type":"ContainerStarted","Data":"d80ba1c6a69894dc15c15503c473f395763b874520ec225059efa316c66fc809"} Feb 17 14:17:42 crc kubenswrapper[4836]: I0217 14:17:42.899135 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nb8gc" event={"ID":"dd545c20-7aca-4536-84b1-826c46c009f0","Type":"ContainerStarted","Data":"3e077857f671b4f7efe7c63fe55ad8a73a20c1a61c8713072c3124c3ed7a902a"} Feb 17 14:17:42 crc kubenswrapper[4836]: I0217 14:17:42.899148 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nb8gc" event={"ID":"dd545c20-7aca-4536-84b1-826c46c009f0","Type":"ContainerStarted","Data":"69657103e4ae85731199e40014c18cac79931a53680f577dc479eed9c19ee7ba"} Feb 17 14:17:42 crc kubenswrapper[4836]: I0217 14:17:42.899159 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nb8gc" event={"ID":"dd545c20-7aca-4536-84b1-826c46c009f0","Type":"ContainerStarted","Data":"c016150b9b2c5c308f421584fa3afce1a66602ef23b87219a13f0ab0cb7f4f15"} Feb 17 14:17:43 crc kubenswrapper[4836]: I0217 14:17:43.910671 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nb8gc" event={"ID":"dd545c20-7aca-4536-84b1-826c46c009f0","Type":"ContainerStarted","Data":"71bbaacd7163148d079b9df98584e0d2a9d4d47c73e2db23db80124b13e0a1ce"} Feb 17 14:17:45 crc kubenswrapper[4836]: I0217 14:17:45.945981 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nb8gc" event={"ID":"dd545c20-7aca-4536-84b1-826c46c009f0","Type":"ContainerStarted","Data":"34cb30fd8fa30cf5a52c92154e6dac1df01f64848e689157958282dc444bae67"} Feb 17 14:17:49 crc kubenswrapper[4836]: I0217 14:17:49.395757 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nb8gc" event={"ID":"dd545c20-7aca-4536-84b1-826c46c009f0","Type":"ContainerStarted","Data":"40a86bd99cf3095d192bcd44811f4400a5de72117342bad74fe92645111c5f1e"} Feb 17 14:17:49 crc kubenswrapper[4836]: I0217 14:17:49.397730 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-nb8gc" Feb 17 14:17:49 crc kubenswrapper[4836]: I0217 14:17:49.397784 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-nb8gc" Feb 17 14:17:49 crc kubenswrapper[4836]: I0217 14:17:49.397797 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-nb8gc" Feb 17 14:17:49 crc kubenswrapper[4836]: I0217 14:17:49.492711 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-nb8gc" Feb 17 14:17:49 crc kubenswrapper[4836]: I0217 14:17:49.521966 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-nb8gc" podStartSLOduration=9.521918497 podStartE2EDuration="9.521918497s" podCreationTimestamp="2026-02-17 14:17:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:17:49.51589311 +0000 UTC m=+695.858821399" watchObservedRunningTime="2026-02-17 14:17:49.521918497 +0000 UTC m=+695.864846776" Feb 17 14:17:49 crc kubenswrapper[4836]: I0217 14:17:49.684126 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-nb8gc" Feb 17 14:17:51 crc kubenswrapper[4836]: I0217 14:17:51.215041 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-xm2rk"] Feb 17 14:17:51 crc kubenswrapper[4836]: I0217 14:17:51.215877 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-xm2rk" Feb 17 14:17:51 crc kubenswrapper[4836]: I0217 14:17:51.218193 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-2qj7z" Feb 17 14:17:51 crc kubenswrapper[4836]: I0217 14:17:51.218223 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Feb 17 14:17:51 crc kubenswrapper[4836]: I0217 14:17:51.219830 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Feb 17 14:17:51 crc kubenswrapper[4836]: I0217 14:17:51.347081 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-5b474d8486-lngwm"] Feb 17 14:17:51 crc kubenswrapper[4836]: I0217 14:17:51.347779 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b474d8486-lngwm" Feb 17 14:17:51 crc kubenswrapper[4836]: I0217 14:17:51.350742 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Feb 17 14:17:51 crc kubenswrapper[4836]: I0217 14:17:51.350916 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-q77cg" Feb 17 14:17:51 crc kubenswrapper[4836]: I0217 14:17:51.366250 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-5b474d8486-m8jhr"] Feb 17 14:17:51 crc kubenswrapper[4836]: I0217 14:17:51.366928 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b474d8486-m8jhr" Feb 17 14:17:51 crc kubenswrapper[4836]: I0217 14:17:51.384492 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pf6j9\" (UniqueName: \"kubernetes.io/projected/755bc851-3fff-45db-bbcf-164a27afcf85-kube-api-access-pf6j9\") pod \"obo-prometheus-operator-68bc856cb9-xm2rk\" (UID: \"755bc851-3fff-45db-bbcf-164a27afcf85\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-xm2rk" Feb 17 14:17:52 crc kubenswrapper[4836]: I0217 14:17:52.102408 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ce0a3fd2-d84a-417c-bd46-c0dba979376e-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5b474d8486-m8jhr\" (UID: \"ce0a3fd2-d84a-417c-bd46-c0dba979376e\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b474d8486-m8jhr" Feb 17 14:17:52 crc kubenswrapper[4836]: I0217 14:17:52.102505 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5a9fdae1-f115-4e94-9b72-026862e02026-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5b474d8486-lngwm\" (UID: \"5a9fdae1-f115-4e94-9b72-026862e02026\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b474d8486-lngwm" Feb 17 14:17:52 crc kubenswrapper[4836]: I0217 14:17:52.131414 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pf6j9\" (UniqueName: \"kubernetes.io/projected/755bc851-3fff-45db-bbcf-164a27afcf85-kube-api-access-pf6j9\") pod \"obo-prometheus-operator-68bc856cb9-xm2rk\" (UID: \"755bc851-3fff-45db-bbcf-164a27afcf85\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-xm2rk" Feb 17 14:17:52 crc kubenswrapper[4836]: I0217 14:17:52.131478 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ce0a3fd2-d84a-417c-bd46-c0dba979376e-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5b474d8486-m8jhr\" (UID: \"ce0a3fd2-d84a-417c-bd46-c0dba979376e\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b474d8486-m8jhr" Feb 17 14:17:52 crc kubenswrapper[4836]: I0217 14:17:52.131537 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5a9fdae1-f115-4e94-9b72-026862e02026-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5b474d8486-lngwm\" (UID: \"5a9fdae1-f115-4e94-9b72-026862e02026\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b474d8486-lngwm" Feb 17 14:17:52 crc kubenswrapper[4836]: I0217 14:17:52.139196 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-f94f2"] Feb 17 14:17:52 crc kubenswrapper[4836]: I0217 14:17:52.140204 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-f94f2" Feb 17 14:17:52 crc kubenswrapper[4836]: I0217 14:17:52.148229 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-5k2fx" Feb 17 14:17:52 crc kubenswrapper[4836]: I0217 14:17:52.148736 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Feb 17 14:17:52 crc kubenswrapper[4836]: I0217 14:17:52.164966 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pf6j9\" (UniqueName: \"kubernetes.io/projected/755bc851-3fff-45db-bbcf-164a27afcf85-kube-api-access-pf6j9\") pod \"obo-prometheus-operator-68bc856cb9-xm2rk\" (UID: \"755bc851-3fff-45db-bbcf-164a27afcf85\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-xm2rk" Feb 17 14:17:52 crc kubenswrapper[4836]: I0217 14:17:52.232877 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ce0a3fd2-d84a-417c-bd46-c0dba979376e-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5b474d8486-m8jhr\" (UID: \"ce0a3fd2-d84a-417c-bd46-c0dba979376e\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b474d8486-m8jhr" Feb 17 14:17:52 crc kubenswrapper[4836]: I0217 14:17:52.232938 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5a9fdae1-f115-4e94-9b72-026862e02026-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5b474d8486-lngwm\" (UID: \"5a9fdae1-f115-4e94-9b72-026862e02026\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b474d8486-lngwm" Feb 17 14:17:52 crc kubenswrapper[4836]: I0217 14:17:52.232962 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ce0a3fd2-d84a-417c-bd46-c0dba979376e-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5b474d8486-m8jhr\" (UID: \"ce0a3fd2-d84a-417c-bd46-c0dba979376e\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b474d8486-m8jhr" Feb 17 14:17:52 crc kubenswrapper[4836]: I0217 14:17:52.232986 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5a9fdae1-f115-4e94-9b72-026862e02026-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5b474d8486-lngwm\" (UID: \"5a9fdae1-f115-4e94-9b72-026862e02026\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b474d8486-lngwm" Feb 17 14:17:52 crc kubenswrapper[4836]: I0217 14:17:52.236846 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5a9fdae1-f115-4e94-9b72-026862e02026-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5b474d8486-lngwm\" (UID: \"5a9fdae1-f115-4e94-9b72-026862e02026\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b474d8486-lngwm" Feb 17 14:17:52 crc kubenswrapper[4836]: I0217 14:17:52.239325 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5a9fdae1-f115-4e94-9b72-026862e02026-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5b474d8486-lngwm\" (UID: \"5a9fdae1-f115-4e94-9b72-026862e02026\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b474d8486-lngwm" Feb 17 14:17:52 crc kubenswrapper[4836]: I0217 14:17:52.239462 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ce0a3fd2-d84a-417c-bd46-c0dba979376e-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5b474d8486-m8jhr\" (UID: \"ce0a3fd2-d84a-417c-bd46-c0dba979376e\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b474d8486-m8jhr" Feb 17 14:17:52 crc kubenswrapper[4836]: I0217 14:17:52.256313 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ce0a3fd2-d84a-417c-bd46-c0dba979376e-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5b474d8486-m8jhr\" (UID: \"ce0a3fd2-d84a-417c-bd46-c0dba979376e\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b474d8486-m8jhr" Feb 17 14:17:52 crc kubenswrapper[4836]: I0217 14:17:52.263715 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b474d8486-lngwm" Feb 17 14:17:52 crc kubenswrapper[4836]: I0217 14:17:52.281676 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b474d8486-m8jhr" Feb 17 14:17:52 crc kubenswrapper[4836]: E0217 14:17:52.314584 4836 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-5b474d8486-lngwm_openshift-operators_5a9fdae1-f115-4e94-9b72-026862e02026_0(a102d7c03f3108e36b7bfb56594ed6512f487c682ec449ba153bc3d9cab5724a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 17 14:17:52 crc kubenswrapper[4836]: E0217 14:17:52.314748 4836 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-5b474d8486-lngwm_openshift-operators_5a9fdae1-f115-4e94-9b72-026862e02026_0(a102d7c03f3108e36b7bfb56594ed6512f487c682ec449ba153bc3d9cab5724a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b474d8486-lngwm" Feb 17 14:17:52 crc kubenswrapper[4836]: E0217 14:17:52.314787 4836 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-5b474d8486-lngwm_openshift-operators_5a9fdae1-f115-4e94-9b72-026862e02026_0(a102d7c03f3108e36b7bfb56594ed6512f487c682ec449ba153bc3d9cab5724a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b474d8486-lngwm" Feb 17 14:17:52 crc kubenswrapper[4836]: E0217 14:17:52.314847 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-5b474d8486-lngwm_openshift-operators(5a9fdae1-f115-4e94-9b72-026862e02026)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-5b474d8486-lngwm_openshift-operators(5a9fdae1-f115-4e94-9b72-026862e02026)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-5b474d8486-lngwm_openshift-operators_5a9fdae1-f115-4e94-9b72-026862e02026_0(a102d7c03f3108e36b7bfb56594ed6512f487c682ec449ba153bc3d9cab5724a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b474d8486-lngwm" podUID="5a9fdae1-f115-4e94-9b72-026862e02026" Feb 17 14:17:52 crc kubenswrapper[4836]: I0217 14:17:52.323054 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-vqhkf"] Feb 17 14:17:52 crc kubenswrapper[4836]: I0217 14:17:52.323802 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-vqhkf" Feb 17 14:17:52 crc kubenswrapper[4836]: I0217 14:17:52.326074 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-2z5nq" Feb 17 14:17:52 crc kubenswrapper[4836]: E0217 14:17:52.328099 4836 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-5b474d8486-m8jhr_openshift-operators_ce0a3fd2-d84a-417c-bd46-c0dba979376e_0(89d09f6731e23af0db5a3e6650e8f1bff1c0eeaf9b9d3946e1523c677187560c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 17 14:17:52 crc kubenswrapper[4836]: E0217 14:17:52.328141 4836 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-5b474d8486-m8jhr_openshift-operators_ce0a3fd2-d84a-417c-bd46-c0dba979376e_0(89d09f6731e23af0db5a3e6650e8f1bff1c0eeaf9b9d3946e1523c677187560c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b474d8486-m8jhr" Feb 17 14:17:52 crc kubenswrapper[4836]: E0217 14:17:52.328162 4836 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-5b474d8486-m8jhr_openshift-operators_ce0a3fd2-d84a-417c-bd46-c0dba979376e_0(89d09f6731e23af0db5a3e6650e8f1bff1c0eeaf9b9d3946e1523c677187560c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b474d8486-m8jhr" Feb 17 14:17:52 crc kubenswrapper[4836]: E0217 14:17:52.328208 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-5b474d8486-m8jhr_openshift-operators(ce0a3fd2-d84a-417c-bd46-c0dba979376e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-5b474d8486-m8jhr_openshift-operators(ce0a3fd2-d84a-417c-bd46-c0dba979376e)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-5b474d8486-m8jhr_openshift-operators_ce0a3fd2-d84a-417c-bd46-c0dba979376e_0(89d09f6731e23af0db5a3e6650e8f1bff1c0eeaf9b9d3946e1523c677187560c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b474d8486-m8jhr" podUID="ce0a3fd2-d84a-417c-bd46-c0dba979376e" Feb 17 14:17:52 crc kubenswrapper[4836]: I0217 14:17:52.333946 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/d6acbcf2-dfc0-4a7b-b6bd-4b62c0b03578-observability-operator-tls\") pod \"observability-operator-59bdc8b94-f94f2\" (UID: \"d6acbcf2-dfc0-4a7b-b6bd-4b62c0b03578\") " pod="openshift-operators/observability-operator-59bdc8b94-f94f2" Feb 17 14:17:52 crc kubenswrapper[4836]: I0217 14:17:52.334040 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbwp7\" (UniqueName: \"kubernetes.io/projected/d6acbcf2-dfc0-4a7b-b6bd-4b62c0b03578-kube-api-access-pbwp7\") pod \"observability-operator-59bdc8b94-f94f2\" (UID: \"d6acbcf2-dfc0-4a7b-b6bd-4b62c0b03578\") " pod="openshift-operators/observability-operator-59bdc8b94-f94f2" Feb 17 14:17:52 crc kubenswrapper[4836]: I0217 14:17:52.435064 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/d6acbcf2-dfc0-4a7b-b6bd-4b62c0b03578-observability-operator-tls\") pod \"observability-operator-59bdc8b94-f94f2\" (UID: \"d6acbcf2-dfc0-4a7b-b6bd-4b62c0b03578\") " pod="openshift-operators/observability-operator-59bdc8b94-f94f2" Feb 17 14:17:52 crc kubenswrapper[4836]: I0217 14:17:52.435129 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/c4b6d996-7a86-4512-825f-6e6d34148862-openshift-service-ca\") pod \"perses-operator-5bf474d74f-vqhkf\" (UID: \"c4b6d996-7a86-4512-825f-6e6d34148862\") " pod="openshift-operators/perses-operator-5bf474d74f-vqhkf" Feb 17 14:17:52 crc kubenswrapper[4836]: I0217 14:17:52.435161 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbwp7\" (UniqueName: \"kubernetes.io/projected/d6acbcf2-dfc0-4a7b-b6bd-4b62c0b03578-kube-api-access-pbwp7\") pod \"observability-operator-59bdc8b94-f94f2\" (UID: \"d6acbcf2-dfc0-4a7b-b6bd-4b62c0b03578\") " pod="openshift-operators/observability-operator-59bdc8b94-f94f2" Feb 17 14:17:52 crc kubenswrapper[4836]: I0217 14:17:52.435209 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwc4k\" (UniqueName: \"kubernetes.io/projected/c4b6d996-7a86-4512-825f-6e6d34148862-kube-api-access-lwc4k\") pod \"perses-operator-5bf474d74f-vqhkf\" (UID: \"c4b6d996-7a86-4512-825f-6e6d34148862\") " pod="openshift-operators/perses-operator-5bf474d74f-vqhkf" Feb 17 14:17:52 crc kubenswrapper[4836]: I0217 14:17:52.439638 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-xm2rk" Feb 17 14:17:52 crc kubenswrapper[4836]: I0217 14:17:52.440444 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/d6acbcf2-dfc0-4a7b-b6bd-4b62c0b03578-observability-operator-tls\") pod \"observability-operator-59bdc8b94-f94f2\" (UID: \"d6acbcf2-dfc0-4a7b-b6bd-4b62c0b03578\") " pod="openshift-operators/observability-operator-59bdc8b94-f94f2" Feb 17 14:17:52 crc kubenswrapper[4836]: I0217 14:17:52.456375 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbwp7\" (UniqueName: \"kubernetes.io/projected/d6acbcf2-dfc0-4a7b-b6bd-4b62c0b03578-kube-api-access-pbwp7\") pod \"observability-operator-59bdc8b94-f94f2\" (UID: \"d6acbcf2-dfc0-4a7b-b6bd-4b62c0b03578\") " pod="openshift-operators/observability-operator-59bdc8b94-f94f2" Feb 17 14:17:52 crc kubenswrapper[4836]: E0217 14:17:52.474037 4836 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-xm2rk_openshift-operators_755bc851-3fff-45db-bbcf-164a27afcf85_0(a73fa1b3f7aea1c4edcde216ce381c1d7b965091a0920e08a20d46cf7f096c67): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 17 14:17:52 crc kubenswrapper[4836]: E0217 14:17:52.474107 4836 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-xm2rk_openshift-operators_755bc851-3fff-45db-bbcf-164a27afcf85_0(a73fa1b3f7aea1c4edcde216ce381c1d7b965091a0920e08a20d46cf7f096c67): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-xm2rk" Feb 17 14:17:52 crc kubenswrapper[4836]: E0217 14:17:52.474129 4836 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-xm2rk_openshift-operators_755bc851-3fff-45db-bbcf-164a27afcf85_0(a73fa1b3f7aea1c4edcde216ce381c1d7b965091a0920e08a20d46cf7f096c67): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-xm2rk" Feb 17 14:17:52 crc kubenswrapper[4836]: E0217 14:17:52.474171 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-68bc856cb9-xm2rk_openshift-operators(755bc851-3fff-45db-bbcf-164a27afcf85)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-68bc856cb9-xm2rk_openshift-operators(755bc851-3fff-45db-bbcf-164a27afcf85)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-xm2rk_openshift-operators_755bc851-3fff-45db-bbcf-164a27afcf85_0(a73fa1b3f7aea1c4edcde216ce381c1d7b965091a0920e08a20d46cf7f096c67): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-xm2rk" podUID="755bc851-3fff-45db-bbcf-164a27afcf85" Feb 17 14:17:52 crc kubenswrapper[4836]: I0217 14:17:52.499381 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-f94f2" Feb 17 14:17:52 crc kubenswrapper[4836]: E0217 14:17:52.529438 4836 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-f94f2_openshift-operators_d6acbcf2-dfc0-4a7b-b6bd-4b62c0b03578_0(7374031d3926788c7257128305991081ecceefe7149cfab81e10d7ed4f67c598): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 17 14:17:52 crc kubenswrapper[4836]: E0217 14:17:52.529501 4836 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-f94f2_openshift-operators_d6acbcf2-dfc0-4a7b-b6bd-4b62c0b03578_0(7374031d3926788c7257128305991081ecceefe7149cfab81e10d7ed4f67c598): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-59bdc8b94-f94f2" Feb 17 14:17:52 crc kubenswrapper[4836]: E0217 14:17:52.529527 4836 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-f94f2_openshift-operators_d6acbcf2-dfc0-4a7b-b6bd-4b62c0b03578_0(7374031d3926788c7257128305991081ecceefe7149cfab81e10d7ed4f67c598): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-59bdc8b94-f94f2" Feb 17 14:17:52 crc kubenswrapper[4836]: E0217 14:17:52.529593 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-59bdc8b94-f94f2_openshift-operators(d6acbcf2-dfc0-4a7b-b6bd-4b62c0b03578)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-59bdc8b94-f94f2_openshift-operators(d6acbcf2-dfc0-4a7b-b6bd-4b62c0b03578)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-f94f2_openshift-operators_d6acbcf2-dfc0-4a7b-b6bd-4b62c0b03578_0(7374031d3926788c7257128305991081ecceefe7149cfab81e10d7ed4f67c598): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-59bdc8b94-f94f2" podUID="d6acbcf2-dfc0-4a7b-b6bd-4b62c0b03578" Feb 17 14:17:52 crc kubenswrapper[4836]: I0217 14:17:52.537751 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/c4b6d996-7a86-4512-825f-6e6d34148862-openshift-service-ca\") pod \"perses-operator-5bf474d74f-vqhkf\" (UID: \"c4b6d996-7a86-4512-825f-6e6d34148862\") " pod="openshift-operators/perses-operator-5bf474d74f-vqhkf" Feb 17 14:17:52 crc kubenswrapper[4836]: I0217 14:17:52.537795 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwc4k\" (UniqueName: \"kubernetes.io/projected/c4b6d996-7a86-4512-825f-6e6d34148862-kube-api-access-lwc4k\") pod \"perses-operator-5bf474d74f-vqhkf\" (UID: \"c4b6d996-7a86-4512-825f-6e6d34148862\") " pod="openshift-operators/perses-operator-5bf474d74f-vqhkf" Feb 17 14:17:52 crc kubenswrapper[4836]: I0217 14:17:52.539062 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/c4b6d996-7a86-4512-825f-6e6d34148862-openshift-service-ca\") pod \"perses-operator-5bf474d74f-vqhkf\" (UID: \"c4b6d996-7a86-4512-825f-6e6d34148862\") " pod="openshift-operators/perses-operator-5bf474d74f-vqhkf" Feb 17 14:17:52 crc kubenswrapper[4836]: I0217 14:17:52.582120 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwc4k\" (UniqueName: \"kubernetes.io/projected/c4b6d996-7a86-4512-825f-6e6d34148862-kube-api-access-lwc4k\") pod \"perses-operator-5bf474d74f-vqhkf\" (UID: \"c4b6d996-7a86-4512-825f-6e6d34148862\") " pod="openshift-operators/perses-operator-5bf474d74f-vqhkf" Feb 17 14:17:52 crc kubenswrapper[4836]: I0217 14:17:52.626109 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-5b474d8486-m8jhr"] Feb 17 14:17:52 crc kubenswrapper[4836]: I0217 14:17:52.638495 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-5b474d8486-lngwm"] Feb 17 14:17:52 crc kubenswrapper[4836]: I0217 14:17:52.642474 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-vqhkf"] Feb 17 14:17:52 crc kubenswrapper[4836]: I0217 14:17:52.644998 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-f94f2"] Feb 17 14:17:52 crc kubenswrapper[4836]: I0217 14:17:52.652332 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-vqhkf" Feb 17 14:17:52 crc kubenswrapper[4836]: I0217 14:17:52.676264 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-xm2rk"] Feb 17 14:17:52 crc kubenswrapper[4836]: E0217 14:17:52.691768 4836 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-vqhkf_openshift-operators_c4b6d996-7a86-4512-825f-6e6d34148862_0(c8518216692ffff85a5810035447a57077c2cfa6128632f416a37aafc5adc68e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 17 14:17:52 crc kubenswrapper[4836]: E0217 14:17:52.691880 4836 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-vqhkf_openshift-operators_c4b6d996-7a86-4512-825f-6e6d34148862_0(c8518216692ffff85a5810035447a57077c2cfa6128632f416a37aafc5adc68e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5bf474d74f-vqhkf" Feb 17 14:17:52 crc kubenswrapper[4836]: E0217 14:17:52.691916 4836 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-vqhkf_openshift-operators_c4b6d996-7a86-4512-825f-6e6d34148862_0(c8518216692ffff85a5810035447a57077c2cfa6128632f416a37aafc5adc68e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5bf474d74f-vqhkf" Feb 17 14:17:52 crc kubenswrapper[4836]: E0217 14:17:52.691983 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-5bf474d74f-vqhkf_openshift-operators(c4b6d996-7a86-4512-825f-6e6d34148862)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-5bf474d74f-vqhkf_openshift-operators(c4b6d996-7a86-4512-825f-6e6d34148862)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-vqhkf_openshift-operators_c4b6d996-7a86-4512-825f-6e6d34148862_0(c8518216692ffff85a5810035447a57077c2cfa6128632f416a37aafc5adc68e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-5bf474d74f-vqhkf" podUID="c4b6d996-7a86-4512-825f-6e6d34148862" Feb 17 14:17:53 crc kubenswrapper[4836]: I0217 14:17:53.214014 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b474d8486-m8jhr" Feb 17 14:17:53 crc kubenswrapper[4836]: I0217 14:17:53.214054 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-vqhkf" Feb 17 14:17:53 crc kubenswrapper[4836]: I0217 14:17:53.214024 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-xm2rk" Feb 17 14:17:53 crc kubenswrapper[4836]: I0217 14:17:53.214253 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b474d8486-lngwm" Feb 17 14:17:53 crc kubenswrapper[4836]: I0217 14:17:53.214465 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b474d8486-m8jhr" Feb 17 14:17:53 crc kubenswrapper[4836]: I0217 14:17:53.214721 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-xm2rk" Feb 17 14:17:53 crc kubenswrapper[4836]: I0217 14:17:53.214737 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-f94f2" Feb 17 14:17:53 crc kubenswrapper[4836]: I0217 14:17:53.214890 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b474d8486-lngwm" Feb 17 14:17:53 crc kubenswrapper[4836]: I0217 14:17:53.214945 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-vqhkf" Feb 17 14:17:53 crc kubenswrapper[4836]: I0217 14:17:53.215489 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-f94f2" Feb 17 14:17:53 crc kubenswrapper[4836]: E0217 14:17:53.316518 4836 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-5b474d8486-m8jhr_openshift-operators_ce0a3fd2-d84a-417c-bd46-c0dba979376e_0(c7b2d209d7ed1f8aefeae299e81c8622fc3e16d43f2d3f86380300c5ce4f151d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 17 14:17:53 crc kubenswrapper[4836]: E0217 14:17:53.316590 4836 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-5b474d8486-m8jhr_openshift-operators_ce0a3fd2-d84a-417c-bd46-c0dba979376e_0(c7b2d209d7ed1f8aefeae299e81c8622fc3e16d43f2d3f86380300c5ce4f151d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b474d8486-m8jhr" Feb 17 14:17:53 crc kubenswrapper[4836]: E0217 14:17:53.316613 4836 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-5b474d8486-m8jhr_openshift-operators_ce0a3fd2-d84a-417c-bd46-c0dba979376e_0(c7b2d209d7ed1f8aefeae299e81c8622fc3e16d43f2d3f86380300c5ce4f151d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b474d8486-m8jhr" Feb 17 14:17:53 crc kubenswrapper[4836]: E0217 14:17:53.316664 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-5b474d8486-m8jhr_openshift-operators(ce0a3fd2-d84a-417c-bd46-c0dba979376e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-5b474d8486-m8jhr_openshift-operators(ce0a3fd2-d84a-417c-bd46-c0dba979376e)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-5b474d8486-m8jhr_openshift-operators_ce0a3fd2-d84a-417c-bd46-c0dba979376e_0(c7b2d209d7ed1f8aefeae299e81c8622fc3e16d43f2d3f86380300c5ce4f151d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b474d8486-m8jhr" podUID="ce0a3fd2-d84a-417c-bd46-c0dba979376e" Feb 17 14:17:53 crc kubenswrapper[4836]: E0217 14:17:53.329969 4836 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-vqhkf_openshift-operators_c4b6d996-7a86-4512-825f-6e6d34148862_0(a8b8939fd6aa2167a9f7f62aa79f4d937e00d5357472be28cc1a68701f421b45): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 17 14:17:53 crc kubenswrapper[4836]: E0217 14:17:53.330053 4836 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-vqhkf_openshift-operators_c4b6d996-7a86-4512-825f-6e6d34148862_0(a8b8939fd6aa2167a9f7f62aa79f4d937e00d5357472be28cc1a68701f421b45): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5bf474d74f-vqhkf" Feb 17 14:17:53 crc kubenswrapper[4836]: E0217 14:17:53.330079 4836 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-vqhkf_openshift-operators_c4b6d996-7a86-4512-825f-6e6d34148862_0(a8b8939fd6aa2167a9f7f62aa79f4d937e00d5357472be28cc1a68701f421b45): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5bf474d74f-vqhkf" Feb 17 14:17:53 crc kubenswrapper[4836]: E0217 14:17:53.330125 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-5bf474d74f-vqhkf_openshift-operators(c4b6d996-7a86-4512-825f-6e6d34148862)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-5bf474d74f-vqhkf_openshift-operators(c4b6d996-7a86-4512-825f-6e6d34148862)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-vqhkf_openshift-operators_c4b6d996-7a86-4512-825f-6e6d34148862_0(a8b8939fd6aa2167a9f7f62aa79f4d937e00d5357472be28cc1a68701f421b45): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-5bf474d74f-vqhkf" podUID="c4b6d996-7a86-4512-825f-6e6d34148862" Feb 17 14:17:53 crc kubenswrapper[4836]: E0217 14:17:53.348524 4836 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-5b474d8486-lngwm_openshift-operators_5a9fdae1-f115-4e94-9b72-026862e02026_0(2ffa99ee1d8856d93373ba1de1f5431c31896fd44a02167ac392bee540256a8b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 17 14:17:53 crc kubenswrapper[4836]: E0217 14:17:53.348607 4836 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-5b474d8486-lngwm_openshift-operators_5a9fdae1-f115-4e94-9b72-026862e02026_0(2ffa99ee1d8856d93373ba1de1f5431c31896fd44a02167ac392bee540256a8b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b474d8486-lngwm" Feb 17 14:17:53 crc kubenswrapper[4836]: E0217 14:17:53.348647 4836 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-5b474d8486-lngwm_openshift-operators_5a9fdae1-f115-4e94-9b72-026862e02026_0(2ffa99ee1d8856d93373ba1de1f5431c31896fd44a02167ac392bee540256a8b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b474d8486-lngwm" Feb 17 14:17:53 crc kubenswrapper[4836]: E0217 14:17:53.348695 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-5b474d8486-lngwm_openshift-operators(5a9fdae1-f115-4e94-9b72-026862e02026)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-5b474d8486-lngwm_openshift-operators(5a9fdae1-f115-4e94-9b72-026862e02026)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-5b474d8486-lngwm_openshift-operators_5a9fdae1-f115-4e94-9b72-026862e02026_0(2ffa99ee1d8856d93373ba1de1f5431c31896fd44a02167ac392bee540256a8b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b474d8486-lngwm" podUID="5a9fdae1-f115-4e94-9b72-026862e02026" Feb 17 14:17:53 crc kubenswrapper[4836]: E0217 14:17:53.356239 4836 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-xm2rk_openshift-operators_755bc851-3fff-45db-bbcf-164a27afcf85_0(6717d87401b60614c8c16cf675baf03575f29616b818320e25760ec5e7aa98e9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 17 14:17:53 crc kubenswrapper[4836]: E0217 14:17:53.356313 4836 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-xm2rk_openshift-operators_755bc851-3fff-45db-bbcf-164a27afcf85_0(6717d87401b60614c8c16cf675baf03575f29616b818320e25760ec5e7aa98e9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-xm2rk" Feb 17 14:17:53 crc kubenswrapper[4836]: E0217 14:17:53.356333 4836 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-xm2rk_openshift-operators_755bc851-3fff-45db-bbcf-164a27afcf85_0(6717d87401b60614c8c16cf675baf03575f29616b818320e25760ec5e7aa98e9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-xm2rk" Feb 17 14:17:53 crc kubenswrapper[4836]: E0217 14:17:53.356440 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-68bc856cb9-xm2rk_openshift-operators(755bc851-3fff-45db-bbcf-164a27afcf85)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-68bc856cb9-xm2rk_openshift-operators(755bc851-3fff-45db-bbcf-164a27afcf85)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-xm2rk_openshift-operators_755bc851-3fff-45db-bbcf-164a27afcf85_0(6717d87401b60614c8c16cf675baf03575f29616b818320e25760ec5e7aa98e9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-xm2rk" podUID="755bc851-3fff-45db-bbcf-164a27afcf85" Feb 17 14:17:53 crc kubenswrapper[4836]: I0217 14:17:53.567661 4836 scope.go:117] "RemoveContainer" containerID="d7051348fa11415bbd3ca42ccce04342cfc29fef1e5015e7fedf40514e49824c" Feb 17 14:17:53 crc kubenswrapper[4836]: E0217 14:17:53.568327 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-c76cc_openshift-multus(592aa549-1b1b-441e-93e4-0821e05ff2b2)\"" pod="openshift-multus/multus-c76cc" podUID="592aa549-1b1b-441e-93e4-0821e05ff2b2" Feb 17 14:17:53 crc kubenswrapper[4836]: E0217 14:17:53.720028 4836 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-f94f2_openshift-operators_d6acbcf2-dfc0-4a7b-b6bd-4b62c0b03578_0(76117cbc5c70c88088cbf577789548acc2269123c8e5cfa8d7c9d9a9894446db): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 17 14:17:53 crc kubenswrapper[4836]: E0217 14:17:53.720093 4836 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-f94f2_openshift-operators_d6acbcf2-dfc0-4a7b-b6bd-4b62c0b03578_0(76117cbc5c70c88088cbf577789548acc2269123c8e5cfa8d7c9d9a9894446db): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-59bdc8b94-f94f2" Feb 17 14:17:53 crc kubenswrapper[4836]: E0217 14:17:53.720114 4836 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-f94f2_openshift-operators_d6acbcf2-dfc0-4a7b-b6bd-4b62c0b03578_0(76117cbc5c70c88088cbf577789548acc2269123c8e5cfa8d7c9d9a9894446db): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-59bdc8b94-f94f2" Feb 17 14:17:53 crc kubenswrapper[4836]: E0217 14:17:53.720157 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-59bdc8b94-f94f2_openshift-operators(d6acbcf2-dfc0-4a7b-b6bd-4b62c0b03578)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-59bdc8b94-f94f2_openshift-operators(d6acbcf2-dfc0-4a7b-b6bd-4b62c0b03578)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-f94f2_openshift-operators_d6acbcf2-dfc0-4a7b-b6bd-4b62c0b03578_0(76117cbc5c70c88088cbf577789548acc2269123c8e5cfa8d7c9d9a9894446db): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-59bdc8b94-f94f2" podUID="d6acbcf2-dfc0-4a7b-b6bd-4b62c0b03578" Feb 17 14:18:06 crc kubenswrapper[4836]: I0217 14:18:06.567596 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-vqhkf" Feb 17 14:18:06 crc kubenswrapper[4836]: I0217 14:18:06.568828 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-vqhkf" Feb 17 14:18:06 crc kubenswrapper[4836]: E0217 14:18:06.600375 4836 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-vqhkf_openshift-operators_c4b6d996-7a86-4512-825f-6e6d34148862_0(cdb695baf8e9d67708026ebd9d849b6f54a9010f36c3c63ddc9fde83ccd92990): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 17 14:18:06 crc kubenswrapper[4836]: E0217 14:18:06.600633 4836 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-vqhkf_openshift-operators_c4b6d996-7a86-4512-825f-6e6d34148862_0(cdb695baf8e9d67708026ebd9d849b6f54a9010f36c3c63ddc9fde83ccd92990): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5bf474d74f-vqhkf" Feb 17 14:18:06 crc kubenswrapper[4836]: E0217 14:18:06.600655 4836 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-vqhkf_openshift-operators_c4b6d996-7a86-4512-825f-6e6d34148862_0(cdb695baf8e9d67708026ebd9d849b6f54a9010f36c3c63ddc9fde83ccd92990): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5bf474d74f-vqhkf" Feb 17 14:18:06 crc kubenswrapper[4836]: E0217 14:18:06.600701 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-5bf474d74f-vqhkf_openshift-operators(c4b6d996-7a86-4512-825f-6e6d34148862)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-5bf474d74f-vqhkf_openshift-operators(c4b6d996-7a86-4512-825f-6e6d34148862)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-vqhkf_openshift-operators_c4b6d996-7a86-4512-825f-6e6d34148862_0(cdb695baf8e9d67708026ebd9d849b6f54a9010f36c3c63ddc9fde83ccd92990): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-5bf474d74f-vqhkf" podUID="c4b6d996-7a86-4512-825f-6e6d34148862" Feb 17 14:18:07 crc kubenswrapper[4836]: I0217 14:18:07.567650 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b474d8486-m8jhr" Feb 17 14:18:07 crc kubenswrapper[4836]: I0217 14:18:07.568327 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b474d8486-m8jhr" Feb 17 14:18:07 crc kubenswrapper[4836]: I0217 14:18:07.568949 4836 scope.go:117] "RemoveContainer" containerID="d7051348fa11415bbd3ca42ccce04342cfc29fef1e5015e7fedf40514e49824c" Feb 17 14:18:07 crc kubenswrapper[4836]: E0217 14:18:07.683628 4836 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-5b474d8486-m8jhr_openshift-operators_ce0a3fd2-d84a-417c-bd46-c0dba979376e_0(7616aaca5ba6e128f1edcce6dab39e57b5d0e5b83eb05aa68fcb8f0aa79fae2f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 17 14:18:07 crc kubenswrapper[4836]: E0217 14:18:07.683736 4836 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-5b474d8486-m8jhr_openshift-operators_ce0a3fd2-d84a-417c-bd46-c0dba979376e_0(7616aaca5ba6e128f1edcce6dab39e57b5d0e5b83eb05aa68fcb8f0aa79fae2f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b474d8486-m8jhr" Feb 17 14:18:07 crc kubenswrapper[4836]: E0217 14:18:07.683765 4836 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-5b474d8486-m8jhr_openshift-operators_ce0a3fd2-d84a-417c-bd46-c0dba979376e_0(7616aaca5ba6e128f1edcce6dab39e57b5d0e5b83eb05aa68fcb8f0aa79fae2f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b474d8486-m8jhr" Feb 17 14:18:07 crc kubenswrapper[4836]: E0217 14:18:07.683833 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-5b474d8486-m8jhr_openshift-operators(ce0a3fd2-d84a-417c-bd46-c0dba979376e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-5b474d8486-m8jhr_openshift-operators(ce0a3fd2-d84a-417c-bd46-c0dba979376e)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-5b474d8486-m8jhr_openshift-operators_ce0a3fd2-d84a-417c-bd46-c0dba979376e_0(7616aaca5ba6e128f1edcce6dab39e57b5d0e5b83eb05aa68fcb8f0aa79fae2f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b474d8486-m8jhr" podUID="ce0a3fd2-d84a-417c-bd46-c0dba979376e" Feb 17 14:18:08 crc kubenswrapper[4836]: I0217 14:18:08.365732 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-c76cc_592aa549-1b1b-441e-93e4-0821e05ff2b2/kube-multus/2.log" Feb 17 14:18:08 crc kubenswrapper[4836]: I0217 14:18:08.368802 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-c76cc_592aa549-1b1b-441e-93e4-0821e05ff2b2/kube-multus/1.log" Feb 17 14:18:08 crc kubenswrapper[4836]: I0217 14:18:08.368856 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-c76cc" event={"ID":"592aa549-1b1b-441e-93e4-0821e05ff2b2","Type":"ContainerStarted","Data":"8c74a5866188271c5111852363699b9b2a7c209b6cfe49d8ec0dd64613ff8db7"} Feb 17 14:18:08 crc kubenswrapper[4836]: I0217 14:18:08.571810 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-f94f2" Feb 17 14:18:08 crc kubenswrapper[4836]: I0217 14:18:08.572322 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-f94f2" Feb 17 14:18:08 crc kubenswrapper[4836]: I0217 14:18:08.572583 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b474d8486-lngwm" Feb 17 14:18:08 crc kubenswrapper[4836]: I0217 14:18:08.572795 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b474d8486-lngwm" Feb 17 14:18:08 crc kubenswrapper[4836]: I0217 14:18:08.572991 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-xm2rk" Feb 17 14:18:08 crc kubenswrapper[4836]: I0217 14:18:08.573200 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-xm2rk" Feb 17 14:18:08 crc kubenswrapper[4836]: E0217 14:18:08.624712 4836 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-xm2rk_openshift-operators_755bc851-3fff-45db-bbcf-164a27afcf85_0(bbc9c8e79e3ea2f00f0d21fb0dc91089d8e1f283b9575b8718b508c7fa08ab63): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 17 14:18:08 crc kubenswrapper[4836]: E0217 14:18:08.624820 4836 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-xm2rk_openshift-operators_755bc851-3fff-45db-bbcf-164a27afcf85_0(bbc9c8e79e3ea2f00f0d21fb0dc91089d8e1f283b9575b8718b508c7fa08ab63): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-xm2rk" Feb 17 14:18:08 crc kubenswrapper[4836]: E0217 14:18:08.624840 4836 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-xm2rk_openshift-operators_755bc851-3fff-45db-bbcf-164a27afcf85_0(bbc9c8e79e3ea2f00f0d21fb0dc91089d8e1f283b9575b8718b508c7fa08ab63): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-xm2rk" Feb 17 14:18:08 crc kubenswrapper[4836]: E0217 14:18:08.624903 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-68bc856cb9-xm2rk_openshift-operators(755bc851-3fff-45db-bbcf-164a27afcf85)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-68bc856cb9-xm2rk_openshift-operators(755bc851-3fff-45db-bbcf-164a27afcf85)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-xm2rk_openshift-operators_755bc851-3fff-45db-bbcf-164a27afcf85_0(bbc9c8e79e3ea2f00f0d21fb0dc91089d8e1f283b9575b8718b508c7fa08ab63): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-xm2rk" podUID="755bc851-3fff-45db-bbcf-164a27afcf85" Feb 17 14:18:08 crc kubenswrapper[4836]: E0217 14:18:08.629089 4836 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-f94f2_openshift-operators_d6acbcf2-dfc0-4a7b-b6bd-4b62c0b03578_0(7b158f726609e5a26d4ec63e2ad43d6a3b9f29149d2ddf5dbb69b98f81a90f71): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 17 14:18:08 crc kubenswrapper[4836]: E0217 14:18:08.629171 4836 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-f94f2_openshift-operators_d6acbcf2-dfc0-4a7b-b6bd-4b62c0b03578_0(7b158f726609e5a26d4ec63e2ad43d6a3b9f29149d2ddf5dbb69b98f81a90f71): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-59bdc8b94-f94f2" Feb 17 14:18:08 crc kubenswrapper[4836]: E0217 14:18:08.629201 4836 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-f94f2_openshift-operators_d6acbcf2-dfc0-4a7b-b6bd-4b62c0b03578_0(7b158f726609e5a26d4ec63e2ad43d6a3b9f29149d2ddf5dbb69b98f81a90f71): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-59bdc8b94-f94f2" Feb 17 14:18:08 crc kubenswrapper[4836]: E0217 14:18:08.629258 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-59bdc8b94-f94f2_openshift-operators(d6acbcf2-dfc0-4a7b-b6bd-4b62c0b03578)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-59bdc8b94-f94f2_openshift-operators(d6acbcf2-dfc0-4a7b-b6bd-4b62c0b03578)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-f94f2_openshift-operators_d6acbcf2-dfc0-4a7b-b6bd-4b62c0b03578_0(7b158f726609e5a26d4ec63e2ad43d6a3b9f29149d2ddf5dbb69b98f81a90f71): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-59bdc8b94-f94f2" podUID="d6acbcf2-dfc0-4a7b-b6bd-4b62c0b03578" Feb 17 14:18:08 crc kubenswrapper[4836]: E0217 14:18:08.633936 4836 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-5b474d8486-lngwm_openshift-operators_5a9fdae1-f115-4e94-9b72-026862e02026_0(3a523f82bd10b53ba0c703d2e41044c6ff1f70dabeb8d55c7e592033b07e380b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 17 14:18:08 crc kubenswrapper[4836]: E0217 14:18:08.633987 4836 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-5b474d8486-lngwm_openshift-operators_5a9fdae1-f115-4e94-9b72-026862e02026_0(3a523f82bd10b53ba0c703d2e41044c6ff1f70dabeb8d55c7e592033b07e380b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b474d8486-lngwm" Feb 17 14:18:08 crc kubenswrapper[4836]: E0217 14:18:08.634007 4836 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-5b474d8486-lngwm_openshift-operators_5a9fdae1-f115-4e94-9b72-026862e02026_0(3a523f82bd10b53ba0c703d2e41044c6ff1f70dabeb8d55c7e592033b07e380b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b474d8486-lngwm" Feb 17 14:18:08 crc kubenswrapper[4836]: E0217 14:18:08.634048 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-5b474d8486-lngwm_openshift-operators(5a9fdae1-f115-4e94-9b72-026862e02026)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-5b474d8486-lngwm_openshift-operators(5a9fdae1-f115-4e94-9b72-026862e02026)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-5b474d8486-lngwm_openshift-operators_5a9fdae1-f115-4e94-9b72-026862e02026_0(3a523f82bd10b53ba0c703d2e41044c6ff1f70dabeb8d55c7e592033b07e380b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b474d8486-lngwm" podUID="5a9fdae1-f115-4e94-9b72-026862e02026" Feb 17 14:18:11 crc kubenswrapper[4836]: I0217 14:18:11.273875 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-nb8gc" Feb 17 14:18:15 crc kubenswrapper[4836]: I0217 14:18:15.008796 4836 scope.go:117] "RemoveContainer" containerID="b64d012815880d0ba6314438a96b0ffd6bdad4678135a9bd3c9c2a4d6eb83c41" Feb 17 14:18:15 crc kubenswrapper[4836]: I0217 14:18:15.413176 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-c76cc_592aa549-1b1b-441e-93e4-0821e05ff2b2/kube-multus/2.log" Feb 17 14:18:20 crc kubenswrapper[4836]: I0217 14:18:20.567979 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b474d8486-m8jhr" Feb 17 14:18:20 crc kubenswrapper[4836]: I0217 14:18:20.568224 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-vqhkf" Feb 17 14:18:20 crc kubenswrapper[4836]: I0217 14:18:20.569366 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b474d8486-m8jhr" Feb 17 14:18:20 crc kubenswrapper[4836]: I0217 14:18:20.569834 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-vqhkf" Feb 17 14:18:20 crc kubenswrapper[4836]: I0217 14:18:20.885784 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-5b474d8486-m8jhr"] Feb 17 14:18:20 crc kubenswrapper[4836]: W0217 14:18:20.894478 4836 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce0a3fd2_d84a_417c_bd46_c0dba979376e.slice/crio-49fefc9e7d18ab8caf445571ee5cc595e92005bf37309aed0e22e7f547957a34 WatchSource:0}: Error finding container 49fefc9e7d18ab8caf445571ee5cc595e92005bf37309aed0e22e7f547957a34: Status 404 returned error can't find the container with id 49fefc9e7d18ab8caf445571ee5cc595e92005bf37309aed0e22e7f547957a34 Feb 17 14:18:20 crc kubenswrapper[4836]: I0217 14:18:20.912110 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-vqhkf"] Feb 17 14:18:20 crc kubenswrapper[4836]: W0217 14:18:20.917646 4836 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc4b6d996_7a86_4512_825f_6e6d34148862.slice/crio-dd7def9354ed9beb2f0764687f14d31da29f379e91018e84c3758233b0d4fbd6 WatchSource:0}: Error finding container dd7def9354ed9beb2f0764687f14d31da29f379e91018e84c3758233b0d4fbd6: Status 404 returned error can't find the container with id dd7def9354ed9beb2f0764687f14d31da29f379e91018e84c3758233b0d4fbd6 Feb 17 14:18:21 crc kubenswrapper[4836]: I0217 14:18:21.450218 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-vqhkf" event={"ID":"c4b6d996-7a86-4512-825f-6e6d34148862","Type":"ContainerStarted","Data":"dd7def9354ed9beb2f0764687f14d31da29f379e91018e84c3758233b0d4fbd6"} Feb 17 14:18:21 crc kubenswrapper[4836]: I0217 14:18:21.451525 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b474d8486-m8jhr" event={"ID":"ce0a3fd2-d84a-417c-bd46-c0dba979376e","Type":"ContainerStarted","Data":"49fefc9e7d18ab8caf445571ee5cc595e92005bf37309aed0e22e7f547957a34"} Feb 17 14:18:21 crc kubenswrapper[4836]: I0217 14:18:21.567435 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-xm2rk" Feb 17 14:18:21 crc kubenswrapper[4836]: I0217 14:18:21.568263 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-xm2rk" Feb 17 14:18:21 crc kubenswrapper[4836]: I0217 14:18:21.815958 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-xm2rk"] Feb 17 14:18:21 crc kubenswrapper[4836]: W0217 14:18:21.827578 4836 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod755bc851_3fff_45db_bbcf_164a27afcf85.slice/crio-06c320e967b08b80978d88908ead978940ea5cebdc861719314ea2cc3c71cb82 WatchSource:0}: Error finding container 06c320e967b08b80978d88908ead978940ea5cebdc861719314ea2cc3c71cb82: Status 404 returned error can't find the container with id 06c320e967b08b80978d88908ead978940ea5cebdc861719314ea2cc3c71cb82 Feb 17 14:18:22 crc kubenswrapper[4836]: I0217 14:18:22.460454 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-xm2rk" event={"ID":"755bc851-3fff-45db-bbcf-164a27afcf85","Type":"ContainerStarted","Data":"06c320e967b08b80978d88908ead978940ea5cebdc861719314ea2cc3c71cb82"} Feb 17 14:18:22 crc kubenswrapper[4836]: I0217 14:18:22.567466 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b474d8486-lngwm" Feb 17 14:18:22 crc kubenswrapper[4836]: I0217 14:18:22.567530 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-f94f2" Feb 17 14:18:22 crc kubenswrapper[4836]: I0217 14:18:22.568041 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b474d8486-lngwm" Feb 17 14:18:22 crc kubenswrapper[4836]: I0217 14:18:22.568864 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-f94f2" Feb 17 14:18:22 crc kubenswrapper[4836]: I0217 14:18:22.959423 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-f94f2"] Feb 17 14:18:22 crc kubenswrapper[4836]: W0217 14:18:22.967471 4836 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6acbcf2_dfc0_4a7b_b6bd_4b62c0b03578.slice/crio-d51bb9e93098b5cd655edc904abc10f7c5c285cca8b23062fb820b727c6d6941 WatchSource:0}: Error finding container d51bb9e93098b5cd655edc904abc10f7c5c285cca8b23062fb820b727c6d6941: Status 404 returned error can't find the container with id d51bb9e93098b5cd655edc904abc10f7c5c285cca8b23062fb820b727c6d6941 Feb 17 14:18:23 crc kubenswrapper[4836]: I0217 14:18:23.014074 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-5b474d8486-lngwm"] Feb 17 14:18:23 crc kubenswrapper[4836]: W0217 14:18:23.026992 4836 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a9fdae1_f115_4e94_9b72_026862e02026.slice/crio-054a3ff7a9e29c02b401290c1f9213e27ab3854468251eb7c846f66b3f7ecd60 WatchSource:0}: Error finding container 054a3ff7a9e29c02b401290c1f9213e27ab3854468251eb7c846f66b3f7ecd60: Status 404 returned error can't find the container with id 054a3ff7a9e29c02b401290c1f9213e27ab3854468251eb7c846f66b3f7ecd60 Feb 17 14:18:23 crc kubenswrapper[4836]: I0217 14:18:23.478513 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-f94f2" event={"ID":"d6acbcf2-dfc0-4a7b-b6bd-4b62c0b03578","Type":"ContainerStarted","Data":"d51bb9e93098b5cd655edc904abc10f7c5c285cca8b23062fb820b727c6d6941"} Feb 17 14:18:23 crc kubenswrapper[4836]: I0217 14:18:23.483705 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b474d8486-lngwm" event={"ID":"5a9fdae1-f115-4e94-9b72-026862e02026","Type":"ContainerStarted","Data":"054a3ff7a9e29c02b401290c1f9213e27ab3854468251eb7c846f66b3f7ecd60"} Feb 17 14:18:29 crc kubenswrapper[4836]: I0217 14:18:29.943318 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-xm2rk" event={"ID":"755bc851-3fff-45db-bbcf-164a27afcf85","Type":"ContainerStarted","Data":"f91bde5f52256cdc7cbbe106e02425ea11ef3ae0650501533003424513157952"} Feb 17 14:18:29 crc kubenswrapper[4836]: I0217 14:18:29.945042 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b474d8486-lngwm" event={"ID":"5a9fdae1-f115-4e94-9b72-026862e02026","Type":"ContainerStarted","Data":"4e57391a3d6ce3e1fa73c5de27f3090dd1065a387940c00c99871885ae404b29"} Feb 17 14:18:29 crc kubenswrapper[4836]: I0217 14:18:29.947205 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-vqhkf" event={"ID":"c4b6d996-7a86-4512-825f-6e6d34148862","Type":"ContainerStarted","Data":"c8662ff9c387493dd51bf7acbf6d1af480903838614b3eb0799c84a871a68b87"} Feb 17 14:18:29 crc kubenswrapper[4836]: I0217 14:18:29.947345 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5bf474d74f-vqhkf" Feb 17 14:18:29 crc kubenswrapper[4836]: I0217 14:18:29.949783 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b474d8486-m8jhr" event={"ID":"ce0a3fd2-d84a-417c-bd46-c0dba979376e","Type":"ContainerStarted","Data":"68116ebd01f159fd46c207f99b23289dcfceebb669691ac495307e644c1a63af"} Feb 17 14:18:29 crc kubenswrapper[4836]: I0217 14:18:29.974733 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-xm2rk" podStartSLOduration=31.327418557 podStartE2EDuration="38.974687349s" podCreationTimestamp="2026-02-17 14:17:51 +0000 UTC" firstStartedPulling="2026-02-17 14:18:21.829819144 +0000 UTC m=+728.172747413" lastFinishedPulling="2026-02-17 14:18:29.477087936 +0000 UTC m=+735.820016205" observedRunningTime="2026-02-17 14:18:29.968343624 +0000 UTC m=+736.311271913" watchObservedRunningTime="2026-02-17 14:18:29.974687349 +0000 UTC m=+736.317615639" Feb 17 14:18:30 crc kubenswrapper[4836]: I0217 14:18:30.311403 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5bf474d74f-vqhkf" podStartSLOduration=29.782553461 podStartE2EDuration="38.311378454s" podCreationTimestamp="2026-02-17 14:17:52 +0000 UTC" firstStartedPulling="2026-02-17 14:18:20.920411109 +0000 UTC m=+727.263339378" lastFinishedPulling="2026-02-17 14:18:29.449236102 +0000 UTC m=+735.792164371" observedRunningTime="2026-02-17 14:18:30.304956475 +0000 UTC m=+736.647884754" watchObservedRunningTime="2026-02-17 14:18:30.311378454 +0000 UTC m=+736.654306743" Feb 17 14:18:30 crc kubenswrapper[4836]: I0217 14:18:30.352476 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b474d8486-m8jhr" podStartSLOduration=30.774288741 podStartE2EDuration="39.352444004s" podCreationTimestamp="2026-02-17 14:17:51 +0000 UTC" firstStartedPulling="2026-02-17 14:18:20.897977156 +0000 UTC m=+727.240905425" lastFinishedPulling="2026-02-17 14:18:29.476132419 +0000 UTC m=+735.819060688" observedRunningTime="2026-02-17 14:18:30.350120769 +0000 UTC m=+736.693049048" watchObservedRunningTime="2026-02-17 14:18:30.352444004 +0000 UTC m=+736.695372263" Feb 17 14:18:35 crc kubenswrapper[4836]: I0217 14:18:35.299224 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-f94f2" event={"ID":"d6acbcf2-dfc0-4a7b-b6bd-4b62c0b03578","Type":"ContainerStarted","Data":"5824928e16ab2aba8546cfdc352c134bfff6bf88426c41f8c0c8d0e74db6324c"} Feb 17 14:18:35 crc kubenswrapper[4836]: I0217 14:18:35.299923 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-59bdc8b94-f94f2" Feb 17 14:18:35 crc kubenswrapper[4836]: I0217 14:18:35.329227 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-59bdc8b94-f94f2" podStartSLOduration=32.775299342 podStartE2EDuration="44.329199985s" podCreationTimestamp="2026-02-17 14:17:51 +0000 UTC" firstStartedPulling="2026-02-17 14:18:22.972191151 +0000 UTC m=+729.315119410" lastFinishedPulling="2026-02-17 14:18:34.526091784 +0000 UTC m=+740.869020053" observedRunningTime="2026-02-17 14:18:35.322957312 +0000 UTC m=+741.665885601" watchObservedRunningTime="2026-02-17 14:18:35.329199985 +0000 UTC m=+741.672128254" Feb 17 14:18:35 crc kubenswrapper[4836]: I0217 14:18:35.329599 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5b474d8486-lngwm" podStartSLOduration=37.873284332 podStartE2EDuration="44.329594697s" podCreationTimestamp="2026-02-17 14:17:51 +0000 UTC" firstStartedPulling="2026-02-17 14:18:23.03803346 +0000 UTC m=+729.380961729" lastFinishedPulling="2026-02-17 14:18:29.494343805 +0000 UTC m=+735.837272094" observedRunningTime="2026-02-17 14:18:30.381009397 +0000 UTC m=+736.723937666" watchObservedRunningTime="2026-02-17 14:18:35.329594697 +0000 UTC m=+741.672522966" Feb 17 14:18:35 crc kubenswrapper[4836]: I0217 14:18:35.359374 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-59bdc8b94-f94f2" Feb 17 14:18:42 crc kubenswrapper[4836]: I0217 14:18:42.656574 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5bf474d74f-vqhkf" Feb 17 14:18:44 crc kubenswrapper[4836]: I0217 14:18:44.890666 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-dmddv"] Feb 17 14:18:44 crc kubenswrapper[4836]: I0217 14:18:44.891945 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-dmddv" Feb 17 14:18:44 crc kubenswrapper[4836]: I0217 14:18:44.894020 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Feb 17 14:18:44 crc kubenswrapper[4836]: I0217 14:18:44.894770 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Feb 17 14:18:44 crc kubenswrapper[4836]: I0217 14:18:44.896773 4836 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-kw992" Feb 17 14:18:44 crc kubenswrapper[4836]: I0217 14:18:44.906571 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-vtfx4"] Feb 17 14:18:44 crc kubenswrapper[4836]: I0217 14:18:44.906883 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gcgbc\" (UniqueName: \"kubernetes.io/projected/918985c6-76a8-4bb2-8868-278b633133a9-kube-api-access-gcgbc\") pod \"cert-manager-cainjector-cf98fcc89-dmddv\" (UID: \"918985c6-76a8-4bb2-8868-278b633133a9\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-dmddv" Feb 17 14:18:44 crc kubenswrapper[4836]: I0217 14:18:44.907872 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-vtfx4" Feb 17 14:18:44 crc kubenswrapper[4836]: I0217 14:18:44.911924 4836 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-nq86z" Feb 17 14:18:44 crc kubenswrapper[4836]: I0217 14:18:44.914553 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-zhbzj"] Feb 17 14:18:44 crc kubenswrapper[4836]: I0217 14:18:44.915604 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-zhbzj" Feb 17 14:18:44 crc kubenswrapper[4836]: I0217 14:18:44.928955 4836 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-vbn52" Feb 17 14:18:44 crc kubenswrapper[4836]: I0217 14:18:44.933652 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-vtfx4"] Feb 17 14:18:44 crc kubenswrapper[4836]: I0217 14:18:44.939858 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-zhbzj"] Feb 17 14:18:44 crc kubenswrapper[4836]: I0217 14:18:44.957038 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-dmddv"] Feb 17 14:18:45 crc kubenswrapper[4836]: I0217 14:18:45.008167 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gcgbc\" (UniqueName: \"kubernetes.io/projected/918985c6-76a8-4bb2-8868-278b633133a9-kube-api-access-gcgbc\") pod \"cert-manager-cainjector-cf98fcc89-dmddv\" (UID: \"918985c6-76a8-4bb2-8868-278b633133a9\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-dmddv" Feb 17 14:18:45 crc kubenswrapper[4836]: I0217 14:18:45.008541 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84qg8\" (UniqueName: \"kubernetes.io/projected/662067b4-39c2-4ab7-adb4-ba8a6330b0b9-kube-api-access-84qg8\") pod \"cert-manager-webhook-687f57d79b-zhbzj\" (UID: \"662067b4-39c2-4ab7-adb4-ba8a6330b0b9\") " pod="cert-manager/cert-manager-webhook-687f57d79b-zhbzj" Feb 17 14:18:45 crc kubenswrapper[4836]: I0217 14:18:45.008748 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cc878\" (UniqueName: \"kubernetes.io/projected/63f75031-4e24-42f7-80cc-2f3fb289dac0-kube-api-access-cc878\") pod \"cert-manager-858654f9db-vtfx4\" (UID: \"63f75031-4e24-42f7-80cc-2f3fb289dac0\") " pod="cert-manager/cert-manager-858654f9db-vtfx4" Feb 17 14:18:45 crc kubenswrapper[4836]: I0217 14:18:45.035767 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcgbc\" (UniqueName: \"kubernetes.io/projected/918985c6-76a8-4bb2-8868-278b633133a9-kube-api-access-gcgbc\") pod \"cert-manager-cainjector-cf98fcc89-dmddv\" (UID: \"918985c6-76a8-4bb2-8868-278b633133a9\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-dmddv" Feb 17 14:18:45 crc kubenswrapper[4836]: I0217 14:18:45.110735 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cc878\" (UniqueName: \"kubernetes.io/projected/63f75031-4e24-42f7-80cc-2f3fb289dac0-kube-api-access-cc878\") pod \"cert-manager-858654f9db-vtfx4\" (UID: \"63f75031-4e24-42f7-80cc-2f3fb289dac0\") " pod="cert-manager/cert-manager-858654f9db-vtfx4" Feb 17 14:18:45 crc kubenswrapper[4836]: I0217 14:18:45.110842 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84qg8\" (UniqueName: \"kubernetes.io/projected/662067b4-39c2-4ab7-adb4-ba8a6330b0b9-kube-api-access-84qg8\") pod \"cert-manager-webhook-687f57d79b-zhbzj\" (UID: \"662067b4-39c2-4ab7-adb4-ba8a6330b0b9\") " pod="cert-manager/cert-manager-webhook-687f57d79b-zhbzj" Feb 17 14:18:45 crc kubenswrapper[4836]: I0217 14:18:45.137435 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84qg8\" (UniqueName: \"kubernetes.io/projected/662067b4-39c2-4ab7-adb4-ba8a6330b0b9-kube-api-access-84qg8\") pod \"cert-manager-webhook-687f57d79b-zhbzj\" (UID: \"662067b4-39c2-4ab7-adb4-ba8a6330b0b9\") " pod="cert-manager/cert-manager-webhook-687f57d79b-zhbzj" Feb 17 14:18:45 crc kubenswrapper[4836]: I0217 14:18:45.137457 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cc878\" (UniqueName: \"kubernetes.io/projected/63f75031-4e24-42f7-80cc-2f3fb289dac0-kube-api-access-cc878\") pod \"cert-manager-858654f9db-vtfx4\" (UID: \"63f75031-4e24-42f7-80cc-2f3fb289dac0\") " pod="cert-manager/cert-manager-858654f9db-vtfx4" Feb 17 14:18:45 crc kubenswrapper[4836]: I0217 14:18:45.216223 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-dmddv" Feb 17 14:18:45 crc kubenswrapper[4836]: I0217 14:18:45.240226 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-vtfx4" Feb 17 14:18:45 crc kubenswrapper[4836]: I0217 14:18:45.251531 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-zhbzj" Feb 17 14:18:45 crc kubenswrapper[4836]: I0217 14:18:45.820175 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-dmddv"] Feb 17 14:18:45 crc kubenswrapper[4836]: I0217 14:18:45.958449 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-zhbzj"] Feb 17 14:18:46 crc kubenswrapper[4836]: I0217 14:18:46.136042 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-vtfx4"] Feb 17 14:18:46 crc kubenswrapper[4836]: W0217 14:18:46.140133 4836 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod63f75031_4e24_42f7_80cc_2f3fb289dac0.slice/crio-13fa2482bef76bef78e56e4d59c9fa10d65b9ec893b2abd432a09a9a5a29d1a7 WatchSource:0}: Error finding container 13fa2482bef76bef78e56e4d59c9fa10d65b9ec893b2abd432a09a9a5a29d1a7: Status 404 returned error can't find the container with id 13fa2482bef76bef78e56e4d59c9fa10d65b9ec893b2abd432a09a9a5a29d1a7 Feb 17 14:18:46 crc kubenswrapper[4836]: I0217 14:18:46.454527 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-dmddv" event={"ID":"918985c6-76a8-4bb2-8868-278b633133a9","Type":"ContainerStarted","Data":"e77d45274e336b86af9f0403d5e2f4a1a86bd5f0902a50a6e16bbe6a9120ceb4"} Feb 17 14:18:46 crc kubenswrapper[4836]: I0217 14:18:46.455660 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-zhbzj" event={"ID":"662067b4-39c2-4ab7-adb4-ba8a6330b0b9","Type":"ContainerStarted","Data":"2961909ead6c585030cebc332cd7163a3d59b4a09191eb7a33f381ea3f8e925d"} Feb 17 14:18:46 crc kubenswrapper[4836]: I0217 14:18:46.456747 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-vtfx4" event={"ID":"63f75031-4e24-42f7-80cc-2f3fb289dac0","Type":"ContainerStarted","Data":"13fa2482bef76bef78e56e4d59c9fa10d65b9ec893b2abd432a09a9a5a29d1a7"} Feb 17 14:18:54 crc kubenswrapper[4836]: I0217 14:18:54.281962 4836 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 17 14:18:54 crc kubenswrapper[4836]: I0217 14:18:54.853529 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-zhbzj" event={"ID":"662067b4-39c2-4ab7-adb4-ba8a6330b0b9","Type":"ContainerStarted","Data":"3e08ab673431ee790211e9556c838c1bf6f6fb7544b0d27a27043570e3c38044"} Feb 17 14:18:54 crc kubenswrapper[4836]: I0217 14:18:54.853726 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-zhbzj" Feb 17 14:18:54 crc kubenswrapper[4836]: I0217 14:18:54.857136 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-vtfx4" event={"ID":"63f75031-4e24-42f7-80cc-2f3fb289dac0","Type":"ContainerStarted","Data":"17d85d2266dcb6ef95d13d8fc7572ab81e7ebc3e1ddeff21fbe69d2fcb5b306b"} Feb 17 14:18:54 crc kubenswrapper[4836]: I0217 14:18:54.860043 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-dmddv" event={"ID":"918985c6-76a8-4bb2-8868-278b633133a9","Type":"ContainerStarted","Data":"5b7f25a98a96ce86bd9c258bb498ee41b2e57b44349490d627627adcca770733"} Feb 17 14:18:54 crc kubenswrapper[4836]: I0217 14:18:54.874679 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-zhbzj" podStartSLOduration=2.761455248 podStartE2EDuration="10.874659006s" podCreationTimestamp="2026-02-17 14:18:44 +0000 UTC" firstStartedPulling="2026-02-17 14:18:45.971834621 +0000 UTC m=+752.314762890" lastFinishedPulling="2026-02-17 14:18:54.085038379 +0000 UTC m=+760.427966648" observedRunningTime="2026-02-17 14:18:54.871691073 +0000 UTC m=+761.214619342" watchObservedRunningTime="2026-02-17 14:18:54.874659006 +0000 UTC m=+761.217587265" Feb 17 14:18:54 crc kubenswrapper[4836]: I0217 14:18:54.898829 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-vtfx4" podStartSLOduration=2.894071183 podStartE2EDuration="10.898790466s" podCreationTimestamp="2026-02-17 14:18:44 +0000 UTC" firstStartedPulling="2026-02-17 14:18:46.143321436 +0000 UTC m=+752.486249705" lastFinishedPulling="2026-02-17 14:18:54.148040719 +0000 UTC m=+760.490968988" observedRunningTime="2026-02-17 14:18:54.88922695 +0000 UTC m=+761.232155219" watchObservedRunningTime="2026-02-17 14:18:54.898790466 +0000 UTC m=+761.241718745" Feb 17 14:18:54 crc kubenswrapper[4836]: I0217 14:18:54.920743 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-dmddv" podStartSLOduration=2.67655523 podStartE2EDuration="10.920724665s" podCreationTimestamp="2026-02-17 14:18:44 +0000 UTC" firstStartedPulling="2026-02-17 14:18:45.841040198 +0000 UTC m=+752.183968467" lastFinishedPulling="2026-02-17 14:18:54.085209633 +0000 UTC m=+760.428137902" observedRunningTime="2026-02-17 14:18:54.918800491 +0000 UTC m=+761.261728760" watchObservedRunningTime="2026-02-17 14:18:54.920724665 +0000 UTC m=+761.263652944" Feb 17 14:19:00 crc kubenswrapper[4836]: I0217 14:19:00.257944 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-zhbzj" Feb 17 14:19:29 crc kubenswrapper[4836]: I0217 14:19:29.765027 4836 patch_prober.go:28] interesting pod/machine-config-daemon-bkk9g container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 14:19:29 crc kubenswrapper[4836]: I0217 14:19:29.765810 4836 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" podUID="895a19c9-a3f0-4a15-aa19-19347121388c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 14:19:31 crc kubenswrapper[4836]: I0217 14:19:31.658159 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651jtdwf"] Feb 17 14:19:31 crc kubenswrapper[4836]: I0217 14:19:31.659639 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651jtdwf" Feb 17 14:19:31 crc kubenswrapper[4836]: I0217 14:19:31.661735 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 17 14:19:31 crc kubenswrapper[4836]: I0217 14:19:31.680508 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651jtdwf"] Feb 17 14:19:31 crc kubenswrapper[4836]: I0217 14:19:31.839233 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvsk4\" (UniqueName: \"kubernetes.io/projected/3464477d-9902-4d40-9048-443132123fb3-kube-api-access-fvsk4\") pod \"7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651jtdwf\" (UID: \"3464477d-9902-4d40-9048-443132123fb3\") " pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651jtdwf" Feb 17 14:19:31 crc kubenswrapper[4836]: I0217 14:19:31.839372 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3464477d-9902-4d40-9048-443132123fb3-bundle\") pod \"7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651jtdwf\" (UID: \"3464477d-9902-4d40-9048-443132123fb3\") " pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651jtdwf" Feb 17 14:19:31 crc kubenswrapper[4836]: I0217 14:19:31.839432 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3464477d-9902-4d40-9048-443132123fb3-util\") pod \"7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651jtdwf\" (UID: \"3464477d-9902-4d40-9048-443132123fb3\") " pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651jtdwf" Feb 17 14:19:31 crc kubenswrapper[4836]: I0217 14:19:31.941605 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvsk4\" (UniqueName: \"kubernetes.io/projected/3464477d-9902-4d40-9048-443132123fb3-kube-api-access-fvsk4\") pod \"7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651jtdwf\" (UID: \"3464477d-9902-4d40-9048-443132123fb3\") " pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651jtdwf" Feb 17 14:19:31 crc kubenswrapper[4836]: I0217 14:19:31.942077 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3464477d-9902-4d40-9048-443132123fb3-bundle\") pod \"7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651jtdwf\" (UID: \"3464477d-9902-4d40-9048-443132123fb3\") " pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651jtdwf" Feb 17 14:19:31 crc kubenswrapper[4836]: I0217 14:19:31.942255 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3464477d-9902-4d40-9048-443132123fb3-util\") pod \"7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651jtdwf\" (UID: \"3464477d-9902-4d40-9048-443132123fb3\") " pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651jtdwf" Feb 17 14:19:31 crc kubenswrapper[4836]: I0217 14:19:31.943031 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3464477d-9902-4d40-9048-443132123fb3-bundle\") pod \"7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651jtdwf\" (UID: \"3464477d-9902-4d40-9048-443132123fb3\") " pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651jtdwf" Feb 17 14:19:31 crc kubenswrapper[4836]: I0217 14:19:31.943156 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3464477d-9902-4d40-9048-443132123fb3-util\") pod \"7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651jtdwf\" (UID: \"3464477d-9902-4d40-9048-443132123fb3\") " pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651jtdwf" Feb 17 14:19:31 crc kubenswrapper[4836]: I0217 14:19:31.978128 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvsk4\" (UniqueName: \"kubernetes.io/projected/3464477d-9902-4d40-9048-443132123fb3-kube-api-access-fvsk4\") pod \"7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651jtdwf\" (UID: \"3464477d-9902-4d40-9048-443132123fb3\") " pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651jtdwf" Feb 17 14:19:32 crc kubenswrapper[4836]: I0217 14:19:32.016716 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651jtdwf" Feb 17 14:19:32 crc kubenswrapper[4836]: I0217 14:19:32.544219 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651jtdwf"] Feb 17 14:19:33 crc kubenswrapper[4836]: I0217 14:19:33.133359 4836 generic.go:334] "Generic (PLEG): container finished" podID="3464477d-9902-4d40-9048-443132123fb3" containerID="65aa24d05f36a4132e36a602723d81f5d15fc397e4fc83e26cf4a6780a6cbce0" exitCode=0 Feb 17 14:19:33 crc kubenswrapper[4836]: I0217 14:19:33.133415 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651jtdwf" event={"ID":"3464477d-9902-4d40-9048-443132123fb3","Type":"ContainerDied","Data":"65aa24d05f36a4132e36a602723d81f5d15fc397e4fc83e26cf4a6780a6cbce0"} Feb 17 14:19:33 crc kubenswrapper[4836]: I0217 14:19:33.133673 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651jtdwf" event={"ID":"3464477d-9902-4d40-9048-443132123fb3","Type":"ContainerStarted","Data":"ad0f9defb731bc03c633adaa62a2320c1a2b5f7f64dd33ce92cf51e6d63cea84"} Feb 17 14:19:33 crc kubenswrapper[4836]: I0217 14:19:33.900753 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["minio-dev/minio"] Feb 17 14:19:33 crc kubenswrapper[4836]: I0217 14:19:33.901695 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="minio-dev/minio" Feb 17 14:19:33 crc kubenswrapper[4836]: I0217 14:19:33.903986 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"minio-dev"/"kube-root-ca.crt" Feb 17 14:19:33 crc kubenswrapper[4836]: I0217 14:19:33.905199 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"minio-dev"/"openshift-service-ca.crt" Feb 17 14:19:33 crc kubenswrapper[4836]: I0217 14:19:33.906213 4836 reflector.go:368] Caches populated for *v1.Secret from object-"minio-dev"/"default-dockercfg-4mnhk" Feb 17 14:19:33 crc kubenswrapper[4836]: I0217 14:19:33.914516 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["minio-dev/minio"] Feb 17 14:19:33 crc kubenswrapper[4836]: I0217 14:19:33.942617 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-lzp6h"] Feb 17 14:19:33 crc kubenswrapper[4836]: I0217 14:19:33.945869 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lzp6h" Feb 17 14:19:33 crc kubenswrapper[4836]: I0217 14:19:33.955772 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lzp6h"] Feb 17 14:19:34 crc kubenswrapper[4836]: I0217 14:19:34.062824 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-d2c45594-ebae-414b-b3cb-e8abe867c20a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d2c45594-ebae-414b-b3cb-e8abe867c20a\") pod \"minio\" (UID: \"673f5440-6cd4-4341-8388-fdf924e48044\") " pod="minio-dev/minio" Feb 17 14:19:34 crc kubenswrapper[4836]: I0217 14:19:34.062889 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cfed558c-2562-4771-af8e-bc422f87be49-catalog-content\") pod \"redhat-operators-lzp6h\" (UID: \"cfed558c-2562-4771-af8e-bc422f87be49\") " pod="openshift-marketplace/redhat-operators-lzp6h" Feb 17 14:19:34 crc kubenswrapper[4836]: I0217 14:19:34.062926 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cfed558c-2562-4771-af8e-bc422f87be49-utilities\") pod \"redhat-operators-lzp6h\" (UID: \"cfed558c-2562-4771-af8e-bc422f87be49\") " pod="openshift-marketplace/redhat-operators-lzp6h" Feb 17 14:19:34 crc kubenswrapper[4836]: I0217 14:19:34.062955 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdlcw\" (UniqueName: \"kubernetes.io/projected/cfed558c-2562-4771-af8e-bc422f87be49-kube-api-access-tdlcw\") pod \"redhat-operators-lzp6h\" (UID: \"cfed558c-2562-4771-af8e-bc422f87be49\") " pod="openshift-marketplace/redhat-operators-lzp6h" Feb 17 14:19:34 crc kubenswrapper[4836]: I0217 14:19:34.062985 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tst2s\" (UniqueName: \"kubernetes.io/projected/673f5440-6cd4-4341-8388-fdf924e48044-kube-api-access-tst2s\") pod \"minio\" (UID: \"673f5440-6cd4-4341-8388-fdf924e48044\") " pod="minio-dev/minio" Feb 17 14:19:34 crc kubenswrapper[4836]: I0217 14:19:34.163960 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cfed558c-2562-4771-af8e-bc422f87be49-utilities\") pod \"redhat-operators-lzp6h\" (UID: \"cfed558c-2562-4771-af8e-bc422f87be49\") " pod="openshift-marketplace/redhat-operators-lzp6h" Feb 17 14:19:34 crc kubenswrapper[4836]: I0217 14:19:34.164049 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdlcw\" (UniqueName: \"kubernetes.io/projected/cfed558c-2562-4771-af8e-bc422f87be49-kube-api-access-tdlcw\") pod \"redhat-operators-lzp6h\" (UID: \"cfed558c-2562-4771-af8e-bc422f87be49\") " pod="openshift-marketplace/redhat-operators-lzp6h" Feb 17 14:19:34 crc kubenswrapper[4836]: I0217 14:19:34.164091 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tst2s\" (UniqueName: \"kubernetes.io/projected/673f5440-6cd4-4341-8388-fdf924e48044-kube-api-access-tst2s\") pod \"minio\" (UID: \"673f5440-6cd4-4341-8388-fdf924e48044\") " pod="minio-dev/minio" Feb 17 14:19:34 crc kubenswrapper[4836]: I0217 14:19:34.164156 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-d2c45594-ebae-414b-b3cb-e8abe867c20a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d2c45594-ebae-414b-b3cb-e8abe867c20a\") pod \"minio\" (UID: \"673f5440-6cd4-4341-8388-fdf924e48044\") " pod="minio-dev/minio" Feb 17 14:19:34 crc kubenswrapper[4836]: I0217 14:19:34.164187 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cfed558c-2562-4771-af8e-bc422f87be49-catalog-content\") pod \"redhat-operators-lzp6h\" (UID: \"cfed558c-2562-4771-af8e-bc422f87be49\") " pod="openshift-marketplace/redhat-operators-lzp6h" Feb 17 14:19:34 crc kubenswrapper[4836]: I0217 14:19:34.164579 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cfed558c-2562-4771-af8e-bc422f87be49-catalog-content\") pod \"redhat-operators-lzp6h\" (UID: \"cfed558c-2562-4771-af8e-bc422f87be49\") " pod="openshift-marketplace/redhat-operators-lzp6h" Feb 17 14:19:34 crc kubenswrapper[4836]: I0217 14:19:34.164629 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cfed558c-2562-4771-af8e-bc422f87be49-utilities\") pod \"redhat-operators-lzp6h\" (UID: \"cfed558c-2562-4771-af8e-bc422f87be49\") " pod="openshift-marketplace/redhat-operators-lzp6h" Feb 17 14:19:34 crc kubenswrapper[4836]: I0217 14:19:34.177028 4836 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 17 14:19:34 crc kubenswrapper[4836]: I0217 14:19:34.177093 4836 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-d2c45594-ebae-414b-b3cb-e8abe867c20a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d2c45594-ebae-414b-b3cb-e8abe867c20a\") pod \"minio\" (UID: \"673f5440-6cd4-4341-8388-fdf924e48044\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/629f4bca77debccc3b1fdd12a1a4fe57c6a22cc0f05c47e1303e3db1224caa2f/globalmount\"" pod="minio-dev/minio" Feb 17 14:19:34 crc kubenswrapper[4836]: I0217 14:19:34.191631 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdlcw\" (UniqueName: \"kubernetes.io/projected/cfed558c-2562-4771-af8e-bc422f87be49-kube-api-access-tdlcw\") pod \"redhat-operators-lzp6h\" (UID: \"cfed558c-2562-4771-af8e-bc422f87be49\") " pod="openshift-marketplace/redhat-operators-lzp6h" Feb 17 14:19:34 crc kubenswrapper[4836]: I0217 14:19:34.196398 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tst2s\" (UniqueName: \"kubernetes.io/projected/673f5440-6cd4-4341-8388-fdf924e48044-kube-api-access-tst2s\") pod \"minio\" (UID: \"673f5440-6cd4-4341-8388-fdf924e48044\") " pod="minio-dev/minio" Feb 17 14:19:34 crc kubenswrapper[4836]: I0217 14:19:34.207434 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-d2c45594-ebae-414b-b3cb-e8abe867c20a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d2c45594-ebae-414b-b3cb-e8abe867c20a\") pod \"minio\" (UID: \"673f5440-6cd4-4341-8388-fdf924e48044\") " pod="minio-dev/minio" Feb 17 14:19:34 crc kubenswrapper[4836]: I0217 14:19:34.219336 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="minio-dev/minio" Feb 17 14:19:34 crc kubenswrapper[4836]: I0217 14:19:34.265255 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lzp6h" Feb 17 14:19:34 crc kubenswrapper[4836]: I0217 14:19:34.633130 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["minio-dev/minio"] Feb 17 14:19:34 crc kubenswrapper[4836]: W0217 14:19:34.636201 4836 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod673f5440_6cd4_4341_8388_fdf924e48044.slice/crio-35ea741011fbc50b7e7ff703b1d8cfc72dafe3c84b464800e5c17d49f9283249 WatchSource:0}: Error finding container 35ea741011fbc50b7e7ff703b1d8cfc72dafe3c84b464800e5c17d49f9283249: Status 404 returned error can't find the container with id 35ea741011fbc50b7e7ff703b1d8cfc72dafe3c84b464800e5c17d49f9283249 Feb 17 14:19:35 crc kubenswrapper[4836]: I0217 14:19:35.009621 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lzp6h"] Feb 17 14:19:35 crc kubenswrapper[4836]: W0217 14:19:35.016942 4836 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcfed558c_2562_4771_af8e_bc422f87be49.slice/crio-673a85edd931637c360a40a9b7c8e08f34e58e0982f41b7b113c561515f17ba5 WatchSource:0}: Error finding container 673a85edd931637c360a40a9b7c8e08f34e58e0982f41b7b113c561515f17ba5: Status 404 returned error can't find the container with id 673a85edd931637c360a40a9b7c8e08f34e58e0982f41b7b113c561515f17ba5 Feb 17 14:19:35 crc kubenswrapper[4836]: I0217 14:19:35.151398 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="minio-dev/minio" event={"ID":"673f5440-6cd4-4341-8388-fdf924e48044","Type":"ContainerStarted","Data":"35ea741011fbc50b7e7ff703b1d8cfc72dafe3c84b464800e5c17d49f9283249"} Feb 17 14:19:35 crc kubenswrapper[4836]: I0217 14:19:35.156224 4836 generic.go:334] "Generic (PLEG): container finished" podID="3464477d-9902-4d40-9048-443132123fb3" containerID="844e099e74476fd7a0a52853c101508eb28066193f6a0fd9d6d237ab45adab36" exitCode=0 Feb 17 14:19:35 crc kubenswrapper[4836]: I0217 14:19:35.156338 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651jtdwf" event={"ID":"3464477d-9902-4d40-9048-443132123fb3","Type":"ContainerDied","Data":"844e099e74476fd7a0a52853c101508eb28066193f6a0fd9d6d237ab45adab36"} Feb 17 14:19:35 crc kubenswrapper[4836]: I0217 14:19:35.158849 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lzp6h" event={"ID":"cfed558c-2562-4771-af8e-bc422f87be49","Type":"ContainerStarted","Data":"673a85edd931637c360a40a9b7c8e08f34e58e0982f41b7b113c561515f17ba5"} Feb 17 14:19:36 crc kubenswrapper[4836]: I0217 14:19:36.182790 4836 generic.go:334] "Generic (PLEG): container finished" podID="3464477d-9902-4d40-9048-443132123fb3" containerID="66a2284aeac61984203666d085154cc52e97f425c50842712f125a2a2476f42e" exitCode=0 Feb 17 14:19:36 crc kubenswrapper[4836]: I0217 14:19:36.183669 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651jtdwf" event={"ID":"3464477d-9902-4d40-9048-443132123fb3","Type":"ContainerDied","Data":"66a2284aeac61984203666d085154cc52e97f425c50842712f125a2a2476f42e"} Feb 17 14:19:36 crc kubenswrapper[4836]: I0217 14:19:36.187998 4836 generic.go:334] "Generic (PLEG): container finished" podID="cfed558c-2562-4771-af8e-bc422f87be49" containerID="d4dfb4a28991a2ad7ad1d97f7e9b6acb9b352c07cce3ac2497280bb1a45e40ce" exitCode=0 Feb 17 14:19:36 crc kubenswrapper[4836]: I0217 14:19:36.188076 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lzp6h" event={"ID":"cfed558c-2562-4771-af8e-bc422f87be49","Type":"ContainerDied","Data":"d4dfb4a28991a2ad7ad1d97f7e9b6acb9b352c07cce3ac2497280bb1a45e40ce"} Feb 17 14:19:38 crc kubenswrapper[4836]: I0217 14:19:38.379313 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651jtdwf" Feb 17 14:19:38 crc kubenswrapper[4836]: I0217 14:19:38.483560 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3464477d-9902-4d40-9048-443132123fb3-bundle\") pod \"3464477d-9902-4d40-9048-443132123fb3\" (UID: \"3464477d-9902-4d40-9048-443132123fb3\") " Feb 17 14:19:38 crc kubenswrapper[4836]: I0217 14:19:38.483618 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3464477d-9902-4d40-9048-443132123fb3-util\") pod \"3464477d-9902-4d40-9048-443132123fb3\" (UID: \"3464477d-9902-4d40-9048-443132123fb3\") " Feb 17 14:19:38 crc kubenswrapper[4836]: I0217 14:19:38.483686 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fvsk4\" (UniqueName: \"kubernetes.io/projected/3464477d-9902-4d40-9048-443132123fb3-kube-api-access-fvsk4\") pod \"3464477d-9902-4d40-9048-443132123fb3\" (UID: \"3464477d-9902-4d40-9048-443132123fb3\") " Feb 17 14:19:38 crc kubenswrapper[4836]: I0217 14:19:38.485872 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3464477d-9902-4d40-9048-443132123fb3-bundle" (OuterVolumeSpecName: "bundle") pod "3464477d-9902-4d40-9048-443132123fb3" (UID: "3464477d-9902-4d40-9048-443132123fb3"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:19:38 crc kubenswrapper[4836]: I0217 14:19:38.495798 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3464477d-9902-4d40-9048-443132123fb3-kube-api-access-fvsk4" (OuterVolumeSpecName: "kube-api-access-fvsk4") pod "3464477d-9902-4d40-9048-443132123fb3" (UID: "3464477d-9902-4d40-9048-443132123fb3"). InnerVolumeSpecName "kube-api-access-fvsk4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:19:38 crc kubenswrapper[4836]: I0217 14:19:38.585119 4836 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3464477d-9902-4d40-9048-443132123fb3-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:19:38 crc kubenswrapper[4836]: I0217 14:19:38.585159 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fvsk4\" (UniqueName: \"kubernetes.io/projected/3464477d-9902-4d40-9048-443132123fb3-kube-api-access-fvsk4\") on node \"crc\" DevicePath \"\"" Feb 17 14:19:38 crc kubenswrapper[4836]: I0217 14:19:38.595773 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3464477d-9902-4d40-9048-443132123fb3-util" (OuterVolumeSpecName: "util") pod "3464477d-9902-4d40-9048-443132123fb3" (UID: "3464477d-9902-4d40-9048-443132123fb3"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:19:38 crc kubenswrapper[4836]: I0217 14:19:38.686565 4836 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3464477d-9902-4d40-9048-443132123fb3-util\") on node \"crc\" DevicePath \"\"" Feb 17 14:19:39 crc kubenswrapper[4836]: I0217 14:19:39.228038 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651jtdwf" event={"ID":"3464477d-9902-4d40-9048-443132123fb3","Type":"ContainerDied","Data":"ad0f9defb731bc03c633adaa62a2320c1a2b5f7f64dd33ce92cf51e6d63cea84"} Feb 17 14:19:39 crc kubenswrapper[4836]: I0217 14:19:39.228116 4836 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ad0f9defb731bc03c633adaa62a2320c1a2b5f7f64dd33ce92cf51e6d63cea84" Feb 17 14:19:39 crc kubenswrapper[4836]: I0217 14:19:39.228251 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651jtdwf" Feb 17 14:19:42 crc kubenswrapper[4836]: I0217 14:19:42.477742 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lzp6h" event={"ID":"cfed558c-2562-4771-af8e-bc422f87be49","Type":"ContainerStarted","Data":"d2cbaf947c2e327430e0f53838c199bd1b517cad7c1f8cc6fa7d59ca6b872238"} Feb 17 14:19:42 crc kubenswrapper[4836]: I0217 14:19:42.496908 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="minio-dev/minio" event={"ID":"673f5440-6cd4-4341-8388-fdf924e48044","Type":"ContainerStarted","Data":"2f9bf953667ce4e254bcd6e56c58f905f8da40fdf5143fc07bd7d53964847e14"} Feb 17 14:19:48 crc kubenswrapper[4836]: I0217 14:19:48.520281 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="minio-dev/minio" podStartSLOduration=10.110343819 podStartE2EDuration="17.520254933s" podCreationTimestamp="2026-02-17 14:19:31 +0000 UTC" firstStartedPulling="2026-02-17 14:19:34.63907564 +0000 UTC m=+800.982003909" lastFinishedPulling="2026-02-17 14:19:42.048986754 +0000 UTC m=+808.391915023" observedRunningTime="2026-02-17 14:19:42.925159465 +0000 UTC m=+809.268087744" watchObservedRunningTime="2026-02-17 14:19:48.520254933 +0000 UTC m=+814.863183202" Feb 17 14:19:48 crc kubenswrapper[4836]: I0217 14:19:48.524080 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-dfd4b8c4b-kclf7"] Feb 17 14:19:48 crc kubenswrapper[4836]: E0217 14:19:48.524439 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3464477d-9902-4d40-9048-443132123fb3" containerName="util" Feb 17 14:19:48 crc kubenswrapper[4836]: I0217 14:19:48.524463 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="3464477d-9902-4d40-9048-443132123fb3" containerName="util" Feb 17 14:19:48 crc kubenswrapper[4836]: E0217 14:19:48.524485 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3464477d-9902-4d40-9048-443132123fb3" containerName="pull" Feb 17 14:19:48 crc kubenswrapper[4836]: I0217 14:19:48.524498 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="3464477d-9902-4d40-9048-443132123fb3" containerName="pull" Feb 17 14:19:48 crc kubenswrapper[4836]: E0217 14:19:48.524514 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3464477d-9902-4d40-9048-443132123fb3" containerName="extract" Feb 17 14:19:48 crc kubenswrapper[4836]: I0217 14:19:48.524528 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="3464477d-9902-4d40-9048-443132123fb3" containerName="extract" Feb 17 14:19:48 crc kubenswrapper[4836]: I0217 14:19:48.524691 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="3464477d-9902-4d40-9048-443132123fb3" containerName="extract" Feb 17 14:19:48 crc kubenswrapper[4836]: I0217 14:19:48.525843 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators-redhat/loki-operator-controller-manager-dfd4b8c4b-kclf7" Feb 17 14:19:48 crc kubenswrapper[4836]: I0217 14:19:48.528187 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b54rn\" (UniqueName: \"kubernetes.io/projected/297a6b35-d11d-4c2b-858c-79cb4c3c1b2c-kube-api-access-b54rn\") pod \"loki-operator-controller-manager-dfd4b8c4b-kclf7\" (UID: \"297a6b35-d11d-4c2b-858c-79cb4c3c1b2c\") " pod="openshift-operators-redhat/loki-operator-controller-manager-dfd4b8c4b-kclf7" Feb 17 14:19:48 crc kubenswrapper[4836]: I0217 14:19:48.528447 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/297a6b35-d11d-4c2b-858c-79cb4c3c1b2c-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-dfd4b8c4b-kclf7\" (UID: \"297a6b35-d11d-4c2b-858c-79cb4c3c1b2c\") " pod="openshift-operators-redhat/loki-operator-controller-manager-dfd4b8c4b-kclf7" Feb 17 14:19:48 crc kubenswrapper[4836]: I0217 14:19:48.528544 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/297a6b35-d11d-4c2b-858c-79cb4c3c1b2c-apiservice-cert\") pod \"loki-operator-controller-manager-dfd4b8c4b-kclf7\" (UID: \"297a6b35-d11d-4c2b-858c-79cb4c3c1b2c\") " pod="openshift-operators-redhat/loki-operator-controller-manager-dfd4b8c4b-kclf7" Feb 17 14:19:48 crc kubenswrapper[4836]: I0217 14:19:48.528666 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/297a6b35-d11d-4c2b-858c-79cb4c3c1b2c-manager-config\") pod \"loki-operator-controller-manager-dfd4b8c4b-kclf7\" (UID: \"297a6b35-d11d-4c2b-858c-79cb4c3c1b2c\") " pod="openshift-operators-redhat/loki-operator-controller-manager-dfd4b8c4b-kclf7" Feb 17 14:19:48 crc kubenswrapper[4836]: I0217 14:19:48.528724 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/297a6b35-d11d-4c2b-858c-79cb4c3c1b2c-webhook-cert\") pod \"loki-operator-controller-manager-dfd4b8c4b-kclf7\" (UID: \"297a6b35-d11d-4c2b-858c-79cb4c3c1b2c\") " pod="openshift-operators-redhat/loki-operator-controller-manager-dfd4b8c4b-kclf7" Feb 17 14:19:48 crc kubenswrapper[4836]: I0217 14:19:48.531402 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"kube-root-ca.crt" Feb 17 14:19:48 crc kubenswrapper[4836]: I0217 14:19:48.532352 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-controller-manager-service-cert" Feb 17 14:19:48 crc kubenswrapper[4836]: I0217 14:19:48.532403 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-metrics" Feb 17 14:19:48 crc kubenswrapper[4836]: I0217 14:19:48.533463 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"loki-operator-manager-config" Feb 17 14:19:48 crc kubenswrapper[4836]: I0217 14:19:48.534961 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"openshift-service-ca.crt" Feb 17 14:19:48 crc kubenswrapper[4836]: I0217 14:19:48.536996 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-controller-manager-dockercfg-xqgbn" Feb 17 14:19:48 crc kubenswrapper[4836]: I0217 14:19:48.630485 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b54rn\" (UniqueName: \"kubernetes.io/projected/297a6b35-d11d-4c2b-858c-79cb4c3c1b2c-kube-api-access-b54rn\") pod \"loki-operator-controller-manager-dfd4b8c4b-kclf7\" (UID: \"297a6b35-d11d-4c2b-858c-79cb4c3c1b2c\") " pod="openshift-operators-redhat/loki-operator-controller-manager-dfd4b8c4b-kclf7" Feb 17 14:19:48 crc kubenswrapper[4836]: I0217 14:19:48.630741 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/297a6b35-d11d-4c2b-858c-79cb4c3c1b2c-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-dfd4b8c4b-kclf7\" (UID: \"297a6b35-d11d-4c2b-858c-79cb4c3c1b2c\") " pod="openshift-operators-redhat/loki-operator-controller-manager-dfd4b8c4b-kclf7" Feb 17 14:19:48 crc kubenswrapper[4836]: I0217 14:19:48.630837 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/297a6b35-d11d-4c2b-858c-79cb4c3c1b2c-apiservice-cert\") pod \"loki-operator-controller-manager-dfd4b8c4b-kclf7\" (UID: \"297a6b35-d11d-4c2b-858c-79cb4c3c1b2c\") " pod="openshift-operators-redhat/loki-operator-controller-manager-dfd4b8c4b-kclf7" Feb 17 14:19:48 crc kubenswrapper[4836]: I0217 14:19:48.630911 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/297a6b35-d11d-4c2b-858c-79cb4c3c1b2c-manager-config\") pod \"loki-operator-controller-manager-dfd4b8c4b-kclf7\" (UID: \"297a6b35-d11d-4c2b-858c-79cb4c3c1b2c\") " pod="openshift-operators-redhat/loki-operator-controller-manager-dfd4b8c4b-kclf7" Feb 17 14:19:48 crc kubenswrapper[4836]: I0217 14:19:48.630953 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/297a6b35-d11d-4c2b-858c-79cb4c3c1b2c-webhook-cert\") pod \"loki-operator-controller-manager-dfd4b8c4b-kclf7\" (UID: \"297a6b35-d11d-4c2b-858c-79cb4c3c1b2c\") " pod="openshift-operators-redhat/loki-operator-controller-manager-dfd4b8c4b-kclf7" Feb 17 14:19:48 crc kubenswrapper[4836]: I0217 14:19:48.633002 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/297a6b35-d11d-4c2b-858c-79cb4c3c1b2c-manager-config\") pod \"loki-operator-controller-manager-dfd4b8c4b-kclf7\" (UID: \"297a6b35-d11d-4c2b-858c-79cb4c3c1b2c\") " pod="openshift-operators-redhat/loki-operator-controller-manager-dfd4b8c4b-kclf7" Feb 17 14:19:48 crc kubenswrapper[4836]: I0217 14:19:48.726995 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/297a6b35-d11d-4c2b-858c-79cb4c3c1b2c-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-dfd4b8c4b-kclf7\" (UID: \"297a6b35-d11d-4c2b-858c-79cb4c3c1b2c\") " pod="openshift-operators-redhat/loki-operator-controller-manager-dfd4b8c4b-kclf7" Feb 17 14:19:48 crc kubenswrapper[4836]: I0217 14:19:48.733422 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-dfd4b8c4b-kclf7"] Feb 17 14:19:48 crc kubenswrapper[4836]: I0217 14:19:48.740730 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/297a6b35-d11d-4c2b-858c-79cb4c3c1b2c-webhook-cert\") pod \"loki-operator-controller-manager-dfd4b8c4b-kclf7\" (UID: \"297a6b35-d11d-4c2b-858c-79cb4c3c1b2c\") " pod="openshift-operators-redhat/loki-operator-controller-manager-dfd4b8c4b-kclf7" Feb 17 14:19:48 crc kubenswrapper[4836]: I0217 14:19:48.744551 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/297a6b35-d11d-4c2b-858c-79cb4c3c1b2c-apiservice-cert\") pod \"loki-operator-controller-manager-dfd4b8c4b-kclf7\" (UID: \"297a6b35-d11d-4c2b-858c-79cb4c3c1b2c\") " pod="openshift-operators-redhat/loki-operator-controller-manager-dfd4b8c4b-kclf7" Feb 17 14:19:48 crc kubenswrapper[4836]: I0217 14:19:48.782148 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b54rn\" (UniqueName: \"kubernetes.io/projected/297a6b35-d11d-4c2b-858c-79cb4c3c1b2c-kube-api-access-b54rn\") pod \"loki-operator-controller-manager-dfd4b8c4b-kclf7\" (UID: \"297a6b35-d11d-4c2b-858c-79cb4c3c1b2c\") " pod="openshift-operators-redhat/loki-operator-controller-manager-dfd4b8c4b-kclf7" Feb 17 14:19:48 crc kubenswrapper[4836]: I0217 14:19:48.848853 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators-redhat/loki-operator-controller-manager-dfd4b8c4b-kclf7" Feb 17 14:19:51 crc kubenswrapper[4836]: I0217 14:19:51.049013 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-dfd4b8c4b-kclf7"] Feb 17 14:19:51 crc kubenswrapper[4836]: I0217 14:19:51.072033 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-dfd4b8c4b-kclf7" event={"ID":"297a6b35-d11d-4c2b-858c-79cb4c3c1b2c","Type":"ContainerStarted","Data":"6b58103ebb3f82fe763cafe563d382e1d62fa5efd2795d748f78f78246f2b428"} Feb 17 14:19:53 crc kubenswrapper[4836]: I0217 14:19:53.283469 4836 generic.go:334] "Generic (PLEG): container finished" podID="cfed558c-2562-4771-af8e-bc422f87be49" containerID="d2cbaf947c2e327430e0f53838c199bd1b517cad7c1f8cc6fa7d59ca6b872238" exitCode=0 Feb 17 14:19:53 crc kubenswrapper[4836]: I0217 14:19:53.283865 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lzp6h" event={"ID":"cfed558c-2562-4771-af8e-bc422f87be49","Type":"ContainerDied","Data":"d2cbaf947c2e327430e0f53838c199bd1b517cad7c1f8cc6fa7d59ca6b872238"} Feb 17 14:19:55 crc kubenswrapper[4836]: I0217 14:19:55.313848 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lzp6h" event={"ID":"cfed558c-2562-4771-af8e-bc422f87be49","Type":"ContainerStarted","Data":"a270a09a390ebcea0a4f509773de7bae62620250eaf8884cedb160dbb4315a69"} Feb 17 14:19:55 crc kubenswrapper[4836]: I0217 14:19:55.346989 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-lzp6h" podStartSLOduration=5.886289972 podStartE2EDuration="22.346960575s" podCreationTimestamp="2026-02-17 14:19:33 +0000 UTC" firstStartedPulling="2026-02-17 14:19:37.692717493 +0000 UTC m=+804.035645762" lastFinishedPulling="2026-02-17 14:19:54.153388096 +0000 UTC m=+820.496316365" observedRunningTime="2026-02-17 14:19:55.34136655 +0000 UTC m=+821.684294819" watchObservedRunningTime="2026-02-17 14:19:55.346960575 +0000 UTC m=+821.689888864" Feb 17 14:19:59 crc kubenswrapper[4836]: I0217 14:19:59.766600 4836 patch_prober.go:28] interesting pod/machine-config-daemon-bkk9g container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 14:19:59 crc kubenswrapper[4836]: I0217 14:19:59.767636 4836 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" podUID="895a19c9-a3f0-4a15-aa19-19347121388c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 14:20:03 crc kubenswrapper[4836]: I0217 14:20:03.457080 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-dfd4b8c4b-kclf7" event={"ID":"297a6b35-d11d-4c2b-858c-79cb4c3c1b2c","Type":"ContainerStarted","Data":"ebabf52e45b1432c2f537d383dbd701657289b74835ccf4e71171e80022c892b"} Feb 17 14:20:04 crc kubenswrapper[4836]: I0217 14:20:04.266276 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-lzp6h" Feb 17 14:20:04 crc kubenswrapper[4836]: I0217 14:20:04.266343 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-lzp6h" Feb 17 14:20:05 crc kubenswrapper[4836]: I0217 14:20:05.322107 4836 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-lzp6h" podUID="cfed558c-2562-4771-af8e-bc422f87be49" containerName="registry-server" probeResult="failure" output=< Feb 17 14:20:05 crc kubenswrapper[4836]: timeout: failed to connect service ":50051" within 1s Feb 17 14:20:05 crc kubenswrapper[4836]: > Feb 17 14:20:13 crc kubenswrapper[4836]: I0217 14:20:13.651475 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-dfd4b8c4b-kclf7" event={"ID":"297a6b35-d11d-4c2b-858c-79cb4c3c1b2c","Type":"ContainerStarted","Data":"9f3f299bf99b01623cba9bbf6b343240ae7d86b324fbd64168c45c1bf7eea652"} Feb 17 14:20:13 crc kubenswrapper[4836]: I0217 14:20:13.652847 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators-redhat/loki-operator-controller-manager-dfd4b8c4b-kclf7" Feb 17 14:20:13 crc kubenswrapper[4836]: I0217 14:20:13.655475 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators-redhat/loki-operator-controller-manager-dfd4b8c4b-kclf7" Feb 17 14:20:13 crc kubenswrapper[4836]: I0217 14:20:13.698154 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators-redhat/loki-operator-controller-manager-dfd4b8c4b-kclf7" podStartSLOduration=3.530038009 podStartE2EDuration="25.698120548s" podCreationTimestamp="2026-02-17 14:19:48 +0000 UTC" firstStartedPulling="2026-02-17 14:19:51.05927638 +0000 UTC m=+817.402204649" lastFinishedPulling="2026-02-17 14:20:13.227358919 +0000 UTC m=+839.570287188" observedRunningTime="2026-02-17 14:20:13.690227699 +0000 UTC m=+840.033155968" watchObservedRunningTime="2026-02-17 14:20:13.698120548 +0000 UTC m=+840.041048817" Feb 17 14:20:14 crc kubenswrapper[4836]: I0217 14:20:14.324692 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-lzp6h" Feb 17 14:20:14 crc kubenswrapper[4836]: I0217 14:20:14.366039 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-lzp6h" Feb 17 14:20:16 crc kubenswrapper[4836]: I0217 14:20:16.728905 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lzp6h"] Feb 17 14:20:16 crc kubenswrapper[4836]: I0217 14:20:16.729637 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-lzp6h" podUID="cfed558c-2562-4771-af8e-bc422f87be49" containerName="registry-server" containerID="cri-o://a270a09a390ebcea0a4f509773de7bae62620250eaf8884cedb160dbb4315a69" gracePeriod=2 Feb 17 14:20:17 crc kubenswrapper[4836]: I0217 14:20:17.197719 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lzp6h" Feb 17 14:20:17 crc kubenswrapper[4836]: I0217 14:20:17.355695 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cfed558c-2562-4771-af8e-bc422f87be49-catalog-content\") pod \"cfed558c-2562-4771-af8e-bc422f87be49\" (UID: \"cfed558c-2562-4771-af8e-bc422f87be49\") " Feb 17 14:20:17 crc kubenswrapper[4836]: I0217 14:20:17.355887 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tdlcw\" (UniqueName: \"kubernetes.io/projected/cfed558c-2562-4771-af8e-bc422f87be49-kube-api-access-tdlcw\") pod \"cfed558c-2562-4771-af8e-bc422f87be49\" (UID: \"cfed558c-2562-4771-af8e-bc422f87be49\") " Feb 17 14:20:17 crc kubenswrapper[4836]: I0217 14:20:17.355978 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cfed558c-2562-4771-af8e-bc422f87be49-utilities\") pod \"cfed558c-2562-4771-af8e-bc422f87be49\" (UID: \"cfed558c-2562-4771-af8e-bc422f87be49\") " Feb 17 14:20:17 crc kubenswrapper[4836]: I0217 14:20:17.356985 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cfed558c-2562-4771-af8e-bc422f87be49-utilities" (OuterVolumeSpecName: "utilities") pod "cfed558c-2562-4771-af8e-bc422f87be49" (UID: "cfed558c-2562-4771-af8e-bc422f87be49"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:20:17 crc kubenswrapper[4836]: I0217 14:20:17.362515 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfed558c-2562-4771-af8e-bc422f87be49-kube-api-access-tdlcw" (OuterVolumeSpecName: "kube-api-access-tdlcw") pod "cfed558c-2562-4771-af8e-bc422f87be49" (UID: "cfed558c-2562-4771-af8e-bc422f87be49"). InnerVolumeSpecName "kube-api-access-tdlcw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:20:17 crc kubenswrapper[4836]: I0217 14:20:17.457364 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tdlcw\" (UniqueName: \"kubernetes.io/projected/cfed558c-2562-4771-af8e-bc422f87be49-kube-api-access-tdlcw\") on node \"crc\" DevicePath \"\"" Feb 17 14:20:17 crc kubenswrapper[4836]: I0217 14:20:17.457401 4836 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cfed558c-2562-4771-af8e-bc422f87be49-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 14:20:17 crc kubenswrapper[4836]: I0217 14:20:17.486427 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cfed558c-2562-4771-af8e-bc422f87be49-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cfed558c-2562-4771-af8e-bc422f87be49" (UID: "cfed558c-2562-4771-af8e-bc422f87be49"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:20:17 crc kubenswrapper[4836]: I0217 14:20:17.561011 4836 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cfed558c-2562-4771-af8e-bc422f87be49-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 14:20:17 crc kubenswrapper[4836]: I0217 14:20:17.753053 4836 generic.go:334] "Generic (PLEG): container finished" podID="cfed558c-2562-4771-af8e-bc422f87be49" containerID="a270a09a390ebcea0a4f509773de7bae62620250eaf8884cedb160dbb4315a69" exitCode=0 Feb 17 14:20:17 crc kubenswrapper[4836]: I0217 14:20:17.753125 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lzp6h" event={"ID":"cfed558c-2562-4771-af8e-bc422f87be49","Type":"ContainerDied","Data":"a270a09a390ebcea0a4f509773de7bae62620250eaf8884cedb160dbb4315a69"} Feb 17 14:20:17 crc kubenswrapper[4836]: I0217 14:20:17.753180 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lzp6h" event={"ID":"cfed558c-2562-4771-af8e-bc422f87be49","Type":"ContainerDied","Data":"673a85edd931637c360a40a9b7c8e08f34e58e0982f41b7b113c561515f17ba5"} Feb 17 14:20:17 crc kubenswrapper[4836]: I0217 14:20:17.753179 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lzp6h" Feb 17 14:20:17 crc kubenswrapper[4836]: I0217 14:20:17.753203 4836 scope.go:117] "RemoveContainer" containerID="a270a09a390ebcea0a4f509773de7bae62620250eaf8884cedb160dbb4315a69" Feb 17 14:20:17 crc kubenswrapper[4836]: I0217 14:20:17.783111 4836 scope.go:117] "RemoveContainer" containerID="d2cbaf947c2e327430e0f53838c199bd1b517cad7c1f8cc6fa7d59ca6b872238" Feb 17 14:20:17 crc kubenswrapper[4836]: I0217 14:20:17.794373 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lzp6h"] Feb 17 14:20:17 crc kubenswrapper[4836]: I0217 14:20:17.798729 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-lzp6h"] Feb 17 14:20:17 crc kubenswrapper[4836]: I0217 14:20:17.809415 4836 scope.go:117] "RemoveContainer" containerID="d4dfb4a28991a2ad7ad1d97f7e9b6acb9b352c07cce3ac2497280bb1a45e40ce" Feb 17 14:20:17 crc kubenswrapper[4836]: I0217 14:20:17.828899 4836 scope.go:117] "RemoveContainer" containerID="a270a09a390ebcea0a4f509773de7bae62620250eaf8884cedb160dbb4315a69" Feb 17 14:20:17 crc kubenswrapper[4836]: E0217 14:20:17.829640 4836 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a270a09a390ebcea0a4f509773de7bae62620250eaf8884cedb160dbb4315a69\": container with ID starting with a270a09a390ebcea0a4f509773de7bae62620250eaf8884cedb160dbb4315a69 not found: ID does not exist" containerID="a270a09a390ebcea0a4f509773de7bae62620250eaf8884cedb160dbb4315a69" Feb 17 14:20:17 crc kubenswrapper[4836]: I0217 14:20:17.829719 4836 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a270a09a390ebcea0a4f509773de7bae62620250eaf8884cedb160dbb4315a69"} err="failed to get container status \"a270a09a390ebcea0a4f509773de7bae62620250eaf8884cedb160dbb4315a69\": rpc error: code = NotFound desc = could not find container \"a270a09a390ebcea0a4f509773de7bae62620250eaf8884cedb160dbb4315a69\": container with ID starting with a270a09a390ebcea0a4f509773de7bae62620250eaf8884cedb160dbb4315a69 not found: ID does not exist" Feb 17 14:20:17 crc kubenswrapper[4836]: I0217 14:20:17.829762 4836 scope.go:117] "RemoveContainer" containerID="d2cbaf947c2e327430e0f53838c199bd1b517cad7c1f8cc6fa7d59ca6b872238" Feb 17 14:20:17 crc kubenswrapper[4836]: E0217 14:20:17.830343 4836 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2cbaf947c2e327430e0f53838c199bd1b517cad7c1f8cc6fa7d59ca6b872238\": container with ID starting with d2cbaf947c2e327430e0f53838c199bd1b517cad7c1f8cc6fa7d59ca6b872238 not found: ID does not exist" containerID="d2cbaf947c2e327430e0f53838c199bd1b517cad7c1f8cc6fa7d59ca6b872238" Feb 17 14:20:17 crc kubenswrapper[4836]: I0217 14:20:17.830449 4836 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2cbaf947c2e327430e0f53838c199bd1b517cad7c1f8cc6fa7d59ca6b872238"} err="failed to get container status \"d2cbaf947c2e327430e0f53838c199bd1b517cad7c1f8cc6fa7d59ca6b872238\": rpc error: code = NotFound desc = could not find container \"d2cbaf947c2e327430e0f53838c199bd1b517cad7c1f8cc6fa7d59ca6b872238\": container with ID starting with d2cbaf947c2e327430e0f53838c199bd1b517cad7c1f8cc6fa7d59ca6b872238 not found: ID does not exist" Feb 17 14:20:17 crc kubenswrapper[4836]: I0217 14:20:17.830534 4836 scope.go:117] "RemoveContainer" containerID="d4dfb4a28991a2ad7ad1d97f7e9b6acb9b352c07cce3ac2497280bb1a45e40ce" Feb 17 14:20:17 crc kubenswrapper[4836]: E0217 14:20:17.831214 4836 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4dfb4a28991a2ad7ad1d97f7e9b6acb9b352c07cce3ac2497280bb1a45e40ce\": container with ID starting with d4dfb4a28991a2ad7ad1d97f7e9b6acb9b352c07cce3ac2497280bb1a45e40ce not found: ID does not exist" containerID="d4dfb4a28991a2ad7ad1d97f7e9b6acb9b352c07cce3ac2497280bb1a45e40ce" Feb 17 14:20:17 crc kubenswrapper[4836]: I0217 14:20:17.831310 4836 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4dfb4a28991a2ad7ad1d97f7e9b6acb9b352c07cce3ac2497280bb1a45e40ce"} err="failed to get container status \"d4dfb4a28991a2ad7ad1d97f7e9b6acb9b352c07cce3ac2497280bb1a45e40ce\": rpc error: code = NotFound desc = could not find container \"d4dfb4a28991a2ad7ad1d97f7e9b6acb9b352c07cce3ac2497280bb1a45e40ce\": container with ID starting with d4dfb4a28991a2ad7ad1d97f7e9b6acb9b352c07cce3ac2497280bb1a45e40ce not found: ID does not exist" Feb 17 14:20:18 crc kubenswrapper[4836]: I0217 14:20:18.581838 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cfed558c-2562-4771-af8e-bc422f87be49" path="/var/lib/kubelet/pods/cfed558c-2562-4771-af8e-bc422f87be49/volumes" Feb 17 14:20:29 crc kubenswrapper[4836]: I0217 14:20:29.764877 4836 patch_prober.go:28] interesting pod/machine-config-daemon-bkk9g container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 14:20:29 crc kubenswrapper[4836]: I0217 14:20:29.765931 4836 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" podUID="895a19c9-a3f0-4a15-aa19-19347121388c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 14:20:29 crc kubenswrapper[4836]: I0217 14:20:29.765999 4836 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" Feb 17 14:20:29 crc kubenswrapper[4836]: I0217 14:20:29.832393 4836 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d7f43ee4be167fb696d056804834f76d74b6a96b2dd00fc7f1328e7b9c2e7869"} pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 14:20:29 crc kubenswrapper[4836]: I0217 14:20:29.832504 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" podUID="895a19c9-a3f0-4a15-aa19-19347121388c" containerName="machine-config-daemon" containerID="cri-o://d7f43ee4be167fb696d056804834f76d74b6a96b2dd00fc7f1328e7b9c2e7869" gracePeriod=600 Feb 17 14:20:30 crc kubenswrapper[4836]: I0217 14:20:30.841899 4836 generic.go:334] "Generic (PLEG): container finished" podID="895a19c9-a3f0-4a15-aa19-19347121388c" containerID="d7f43ee4be167fb696d056804834f76d74b6a96b2dd00fc7f1328e7b9c2e7869" exitCode=0 Feb 17 14:20:30 crc kubenswrapper[4836]: I0217 14:20:30.841979 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" event={"ID":"895a19c9-a3f0-4a15-aa19-19347121388c","Type":"ContainerDied","Data":"d7f43ee4be167fb696d056804834f76d74b6a96b2dd00fc7f1328e7b9c2e7869"} Feb 17 14:20:30 crc kubenswrapper[4836]: I0217 14:20:30.842643 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" event={"ID":"895a19c9-a3f0-4a15-aa19-19347121388c","Type":"ContainerStarted","Data":"89b78e4cc2264dc06417ab903dd2a1618c1aee2c1d950babae0b011a2e9eac59"} Feb 17 14:20:30 crc kubenswrapper[4836]: I0217 14:20:30.842668 4836 scope.go:117] "RemoveContainer" containerID="1b2a0d64ec4a5faa95e6312a8de2b21c8f3e85f4d851c39760904a4b16753249" Feb 17 14:20:40 crc kubenswrapper[4836]: I0217 14:20:40.008080 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawmbrz"] Feb 17 14:20:40 crc kubenswrapper[4836]: E0217 14:20:40.008889 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfed558c-2562-4771-af8e-bc422f87be49" containerName="extract-utilities" Feb 17 14:20:40 crc kubenswrapper[4836]: I0217 14:20:40.008906 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfed558c-2562-4771-af8e-bc422f87be49" containerName="extract-utilities" Feb 17 14:20:40 crc kubenswrapper[4836]: E0217 14:20:40.008924 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfed558c-2562-4771-af8e-bc422f87be49" containerName="registry-server" Feb 17 14:20:40 crc kubenswrapper[4836]: I0217 14:20:40.008931 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfed558c-2562-4771-af8e-bc422f87be49" containerName="registry-server" Feb 17 14:20:40 crc kubenswrapper[4836]: E0217 14:20:40.008946 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfed558c-2562-4771-af8e-bc422f87be49" containerName="extract-content" Feb 17 14:20:40 crc kubenswrapper[4836]: I0217 14:20:40.008952 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfed558c-2562-4771-af8e-bc422f87be49" containerName="extract-content" Feb 17 14:20:40 crc kubenswrapper[4836]: I0217 14:20:40.009062 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfed558c-2562-4771-af8e-bc422f87be49" containerName="registry-server" Feb 17 14:20:40 crc kubenswrapper[4836]: I0217 14:20:40.023634 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawmbrz" Feb 17 14:20:40 crc kubenswrapper[4836]: I0217 14:20:40.031912 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 17 14:20:40 crc kubenswrapper[4836]: I0217 14:20:40.055420 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawmbrz"] Feb 17 14:20:40 crc kubenswrapper[4836]: I0217 14:20:40.082825 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/96be2236-f07d-4944-8afa-b15a4ce0c4f0-bundle\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawmbrz\" (UID: \"96be2236-f07d-4944-8afa-b15a4ce0c4f0\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawmbrz" Feb 17 14:20:40 crc kubenswrapper[4836]: I0217 14:20:40.082868 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/96be2236-f07d-4944-8afa-b15a4ce0c4f0-util\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawmbrz\" (UID: \"96be2236-f07d-4944-8afa-b15a4ce0c4f0\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawmbrz" Feb 17 14:20:40 crc kubenswrapper[4836]: I0217 14:20:40.082890 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckvl4\" (UniqueName: \"kubernetes.io/projected/96be2236-f07d-4944-8afa-b15a4ce0c4f0-kube-api-access-ckvl4\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawmbrz\" (UID: \"96be2236-f07d-4944-8afa-b15a4ce0c4f0\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawmbrz" Feb 17 14:20:40 crc kubenswrapper[4836]: I0217 14:20:40.183877 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/96be2236-f07d-4944-8afa-b15a4ce0c4f0-util\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawmbrz\" (UID: \"96be2236-f07d-4944-8afa-b15a4ce0c4f0\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawmbrz" Feb 17 14:20:40 crc kubenswrapper[4836]: I0217 14:20:40.184271 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/96be2236-f07d-4944-8afa-b15a4ce0c4f0-bundle\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawmbrz\" (UID: \"96be2236-f07d-4944-8afa-b15a4ce0c4f0\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawmbrz" Feb 17 14:20:40 crc kubenswrapper[4836]: I0217 14:20:40.184317 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckvl4\" (UniqueName: \"kubernetes.io/projected/96be2236-f07d-4944-8afa-b15a4ce0c4f0-kube-api-access-ckvl4\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawmbrz\" (UID: \"96be2236-f07d-4944-8afa-b15a4ce0c4f0\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawmbrz" Feb 17 14:20:40 crc kubenswrapper[4836]: I0217 14:20:40.184517 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/96be2236-f07d-4944-8afa-b15a4ce0c4f0-util\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawmbrz\" (UID: \"96be2236-f07d-4944-8afa-b15a4ce0c4f0\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawmbrz" Feb 17 14:20:40 crc kubenswrapper[4836]: I0217 14:20:40.184810 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/96be2236-f07d-4944-8afa-b15a4ce0c4f0-bundle\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawmbrz\" (UID: \"96be2236-f07d-4944-8afa-b15a4ce0c4f0\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawmbrz" Feb 17 14:20:40 crc kubenswrapper[4836]: I0217 14:20:40.223424 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckvl4\" (UniqueName: \"kubernetes.io/projected/96be2236-f07d-4944-8afa-b15a4ce0c4f0-kube-api-access-ckvl4\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawmbrz\" (UID: \"96be2236-f07d-4944-8afa-b15a4ce0c4f0\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawmbrz" Feb 17 14:20:40 crc kubenswrapper[4836]: I0217 14:20:40.348422 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawmbrz" Feb 17 14:20:40 crc kubenswrapper[4836]: I0217 14:20:40.611991 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawmbrz"] Feb 17 14:20:40 crc kubenswrapper[4836]: I0217 14:20:40.929622 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawmbrz" event={"ID":"96be2236-f07d-4944-8afa-b15a4ce0c4f0","Type":"ContainerStarted","Data":"8235b28307ff2660a2209b952fd70b03df5c5d3ae9afbc6f8b22818710d07e80"} Feb 17 14:20:40 crc kubenswrapper[4836]: I0217 14:20:40.929758 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawmbrz" event={"ID":"96be2236-f07d-4944-8afa-b15a4ce0c4f0","Type":"ContainerStarted","Data":"ab5298b8b361b42235544a670b8b6569354940dd46e82c73afc20041f4e0b413"} Feb 17 14:20:41 crc kubenswrapper[4836]: I0217 14:20:41.946633 4836 generic.go:334] "Generic (PLEG): container finished" podID="96be2236-f07d-4944-8afa-b15a4ce0c4f0" containerID="8235b28307ff2660a2209b952fd70b03df5c5d3ae9afbc6f8b22818710d07e80" exitCode=0 Feb 17 14:20:41 crc kubenswrapper[4836]: I0217 14:20:41.946710 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawmbrz" event={"ID":"96be2236-f07d-4944-8afa-b15a4ce0c4f0","Type":"ContainerDied","Data":"8235b28307ff2660a2209b952fd70b03df5c5d3ae9afbc6f8b22818710d07e80"} Feb 17 14:20:44 crc kubenswrapper[4836]: I0217 14:20:44.988995 4836 generic.go:334] "Generic (PLEG): container finished" podID="96be2236-f07d-4944-8afa-b15a4ce0c4f0" containerID="b63be5ecf4e404a8432eedfca2301d5ed2af1b2db76aacecbb2911a40d6711fe" exitCode=0 Feb 17 14:20:44 crc kubenswrapper[4836]: I0217 14:20:44.989058 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawmbrz" event={"ID":"96be2236-f07d-4944-8afa-b15a4ce0c4f0","Type":"ContainerDied","Data":"b63be5ecf4e404a8432eedfca2301d5ed2af1b2db76aacecbb2911a40d6711fe"} Feb 17 14:20:45 crc kubenswrapper[4836]: I0217 14:20:45.996413 4836 generic.go:334] "Generic (PLEG): container finished" podID="96be2236-f07d-4944-8afa-b15a4ce0c4f0" containerID="83ef15b1d271b620ffa7952a5e4be567e8e2a86b23ca0b2ee1ba0731de7e4453" exitCode=0 Feb 17 14:20:45 crc kubenswrapper[4836]: I0217 14:20:45.996559 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawmbrz" event={"ID":"96be2236-f07d-4944-8afa-b15a4ce0c4f0","Type":"ContainerDied","Data":"83ef15b1d271b620ffa7952a5e4be567e8e2a86b23ca0b2ee1ba0731de7e4453"} Feb 17 14:20:47 crc kubenswrapper[4836]: I0217 14:20:47.260640 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawmbrz" Feb 17 14:20:47 crc kubenswrapper[4836]: I0217 14:20:47.422353 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ckvl4\" (UniqueName: \"kubernetes.io/projected/96be2236-f07d-4944-8afa-b15a4ce0c4f0-kube-api-access-ckvl4\") pod \"96be2236-f07d-4944-8afa-b15a4ce0c4f0\" (UID: \"96be2236-f07d-4944-8afa-b15a4ce0c4f0\") " Feb 17 14:20:47 crc kubenswrapper[4836]: I0217 14:20:47.422607 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/96be2236-f07d-4944-8afa-b15a4ce0c4f0-bundle\") pod \"96be2236-f07d-4944-8afa-b15a4ce0c4f0\" (UID: \"96be2236-f07d-4944-8afa-b15a4ce0c4f0\") " Feb 17 14:20:47 crc kubenswrapper[4836]: I0217 14:20:47.423348 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96be2236-f07d-4944-8afa-b15a4ce0c4f0-bundle" (OuterVolumeSpecName: "bundle") pod "96be2236-f07d-4944-8afa-b15a4ce0c4f0" (UID: "96be2236-f07d-4944-8afa-b15a4ce0c4f0"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:20:47 crc kubenswrapper[4836]: I0217 14:20:47.423401 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/96be2236-f07d-4944-8afa-b15a4ce0c4f0-util\") pod \"96be2236-f07d-4944-8afa-b15a4ce0c4f0\" (UID: \"96be2236-f07d-4944-8afa-b15a4ce0c4f0\") " Feb 17 14:20:47 crc kubenswrapper[4836]: I0217 14:20:47.423946 4836 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/96be2236-f07d-4944-8afa-b15a4ce0c4f0-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:20:47 crc kubenswrapper[4836]: I0217 14:20:47.433203 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96be2236-f07d-4944-8afa-b15a4ce0c4f0-kube-api-access-ckvl4" (OuterVolumeSpecName: "kube-api-access-ckvl4") pod "96be2236-f07d-4944-8afa-b15a4ce0c4f0" (UID: "96be2236-f07d-4944-8afa-b15a4ce0c4f0"). InnerVolumeSpecName "kube-api-access-ckvl4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:20:47 crc kubenswrapper[4836]: I0217 14:20:47.435981 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96be2236-f07d-4944-8afa-b15a4ce0c4f0-util" (OuterVolumeSpecName: "util") pod "96be2236-f07d-4944-8afa-b15a4ce0c4f0" (UID: "96be2236-f07d-4944-8afa-b15a4ce0c4f0"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:20:47 crc kubenswrapper[4836]: I0217 14:20:47.524984 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ckvl4\" (UniqueName: \"kubernetes.io/projected/96be2236-f07d-4944-8afa-b15a4ce0c4f0-kube-api-access-ckvl4\") on node \"crc\" DevicePath \"\"" Feb 17 14:20:47 crc kubenswrapper[4836]: I0217 14:20:47.525036 4836 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/96be2236-f07d-4944-8afa-b15a4ce0c4f0-util\") on node \"crc\" DevicePath \"\"" Feb 17 14:20:48 crc kubenswrapper[4836]: I0217 14:20:48.014164 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawmbrz" event={"ID":"96be2236-f07d-4944-8afa-b15a4ce0c4f0","Type":"ContainerDied","Data":"ab5298b8b361b42235544a670b8b6569354940dd46e82c73afc20041f4e0b413"} Feb 17 14:20:48 crc kubenswrapper[4836]: I0217 14:20:48.014255 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawmbrz" Feb 17 14:20:48 crc kubenswrapper[4836]: I0217 14:20:48.014233 4836 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ab5298b8b361b42235544a670b8b6569354940dd46e82c73afc20041f4e0b413" Feb 17 14:20:51 crc kubenswrapper[4836]: I0217 14:20:51.665419 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-694c9596b7-9w75g"] Feb 17 14:20:51 crc kubenswrapper[4836]: E0217 14:20:51.666095 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96be2236-f07d-4944-8afa-b15a4ce0c4f0" containerName="extract" Feb 17 14:20:51 crc kubenswrapper[4836]: I0217 14:20:51.666111 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="96be2236-f07d-4944-8afa-b15a4ce0c4f0" containerName="extract" Feb 17 14:20:51 crc kubenswrapper[4836]: E0217 14:20:51.666130 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96be2236-f07d-4944-8afa-b15a4ce0c4f0" containerName="util" Feb 17 14:20:51 crc kubenswrapper[4836]: I0217 14:20:51.666139 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="96be2236-f07d-4944-8afa-b15a4ce0c4f0" containerName="util" Feb 17 14:20:51 crc kubenswrapper[4836]: E0217 14:20:51.666152 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96be2236-f07d-4944-8afa-b15a4ce0c4f0" containerName="pull" Feb 17 14:20:51 crc kubenswrapper[4836]: I0217 14:20:51.666161 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="96be2236-f07d-4944-8afa-b15a4ce0c4f0" containerName="pull" Feb 17 14:20:51 crc kubenswrapper[4836]: I0217 14:20:51.666357 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="96be2236-f07d-4944-8afa-b15a4ce0c4f0" containerName="extract" Feb 17 14:20:51 crc kubenswrapper[4836]: I0217 14:20:51.666955 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-694c9596b7-9w75g" Feb 17 14:20:51 crc kubenswrapper[4836]: I0217 14:20:51.669530 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Feb 17 14:20:51 crc kubenswrapper[4836]: I0217 14:20:51.669812 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Feb 17 14:20:51 crc kubenswrapper[4836]: I0217 14:20:51.672250 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-4qq5m" Feb 17 14:20:51 crc kubenswrapper[4836]: I0217 14:20:51.679470 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-694c9596b7-9w75g"] Feb 17 14:20:51 crc kubenswrapper[4836]: I0217 14:20:51.695880 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjqkq\" (UniqueName: \"kubernetes.io/projected/c190e38d-4893-49c9-a633-e6b912030d37-kube-api-access-vjqkq\") pod \"nmstate-operator-694c9596b7-9w75g\" (UID: \"c190e38d-4893-49c9-a633-e6b912030d37\") " pod="openshift-nmstate/nmstate-operator-694c9596b7-9w75g" Feb 17 14:20:51 crc kubenswrapper[4836]: I0217 14:20:51.796880 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjqkq\" (UniqueName: \"kubernetes.io/projected/c190e38d-4893-49c9-a633-e6b912030d37-kube-api-access-vjqkq\") pod \"nmstate-operator-694c9596b7-9w75g\" (UID: \"c190e38d-4893-49c9-a633-e6b912030d37\") " pod="openshift-nmstate/nmstate-operator-694c9596b7-9w75g" Feb 17 14:20:51 crc kubenswrapper[4836]: I0217 14:20:51.814445 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjqkq\" (UniqueName: \"kubernetes.io/projected/c190e38d-4893-49c9-a633-e6b912030d37-kube-api-access-vjqkq\") pod \"nmstate-operator-694c9596b7-9w75g\" (UID: \"c190e38d-4893-49c9-a633-e6b912030d37\") " pod="openshift-nmstate/nmstate-operator-694c9596b7-9w75g" Feb 17 14:20:51 crc kubenswrapper[4836]: I0217 14:20:51.986103 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-694c9596b7-9w75g" Feb 17 14:20:52 crc kubenswrapper[4836]: I0217 14:20:52.188135 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-694c9596b7-9w75g"] Feb 17 14:20:53 crc kubenswrapper[4836]: I0217 14:20:53.047868 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-694c9596b7-9w75g" event={"ID":"c190e38d-4893-49c9-a633-e6b912030d37","Type":"ContainerStarted","Data":"67a57c72c9e94ba262c5b325a5d69a76019472bfac6c3846c7ace76d4e46915a"} Feb 17 14:20:55 crc kubenswrapper[4836]: I0217 14:20:55.062061 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-694c9596b7-9w75g" event={"ID":"c190e38d-4893-49c9-a633-e6b912030d37","Type":"ContainerStarted","Data":"1495efa1ef582fd1a8b215e602903b75c391e0d227af75075e32e473efba5e9b"} Feb 17 14:20:55 crc kubenswrapper[4836]: I0217 14:20:55.081480 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-694c9596b7-9w75g" podStartSLOduration=1.804008023 podStartE2EDuration="4.081425664s" podCreationTimestamp="2026-02-17 14:20:51 +0000 UTC" firstStartedPulling="2026-02-17 14:20:52.203783008 +0000 UTC m=+878.546711267" lastFinishedPulling="2026-02-17 14:20:54.481200639 +0000 UTC m=+880.824128908" observedRunningTime="2026-02-17 14:20:55.07803904 +0000 UTC m=+881.420967319" watchObservedRunningTime="2026-02-17 14:20:55.081425664 +0000 UTC m=+881.424353953" Feb 17 14:21:01 crc kubenswrapper[4836]: I0217 14:21:01.193181 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-58c85c668d-877xf"] Feb 17 14:21:01 crc kubenswrapper[4836]: I0217 14:21:01.195357 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58c85c668d-877xf" Feb 17 14:21:01 crc kubenswrapper[4836]: I0217 14:21:01.204526 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-2n6qm" Feb 17 14:21:01 crc kubenswrapper[4836]: I0217 14:21:01.219456 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-866bcb46dc-52vj8"] Feb 17 14:21:01 crc kubenswrapper[4836]: I0217 14:21:01.220766 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-52vj8" Feb 17 14:21:01 crc kubenswrapper[4836]: I0217 14:21:01.229740 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58c85c668d-877xf"] Feb 17 14:21:01 crc kubenswrapper[4836]: I0217 14:21:01.233640 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Feb 17 14:21:01 crc kubenswrapper[4836]: I0217 14:21:01.283491 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-866bcb46dc-52vj8"] Feb 17 14:21:01 crc kubenswrapper[4836]: I0217 14:21:01.301635 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-w8wbg"] Feb 17 14:21:01 crc kubenswrapper[4836]: I0217 14:21:01.302412 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-w8wbg" Feb 17 14:21:01 crc kubenswrapper[4836]: I0217 14:21:01.334682 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/6d6a6ca4-12c5-4bc1-b67e-5a48d1fe86f8-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-52vj8\" (UID: \"6d6a6ca4-12c5-4bc1-b67e-5a48d1fe86f8\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-52vj8" Feb 17 14:21:01 crc kubenswrapper[4836]: I0217 14:21:01.334999 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84vsn\" (UniqueName: \"kubernetes.io/projected/0d0615b5-ef3b-4932-957c-a4b44f35c1a9-kube-api-access-84vsn\") pod \"nmstate-metrics-58c85c668d-877xf\" (UID: \"0d0615b5-ef3b-4932-957c-a4b44f35c1a9\") " pod="openshift-nmstate/nmstate-metrics-58c85c668d-877xf" Feb 17 14:21:01 crc kubenswrapper[4836]: I0217 14:21:01.335184 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grx5q\" (UniqueName: \"kubernetes.io/projected/6d6a6ca4-12c5-4bc1-b67e-5a48d1fe86f8-kube-api-access-grx5q\") pod \"nmstate-webhook-866bcb46dc-52vj8\" (UID: \"6d6a6ca4-12c5-4bc1-b67e-5a48d1fe86f8\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-52vj8" Feb 17 14:21:01 crc kubenswrapper[4836]: I0217 14:21:01.438500 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/9ff842c9-08b8-4363-b82a-5f7e2461ec2a-ovs-socket\") pod \"nmstate-handler-w8wbg\" (UID: \"9ff842c9-08b8-4363-b82a-5f7e2461ec2a\") " pod="openshift-nmstate/nmstate-handler-w8wbg" Feb 17 14:21:01 crc kubenswrapper[4836]: I0217 14:21:01.438568 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gz49c\" (UniqueName: \"kubernetes.io/projected/9ff842c9-08b8-4363-b82a-5f7e2461ec2a-kube-api-access-gz49c\") pod \"nmstate-handler-w8wbg\" (UID: \"9ff842c9-08b8-4363-b82a-5f7e2461ec2a\") " pod="openshift-nmstate/nmstate-handler-w8wbg" Feb 17 14:21:01 crc kubenswrapper[4836]: I0217 14:21:01.438638 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grx5q\" (UniqueName: \"kubernetes.io/projected/6d6a6ca4-12c5-4bc1-b67e-5a48d1fe86f8-kube-api-access-grx5q\") pod \"nmstate-webhook-866bcb46dc-52vj8\" (UID: \"6d6a6ca4-12c5-4bc1-b67e-5a48d1fe86f8\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-52vj8" Feb 17 14:21:01 crc kubenswrapper[4836]: I0217 14:21:01.438678 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/9ff842c9-08b8-4363-b82a-5f7e2461ec2a-nmstate-lock\") pod \"nmstate-handler-w8wbg\" (UID: \"9ff842c9-08b8-4363-b82a-5f7e2461ec2a\") " pod="openshift-nmstate/nmstate-handler-w8wbg" Feb 17 14:21:01 crc kubenswrapper[4836]: I0217 14:21:01.438718 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/6d6a6ca4-12c5-4bc1-b67e-5a48d1fe86f8-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-52vj8\" (UID: \"6d6a6ca4-12c5-4bc1-b67e-5a48d1fe86f8\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-52vj8" Feb 17 14:21:01 crc kubenswrapper[4836]: I0217 14:21:01.438755 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/9ff842c9-08b8-4363-b82a-5f7e2461ec2a-dbus-socket\") pod \"nmstate-handler-w8wbg\" (UID: \"9ff842c9-08b8-4363-b82a-5f7e2461ec2a\") " pod="openshift-nmstate/nmstate-handler-w8wbg" Feb 17 14:21:01 crc kubenswrapper[4836]: I0217 14:21:01.438796 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84vsn\" (UniqueName: \"kubernetes.io/projected/0d0615b5-ef3b-4932-957c-a4b44f35c1a9-kube-api-access-84vsn\") pod \"nmstate-metrics-58c85c668d-877xf\" (UID: \"0d0615b5-ef3b-4932-957c-a4b44f35c1a9\") " pod="openshift-nmstate/nmstate-metrics-58c85c668d-877xf" Feb 17 14:21:01 crc kubenswrapper[4836]: E0217 14:21:01.439397 4836 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Feb 17 14:21:01 crc kubenswrapper[4836]: E0217 14:21:01.439475 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d6a6ca4-12c5-4bc1-b67e-5a48d1fe86f8-tls-key-pair podName:6d6a6ca4-12c5-4bc1-b67e-5a48d1fe86f8 nodeName:}" failed. No retries permitted until 2026-02-17 14:21:01.93945596 +0000 UTC m=+888.282384229 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/6d6a6ca4-12c5-4bc1-b67e-5a48d1fe86f8-tls-key-pair") pod "nmstate-webhook-866bcb46dc-52vj8" (UID: "6d6a6ca4-12c5-4bc1-b67e-5a48d1fe86f8") : secret "openshift-nmstate-webhook" not found Feb 17 14:21:01 crc kubenswrapper[4836]: I0217 14:21:01.486930 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84vsn\" (UniqueName: \"kubernetes.io/projected/0d0615b5-ef3b-4932-957c-a4b44f35c1a9-kube-api-access-84vsn\") pod \"nmstate-metrics-58c85c668d-877xf\" (UID: \"0d0615b5-ef3b-4932-957c-a4b44f35c1a9\") " pod="openshift-nmstate/nmstate-metrics-58c85c668d-877xf" Feb 17 14:21:01 crc kubenswrapper[4836]: I0217 14:21:01.489493 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grx5q\" (UniqueName: \"kubernetes.io/projected/6d6a6ca4-12c5-4bc1-b67e-5a48d1fe86f8-kube-api-access-grx5q\") pod \"nmstate-webhook-866bcb46dc-52vj8\" (UID: \"6d6a6ca4-12c5-4bc1-b67e-5a48d1fe86f8\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-52vj8" Feb 17 14:21:01 crc kubenswrapper[4836]: I0217 14:21:01.522429 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5c78fc5d65-q985f"] Feb 17 14:21:01 crc kubenswrapper[4836]: I0217 14:21:01.537281 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-q985f" Feb 17 14:21:01 crc kubenswrapper[4836]: I0217 14:21:01.540091 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gz49c\" (UniqueName: \"kubernetes.io/projected/9ff842c9-08b8-4363-b82a-5f7e2461ec2a-kube-api-access-gz49c\") pod \"nmstate-handler-w8wbg\" (UID: \"9ff842c9-08b8-4363-b82a-5f7e2461ec2a\") " pod="openshift-nmstate/nmstate-handler-w8wbg" Feb 17 14:21:01 crc kubenswrapper[4836]: I0217 14:21:01.540280 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/9ff842c9-08b8-4363-b82a-5f7e2461ec2a-nmstate-lock\") pod \"nmstate-handler-w8wbg\" (UID: \"9ff842c9-08b8-4363-b82a-5f7e2461ec2a\") " pod="openshift-nmstate/nmstate-handler-w8wbg" Feb 17 14:21:01 crc kubenswrapper[4836]: I0217 14:21:01.540433 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/9ff842c9-08b8-4363-b82a-5f7e2461ec2a-dbus-socket\") pod \"nmstate-handler-w8wbg\" (UID: \"9ff842c9-08b8-4363-b82a-5f7e2461ec2a\") " pod="openshift-nmstate/nmstate-handler-w8wbg" Feb 17 14:21:01 crc kubenswrapper[4836]: I0217 14:21:01.540537 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/9ff842c9-08b8-4363-b82a-5f7e2461ec2a-ovs-socket\") pod \"nmstate-handler-w8wbg\" (UID: \"9ff842c9-08b8-4363-b82a-5f7e2461ec2a\") " pod="openshift-nmstate/nmstate-handler-w8wbg" Feb 17 14:21:01 crc kubenswrapper[4836]: I0217 14:21:01.540853 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Feb 17 14:21:01 crc kubenswrapper[4836]: I0217 14:21:01.541448 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/9ff842c9-08b8-4363-b82a-5f7e2461ec2a-nmstate-lock\") pod \"nmstate-handler-w8wbg\" (UID: \"9ff842c9-08b8-4363-b82a-5f7e2461ec2a\") " pod="openshift-nmstate/nmstate-handler-w8wbg" Feb 17 14:21:01 crc kubenswrapper[4836]: I0217 14:21:01.541607 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/9ff842c9-08b8-4363-b82a-5f7e2461ec2a-dbus-socket\") pod \"nmstate-handler-w8wbg\" (UID: \"9ff842c9-08b8-4363-b82a-5f7e2461ec2a\") " pod="openshift-nmstate/nmstate-handler-w8wbg" Feb 17 14:21:01 crc kubenswrapper[4836]: I0217 14:21:01.541625 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/9ff842c9-08b8-4363-b82a-5f7e2461ec2a-ovs-socket\") pod \"nmstate-handler-w8wbg\" (UID: \"9ff842c9-08b8-4363-b82a-5f7e2461ec2a\") " pod="openshift-nmstate/nmstate-handler-w8wbg" Feb 17 14:21:01 crc kubenswrapper[4836]: I0217 14:21:01.540946 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Feb 17 14:21:01 crc kubenswrapper[4836]: I0217 14:21:01.540997 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-j25cv" Feb 17 14:21:01 crc kubenswrapper[4836]: I0217 14:21:01.552227 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58c85c668d-877xf" Feb 17 14:21:01 crc kubenswrapper[4836]: I0217 14:21:01.552828 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5c78fc5d65-q985f"] Feb 17 14:21:01 crc kubenswrapper[4836]: I0217 14:21:01.573362 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gz49c\" (UniqueName: \"kubernetes.io/projected/9ff842c9-08b8-4363-b82a-5f7e2461ec2a-kube-api-access-gz49c\") pod \"nmstate-handler-w8wbg\" (UID: \"9ff842c9-08b8-4363-b82a-5f7e2461ec2a\") " pod="openshift-nmstate/nmstate-handler-w8wbg" Feb 17 14:21:01 crc kubenswrapper[4836]: I0217 14:21:01.636701 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-w8wbg" Feb 17 14:21:01 crc kubenswrapper[4836]: I0217 14:21:01.644331 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2kzb\" (UniqueName: \"kubernetes.io/projected/8fc6d41c-a8a1-4fe3-ade2-b79761920b17-kube-api-access-w2kzb\") pod \"nmstate-console-plugin-5c78fc5d65-q985f\" (UID: \"8fc6d41c-a8a1-4fe3-ade2-b79761920b17\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-q985f" Feb 17 14:21:01 crc kubenswrapper[4836]: I0217 14:21:01.644405 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/8fc6d41c-a8a1-4fe3-ade2-b79761920b17-nginx-conf\") pod \"nmstate-console-plugin-5c78fc5d65-q985f\" (UID: \"8fc6d41c-a8a1-4fe3-ade2-b79761920b17\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-q985f" Feb 17 14:21:01 crc kubenswrapper[4836]: I0217 14:21:01.644422 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/8fc6d41c-a8a1-4fe3-ade2-b79761920b17-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-q985f\" (UID: \"8fc6d41c-a8a1-4fe3-ade2-b79761920b17\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-q985f" Feb 17 14:21:01 crc kubenswrapper[4836]: W0217 14:21:01.683050 4836 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ff842c9_08b8_4363_b82a_5f7e2461ec2a.slice/crio-e28585bf6f19fb3108b832e84d25055b3a9b9067d38607b8e5943b281573e17e WatchSource:0}: Error finding container e28585bf6f19fb3108b832e84d25055b3a9b9067d38607b8e5943b281573e17e: Status 404 returned error can't find the container with id e28585bf6f19fb3108b832e84d25055b3a9b9067d38607b8e5943b281573e17e Feb 17 14:21:01 crc kubenswrapper[4836]: I0217 14:21:01.745389 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2kzb\" (UniqueName: \"kubernetes.io/projected/8fc6d41c-a8a1-4fe3-ade2-b79761920b17-kube-api-access-w2kzb\") pod \"nmstate-console-plugin-5c78fc5d65-q985f\" (UID: \"8fc6d41c-a8a1-4fe3-ade2-b79761920b17\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-q985f" Feb 17 14:21:01 crc kubenswrapper[4836]: I0217 14:21:01.745455 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/8fc6d41c-a8a1-4fe3-ade2-b79761920b17-nginx-conf\") pod \"nmstate-console-plugin-5c78fc5d65-q985f\" (UID: \"8fc6d41c-a8a1-4fe3-ade2-b79761920b17\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-q985f" Feb 17 14:21:01 crc kubenswrapper[4836]: I0217 14:21:01.745479 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/8fc6d41c-a8a1-4fe3-ade2-b79761920b17-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-q985f\" (UID: \"8fc6d41c-a8a1-4fe3-ade2-b79761920b17\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-q985f" Feb 17 14:21:01 crc kubenswrapper[4836]: I0217 14:21:01.751101 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-b844687d4-4gf5j"] Feb 17 14:21:01 crc kubenswrapper[4836]: I0217 14:21:01.751964 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-b844687d4-4gf5j" Feb 17 14:21:01 crc kubenswrapper[4836]: I0217 14:21:01.755721 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/8fc6d41c-a8a1-4fe3-ade2-b79761920b17-nginx-conf\") pod \"nmstate-console-plugin-5c78fc5d65-q985f\" (UID: \"8fc6d41c-a8a1-4fe3-ade2-b79761920b17\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-q985f" Feb 17 14:21:01 crc kubenswrapper[4836]: I0217 14:21:01.786383 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/8fc6d41c-a8a1-4fe3-ade2-b79761920b17-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-q985f\" (UID: \"8fc6d41c-a8a1-4fe3-ade2-b79761920b17\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-q985f" Feb 17 14:21:01 crc kubenswrapper[4836]: I0217 14:21:01.798895 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2kzb\" (UniqueName: \"kubernetes.io/projected/8fc6d41c-a8a1-4fe3-ade2-b79761920b17-kube-api-access-w2kzb\") pod \"nmstate-console-plugin-5c78fc5d65-q985f\" (UID: \"8fc6d41c-a8a1-4fe3-ade2-b79761920b17\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-q985f" Feb 17 14:21:01 crc kubenswrapper[4836]: I0217 14:21:01.823114 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-b844687d4-4gf5j"] Feb 17 14:21:01 crc kubenswrapper[4836]: I0217 14:21:01.848363 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a994c152-32cc-448d-a7f7-099bd60fb8d9-console-config\") pod \"console-b844687d4-4gf5j\" (UID: \"a994c152-32cc-448d-a7f7-099bd60fb8d9\") " pod="openshift-console/console-b844687d4-4gf5j" Feb 17 14:21:01 crc kubenswrapper[4836]: I0217 14:21:01.848443 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a994c152-32cc-448d-a7f7-099bd60fb8d9-oauth-serving-cert\") pod \"console-b844687d4-4gf5j\" (UID: \"a994c152-32cc-448d-a7f7-099bd60fb8d9\") " pod="openshift-console/console-b844687d4-4gf5j" Feb 17 14:21:01 crc kubenswrapper[4836]: I0217 14:21:01.848473 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a994c152-32cc-448d-a7f7-099bd60fb8d9-console-oauth-config\") pod \"console-b844687d4-4gf5j\" (UID: \"a994c152-32cc-448d-a7f7-099bd60fb8d9\") " pod="openshift-console/console-b844687d4-4gf5j" Feb 17 14:21:01 crc kubenswrapper[4836]: I0217 14:21:01.848507 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a994c152-32cc-448d-a7f7-099bd60fb8d9-trusted-ca-bundle\") pod \"console-b844687d4-4gf5j\" (UID: \"a994c152-32cc-448d-a7f7-099bd60fb8d9\") " pod="openshift-console/console-b844687d4-4gf5j" Feb 17 14:21:01 crc kubenswrapper[4836]: I0217 14:21:01.848535 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8frxf\" (UniqueName: \"kubernetes.io/projected/a994c152-32cc-448d-a7f7-099bd60fb8d9-kube-api-access-8frxf\") pod \"console-b844687d4-4gf5j\" (UID: \"a994c152-32cc-448d-a7f7-099bd60fb8d9\") " pod="openshift-console/console-b844687d4-4gf5j" Feb 17 14:21:01 crc kubenswrapper[4836]: I0217 14:21:01.848574 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a994c152-32cc-448d-a7f7-099bd60fb8d9-console-serving-cert\") pod \"console-b844687d4-4gf5j\" (UID: \"a994c152-32cc-448d-a7f7-099bd60fb8d9\") " pod="openshift-console/console-b844687d4-4gf5j" Feb 17 14:21:01 crc kubenswrapper[4836]: I0217 14:21:01.848606 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a994c152-32cc-448d-a7f7-099bd60fb8d9-service-ca\") pod \"console-b844687d4-4gf5j\" (UID: \"a994c152-32cc-448d-a7f7-099bd60fb8d9\") " pod="openshift-console/console-b844687d4-4gf5j" Feb 17 14:21:01 crc kubenswrapper[4836]: I0217 14:21:01.853882 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-q985f" Feb 17 14:21:01 crc kubenswrapper[4836]: I0217 14:21:01.920854 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58c85c668d-877xf"] Feb 17 14:21:01 crc kubenswrapper[4836]: W0217 14:21:01.922639 4836 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0d0615b5_ef3b_4932_957c_a4b44f35c1a9.slice/crio-8f217c70f44750112f961da5e87d3955d3c432341620aa0dc51f2e75d18c16e3 WatchSource:0}: Error finding container 8f217c70f44750112f961da5e87d3955d3c432341620aa0dc51f2e75d18c16e3: Status 404 returned error can't find the container with id 8f217c70f44750112f961da5e87d3955d3c432341620aa0dc51f2e75d18c16e3 Feb 17 14:21:01 crc kubenswrapper[4836]: I0217 14:21:01.950274 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a994c152-32cc-448d-a7f7-099bd60fb8d9-trusted-ca-bundle\") pod \"console-b844687d4-4gf5j\" (UID: \"a994c152-32cc-448d-a7f7-099bd60fb8d9\") " pod="openshift-console/console-b844687d4-4gf5j" Feb 17 14:21:01 crc kubenswrapper[4836]: I0217 14:21:01.950366 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8frxf\" (UniqueName: \"kubernetes.io/projected/a994c152-32cc-448d-a7f7-099bd60fb8d9-kube-api-access-8frxf\") pod \"console-b844687d4-4gf5j\" (UID: \"a994c152-32cc-448d-a7f7-099bd60fb8d9\") " pod="openshift-console/console-b844687d4-4gf5j" Feb 17 14:21:01 crc kubenswrapper[4836]: I0217 14:21:01.950409 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a994c152-32cc-448d-a7f7-099bd60fb8d9-console-serving-cert\") pod \"console-b844687d4-4gf5j\" (UID: \"a994c152-32cc-448d-a7f7-099bd60fb8d9\") " pod="openshift-console/console-b844687d4-4gf5j" Feb 17 14:21:01 crc kubenswrapper[4836]: I0217 14:21:01.950444 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a994c152-32cc-448d-a7f7-099bd60fb8d9-service-ca\") pod \"console-b844687d4-4gf5j\" (UID: \"a994c152-32cc-448d-a7f7-099bd60fb8d9\") " pod="openshift-console/console-b844687d4-4gf5j" Feb 17 14:21:01 crc kubenswrapper[4836]: I0217 14:21:01.950484 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/6d6a6ca4-12c5-4bc1-b67e-5a48d1fe86f8-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-52vj8\" (UID: \"6d6a6ca4-12c5-4bc1-b67e-5a48d1fe86f8\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-52vj8" Feb 17 14:21:01 crc kubenswrapper[4836]: I0217 14:21:01.950546 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a994c152-32cc-448d-a7f7-099bd60fb8d9-console-config\") pod \"console-b844687d4-4gf5j\" (UID: \"a994c152-32cc-448d-a7f7-099bd60fb8d9\") " pod="openshift-console/console-b844687d4-4gf5j" Feb 17 14:21:01 crc kubenswrapper[4836]: I0217 14:21:01.950594 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a994c152-32cc-448d-a7f7-099bd60fb8d9-oauth-serving-cert\") pod \"console-b844687d4-4gf5j\" (UID: \"a994c152-32cc-448d-a7f7-099bd60fb8d9\") " pod="openshift-console/console-b844687d4-4gf5j" Feb 17 14:21:01 crc kubenswrapper[4836]: I0217 14:21:01.950618 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a994c152-32cc-448d-a7f7-099bd60fb8d9-console-oauth-config\") pod \"console-b844687d4-4gf5j\" (UID: \"a994c152-32cc-448d-a7f7-099bd60fb8d9\") " pod="openshift-console/console-b844687d4-4gf5j" Feb 17 14:21:01 crc kubenswrapper[4836]: I0217 14:21:01.951830 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a994c152-32cc-448d-a7f7-099bd60fb8d9-trusted-ca-bundle\") pod \"console-b844687d4-4gf5j\" (UID: \"a994c152-32cc-448d-a7f7-099bd60fb8d9\") " pod="openshift-console/console-b844687d4-4gf5j" Feb 17 14:21:01 crc kubenswrapper[4836]: I0217 14:21:01.951981 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a994c152-32cc-448d-a7f7-099bd60fb8d9-service-ca\") pod \"console-b844687d4-4gf5j\" (UID: \"a994c152-32cc-448d-a7f7-099bd60fb8d9\") " pod="openshift-console/console-b844687d4-4gf5j" Feb 17 14:21:01 crc kubenswrapper[4836]: I0217 14:21:01.952604 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a994c152-32cc-448d-a7f7-099bd60fb8d9-console-config\") pod \"console-b844687d4-4gf5j\" (UID: \"a994c152-32cc-448d-a7f7-099bd60fb8d9\") " pod="openshift-console/console-b844687d4-4gf5j" Feb 17 14:21:01 crc kubenswrapper[4836]: I0217 14:21:01.953226 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a994c152-32cc-448d-a7f7-099bd60fb8d9-oauth-serving-cert\") pod \"console-b844687d4-4gf5j\" (UID: \"a994c152-32cc-448d-a7f7-099bd60fb8d9\") " pod="openshift-console/console-b844687d4-4gf5j" Feb 17 14:21:01 crc kubenswrapper[4836]: I0217 14:21:01.955597 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a994c152-32cc-448d-a7f7-099bd60fb8d9-console-oauth-config\") pod \"console-b844687d4-4gf5j\" (UID: \"a994c152-32cc-448d-a7f7-099bd60fb8d9\") " pod="openshift-console/console-b844687d4-4gf5j" Feb 17 14:21:01 crc kubenswrapper[4836]: I0217 14:21:01.956460 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a994c152-32cc-448d-a7f7-099bd60fb8d9-console-serving-cert\") pod \"console-b844687d4-4gf5j\" (UID: \"a994c152-32cc-448d-a7f7-099bd60fb8d9\") " pod="openshift-console/console-b844687d4-4gf5j" Feb 17 14:21:01 crc kubenswrapper[4836]: I0217 14:21:01.957630 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/6d6a6ca4-12c5-4bc1-b67e-5a48d1fe86f8-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-52vj8\" (UID: \"6d6a6ca4-12c5-4bc1-b67e-5a48d1fe86f8\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-52vj8" Feb 17 14:21:01 crc kubenswrapper[4836]: I0217 14:21:01.971413 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8frxf\" (UniqueName: \"kubernetes.io/projected/a994c152-32cc-448d-a7f7-099bd60fb8d9-kube-api-access-8frxf\") pod \"console-b844687d4-4gf5j\" (UID: \"a994c152-32cc-448d-a7f7-099bd60fb8d9\") " pod="openshift-console/console-b844687d4-4gf5j" Feb 17 14:21:02 crc kubenswrapper[4836]: I0217 14:21:02.085679 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5c78fc5d65-q985f"] Feb 17 14:21:02 crc kubenswrapper[4836]: W0217 14:21:02.089820 4836 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8fc6d41c_a8a1_4fe3_ade2_b79761920b17.slice/crio-516c8bbfd65d5225a7defe5ace76faaddad1d69a7c69913806b98a6dde228d0e WatchSource:0}: Error finding container 516c8bbfd65d5225a7defe5ace76faaddad1d69a7c69913806b98a6dde228d0e: Status 404 returned error can't find the container with id 516c8bbfd65d5225a7defe5ace76faaddad1d69a7c69913806b98a6dde228d0e Feb 17 14:21:02 crc kubenswrapper[4836]: I0217 14:21:02.106339 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58c85c668d-877xf" event={"ID":"0d0615b5-ef3b-4932-957c-a4b44f35c1a9","Type":"ContainerStarted","Data":"8f217c70f44750112f961da5e87d3955d3c432341620aa0dc51f2e75d18c16e3"} Feb 17 14:21:02 crc kubenswrapper[4836]: I0217 14:21:02.107355 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-q985f" event={"ID":"8fc6d41c-a8a1-4fe3-ade2-b79761920b17","Type":"ContainerStarted","Data":"516c8bbfd65d5225a7defe5ace76faaddad1d69a7c69913806b98a6dde228d0e"} Feb 17 14:21:02 crc kubenswrapper[4836]: I0217 14:21:02.108051 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-w8wbg" event={"ID":"9ff842c9-08b8-4363-b82a-5f7e2461ec2a","Type":"ContainerStarted","Data":"e28585bf6f19fb3108b832e84d25055b3a9b9067d38607b8e5943b281573e17e"} Feb 17 14:21:02 crc kubenswrapper[4836]: I0217 14:21:02.109577 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-b844687d4-4gf5j" Feb 17 14:21:02 crc kubenswrapper[4836]: I0217 14:21:02.168409 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-52vj8" Feb 17 14:21:02 crc kubenswrapper[4836]: I0217 14:21:02.329681 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-b844687d4-4gf5j"] Feb 17 14:21:02 crc kubenswrapper[4836]: W0217 14:21:02.339607 4836 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda994c152_32cc_448d_a7f7_099bd60fb8d9.slice/crio-4a2749211e3b0287b4eb0c905a1801e479c9610291f3ba45bfe6eeb8d5212844 WatchSource:0}: Error finding container 4a2749211e3b0287b4eb0c905a1801e479c9610291f3ba45bfe6eeb8d5212844: Status 404 returned error can't find the container with id 4a2749211e3b0287b4eb0c905a1801e479c9610291f3ba45bfe6eeb8d5212844 Feb 17 14:21:02 crc kubenswrapper[4836]: I0217 14:21:02.418463 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-866bcb46dc-52vj8"] Feb 17 14:21:02 crc kubenswrapper[4836]: W0217 14:21:02.439698 4836 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d6a6ca4_12c5_4bc1_b67e_5a48d1fe86f8.slice/crio-433cba2ed4ba704cd9599b2ca4047f781885b90b3de1ec31939f76d4b7d65f11 WatchSource:0}: Error finding container 433cba2ed4ba704cd9599b2ca4047f781885b90b3de1ec31939f76d4b7d65f11: Status 404 returned error can't find the container with id 433cba2ed4ba704cd9599b2ca4047f781885b90b3de1ec31939f76d4b7d65f11 Feb 17 14:21:03 crc kubenswrapper[4836]: I0217 14:21:03.116823 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-52vj8" event={"ID":"6d6a6ca4-12c5-4bc1-b67e-5a48d1fe86f8","Type":"ContainerStarted","Data":"433cba2ed4ba704cd9599b2ca4047f781885b90b3de1ec31939f76d4b7d65f11"} Feb 17 14:21:03 crc kubenswrapper[4836]: I0217 14:21:03.118796 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-b844687d4-4gf5j" event={"ID":"a994c152-32cc-448d-a7f7-099bd60fb8d9","Type":"ContainerStarted","Data":"729b5bfd9fe518d7af30813213189948586fc2a39921928919b8098327fedc0c"} Feb 17 14:21:03 crc kubenswrapper[4836]: I0217 14:21:03.118827 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-b844687d4-4gf5j" event={"ID":"a994c152-32cc-448d-a7f7-099bd60fb8d9","Type":"ContainerStarted","Data":"4a2749211e3b0287b4eb0c905a1801e479c9610291f3ba45bfe6eeb8d5212844"} Feb 17 14:21:03 crc kubenswrapper[4836]: I0217 14:21:03.137086 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-b844687d4-4gf5j" podStartSLOduration=2.137065423 podStartE2EDuration="2.137065423s" podCreationTimestamp="2026-02-17 14:21:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:21:03.13583998 +0000 UTC m=+889.478768249" watchObservedRunningTime="2026-02-17 14:21:03.137065423 +0000 UTC m=+889.479993692" Feb 17 14:21:05 crc kubenswrapper[4836]: I0217 14:21:05.155088 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-w8wbg" event={"ID":"9ff842c9-08b8-4363-b82a-5f7e2461ec2a","Type":"ContainerStarted","Data":"e91294bcf50ad5ea50a8a24d08c1f117b383b99f73cfa3dcaaee8cb047cd56b3"} Feb 17 14:21:05 crc kubenswrapper[4836]: I0217 14:21:05.155701 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-w8wbg" Feb 17 14:21:05 crc kubenswrapper[4836]: I0217 14:21:05.157250 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-52vj8" event={"ID":"6d6a6ca4-12c5-4bc1-b67e-5a48d1fe86f8","Type":"ContainerStarted","Data":"4eacf308cad48168c73e5827af9a5fa4a128251d6957a66cdff72a3a21be9592"} Feb 17 14:21:05 crc kubenswrapper[4836]: I0217 14:21:05.157392 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-52vj8" Feb 17 14:21:05 crc kubenswrapper[4836]: I0217 14:21:05.160307 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58c85c668d-877xf" event={"ID":"0d0615b5-ef3b-4932-957c-a4b44f35c1a9","Type":"ContainerStarted","Data":"f778fb88b8d7a66cb9e757f2a190b1f5ae397e2a3a2ef084d646a6696e5f99ae"} Feb 17 14:21:05 crc kubenswrapper[4836]: I0217 14:21:05.185122 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-52vj8" podStartSLOduration=2.028844558 podStartE2EDuration="4.185100335s" podCreationTimestamp="2026-02-17 14:21:01 +0000 UTC" firstStartedPulling="2026-02-17 14:21:02.442482138 +0000 UTC m=+888.785410407" lastFinishedPulling="2026-02-17 14:21:04.598737915 +0000 UTC m=+890.941666184" observedRunningTime="2026-02-17 14:21:05.184350705 +0000 UTC m=+891.527278984" watchObservedRunningTime="2026-02-17 14:21:05.185100335 +0000 UTC m=+891.528028604" Feb 17 14:21:05 crc kubenswrapper[4836]: I0217 14:21:05.190187 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-w8wbg" podStartSLOduration=1.285013067 podStartE2EDuration="4.190162886s" podCreationTimestamp="2026-02-17 14:21:01 +0000 UTC" firstStartedPulling="2026-02-17 14:21:01.695062207 +0000 UTC m=+888.037990476" lastFinishedPulling="2026-02-17 14:21:04.600212026 +0000 UTC m=+890.943140295" observedRunningTime="2026-02-17 14:21:05.169695578 +0000 UTC m=+891.512623847" watchObservedRunningTime="2026-02-17 14:21:05.190162886 +0000 UTC m=+891.533091155" Feb 17 14:21:08 crc kubenswrapper[4836]: I0217 14:21:08.186447 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58c85c668d-877xf" event={"ID":"0d0615b5-ef3b-4932-957c-a4b44f35c1a9","Type":"ContainerStarted","Data":"9aadb1da88a882b1e411d7b4e93a538f345fea0b1d3d9c1af8adb50d6fff8506"} Feb 17 14:21:08 crc kubenswrapper[4836]: I0217 14:21:08.188865 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-q985f" event={"ID":"8fc6d41c-a8a1-4fe3-ade2-b79761920b17","Type":"ContainerStarted","Data":"50922edbb640e19c9d8a35cfe5d477f250d234d0781f4ab9c50277718f237ba4"} Feb 17 14:21:08 crc kubenswrapper[4836]: I0217 14:21:08.207857 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-58c85c668d-877xf" podStartSLOduration=1.761940369 podStartE2EDuration="7.20783998s" podCreationTimestamp="2026-02-17 14:21:01 +0000 UTC" firstStartedPulling="2026-02-17 14:21:01.925137215 +0000 UTC m=+888.268065484" lastFinishedPulling="2026-02-17 14:21:07.371036826 +0000 UTC m=+893.713965095" observedRunningTime="2026-02-17 14:21:08.204524678 +0000 UTC m=+894.547452977" watchObservedRunningTime="2026-02-17 14:21:08.20783998 +0000 UTC m=+894.550768249" Feb 17 14:21:08 crc kubenswrapper[4836]: I0217 14:21:08.237797 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-q985f" podStartSLOduration=1.970026135 podStartE2EDuration="7.23777458s" podCreationTimestamp="2026-02-17 14:21:01 +0000 UTC" firstStartedPulling="2026-02-17 14:21:02.091970837 +0000 UTC m=+888.434899106" lastFinishedPulling="2026-02-17 14:21:07.359719282 +0000 UTC m=+893.702647551" observedRunningTime="2026-02-17 14:21:08.231598889 +0000 UTC m=+894.574527168" watchObservedRunningTime="2026-02-17 14:21:08.23777458 +0000 UTC m=+894.580702849" Feb 17 14:21:11 crc kubenswrapper[4836]: I0217 14:21:11.663108 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-w8wbg" Feb 17 14:21:12 crc kubenswrapper[4836]: I0217 14:21:12.110550 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-b844687d4-4gf5j" Feb 17 14:21:12 crc kubenswrapper[4836]: I0217 14:21:12.110872 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-b844687d4-4gf5j" Feb 17 14:21:12 crc kubenswrapper[4836]: I0217 14:21:12.118636 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-b844687d4-4gf5j" Feb 17 14:21:12 crc kubenswrapper[4836]: I0217 14:21:12.218115 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-b844687d4-4gf5j" Feb 17 14:21:12 crc kubenswrapper[4836]: I0217 14:21:12.274591 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-6zspj"] Feb 17 14:21:22 crc kubenswrapper[4836]: I0217 14:21:22.174641 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-52vj8" Feb 17 14:21:36 crc kubenswrapper[4836]: I0217 14:21:36.113589 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213cwqrl"] Feb 17 14:21:36 crc kubenswrapper[4836]: I0217 14:21:36.115618 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213cwqrl" Feb 17 14:21:36 crc kubenswrapper[4836]: I0217 14:21:36.124250 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 17 14:21:36 crc kubenswrapper[4836]: I0217 14:21:36.124588 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213cwqrl"] Feb 17 14:21:36 crc kubenswrapper[4836]: I0217 14:21:36.197513 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5939eb42-42be-4ecf-845a-c28b4669c02d-bundle\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213cwqrl\" (UID: \"5939eb42-42be-4ecf-845a-c28b4669c02d\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213cwqrl" Feb 17 14:21:36 crc kubenswrapper[4836]: I0217 14:21:36.197602 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5939eb42-42be-4ecf-845a-c28b4669c02d-util\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213cwqrl\" (UID: \"5939eb42-42be-4ecf-845a-c28b4669c02d\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213cwqrl" Feb 17 14:21:36 crc kubenswrapper[4836]: I0217 14:21:36.197662 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hck2p\" (UniqueName: \"kubernetes.io/projected/5939eb42-42be-4ecf-845a-c28b4669c02d-kube-api-access-hck2p\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213cwqrl\" (UID: \"5939eb42-42be-4ecf-845a-c28b4669c02d\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213cwqrl" Feb 17 14:21:36 crc kubenswrapper[4836]: I0217 14:21:36.298988 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5939eb42-42be-4ecf-845a-c28b4669c02d-util\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213cwqrl\" (UID: \"5939eb42-42be-4ecf-845a-c28b4669c02d\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213cwqrl" Feb 17 14:21:36 crc kubenswrapper[4836]: I0217 14:21:36.299121 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hck2p\" (UniqueName: \"kubernetes.io/projected/5939eb42-42be-4ecf-845a-c28b4669c02d-kube-api-access-hck2p\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213cwqrl\" (UID: \"5939eb42-42be-4ecf-845a-c28b4669c02d\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213cwqrl" Feb 17 14:21:36 crc kubenswrapper[4836]: I0217 14:21:36.299205 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5939eb42-42be-4ecf-845a-c28b4669c02d-bundle\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213cwqrl\" (UID: \"5939eb42-42be-4ecf-845a-c28b4669c02d\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213cwqrl" Feb 17 14:21:36 crc kubenswrapper[4836]: I0217 14:21:36.299823 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5939eb42-42be-4ecf-845a-c28b4669c02d-util\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213cwqrl\" (UID: \"5939eb42-42be-4ecf-845a-c28b4669c02d\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213cwqrl" Feb 17 14:21:36 crc kubenswrapper[4836]: I0217 14:21:36.300145 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5939eb42-42be-4ecf-845a-c28b4669c02d-bundle\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213cwqrl\" (UID: \"5939eb42-42be-4ecf-845a-c28b4669c02d\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213cwqrl" Feb 17 14:21:36 crc kubenswrapper[4836]: I0217 14:21:36.324893 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hck2p\" (UniqueName: \"kubernetes.io/projected/5939eb42-42be-4ecf-845a-c28b4669c02d-kube-api-access-hck2p\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213cwqrl\" (UID: \"5939eb42-42be-4ecf-845a-c28b4669c02d\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213cwqrl" Feb 17 14:21:36 crc kubenswrapper[4836]: I0217 14:21:36.438927 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213cwqrl" Feb 17 14:21:36 crc kubenswrapper[4836]: I0217 14:21:36.863829 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213cwqrl"] Feb 17 14:21:37 crc kubenswrapper[4836]: I0217 14:21:37.317779 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-6zspj" podUID="6d52104b-91e7-4a3a-9138-163eb850485d" containerName="console" containerID="cri-o://f775333908715051cce1f22baadc49c1832333ebb3ff40de54bd938fa8c37a98" gracePeriod=15 Feb 17 14:21:37 crc kubenswrapper[4836]: I0217 14:21:37.398764 4836 generic.go:334] "Generic (PLEG): container finished" podID="5939eb42-42be-4ecf-845a-c28b4669c02d" containerID="36109a71edda8ae9aa419b8559cf5fe2431d0d712a414f525d482f59972b80ca" exitCode=0 Feb 17 14:21:37 crc kubenswrapper[4836]: I0217 14:21:37.398831 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213cwqrl" event={"ID":"5939eb42-42be-4ecf-845a-c28b4669c02d","Type":"ContainerDied","Data":"36109a71edda8ae9aa419b8559cf5fe2431d0d712a414f525d482f59972b80ca"} Feb 17 14:21:37 crc kubenswrapper[4836]: I0217 14:21:37.398860 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213cwqrl" event={"ID":"5939eb42-42be-4ecf-845a-c28b4669c02d","Type":"ContainerStarted","Data":"bfa7ffee62c7db8c9b964e7f91cb1a47a56e8fd7b3d25a5ed2ab5b4a481604e2"} Feb 17 14:21:37 crc kubenswrapper[4836]: I0217 14:21:37.719465 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-6zspj_6d52104b-91e7-4a3a-9138-163eb850485d/console/0.log" Feb 17 14:21:37 crc kubenswrapper[4836]: I0217 14:21:37.719792 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-6zspj" Feb 17 14:21:37 crc kubenswrapper[4836]: I0217 14:21:37.824378 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6d52104b-91e7-4a3a-9138-163eb850485d-trusted-ca-bundle\") pod \"6d52104b-91e7-4a3a-9138-163eb850485d\" (UID: \"6d52104b-91e7-4a3a-9138-163eb850485d\") " Feb 17 14:21:37 crc kubenswrapper[4836]: I0217 14:21:37.824483 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6d52104b-91e7-4a3a-9138-163eb850485d-console-oauth-config\") pod \"6d52104b-91e7-4a3a-9138-163eb850485d\" (UID: \"6d52104b-91e7-4a3a-9138-163eb850485d\") " Feb 17 14:21:37 crc kubenswrapper[4836]: I0217 14:21:37.824521 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6d52104b-91e7-4a3a-9138-163eb850485d-console-config\") pod \"6d52104b-91e7-4a3a-9138-163eb850485d\" (UID: \"6d52104b-91e7-4a3a-9138-163eb850485d\") " Feb 17 14:21:37 crc kubenswrapper[4836]: I0217 14:21:37.824595 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6d52104b-91e7-4a3a-9138-163eb850485d-oauth-serving-cert\") pod \"6d52104b-91e7-4a3a-9138-163eb850485d\" (UID: \"6d52104b-91e7-4a3a-9138-163eb850485d\") " Feb 17 14:21:37 crc kubenswrapper[4836]: I0217 14:21:37.824618 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6d52104b-91e7-4a3a-9138-163eb850485d-service-ca\") pod \"6d52104b-91e7-4a3a-9138-163eb850485d\" (UID: \"6d52104b-91e7-4a3a-9138-163eb850485d\") " Feb 17 14:21:37 crc kubenswrapper[4836]: I0217 14:21:37.824651 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6d52104b-91e7-4a3a-9138-163eb850485d-console-serving-cert\") pod \"6d52104b-91e7-4a3a-9138-163eb850485d\" (UID: \"6d52104b-91e7-4a3a-9138-163eb850485d\") " Feb 17 14:21:37 crc kubenswrapper[4836]: I0217 14:21:37.824719 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-grvpk\" (UniqueName: \"kubernetes.io/projected/6d52104b-91e7-4a3a-9138-163eb850485d-kube-api-access-grvpk\") pod \"6d52104b-91e7-4a3a-9138-163eb850485d\" (UID: \"6d52104b-91e7-4a3a-9138-163eb850485d\") " Feb 17 14:21:37 crc kubenswrapper[4836]: I0217 14:21:37.825407 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d52104b-91e7-4a3a-9138-163eb850485d-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "6d52104b-91e7-4a3a-9138-163eb850485d" (UID: "6d52104b-91e7-4a3a-9138-163eb850485d"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:21:37 crc kubenswrapper[4836]: I0217 14:21:37.825668 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d52104b-91e7-4a3a-9138-163eb850485d-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6d52104b-91e7-4a3a-9138-163eb850485d" (UID: "6d52104b-91e7-4a3a-9138-163eb850485d"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:21:37 crc kubenswrapper[4836]: I0217 14:21:37.825798 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d52104b-91e7-4a3a-9138-163eb850485d-service-ca" (OuterVolumeSpecName: "service-ca") pod "6d52104b-91e7-4a3a-9138-163eb850485d" (UID: "6d52104b-91e7-4a3a-9138-163eb850485d"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:21:37 crc kubenswrapper[4836]: I0217 14:21:37.826227 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d52104b-91e7-4a3a-9138-163eb850485d-console-config" (OuterVolumeSpecName: "console-config") pod "6d52104b-91e7-4a3a-9138-163eb850485d" (UID: "6d52104b-91e7-4a3a-9138-163eb850485d"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:21:37 crc kubenswrapper[4836]: I0217 14:21:37.826259 4836 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6d52104b-91e7-4a3a-9138-163eb850485d-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:21:37 crc kubenswrapper[4836]: I0217 14:21:37.826318 4836 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6d52104b-91e7-4a3a-9138-163eb850485d-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 14:21:37 crc kubenswrapper[4836]: I0217 14:21:37.826331 4836 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6d52104b-91e7-4a3a-9138-163eb850485d-service-ca\") on node \"crc\" DevicePath \"\"" Feb 17 14:21:37 crc kubenswrapper[4836]: I0217 14:21:37.832014 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d52104b-91e7-4a3a-9138-163eb850485d-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "6d52104b-91e7-4a3a-9138-163eb850485d" (UID: "6d52104b-91e7-4a3a-9138-163eb850485d"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:21:37 crc kubenswrapper[4836]: I0217 14:21:37.832540 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d52104b-91e7-4a3a-9138-163eb850485d-kube-api-access-grvpk" (OuterVolumeSpecName: "kube-api-access-grvpk") pod "6d52104b-91e7-4a3a-9138-163eb850485d" (UID: "6d52104b-91e7-4a3a-9138-163eb850485d"). InnerVolumeSpecName "kube-api-access-grvpk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:21:37 crc kubenswrapper[4836]: I0217 14:21:37.835213 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d52104b-91e7-4a3a-9138-163eb850485d-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "6d52104b-91e7-4a3a-9138-163eb850485d" (UID: "6d52104b-91e7-4a3a-9138-163eb850485d"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:21:37 crc kubenswrapper[4836]: I0217 14:21:37.927849 4836 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6d52104b-91e7-4a3a-9138-163eb850485d-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:21:37 crc kubenswrapper[4836]: I0217 14:21:37.927945 4836 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6d52104b-91e7-4a3a-9138-163eb850485d-console-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:21:37 crc kubenswrapper[4836]: I0217 14:21:37.927960 4836 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6d52104b-91e7-4a3a-9138-163eb850485d-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 14:21:37 crc kubenswrapper[4836]: I0217 14:21:37.927972 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-grvpk\" (UniqueName: \"kubernetes.io/projected/6d52104b-91e7-4a3a-9138-163eb850485d-kube-api-access-grvpk\") on node \"crc\" DevicePath \"\"" Feb 17 14:21:38 crc kubenswrapper[4836]: I0217 14:21:38.452936 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-6zspj_6d52104b-91e7-4a3a-9138-163eb850485d/console/0.log" Feb 17 14:21:38 crc kubenswrapper[4836]: I0217 14:21:38.452991 4836 generic.go:334] "Generic (PLEG): container finished" podID="6d52104b-91e7-4a3a-9138-163eb850485d" containerID="f775333908715051cce1f22baadc49c1832333ebb3ff40de54bd938fa8c37a98" exitCode=2 Feb 17 14:21:38 crc kubenswrapper[4836]: I0217 14:21:38.453025 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-6zspj" event={"ID":"6d52104b-91e7-4a3a-9138-163eb850485d","Type":"ContainerDied","Data":"f775333908715051cce1f22baadc49c1832333ebb3ff40de54bd938fa8c37a98"} Feb 17 14:21:38 crc kubenswrapper[4836]: I0217 14:21:38.453054 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-6zspj" event={"ID":"6d52104b-91e7-4a3a-9138-163eb850485d","Type":"ContainerDied","Data":"291ff510753e6307affd77e72c2b113e622f07b799c9441e606ef5eb3889b1a8"} Feb 17 14:21:38 crc kubenswrapper[4836]: I0217 14:21:38.453072 4836 scope.go:117] "RemoveContainer" containerID="f775333908715051cce1f22baadc49c1832333ebb3ff40de54bd938fa8c37a98" Feb 17 14:21:38 crc kubenswrapper[4836]: I0217 14:21:38.453222 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-6zspj" Feb 17 14:21:38 crc kubenswrapper[4836]: I0217 14:21:38.503387 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-6zspj"] Feb 17 14:21:38 crc kubenswrapper[4836]: I0217 14:21:38.508532 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-6zspj"] Feb 17 14:21:38 crc kubenswrapper[4836]: I0217 14:21:38.585141 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d52104b-91e7-4a3a-9138-163eb850485d" path="/var/lib/kubelet/pods/6d52104b-91e7-4a3a-9138-163eb850485d/volumes" Feb 17 14:21:38 crc kubenswrapper[4836]: I0217 14:21:38.622683 4836 scope.go:117] "RemoveContainer" containerID="f775333908715051cce1f22baadc49c1832333ebb3ff40de54bd938fa8c37a98" Feb 17 14:21:38 crc kubenswrapper[4836]: E0217 14:21:38.623437 4836 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f775333908715051cce1f22baadc49c1832333ebb3ff40de54bd938fa8c37a98\": container with ID starting with f775333908715051cce1f22baadc49c1832333ebb3ff40de54bd938fa8c37a98 not found: ID does not exist" containerID="f775333908715051cce1f22baadc49c1832333ebb3ff40de54bd938fa8c37a98" Feb 17 14:21:38 crc kubenswrapper[4836]: I0217 14:21:38.623474 4836 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f775333908715051cce1f22baadc49c1832333ebb3ff40de54bd938fa8c37a98"} err="failed to get container status \"f775333908715051cce1f22baadc49c1832333ebb3ff40de54bd938fa8c37a98\": rpc error: code = NotFound desc = could not find container \"f775333908715051cce1f22baadc49c1832333ebb3ff40de54bd938fa8c37a98\": container with ID starting with f775333908715051cce1f22baadc49c1832333ebb3ff40de54bd938fa8c37a98 not found: ID does not exist" Feb 17 14:21:38 crc kubenswrapper[4836]: E0217 14:21:38.938330 4836 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5939eb42_42be_4ecf_845a_c28b4669c02d.slice/crio-6ad7243e7b72de4f694d75068780dc31771b20f94dd9e2ee564008d8fbfeb3ca.scope\": RecentStats: unable to find data in memory cache]" Feb 17 14:21:39 crc kubenswrapper[4836]: I0217 14:21:39.464755 4836 generic.go:334] "Generic (PLEG): container finished" podID="5939eb42-42be-4ecf-845a-c28b4669c02d" containerID="6ad7243e7b72de4f694d75068780dc31771b20f94dd9e2ee564008d8fbfeb3ca" exitCode=0 Feb 17 14:21:39 crc kubenswrapper[4836]: I0217 14:21:39.464811 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213cwqrl" event={"ID":"5939eb42-42be-4ecf-845a-c28b4669c02d","Type":"ContainerDied","Data":"6ad7243e7b72de4f694d75068780dc31771b20f94dd9e2ee564008d8fbfeb3ca"} Feb 17 14:21:40 crc kubenswrapper[4836]: I0217 14:21:40.474027 4836 generic.go:334] "Generic (PLEG): container finished" podID="5939eb42-42be-4ecf-845a-c28b4669c02d" containerID="8027455750d65578381ecdd1bb12d3bb3c1d46d569bb8e1b1c71989150c32938" exitCode=0 Feb 17 14:21:40 crc kubenswrapper[4836]: I0217 14:21:40.474152 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213cwqrl" event={"ID":"5939eb42-42be-4ecf-845a-c28b4669c02d","Type":"ContainerDied","Data":"8027455750d65578381ecdd1bb12d3bb3c1d46d569bb8e1b1c71989150c32938"} Feb 17 14:21:41 crc kubenswrapper[4836]: I0217 14:21:41.742334 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213cwqrl" Feb 17 14:21:41 crc kubenswrapper[4836]: I0217 14:21:41.783988 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5939eb42-42be-4ecf-845a-c28b4669c02d-util\") pod \"5939eb42-42be-4ecf-845a-c28b4669c02d\" (UID: \"5939eb42-42be-4ecf-845a-c28b4669c02d\") " Feb 17 14:21:41 crc kubenswrapper[4836]: I0217 14:21:41.784229 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5939eb42-42be-4ecf-845a-c28b4669c02d-bundle\") pod \"5939eb42-42be-4ecf-845a-c28b4669c02d\" (UID: \"5939eb42-42be-4ecf-845a-c28b4669c02d\") " Feb 17 14:21:41 crc kubenswrapper[4836]: I0217 14:21:41.784269 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hck2p\" (UniqueName: \"kubernetes.io/projected/5939eb42-42be-4ecf-845a-c28b4669c02d-kube-api-access-hck2p\") pod \"5939eb42-42be-4ecf-845a-c28b4669c02d\" (UID: \"5939eb42-42be-4ecf-845a-c28b4669c02d\") " Feb 17 14:21:41 crc kubenswrapper[4836]: I0217 14:21:41.785343 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5939eb42-42be-4ecf-845a-c28b4669c02d-bundle" (OuterVolumeSpecName: "bundle") pod "5939eb42-42be-4ecf-845a-c28b4669c02d" (UID: "5939eb42-42be-4ecf-845a-c28b4669c02d"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:21:41 crc kubenswrapper[4836]: I0217 14:21:41.792013 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5939eb42-42be-4ecf-845a-c28b4669c02d-kube-api-access-hck2p" (OuterVolumeSpecName: "kube-api-access-hck2p") pod "5939eb42-42be-4ecf-845a-c28b4669c02d" (UID: "5939eb42-42be-4ecf-845a-c28b4669c02d"). InnerVolumeSpecName "kube-api-access-hck2p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:21:41 crc kubenswrapper[4836]: I0217 14:21:41.800812 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5939eb42-42be-4ecf-845a-c28b4669c02d-util" (OuterVolumeSpecName: "util") pod "5939eb42-42be-4ecf-845a-c28b4669c02d" (UID: "5939eb42-42be-4ecf-845a-c28b4669c02d"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:21:41 crc kubenswrapper[4836]: I0217 14:21:41.885825 4836 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5939eb42-42be-4ecf-845a-c28b4669c02d-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:21:41 crc kubenswrapper[4836]: I0217 14:21:41.885885 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hck2p\" (UniqueName: \"kubernetes.io/projected/5939eb42-42be-4ecf-845a-c28b4669c02d-kube-api-access-hck2p\") on node \"crc\" DevicePath \"\"" Feb 17 14:21:41 crc kubenswrapper[4836]: I0217 14:21:41.885906 4836 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5939eb42-42be-4ecf-845a-c28b4669c02d-util\") on node \"crc\" DevicePath \"\"" Feb 17 14:21:42 crc kubenswrapper[4836]: I0217 14:21:42.491962 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213cwqrl" event={"ID":"5939eb42-42be-4ecf-845a-c28b4669c02d","Type":"ContainerDied","Data":"bfa7ffee62c7db8c9b964e7f91cb1a47a56e8fd7b3d25a5ed2ab5b4a481604e2"} Feb 17 14:21:42 crc kubenswrapper[4836]: I0217 14:21:42.491998 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213cwqrl" Feb 17 14:21:42 crc kubenswrapper[4836]: I0217 14:21:42.492001 4836 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bfa7ffee62c7db8c9b964e7f91cb1a47a56e8fd7b3d25a5ed2ab5b4a481604e2" Feb 17 14:21:43 crc kubenswrapper[4836]: I0217 14:21:43.962220 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-p9lh8"] Feb 17 14:21:43 crc kubenswrapper[4836]: E0217 14:21:43.962530 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5939eb42-42be-4ecf-845a-c28b4669c02d" containerName="pull" Feb 17 14:21:43 crc kubenswrapper[4836]: I0217 14:21:43.962545 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="5939eb42-42be-4ecf-845a-c28b4669c02d" containerName="pull" Feb 17 14:21:43 crc kubenswrapper[4836]: E0217 14:21:43.962560 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5939eb42-42be-4ecf-845a-c28b4669c02d" containerName="util" Feb 17 14:21:43 crc kubenswrapper[4836]: I0217 14:21:43.962565 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="5939eb42-42be-4ecf-845a-c28b4669c02d" containerName="util" Feb 17 14:21:43 crc kubenswrapper[4836]: E0217 14:21:43.962578 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d52104b-91e7-4a3a-9138-163eb850485d" containerName="console" Feb 17 14:21:43 crc kubenswrapper[4836]: I0217 14:21:43.962584 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d52104b-91e7-4a3a-9138-163eb850485d" containerName="console" Feb 17 14:21:43 crc kubenswrapper[4836]: E0217 14:21:43.962596 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5939eb42-42be-4ecf-845a-c28b4669c02d" containerName="extract" Feb 17 14:21:43 crc kubenswrapper[4836]: I0217 14:21:43.962602 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="5939eb42-42be-4ecf-845a-c28b4669c02d" containerName="extract" Feb 17 14:21:43 crc kubenswrapper[4836]: I0217 14:21:43.962697 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d52104b-91e7-4a3a-9138-163eb850485d" containerName="console" Feb 17 14:21:43 crc kubenswrapper[4836]: I0217 14:21:43.962716 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="5939eb42-42be-4ecf-845a-c28b4669c02d" containerName="extract" Feb 17 14:21:43 crc kubenswrapper[4836]: I0217 14:21:43.963542 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p9lh8" Feb 17 14:21:44 crc kubenswrapper[4836]: I0217 14:21:44.034054 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-p9lh8"] Feb 17 14:21:44 crc kubenswrapper[4836]: I0217 14:21:44.146553 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11bad93-5af0-4c75-954c-42cc99684597-catalog-content\") pod \"certified-operators-p9lh8\" (UID: \"b11bad93-5af0-4c75-954c-42cc99684597\") " pod="openshift-marketplace/certified-operators-p9lh8" Feb 17 14:21:44 crc kubenswrapper[4836]: I0217 14:21:44.148511 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11bad93-5af0-4c75-954c-42cc99684597-utilities\") pod \"certified-operators-p9lh8\" (UID: \"b11bad93-5af0-4c75-954c-42cc99684597\") " pod="openshift-marketplace/certified-operators-p9lh8" Feb 17 14:21:44 crc kubenswrapper[4836]: I0217 14:21:44.148574 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76l57\" (UniqueName: \"kubernetes.io/projected/b11bad93-5af0-4c75-954c-42cc99684597-kube-api-access-76l57\") pod \"certified-operators-p9lh8\" (UID: \"b11bad93-5af0-4c75-954c-42cc99684597\") " pod="openshift-marketplace/certified-operators-p9lh8" Feb 17 14:21:44 crc kubenswrapper[4836]: I0217 14:21:44.250460 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11bad93-5af0-4c75-954c-42cc99684597-catalog-content\") pod \"certified-operators-p9lh8\" (UID: \"b11bad93-5af0-4c75-954c-42cc99684597\") " pod="openshift-marketplace/certified-operators-p9lh8" Feb 17 14:21:44 crc kubenswrapper[4836]: I0217 14:21:44.250587 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11bad93-5af0-4c75-954c-42cc99684597-utilities\") pod \"certified-operators-p9lh8\" (UID: \"b11bad93-5af0-4c75-954c-42cc99684597\") " pod="openshift-marketplace/certified-operators-p9lh8" Feb 17 14:21:44 crc kubenswrapper[4836]: I0217 14:21:44.250648 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76l57\" (UniqueName: \"kubernetes.io/projected/b11bad93-5af0-4c75-954c-42cc99684597-kube-api-access-76l57\") pod \"certified-operators-p9lh8\" (UID: \"b11bad93-5af0-4c75-954c-42cc99684597\") " pod="openshift-marketplace/certified-operators-p9lh8" Feb 17 14:21:44 crc kubenswrapper[4836]: I0217 14:21:44.251842 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11bad93-5af0-4c75-954c-42cc99684597-catalog-content\") pod \"certified-operators-p9lh8\" (UID: \"b11bad93-5af0-4c75-954c-42cc99684597\") " pod="openshift-marketplace/certified-operators-p9lh8" Feb 17 14:21:44 crc kubenswrapper[4836]: I0217 14:21:44.252523 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11bad93-5af0-4c75-954c-42cc99684597-utilities\") pod \"certified-operators-p9lh8\" (UID: \"b11bad93-5af0-4c75-954c-42cc99684597\") " pod="openshift-marketplace/certified-operators-p9lh8" Feb 17 14:21:44 crc kubenswrapper[4836]: I0217 14:21:44.272416 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76l57\" (UniqueName: \"kubernetes.io/projected/b11bad93-5af0-4c75-954c-42cc99684597-kube-api-access-76l57\") pod \"certified-operators-p9lh8\" (UID: \"b11bad93-5af0-4c75-954c-42cc99684597\") " pod="openshift-marketplace/certified-operators-p9lh8" Feb 17 14:21:44 crc kubenswrapper[4836]: I0217 14:21:44.342707 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p9lh8" Feb 17 14:21:44 crc kubenswrapper[4836]: I0217 14:21:44.860343 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-p9lh8"] Feb 17 14:21:45 crc kubenswrapper[4836]: I0217 14:21:45.517226 4836 generic.go:334] "Generic (PLEG): container finished" podID="b11bad93-5af0-4c75-954c-42cc99684597" containerID="240ced32e0bb68659d8fa3215c9fcef735236350bf5d87b16d4adcec08100306" exitCode=0 Feb 17 14:21:45 crc kubenswrapper[4836]: I0217 14:21:45.517328 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p9lh8" event={"ID":"b11bad93-5af0-4c75-954c-42cc99684597","Type":"ContainerDied","Data":"240ced32e0bb68659d8fa3215c9fcef735236350bf5d87b16d4adcec08100306"} Feb 17 14:21:45 crc kubenswrapper[4836]: I0217 14:21:45.517420 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p9lh8" event={"ID":"b11bad93-5af0-4c75-954c-42cc99684597","Type":"ContainerStarted","Data":"f9b851496d5cce0303b165e18e03386edf9e343272b9928f99078559a7e8d5a0"} Feb 17 14:21:46 crc kubenswrapper[4836]: I0217 14:21:46.525400 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p9lh8" event={"ID":"b11bad93-5af0-4c75-954c-42cc99684597","Type":"ContainerStarted","Data":"4fe1fe066d30b34ae39334b2a5ab55fea3ebf731c92645405e6f3bbb74be985c"} Feb 17 14:21:47 crc kubenswrapper[4836]: I0217 14:21:47.534807 4836 generic.go:334] "Generic (PLEG): container finished" podID="b11bad93-5af0-4c75-954c-42cc99684597" containerID="4fe1fe066d30b34ae39334b2a5ab55fea3ebf731c92645405e6f3bbb74be985c" exitCode=0 Feb 17 14:21:47 crc kubenswrapper[4836]: I0217 14:21:47.534895 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p9lh8" event={"ID":"b11bad93-5af0-4c75-954c-42cc99684597","Type":"ContainerDied","Data":"4fe1fe066d30b34ae39334b2a5ab55fea3ebf731c92645405e6f3bbb74be985c"} Feb 17 14:21:48 crc kubenswrapper[4836]: I0217 14:21:48.543858 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p9lh8" event={"ID":"b11bad93-5af0-4c75-954c-42cc99684597","Type":"ContainerStarted","Data":"23e3b31d1bf10a2dce07355e7766faafe2d1bf59f1230f2ba36f46b169423e26"} Feb 17 14:21:48 crc kubenswrapper[4836]: I0217 14:21:48.564847 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-p9lh8" podStartSLOduration=3.138232373 podStartE2EDuration="5.564816245s" podCreationTimestamp="2026-02-17 14:21:43 +0000 UTC" firstStartedPulling="2026-02-17 14:21:45.519438341 +0000 UTC m=+931.862366610" lastFinishedPulling="2026-02-17 14:21:47.946022213 +0000 UTC m=+934.288950482" observedRunningTime="2026-02-17 14:21:48.561982748 +0000 UTC m=+934.904911037" watchObservedRunningTime="2026-02-17 14:21:48.564816245 +0000 UTC m=+934.907744534" Feb 17 14:21:53 crc kubenswrapper[4836]: I0217 14:21:53.425039 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-69b9cbf5df-6fkqt"] Feb 17 14:21:53 crc kubenswrapper[4836]: I0217 14:21:53.426361 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-69b9cbf5df-6fkqt" Feb 17 14:21:53 crc kubenswrapper[4836]: I0217 14:21:53.428417 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Feb 17 14:21:53 crc kubenswrapper[4836]: I0217 14:21:53.428478 4836 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Feb 17 14:21:53 crc kubenswrapper[4836]: I0217 14:21:53.428685 4836 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-vgpnt" Feb 17 14:21:53 crc kubenswrapper[4836]: I0217 14:21:53.428885 4836 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Feb 17 14:21:53 crc kubenswrapper[4836]: I0217 14:21:53.429283 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Feb 17 14:21:53 crc kubenswrapper[4836]: I0217 14:21:53.448428 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-69b9cbf5df-6fkqt"] Feb 17 14:21:53 crc kubenswrapper[4836]: I0217 14:21:53.565798 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ccb35f40-d0b8-4a1e-8c45-63dd6987b72c-apiservice-cert\") pod \"metallb-operator-controller-manager-69b9cbf5df-6fkqt\" (UID: \"ccb35f40-d0b8-4a1e-8c45-63dd6987b72c\") " pod="metallb-system/metallb-operator-controller-manager-69b9cbf5df-6fkqt" Feb 17 14:21:53 crc kubenswrapper[4836]: I0217 14:21:53.565866 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kw8dh\" (UniqueName: \"kubernetes.io/projected/ccb35f40-d0b8-4a1e-8c45-63dd6987b72c-kube-api-access-kw8dh\") pod \"metallb-operator-controller-manager-69b9cbf5df-6fkqt\" (UID: \"ccb35f40-d0b8-4a1e-8c45-63dd6987b72c\") " pod="metallb-system/metallb-operator-controller-manager-69b9cbf5df-6fkqt" Feb 17 14:21:53 crc kubenswrapper[4836]: I0217 14:21:53.565894 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ccb35f40-d0b8-4a1e-8c45-63dd6987b72c-webhook-cert\") pod \"metallb-operator-controller-manager-69b9cbf5df-6fkqt\" (UID: \"ccb35f40-d0b8-4a1e-8c45-63dd6987b72c\") " pod="metallb-system/metallb-operator-controller-manager-69b9cbf5df-6fkqt" Feb 17 14:21:53 crc kubenswrapper[4836]: I0217 14:21:53.666777 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ccb35f40-d0b8-4a1e-8c45-63dd6987b72c-apiservice-cert\") pod \"metallb-operator-controller-manager-69b9cbf5df-6fkqt\" (UID: \"ccb35f40-d0b8-4a1e-8c45-63dd6987b72c\") " pod="metallb-system/metallb-operator-controller-manager-69b9cbf5df-6fkqt" Feb 17 14:21:53 crc kubenswrapper[4836]: I0217 14:21:53.666859 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kw8dh\" (UniqueName: \"kubernetes.io/projected/ccb35f40-d0b8-4a1e-8c45-63dd6987b72c-kube-api-access-kw8dh\") pod \"metallb-operator-controller-manager-69b9cbf5df-6fkqt\" (UID: \"ccb35f40-d0b8-4a1e-8c45-63dd6987b72c\") " pod="metallb-system/metallb-operator-controller-manager-69b9cbf5df-6fkqt" Feb 17 14:21:53 crc kubenswrapper[4836]: I0217 14:21:53.666884 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ccb35f40-d0b8-4a1e-8c45-63dd6987b72c-webhook-cert\") pod \"metallb-operator-controller-manager-69b9cbf5df-6fkqt\" (UID: \"ccb35f40-d0b8-4a1e-8c45-63dd6987b72c\") " pod="metallb-system/metallb-operator-controller-manager-69b9cbf5df-6fkqt" Feb 17 14:21:53 crc kubenswrapper[4836]: I0217 14:21:53.675883 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ccb35f40-d0b8-4a1e-8c45-63dd6987b72c-webhook-cert\") pod \"metallb-operator-controller-manager-69b9cbf5df-6fkqt\" (UID: \"ccb35f40-d0b8-4a1e-8c45-63dd6987b72c\") " pod="metallb-system/metallb-operator-controller-manager-69b9cbf5df-6fkqt" Feb 17 14:21:53 crc kubenswrapper[4836]: I0217 14:21:53.697801 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ccb35f40-d0b8-4a1e-8c45-63dd6987b72c-apiservice-cert\") pod \"metallb-operator-controller-manager-69b9cbf5df-6fkqt\" (UID: \"ccb35f40-d0b8-4a1e-8c45-63dd6987b72c\") " pod="metallb-system/metallb-operator-controller-manager-69b9cbf5df-6fkqt" Feb 17 14:21:53 crc kubenswrapper[4836]: I0217 14:21:53.699213 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kw8dh\" (UniqueName: \"kubernetes.io/projected/ccb35f40-d0b8-4a1e-8c45-63dd6987b72c-kube-api-access-kw8dh\") pod \"metallb-operator-controller-manager-69b9cbf5df-6fkqt\" (UID: \"ccb35f40-d0b8-4a1e-8c45-63dd6987b72c\") " pod="metallb-system/metallb-operator-controller-manager-69b9cbf5df-6fkqt" Feb 17 14:21:53 crc kubenswrapper[4836]: I0217 14:21:53.742482 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-69b9cbf5df-6fkqt" Feb 17 14:21:53 crc kubenswrapper[4836]: I0217 14:21:53.938769 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-856546fc87-n5vrx"] Feb 17 14:21:53 crc kubenswrapper[4836]: I0217 14:21:53.940899 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-856546fc87-n5vrx" Feb 17 14:21:53 crc kubenswrapper[4836]: I0217 14:21:53.944309 4836 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-7nbsm" Feb 17 14:21:53 crc kubenswrapper[4836]: I0217 14:21:53.946620 4836 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Feb 17 14:21:53 crc kubenswrapper[4836]: I0217 14:21:53.948323 4836 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Feb 17 14:21:53 crc kubenswrapper[4836]: I0217 14:21:53.961958 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-856546fc87-n5vrx"] Feb 17 14:21:53 crc kubenswrapper[4836]: I0217 14:21:53.975799 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6gx9\" (UniqueName: \"kubernetes.io/projected/16c736d5-389e-4d03-9657-1abcd4448953-kube-api-access-w6gx9\") pod \"metallb-operator-webhook-server-856546fc87-n5vrx\" (UID: \"16c736d5-389e-4d03-9657-1abcd4448953\") " pod="metallb-system/metallb-operator-webhook-server-856546fc87-n5vrx" Feb 17 14:21:53 crc kubenswrapper[4836]: I0217 14:21:53.975864 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/16c736d5-389e-4d03-9657-1abcd4448953-apiservice-cert\") pod \"metallb-operator-webhook-server-856546fc87-n5vrx\" (UID: \"16c736d5-389e-4d03-9657-1abcd4448953\") " pod="metallb-system/metallb-operator-webhook-server-856546fc87-n5vrx" Feb 17 14:21:53 crc kubenswrapper[4836]: I0217 14:21:53.975942 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/16c736d5-389e-4d03-9657-1abcd4448953-webhook-cert\") pod \"metallb-operator-webhook-server-856546fc87-n5vrx\" (UID: \"16c736d5-389e-4d03-9657-1abcd4448953\") " pod="metallb-system/metallb-operator-webhook-server-856546fc87-n5vrx" Feb 17 14:21:54 crc kubenswrapper[4836]: I0217 14:21:54.079057 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/16c736d5-389e-4d03-9657-1abcd4448953-webhook-cert\") pod \"metallb-operator-webhook-server-856546fc87-n5vrx\" (UID: \"16c736d5-389e-4d03-9657-1abcd4448953\") " pod="metallb-system/metallb-operator-webhook-server-856546fc87-n5vrx" Feb 17 14:21:54 crc kubenswrapper[4836]: I0217 14:21:54.079149 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6gx9\" (UniqueName: \"kubernetes.io/projected/16c736d5-389e-4d03-9657-1abcd4448953-kube-api-access-w6gx9\") pod \"metallb-operator-webhook-server-856546fc87-n5vrx\" (UID: \"16c736d5-389e-4d03-9657-1abcd4448953\") " pod="metallb-system/metallb-operator-webhook-server-856546fc87-n5vrx" Feb 17 14:21:54 crc kubenswrapper[4836]: I0217 14:21:54.079182 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/16c736d5-389e-4d03-9657-1abcd4448953-apiservice-cert\") pod \"metallb-operator-webhook-server-856546fc87-n5vrx\" (UID: \"16c736d5-389e-4d03-9657-1abcd4448953\") " pod="metallb-system/metallb-operator-webhook-server-856546fc87-n5vrx" Feb 17 14:21:54 crc kubenswrapper[4836]: I0217 14:21:54.085009 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/16c736d5-389e-4d03-9657-1abcd4448953-webhook-cert\") pod \"metallb-operator-webhook-server-856546fc87-n5vrx\" (UID: \"16c736d5-389e-4d03-9657-1abcd4448953\") " pod="metallb-system/metallb-operator-webhook-server-856546fc87-n5vrx" Feb 17 14:21:54 crc kubenswrapper[4836]: I0217 14:21:54.086043 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/16c736d5-389e-4d03-9657-1abcd4448953-apiservice-cert\") pod \"metallb-operator-webhook-server-856546fc87-n5vrx\" (UID: \"16c736d5-389e-4d03-9657-1abcd4448953\") " pod="metallb-system/metallb-operator-webhook-server-856546fc87-n5vrx" Feb 17 14:21:54 crc kubenswrapper[4836]: I0217 14:21:54.123601 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6gx9\" (UniqueName: \"kubernetes.io/projected/16c736d5-389e-4d03-9657-1abcd4448953-kube-api-access-w6gx9\") pod \"metallb-operator-webhook-server-856546fc87-n5vrx\" (UID: \"16c736d5-389e-4d03-9657-1abcd4448953\") " pod="metallb-system/metallb-operator-webhook-server-856546fc87-n5vrx" Feb 17 14:21:54 crc kubenswrapper[4836]: I0217 14:21:54.270738 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-856546fc87-n5vrx" Feb 17 14:21:54 crc kubenswrapper[4836]: I0217 14:21:54.282813 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-69b9cbf5df-6fkqt"] Feb 17 14:21:54 crc kubenswrapper[4836]: W0217 14:21:54.294958 4836 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podccb35f40_d0b8_4a1e_8c45_63dd6987b72c.slice/crio-fec3dfd83907dceea71791e3f39ff9f7358e04f0d059a73af848f044d7987788 WatchSource:0}: Error finding container fec3dfd83907dceea71791e3f39ff9f7358e04f0d059a73af848f044d7987788: Status 404 returned error can't find the container with id fec3dfd83907dceea71791e3f39ff9f7358e04f0d059a73af848f044d7987788 Feb 17 14:21:54 crc kubenswrapper[4836]: I0217 14:21:54.343238 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-p9lh8" Feb 17 14:21:54 crc kubenswrapper[4836]: I0217 14:21:54.343811 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-p9lh8" Feb 17 14:21:54 crc kubenswrapper[4836]: I0217 14:21:54.449054 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-p9lh8" Feb 17 14:21:54 crc kubenswrapper[4836]: I0217 14:21:54.533975 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-856546fc87-n5vrx"] Feb 17 14:21:54 crc kubenswrapper[4836]: W0217 14:21:54.539839 4836 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod16c736d5_389e_4d03_9657_1abcd4448953.slice/crio-c936d1fb7a39e85562851c2f13505b9eddab11d6528d5d2a322e4ba467ba7694 WatchSource:0}: Error finding container c936d1fb7a39e85562851c2f13505b9eddab11d6528d5d2a322e4ba467ba7694: Status 404 returned error can't find the container with id c936d1fb7a39e85562851c2f13505b9eddab11d6528d5d2a322e4ba467ba7694 Feb 17 14:21:54 crc kubenswrapper[4836]: I0217 14:21:54.587594 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-69b9cbf5df-6fkqt" event={"ID":"ccb35f40-d0b8-4a1e-8c45-63dd6987b72c","Type":"ContainerStarted","Data":"fec3dfd83907dceea71791e3f39ff9f7358e04f0d059a73af848f044d7987788"} Feb 17 14:21:54 crc kubenswrapper[4836]: I0217 14:21:54.588730 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-856546fc87-n5vrx" event={"ID":"16c736d5-389e-4d03-9657-1abcd4448953","Type":"ContainerStarted","Data":"c936d1fb7a39e85562851c2f13505b9eddab11d6528d5d2a322e4ba467ba7694"} Feb 17 14:21:54 crc kubenswrapper[4836]: I0217 14:21:54.630422 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-p9lh8" Feb 17 14:21:56 crc kubenswrapper[4836]: I0217 14:21:56.755982 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-p9lh8"] Feb 17 14:21:57 crc kubenswrapper[4836]: I0217 14:21:57.611927 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-p9lh8" podUID="b11bad93-5af0-4c75-954c-42cc99684597" containerName="registry-server" containerID="cri-o://23e3b31d1bf10a2dce07355e7766faafe2d1bf59f1230f2ba36f46b169423e26" gracePeriod=2 Feb 17 14:21:58 crc kubenswrapper[4836]: I0217 14:21:58.629338 4836 generic.go:334] "Generic (PLEG): container finished" podID="b11bad93-5af0-4c75-954c-42cc99684597" containerID="23e3b31d1bf10a2dce07355e7766faafe2d1bf59f1230f2ba36f46b169423e26" exitCode=0 Feb 17 14:21:58 crc kubenswrapper[4836]: I0217 14:21:58.629398 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p9lh8" event={"ID":"b11bad93-5af0-4c75-954c-42cc99684597","Type":"ContainerDied","Data":"23e3b31d1bf10a2dce07355e7766faafe2d1bf59f1230f2ba36f46b169423e26"} Feb 17 14:22:00 crc kubenswrapper[4836]: I0217 14:22:00.127613 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p9lh8" Feb 17 14:22:00 crc kubenswrapper[4836]: I0217 14:22:00.308891 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11bad93-5af0-4c75-954c-42cc99684597-catalog-content\") pod \"b11bad93-5af0-4c75-954c-42cc99684597\" (UID: \"b11bad93-5af0-4c75-954c-42cc99684597\") " Feb 17 14:22:00 crc kubenswrapper[4836]: I0217 14:22:00.308975 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-76l57\" (UniqueName: \"kubernetes.io/projected/b11bad93-5af0-4c75-954c-42cc99684597-kube-api-access-76l57\") pod \"b11bad93-5af0-4c75-954c-42cc99684597\" (UID: \"b11bad93-5af0-4c75-954c-42cc99684597\") " Feb 17 14:22:00 crc kubenswrapper[4836]: I0217 14:22:00.311035 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11bad93-5af0-4c75-954c-42cc99684597-utilities\") pod \"b11bad93-5af0-4c75-954c-42cc99684597\" (UID: \"b11bad93-5af0-4c75-954c-42cc99684597\") " Feb 17 14:22:00 crc kubenswrapper[4836]: I0217 14:22:00.312912 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11bad93-5af0-4c75-954c-42cc99684597-utilities" (OuterVolumeSpecName: "utilities") pod "b11bad93-5af0-4c75-954c-42cc99684597" (UID: "b11bad93-5af0-4c75-954c-42cc99684597"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:22:00 crc kubenswrapper[4836]: I0217 14:22:00.318258 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11bad93-5af0-4c75-954c-42cc99684597-kube-api-access-76l57" (OuterVolumeSpecName: "kube-api-access-76l57") pod "b11bad93-5af0-4c75-954c-42cc99684597" (UID: "b11bad93-5af0-4c75-954c-42cc99684597"). InnerVolumeSpecName "kube-api-access-76l57". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:22:00 crc kubenswrapper[4836]: I0217 14:22:00.369243 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11bad93-5af0-4c75-954c-42cc99684597-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11bad93-5af0-4c75-954c-42cc99684597" (UID: "b11bad93-5af0-4c75-954c-42cc99684597"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:22:00 crc kubenswrapper[4836]: I0217 14:22:00.412959 4836 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11bad93-5af0-4c75-954c-42cc99684597-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 14:22:00 crc kubenswrapper[4836]: I0217 14:22:00.413228 4836 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11bad93-5af0-4c75-954c-42cc99684597-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 14:22:00 crc kubenswrapper[4836]: I0217 14:22:00.413731 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-76l57\" (UniqueName: \"kubernetes.io/projected/b11bad93-5af0-4c75-954c-42cc99684597-kube-api-access-76l57\") on node \"crc\" DevicePath \"\"" Feb 17 14:22:00 crc kubenswrapper[4836]: I0217 14:22:00.642565 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-69b9cbf5df-6fkqt" event={"ID":"ccb35f40-d0b8-4a1e-8c45-63dd6987b72c","Type":"ContainerStarted","Data":"a07deea8e75a533163f5f5f6a1fc785de1ad00b58a99eccf4b41d397fabad11c"} Feb 17 14:22:00 crc kubenswrapper[4836]: I0217 14:22:00.642691 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-69b9cbf5df-6fkqt" Feb 17 14:22:00 crc kubenswrapper[4836]: I0217 14:22:00.645533 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p9lh8" event={"ID":"b11bad93-5af0-4c75-954c-42cc99684597","Type":"ContainerDied","Data":"f9b851496d5cce0303b165e18e03386edf9e343272b9928f99078559a7e8d5a0"} Feb 17 14:22:00 crc kubenswrapper[4836]: I0217 14:22:00.645930 4836 scope.go:117] "RemoveContainer" containerID="23e3b31d1bf10a2dce07355e7766faafe2d1bf59f1230f2ba36f46b169423e26" Feb 17 14:22:00 crc kubenswrapper[4836]: I0217 14:22:00.645588 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p9lh8" Feb 17 14:22:00 crc kubenswrapper[4836]: I0217 14:22:00.647875 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-856546fc87-n5vrx" event={"ID":"16c736d5-389e-4d03-9657-1abcd4448953","Type":"ContainerStarted","Data":"47bb71e6c27b2365020b83b9886828ebf9da6c15c21da87051b1933fdd3210e0"} Feb 17 14:22:00 crc kubenswrapper[4836]: I0217 14:22:00.648401 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-856546fc87-n5vrx" Feb 17 14:22:00 crc kubenswrapper[4836]: I0217 14:22:00.678979 4836 scope.go:117] "RemoveContainer" containerID="4fe1fe066d30b34ae39334b2a5ab55fea3ebf731c92645405e6f3bbb74be985c" Feb 17 14:22:00 crc kubenswrapper[4836]: I0217 14:22:00.682554 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-69b9cbf5df-6fkqt" podStartSLOduration=1.891030851 podStartE2EDuration="7.68254315s" podCreationTimestamp="2026-02-17 14:21:53 +0000 UTC" firstStartedPulling="2026-02-17 14:21:54.300672071 +0000 UTC m=+940.643600340" lastFinishedPulling="2026-02-17 14:22:00.09218437 +0000 UTC m=+946.435112639" observedRunningTime="2026-02-17 14:22:00.680907716 +0000 UTC m=+947.023835985" watchObservedRunningTime="2026-02-17 14:22:00.68254315 +0000 UTC m=+947.025471419" Feb 17 14:22:00 crc kubenswrapper[4836]: I0217 14:22:00.710325 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-p9lh8"] Feb 17 14:22:00 crc kubenswrapper[4836]: I0217 14:22:00.713648 4836 scope.go:117] "RemoveContainer" containerID="240ced32e0bb68659d8fa3215c9fcef735236350bf5d87b16d4adcec08100306" Feb 17 14:22:00 crc kubenswrapper[4836]: I0217 14:22:00.715864 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-p9lh8"] Feb 17 14:22:00 crc kubenswrapper[4836]: I0217 14:22:00.740635 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-856546fc87-n5vrx" podStartSLOduration=2.118125303 podStartE2EDuration="7.740610969s" podCreationTimestamp="2026-02-17 14:21:53 +0000 UTC" firstStartedPulling="2026-02-17 14:21:54.542630586 +0000 UTC m=+940.885558855" lastFinishedPulling="2026-02-17 14:22:00.165116262 +0000 UTC m=+946.508044521" observedRunningTime="2026-02-17 14:22:00.734828801 +0000 UTC m=+947.077757080" watchObservedRunningTime="2026-02-17 14:22:00.740610969 +0000 UTC m=+947.083539238" Feb 17 14:22:02 crc kubenswrapper[4836]: I0217 14:22:02.575216 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11bad93-5af0-4c75-954c-42cc99684597" path="/var/lib/kubelet/pods/b11bad93-5af0-4c75-954c-42cc99684597/volumes" Feb 17 14:22:09 crc kubenswrapper[4836]: I0217 14:22:09.785011 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-9vmx8"] Feb 17 14:22:09 crc kubenswrapper[4836]: E0217 14:22:09.786066 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b11bad93-5af0-4c75-954c-42cc99684597" containerName="extract-utilities" Feb 17 14:22:09 crc kubenswrapper[4836]: I0217 14:22:09.786083 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="b11bad93-5af0-4c75-954c-42cc99684597" containerName="extract-utilities" Feb 17 14:22:09 crc kubenswrapper[4836]: E0217 14:22:09.786095 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b11bad93-5af0-4c75-954c-42cc99684597" containerName="extract-content" Feb 17 14:22:09 crc kubenswrapper[4836]: I0217 14:22:09.786102 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="b11bad93-5af0-4c75-954c-42cc99684597" containerName="extract-content" Feb 17 14:22:09 crc kubenswrapper[4836]: E0217 14:22:09.786120 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b11bad93-5af0-4c75-954c-42cc99684597" containerName="registry-server" Feb 17 14:22:09 crc kubenswrapper[4836]: I0217 14:22:09.786129 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="b11bad93-5af0-4c75-954c-42cc99684597" containerName="registry-server" Feb 17 14:22:09 crc kubenswrapper[4836]: I0217 14:22:09.786571 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="b11bad93-5af0-4c75-954c-42cc99684597" containerName="registry-server" Feb 17 14:22:09 crc kubenswrapper[4836]: I0217 14:22:09.787546 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9vmx8" Feb 17 14:22:09 crc kubenswrapper[4836]: I0217 14:22:09.802370 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9vmx8"] Feb 17 14:22:09 crc kubenswrapper[4836]: I0217 14:22:09.850288 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/989e8ec6-9217-43f4-969a-07d9cb793ca9-utilities\") pod \"redhat-marketplace-9vmx8\" (UID: \"989e8ec6-9217-43f4-969a-07d9cb793ca9\") " pod="openshift-marketplace/redhat-marketplace-9vmx8" Feb 17 14:22:09 crc kubenswrapper[4836]: I0217 14:22:09.850668 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/989e8ec6-9217-43f4-969a-07d9cb793ca9-catalog-content\") pod \"redhat-marketplace-9vmx8\" (UID: \"989e8ec6-9217-43f4-969a-07d9cb793ca9\") " pod="openshift-marketplace/redhat-marketplace-9vmx8" Feb 17 14:22:09 crc kubenswrapper[4836]: I0217 14:22:09.850777 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtxkb\" (UniqueName: \"kubernetes.io/projected/989e8ec6-9217-43f4-969a-07d9cb793ca9-kube-api-access-gtxkb\") pod \"redhat-marketplace-9vmx8\" (UID: \"989e8ec6-9217-43f4-969a-07d9cb793ca9\") " pod="openshift-marketplace/redhat-marketplace-9vmx8" Feb 17 14:22:09 crc kubenswrapper[4836]: I0217 14:22:09.951586 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/989e8ec6-9217-43f4-969a-07d9cb793ca9-utilities\") pod \"redhat-marketplace-9vmx8\" (UID: \"989e8ec6-9217-43f4-969a-07d9cb793ca9\") " pod="openshift-marketplace/redhat-marketplace-9vmx8" Feb 17 14:22:09 crc kubenswrapper[4836]: I0217 14:22:09.951977 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/989e8ec6-9217-43f4-969a-07d9cb793ca9-catalog-content\") pod \"redhat-marketplace-9vmx8\" (UID: \"989e8ec6-9217-43f4-969a-07d9cb793ca9\") " pod="openshift-marketplace/redhat-marketplace-9vmx8" Feb 17 14:22:09 crc kubenswrapper[4836]: I0217 14:22:09.952103 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gtxkb\" (UniqueName: \"kubernetes.io/projected/989e8ec6-9217-43f4-969a-07d9cb793ca9-kube-api-access-gtxkb\") pod \"redhat-marketplace-9vmx8\" (UID: \"989e8ec6-9217-43f4-969a-07d9cb793ca9\") " pod="openshift-marketplace/redhat-marketplace-9vmx8" Feb 17 14:22:09 crc kubenswrapper[4836]: I0217 14:22:09.952203 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/989e8ec6-9217-43f4-969a-07d9cb793ca9-utilities\") pod \"redhat-marketplace-9vmx8\" (UID: \"989e8ec6-9217-43f4-969a-07d9cb793ca9\") " pod="openshift-marketplace/redhat-marketplace-9vmx8" Feb 17 14:22:09 crc kubenswrapper[4836]: I0217 14:22:09.952603 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/989e8ec6-9217-43f4-969a-07d9cb793ca9-catalog-content\") pod \"redhat-marketplace-9vmx8\" (UID: \"989e8ec6-9217-43f4-969a-07d9cb793ca9\") " pod="openshift-marketplace/redhat-marketplace-9vmx8" Feb 17 14:22:09 crc kubenswrapper[4836]: I0217 14:22:09.974534 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtxkb\" (UniqueName: \"kubernetes.io/projected/989e8ec6-9217-43f4-969a-07d9cb793ca9-kube-api-access-gtxkb\") pod \"redhat-marketplace-9vmx8\" (UID: \"989e8ec6-9217-43f4-969a-07d9cb793ca9\") " pod="openshift-marketplace/redhat-marketplace-9vmx8" Feb 17 14:22:10 crc kubenswrapper[4836]: I0217 14:22:10.108661 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9vmx8" Feb 17 14:22:10 crc kubenswrapper[4836]: I0217 14:22:10.587543 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9vmx8"] Feb 17 14:22:10 crc kubenswrapper[4836]: I0217 14:22:10.722775 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9vmx8" event={"ID":"989e8ec6-9217-43f4-969a-07d9cb793ca9","Type":"ContainerStarted","Data":"8fe721131e87d39ebd81dc2d83e5b54d94718d9776edb118ec98f74e1234a082"} Feb 17 14:22:11 crc kubenswrapper[4836]: I0217 14:22:11.731492 4836 generic.go:334] "Generic (PLEG): container finished" podID="989e8ec6-9217-43f4-969a-07d9cb793ca9" containerID="82e3c1617eac7ebac1ee94ffed06e82d3836f7d995455d1ac8436067da32e282" exitCode=0 Feb 17 14:22:11 crc kubenswrapper[4836]: I0217 14:22:11.731689 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9vmx8" event={"ID":"989e8ec6-9217-43f4-969a-07d9cb793ca9","Type":"ContainerDied","Data":"82e3c1617eac7ebac1ee94ffed06e82d3836f7d995455d1ac8436067da32e282"} Feb 17 14:22:12 crc kubenswrapper[4836]: I0217 14:22:12.747317 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9vmx8" event={"ID":"989e8ec6-9217-43f4-969a-07d9cb793ca9","Type":"ContainerStarted","Data":"95f148ce857c93eefbaf1d9b8240286284786811ddbc3cdd133a6fa5f0568d0a"} Feb 17 14:22:13 crc kubenswrapper[4836]: I0217 14:22:13.758444 4836 generic.go:334] "Generic (PLEG): container finished" podID="989e8ec6-9217-43f4-969a-07d9cb793ca9" containerID="95f148ce857c93eefbaf1d9b8240286284786811ddbc3cdd133a6fa5f0568d0a" exitCode=0 Feb 17 14:22:13 crc kubenswrapper[4836]: I0217 14:22:13.758582 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9vmx8" event={"ID":"989e8ec6-9217-43f4-969a-07d9cb793ca9","Type":"ContainerDied","Data":"95f148ce857c93eefbaf1d9b8240286284786811ddbc3cdd133a6fa5f0568d0a"} Feb 17 14:22:14 crc kubenswrapper[4836]: I0217 14:22:14.283600 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-856546fc87-n5vrx" Feb 17 14:22:14 crc kubenswrapper[4836]: I0217 14:22:14.780365 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9vmx8" event={"ID":"989e8ec6-9217-43f4-969a-07d9cb793ca9","Type":"ContainerStarted","Data":"59ac6df301e0f88ce0c625edb0b19c60eb087a56643b9f74e869499526bb8a59"} Feb 17 14:22:14 crc kubenswrapper[4836]: I0217 14:22:14.806113 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-9vmx8" podStartSLOduration=3.356018594 podStartE2EDuration="5.806097644s" podCreationTimestamp="2026-02-17 14:22:09 +0000 UTC" firstStartedPulling="2026-02-17 14:22:11.734026585 +0000 UTC m=+958.076954854" lastFinishedPulling="2026-02-17 14:22:14.184105635 +0000 UTC m=+960.527033904" observedRunningTime="2026-02-17 14:22:14.804803549 +0000 UTC m=+961.147731828" watchObservedRunningTime="2026-02-17 14:22:14.806097644 +0000 UTC m=+961.149025913" Feb 17 14:22:20 crc kubenswrapper[4836]: I0217 14:22:20.109650 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-9vmx8" Feb 17 14:22:20 crc kubenswrapper[4836]: I0217 14:22:20.110386 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-9vmx8" Feb 17 14:22:20 crc kubenswrapper[4836]: I0217 14:22:20.156845 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-9vmx8" Feb 17 14:22:21 crc kubenswrapper[4836]: I0217 14:22:21.109611 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-9vmx8" Feb 17 14:22:21 crc kubenswrapper[4836]: I0217 14:22:21.156649 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9vmx8"] Feb 17 14:22:23 crc kubenswrapper[4836]: I0217 14:22:23.073261 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-9vmx8" podUID="989e8ec6-9217-43f4-969a-07d9cb793ca9" containerName="registry-server" containerID="cri-o://59ac6df301e0f88ce0c625edb0b19c60eb087a56643b9f74e869499526bb8a59" gracePeriod=2 Feb 17 14:22:23 crc kubenswrapper[4836]: I0217 14:22:23.452872 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9vmx8" Feb 17 14:22:23 crc kubenswrapper[4836]: I0217 14:22:23.543948 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/989e8ec6-9217-43f4-969a-07d9cb793ca9-catalog-content\") pod \"989e8ec6-9217-43f4-969a-07d9cb793ca9\" (UID: \"989e8ec6-9217-43f4-969a-07d9cb793ca9\") " Feb 17 14:22:23 crc kubenswrapper[4836]: I0217 14:22:23.544030 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gtxkb\" (UniqueName: \"kubernetes.io/projected/989e8ec6-9217-43f4-969a-07d9cb793ca9-kube-api-access-gtxkb\") pod \"989e8ec6-9217-43f4-969a-07d9cb793ca9\" (UID: \"989e8ec6-9217-43f4-969a-07d9cb793ca9\") " Feb 17 14:22:23 crc kubenswrapper[4836]: I0217 14:22:23.544081 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/989e8ec6-9217-43f4-969a-07d9cb793ca9-utilities\") pod \"989e8ec6-9217-43f4-969a-07d9cb793ca9\" (UID: \"989e8ec6-9217-43f4-969a-07d9cb793ca9\") " Feb 17 14:22:23 crc kubenswrapper[4836]: I0217 14:22:23.545521 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/989e8ec6-9217-43f4-969a-07d9cb793ca9-utilities" (OuterVolumeSpecName: "utilities") pod "989e8ec6-9217-43f4-969a-07d9cb793ca9" (UID: "989e8ec6-9217-43f4-969a-07d9cb793ca9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:22:23 crc kubenswrapper[4836]: I0217 14:22:23.551529 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/989e8ec6-9217-43f4-969a-07d9cb793ca9-kube-api-access-gtxkb" (OuterVolumeSpecName: "kube-api-access-gtxkb") pod "989e8ec6-9217-43f4-969a-07d9cb793ca9" (UID: "989e8ec6-9217-43f4-969a-07d9cb793ca9"). InnerVolumeSpecName "kube-api-access-gtxkb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:22:23 crc kubenswrapper[4836]: I0217 14:22:23.645496 4836 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/989e8ec6-9217-43f4-969a-07d9cb793ca9-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 14:22:23 crc kubenswrapper[4836]: I0217 14:22:23.645532 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gtxkb\" (UniqueName: \"kubernetes.io/projected/989e8ec6-9217-43f4-969a-07d9cb793ca9-kube-api-access-gtxkb\") on node \"crc\" DevicePath \"\"" Feb 17 14:22:23 crc kubenswrapper[4836]: I0217 14:22:23.730751 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/989e8ec6-9217-43f4-969a-07d9cb793ca9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "989e8ec6-9217-43f4-969a-07d9cb793ca9" (UID: "989e8ec6-9217-43f4-969a-07d9cb793ca9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:22:23 crc kubenswrapper[4836]: I0217 14:22:23.747314 4836 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/989e8ec6-9217-43f4-969a-07d9cb793ca9-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 14:22:24 crc kubenswrapper[4836]: I0217 14:22:24.081200 4836 generic.go:334] "Generic (PLEG): container finished" podID="989e8ec6-9217-43f4-969a-07d9cb793ca9" containerID="59ac6df301e0f88ce0c625edb0b19c60eb087a56643b9f74e869499526bb8a59" exitCode=0 Feb 17 14:22:24 crc kubenswrapper[4836]: I0217 14:22:24.081283 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9vmx8" Feb 17 14:22:24 crc kubenswrapper[4836]: I0217 14:22:24.081318 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9vmx8" event={"ID":"989e8ec6-9217-43f4-969a-07d9cb793ca9","Type":"ContainerDied","Data":"59ac6df301e0f88ce0c625edb0b19c60eb087a56643b9f74e869499526bb8a59"} Feb 17 14:22:24 crc kubenswrapper[4836]: I0217 14:22:24.082338 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9vmx8" event={"ID":"989e8ec6-9217-43f4-969a-07d9cb793ca9","Type":"ContainerDied","Data":"8fe721131e87d39ebd81dc2d83e5b54d94718d9776edb118ec98f74e1234a082"} Feb 17 14:22:24 crc kubenswrapper[4836]: I0217 14:22:24.082366 4836 scope.go:117] "RemoveContainer" containerID="59ac6df301e0f88ce0c625edb0b19c60eb087a56643b9f74e869499526bb8a59" Feb 17 14:22:24 crc kubenswrapper[4836]: I0217 14:22:24.107163 4836 scope.go:117] "RemoveContainer" containerID="95f148ce857c93eefbaf1d9b8240286284786811ddbc3cdd133a6fa5f0568d0a" Feb 17 14:22:24 crc kubenswrapper[4836]: I0217 14:22:24.120088 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9vmx8"] Feb 17 14:22:24 crc kubenswrapper[4836]: I0217 14:22:24.139385 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-9vmx8"] Feb 17 14:22:24 crc kubenswrapper[4836]: I0217 14:22:24.152727 4836 scope.go:117] "RemoveContainer" containerID="82e3c1617eac7ebac1ee94ffed06e82d3836f7d995455d1ac8436067da32e282" Feb 17 14:22:24 crc kubenswrapper[4836]: I0217 14:22:24.171811 4836 scope.go:117] "RemoveContainer" containerID="59ac6df301e0f88ce0c625edb0b19c60eb087a56643b9f74e869499526bb8a59" Feb 17 14:22:24 crc kubenswrapper[4836]: E0217 14:22:24.172661 4836 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59ac6df301e0f88ce0c625edb0b19c60eb087a56643b9f74e869499526bb8a59\": container with ID starting with 59ac6df301e0f88ce0c625edb0b19c60eb087a56643b9f74e869499526bb8a59 not found: ID does not exist" containerID="59ac6df301e0f88ce0c625edb0b19c60eb087a56643b9f74e869499526bb8a59" Feb 17 14:22:24 crc kubenswrapper[4836]: I0217 14:22:24.172706 4836 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59ac6df301e0f88ce0c625edb0b19c60eb087a56643b9f74e869499526bb8a59"} err="failed to get container status \"59ac6df301e0f88ce0c625edb0b19c60eb087a56643b9f74e869499526bb8a59\": rpc error: code = NotFound desc = could not find container \"59ac6df301e0f88ce0c625edb0b19c60eb087a56643b9f74e869499526bb8a59\": container with ID starting with 59ac6df301e0f88ce0c625edb0b19c60eb087a56643b9f74e869499526bb8a59 not found: ID does not exist" Feb 17 14:22:24 crc kubenswrapper[4836]: I0217 14:22:24.172732 4836 scope.go:117] "RemoveContainer" containerID="95f148ce857c93eefbaf1d9b8240286284786811ddbc3cdd133a6fa5f0568d0a" Feb 17 14:22:24 crc kubenswrapper[4836]: E0217 14:22:24.172979 4836 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95f148ce857c93eefbaf1d9b8240286284786811ddbc3cdd133a6fa5f0568d0a\": container with ID starting with 95f148ce857c93eefbaf1d9b8240286284786811ddbc3cdd133a6fa5f0568d0a not found: ID does not exist" containerID="95f148ce857c93eefbaf1d9b8240286284786811ddbc3cdd133a6fa5f0568d0a" Feb 17 14:22:24 crc kubenswrapper[4836]: I0217 14:22:24.173010 4836 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95f148ce857c93eefbaf1d9b8240286284786811ddbc3cdd133a6fa5f0568d0a"} err="failed to get container status \"95f148ce857c93eefbaf1d9b8240286284786811ddbc3cdd133a6fa5f0568d0a\": rpc error: code = NotFound desc = could not find container \"95f148ce857c93eefbaf1d9b8240286284786811ddbc3cdd133a6fa5f0568d0a\": container with ID starting with 95f148ce857c93eefbaf1d9b8240286284786811ddbc3cdd133a6fa5f0568d0a not found: ID does not exist" Feb 17 14:22:24 crc kubenswrapper[4836]: I0217 14:22:24.173028 4836 scope.go:117] "RemoveContainer" containerID="82e3c1617eac7ebac1ee94ffed06e82d3836f7d995455d1ac8436067da32e282" Feb 17 14:22:24 crc kubenswrapper[4836]: E0217 14:22:24.173530 4836 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82e3c1617eac7ebac1ee94ffed06e82d3836f7d995455d1ac8436067da32e282\": container with ID starting with 82e3c1617eac7ebac1ee94ffed06e82d3836f7d995455d1ac8436067da32e282 not found: ID does not exist" containerID="82e3c1617eac7ebac1ee94ffed06e82d3836f7d995455d1ac8436067da32e282" Feb 17 14:22:24 crc kubenswrapper[4836]: I0217 14:22:24.173554 4836 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82e3c1617eac7ebac1ee94ffed06e82d3836f7d995455d1ac8436067da32e282"} err="failed to get container status \"82e3c1617eac7ebac1ee94ffed06e82d3836f7d995455d1ac8436067da32e282\": rpc error: code = NotFound desc = could not find container \"82e3c1617eac7ebac1ee94ffed06e82d3836f7d995455d1ac8436067da32e282\": container with ID starting with 82e3c1617eac7ebac1ee94ffed06e82d3836f7d995455d1ac8436067da32e282 not found: ID does not exist" Feb 17 14:22:24 crc kubenswrapper[4836]: I0217 14:22:24.575215 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="989e8ec6-9217-43f4-969a-07d9cb793ca9" path="/var/lib/kubelet/pods/989e8ec6-9217-43f4-969a-07d9cb793ca9/volumes" Feb 17 14:22:26 crc kubenswrapper[4836]: I0217 14:22:26.166766 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vxxx8"] Feb 17 14:22:26 crc kubenswrapper[4836]: E0217 14:22:26.167056 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="989e8ec6-9217-43f4-969a-07d9cb793ca9" containerName="registry-server" Feb 17 14:22:26 crc kubenswrapper[4836]: I0217 14:22:26.167073 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="989e8ec6-9217-43f4-969a-07d9cb793ca9" containerName="registry-server" Feb 17 14:22:26 crc kubenswrapper[4836]: E0217 14:22:26.167086 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="989e8ec6-9217-43f4-969a-07d9cb793ca9" containerName="extract-content" Feb 17 14:22:26 crc kubenswrapper[4836]: I0217 14:22:26.167094 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="989e8ec6-9217-43f4-969a-07d9cb793ca9" containerName="extract-content" Feb 17 14:22:26 crc kubenswrapper[4836]: E0217 14:22:26.167114 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="989e8ec6-9217-43f4-969a-07d9cb793ca9" containerName="extract-utilities" Feb 17 14:22:26 crc kubenswrapper[4836]: I0217 14:22:26.167122 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="989e8ec6-9217-43f4-969a-07d9cb793ca9" containerName="extract-utilities" Feb 17 14:22:26 crc kubenswrapper[4836]: I0217 14:22:26.167262 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="989e8ec6-9217-43f4-969a-07d9cb793ca9" containerName="registry-server" Feb 17 14:22:26 crc kubenswrapper[4836]: I0217 14:22:26.168279 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vxxx8" Feb 17 14:22:26 crc kubenswrapper[4836]: I0217 14:22:26.176311 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vxxx8"] Feb 17 14:22:26 crc kubenswrapper[4836]: I0217 14:22:26.282903 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4jss\" (UniqueName: \"kubernetes.io/projected/ee0bd3ed-4af9-40a3-9742-ee548934f0c7-kube-api-access-q4jss\") pod \"community-operators-vxxx8\" (UID: \"ee0bd3ed-4af9-40a3-9742-ee548934f0c7\") " pod="openshift-marketplace/community-operators-vxxx8" Feb 17 14:22:26 crc kubenswrapper[4836]: I0217 14:22:26.283024 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee0bd3ed-4af9-40a3-9742-ee548934f0c7-utilities\") pod \"community-operators-vxxx8\" (UID: \"ee0bd3ed-4af9-40a3-9742-ee548934f0c7\") " pod="openshift-marketplace/community-operators-vxxx8" Feb 17 14:22:26 crc kubenswrapper[4836]: I0217 14:22:26.283068 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee0bd3ed-4af9-40a3-9742-ee548934f0c7-catalog-content\") pod \"community-operators-vxxx8\" (UID: \"ee0bd3ed-4af9-40a3-9742-ee548934f0c7\") " pod="openshift-marketplace/community-operators-vxxx8" Feb 17 14:22:26 crc kubenswrapper[4836]: I0217 14:22:26.426804 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4jss\" (UniqueName: \"kubernetes.io/projected/ee0bd3ed-4af9-40a3-9742-ee548934f0c7-kube-api-access-q4jss\") pod \"community-operators-vxxx8\" (UID: \"ee0bd3ed-4af9-40a3-9742-ee548934f0c7\") " pod="openshift-marketplace/community-operators-vxxx8" Feb 17 14:22:26 crc kubenswrapper[4836]: I0217 14:22:26.426871 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee0bd3ed-4af9-40a3-9742-ee548934f0c7-utilities\") pod \"community-operators-vxxx8\" (UID: \"ee0bd3ed-4af9-40a3-9742-ee548934f0c7\") " pod="openshift-marketplace/community-operators-vxxx8" Feb 17 14:22:26 crc kubenswrapper[4836]: I0217 14:22:26.426899 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee0bd3ed-4af9-40a3-9742-ee548934f0c7-catalog-content\") pod \"community-operators-vxxx8\" (UID: \"ee0bd3ed-4af9-40a3-9742-ee548934f0c7\") " pod="openshift-marketplace/community-operators-vxxx8" Feb 17 14:22:26 crc kubenswrapper[4836]: I0217 14:22:26.427628 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee0bd3ed-4af9-40a3-9742-ee548934f0c7-catalog-content\") pod \"community-operators-vxxx8\" (UID: \"ee0bd3ed-4af9-40a3-9742-ee548934f0c7\") " pod="openshift-marketplace/community-operators-vxxx8" Feb 17 14:22:26 crc kubenswrapper[4836]: I0217 14:22:26.427979 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee0bd3ed-4af9-40a3-9742-ee548934f0c7-utilities\") pod \"community-operators-vxxx8\" (UID: \"ee0bd3ed-4af9-40a3-9742-ee548934f0c7\") " pod="openshift-marketplace/community-operators-vxxx8" Feb 17 14:22:26 crc kubenswrapper[4836]: I0217 14:22:26.453153 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4jss\" (UniqueName: \"kubernetes.io/projected/ee0bd3ed-4af9-40a3-9742-ee548934f0c7-kube-api-access-q4jss\") pod \"community-operators-vxxx8\" (UID: \"ee0bd3ed-4af9-40a3-9742-ee548934f0c7\") " pod="openshift-marketplace/community-operators-vxxx8" Feb 17 14:22:26 crc kubenswrapper[4836]: I0217 14:22:26.534490 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vxxx8" Feb 17 14:22:26 crc kubenswrapper[4836]: I0217 14:22:26.773706 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vxxx8"] Feb 17 14:22:27 crc kubenswrapper[4836]: I0217 14:22:27.106478 4836 generic.go:334] "Generic (PLEG): container finished" podID="ee0bd3ed-4af9-40a3-9742-ee548934f0c7" containerID="87a0d8ab30fd50050f265f555c2ac0eecf86edd798af11bdde6d9dab11cbbc03" exitCode=0 Feb 17 14:22:27 crc kubenswrapper[4836]: I0217 14:22:27.106599 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vxxx8" event={"ID":"ee0bd3ed-4af9-40a3-9742-ee548934f0c7","Type":"ContainerDied","Data":"87a0d8ab30fd50050f265f555c2ac0eecf86edd798af11bdde6d9dab11cbbc03"} Feb 17 14:22:27 crc kubenswrapper[4836]: I0217 14:22:27.106935 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vxxx8" event={"ID":"ee0bd3ed-4af9-40a3-9742-ee548934f0c7","Type":"ContainerStarted","Data":"cb2c2b42c4e66e02d5c1a234e004888a9a328e8e2b0673f2fef499c320e33d68"} Feb 17 14:22:28 crc kubenswrapper[4836]: I0217 14:22:28.116012 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vxxx8" event={"ID":"ee0bd3ed-4af9-40a3-9742-ee548934f0c7","Type":"ContainerStarted","Data":"a437392fc716ac4c181c21aee412d30402f80e7fb51c5e7b4a9ec90fd620087d"} Feb 17 14:22:29 crc kubenswrapper[4836]: I0217 14:22:29.125267 4836 generic.go:334] "Generic (PLEG): container finished" podID="ee0bd3ed-4af9-40a3-9742-ee548934f0c7" containerID="a437392fc716ac4c181c21aee412d30402f80e7fb51c5e7b4a9ec90fd620087d" exitCode=0 Feb 17 14:22:29 crc kubenswrapper[4836]: I0217 14:22:29.125663 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vxxx8" event={"ID":"ee0bd3ed-4af9-40a3-9742-ee548934f0c7","Type":"ContainerDied","Data":"a437392fc716ac4c181c21aee412d30402f80e7fb51c5e7b4a9ec90fd620087d"} Feb 17 14:22:30 crc kubenswrapper[4836]: I0217 14:22:30.137004 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vxxx8" event={"ID":"ee0bd3ed-4af9-40a3-9742-ee548934f0c7","Type":"ContainerStarted","Data":"130efc368435db6a4abd3d1e4331eadbf9c826809151887de231fde45fba1e8c"} Feb 17 14:22:30 crc kubenswrapper[4836]: I0217 14:22:30.158271 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vxxx8" podStartSLOduration=1.781059736 podStartE2EDuration="4.158249716s" podCreationTimestamp="2026-02-17 14:22:26 +0000 UTC" firstStartedPulling="2026-02-17 14:22:27.108243121 +0000 UTC m=+973.451171390" lastFinishedPulling="2026-02-17 14:22:29.485433091 +0000 UTC m=+975.828361370" observedRunningTime="2026-02-17 14:22:30.152948761 +0000 UTC m=+976.495877040" watchObservedRunningTime="2026-02-17 14:22:30.158249716 +0000 UTC m=+976.501177985" Feb 17 14:22:33 crc kubenswrapper[4836]: I0217 14:22:33.745929 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-69b9cbf5df-6fkqt" Feb 17 14:22:34 crc kubenswrapper[4836]: I0217 14:22:34.591636 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-x257b"] Feb 17 14:22:34 crc kubenswrapper[4836]: I0217 14:22:34.596176 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-78b44bf5bb-mznjt"] Feb 17 14:22:34 crc kubenswrapper[4836]: I0217 14:22:34.596431 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-x257b" Feb 17 14:22:34 crc kubenswrapper[4836]: I0217 14:22:34.596937 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-mznjt" Feb 17 14:22:34 crc kubenswrapper[4836]: I0217 14:22:34.600264 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Feb 17 14:22:34 crc kubenswrapper[4836]: I0217 14:22:34.600456 4836 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Feb 17 14:22:34 crc kubenswrapper[4836]: I0217 14:22:34.601517 4836 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Feb 17 14:22:34 crc kubenswrapper[4836]: I0217 14:22:34.601674 4836 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-cwnkd" Feb 17 14:22:34 crc kubenswrapper[4836]: I0217 14:22:34.605220 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-78b44bf5bb-mznjt"] Feb 17 14:22:34 crc kubenswrapper[4836]: I0217 14:22:34.693743 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-pb5ff"] Feb 17 14:22:34 crc kubenswrapper[4836]: I0217 14:22:34.694821 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-pb5ff" Feb 17 14:22:34 crc kubenswrapper[4836]: I0217 14:22:34.699468 4836 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-md5x8" Feb 17 14:22:34 crc kubenswrapper[4836]: I0217 14:22:34.699463 4836 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Feb 17 14:22:34 crc kubenswrapper[4836]: I0217 14:22:34.699550 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Feb 17 14:22:34 crc kubenswrapper[4836]: I0217 14:22:34.699517 4836 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Feb 17 14:22:34 crc kubenswrapper[4836]: I0217 14:22:34.706928 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-69bbfbf88f-szl4j"] Feb 17 14:22:34 crc kubenswrapper[4836]: I0217 14:22:34.708202 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-69bbfbf88f-szl4j" Feb 17 14:22:34 crc kubenswrapper[4836]: I0217 14:22:34.710383 4836 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Feb 17 14:22:34 crc kubenswrapper[4836]: I0217 14:22:34.721773 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-69bbfbf88f-szl4j"] Feb 17 14:22:34 crc kubenswrapper[4836]: I0217 14:22:34.726471 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/e019f338-ff73-4160-a283-a71e9e6119b3-reloader\") pod \"frr-k8s-x257b\" (UID: \"e019f338-ff73-4160-a283-a71e9e6119b3\") " pod="metallb-system/frr-k8s-x257b" Feb 17 14:22:34 crc kubenswrapper[4836]: I0217 14:22:34.726559 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhqr5\" (UniqueName: \"kubernetes.io/projected/e019f338-ff73-4160-a283-a71e9e6119b3-kube-api-access-jhqr5\") pod \"frr-k8s-x257b\" (UID: \"e019f338-ff73-4160-a283-a71e9e6119b3\") " pod="metallb-system/frr-k8s-x257b" Feb 17 14:22:34 crc kubenswrapper[4836]: I0217 14:22:34.726656 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/e019f338-ff73-4160-a283-a71e9e6119b3-frr-sockets\") pod \"frr-k8s-x257b\" (UID: \"e019f338-ff73-4160-a283-a71e9e6119b3\") " pod="metallb-system/frr-k8s-x257b" Feb 17 14:22:34 crc kubenswrapper[4836]: I0217 14:22:34.726689 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/e019f338-ff73-4160-a283-a71e9e6119b3-frr-startup\") pod \"frr-k8s-x257b\" (UID: \"e019f338-ff73-4160-a283-a71e9e6119b3\") " pod="metallb-system/frr-k8s-x257b" Feb 17 14:22:34 crc kubenswrapper[4836]: I0217 14:22:34.726716 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e019f338-ff73-4160-a283-a71e9e6119b3-metrics-certs\") pod \"frr-k8s-x257b\" (UID: \"e019f338-ff73-4160-a283-a71e9e6119b3\") " pod="metallb-system/frr-k8s-x257b" Feb 17 14:22:34 crc kubenswrapper[4836]: I0217 14:22:34.726730 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/e019f338-ff73-4160-a283-a71e9e6119b3-metrics\") pod \"frr-k8s-x257b\" (UID: \"e019f338-ff73-4160-a283-a71e9e6119b3\") " pod="metallb-system/frr-k8s-x257b" Feb 17 14:22:34 crc kubenswrapper[4836]: I0217 14:22:34.726751 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/e019f338-ff73-4160-a283-a71e9e6119b3-frr-conf\") pod \"frr-k8s-x257b\" (UID: \"e019f338-ff73-4160-a283-a71e9e6119b3\") " pod="metallb-system/frr-k8s-x257b" Feb 17 14:22:34 crc kubenswrapper[4836]: I0217 14:22:34.726777 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bp7z7\" (UniqueName: \"kubernetes.io/projected/18ec2995-af0c-4c47-aa70-480f9323329e-kube-api-access-bp7z7\") pod \"frr-k8s-webhook-server-78b44bf5bb-mznjt\" (UID: \"18ec2995-af0c-4c47-aa70-480f9323329e\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-mznjt" Feb 17 14:22:34 crc kubenswrapper[4836]: I0217 14:22:34.726807 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/18ec2995-af0c-4c47-aa70-480f9323329e-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-mznjt\" (UID: \"18ec2995-af0c-4c47-aa70-480f9323329e\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-mznjt" Feb 17 14:22:34 crc kubenswrapper[4836]: I0217 14:22:34.828273 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/e019f338-ff73-4160-a283-a71e9e6119b3-frr-sockets\") pod \"frr-k8s-x257b\" (UID: \"e019f338-ff73-4160-a283-a71e9e6119b3\") " pod="metallb-system/frr-k8s-x257b" Feb 17 14:22:34 crc kubenswrapper[4836]: I0217 14:22:34.828362 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/e019f338-ff73-4160-a283-a71e9e6119b3-frr-startup\") pod \"frr-k8s-x257b\" (UID: \"e019f338-ff73-4160-a283-a71e9e6119b3\") " pod="metallb-system/frr-k8s-x257b" Feb 17 14:22:34 crc kubenswrapper[4836]: I0217 14:22:34.828387 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e019f338-ff73-4160-a283-a71e9e6119b3-metrics-certs\") pod \"frr-k8s-x257b\" (UID: \"e019f338-ff73-4160-a283-a71e9e6119b3\") " pod="metallb-system/frr-k8s-x257b" Feb 17 14:22:34 crc kubenswrapper[4836]: I0217 14:22:34.828404 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/e019f338-ff73-4160-a283-a71e9e6119b3-metrics\") pod \"frr-k8s-x257b\" (UID: \"e019f338-ff73-4160-a283-a71e9e6119b3\") " pod="metallb-system/frr-k8s-x257b" Feb 17 14:22:34 crc kubenswrapper[4836]: I0217 14:22:34.828423 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/e019f338-ff73-4160-a283-a71e9e6119b3-frr-conf\") pod \"frr-k8s-x257b\" (UID: \"e019f338-ff73-4160-a283-a71e9e6119b3\") " pod="metallb-system/frr-k8s-x257b" Feb 17 14:22:34 crc kubenswrapper[4836]: I0217 14:22:34.828448 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htwzc\" (UniqueName: \"kubernetes.io/projected/2690ef6e-0489-43f3-b787-8b6c1295e283-kube-api-access-htwzc\") pod \"speaker-pb5ff\" (UID: \"2690ef6e-0489-43f3-b787-8b6c1295e283\") " pod="metallb-system/speaker-pb5ff" Feb 17 14:22:34 crc kubenswrapper[4836]: I0217 14:22:34.828473 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/27eed55a-1a00-497e-9aa4-74f7007f336e-metrics-certs\") pod \"controller-69bbfbf88f-szl4j\" (UID: \"27eed55a-1a00-497e-9aa4-74f7007f336e\") " pod="metallb-system/controller-69bbfbf88f-szl4j" Feb 17 14:22:34 crc kubenswrapper[4836]: I0217 14:22:34.828515 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bp7z7\" (UniqueName: \"kubernetes.io/projected/18ec2995-af0c-4c47-aa70-480f9323329e-kube-api-access-bp7z7\") pod \"frr-k8s-webhook-server-78b44bf5bb-mznjt\" (UID: \"18ec2995-af0c-4c47-aa70-480f9323329e\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-mznjt" Feb 17 14:22:34 crc kubenswrapper[4836]: I0217 14:22:34.828552 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/18ec2995-af0c-4c47-aa70-480f9323329e-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-mznjt\" (UID: \"18ec2995-af0c-4c47-aa70-480f9323329e\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-mznjt" Feb 17 14:22:34 crc kubenswrapper[4836]: I0217 14:22:34.828581 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/27eed55a-1a00-497e-9aa4-74f7007f336e-cert\") pod \"controller-69bbfbf88f-szl4j\" (UID: \"27eed55a-1a00-497e-9aa4-74f7007f336e\") " pod="metallb-system/controller-69bbfbf88f-szl4j" Feb 17 14:22:34 crc kubenswrapper[4836]: I0217 14:22:34.828678 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/e019f338-ff73-4160-a283-a71e9e6119b3-reloader\") pod \"frr-k8s-x257b\" (UID: \"e019f338-ff73-4160-a283-a71e9e6119b3\") " pod="metallb-system/frr-k8s-x257b" Feb 17 14:22:34 crc kubenswrapper[4836]: I0217 14:22:34.828708 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2690ef6e-0489-43f3-b787-8b6c1295e283-metrics-certs\") pod \"speaker-pb5ff\" (UID: \"2690ef6e-0489-43f3-b787-8b6c1295e283\") " pod="metallb-system/speaker-pb5ff" Feb 17 14:22:34 crc kubenswrapper[4836]: I0217 14:22:34.828850 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhqr5\" (UniqueName: \"kubernetes.io/projected/e019f338-ff73-4160-a283-a71e9e6119b3-kube-api-access-jhqr5\") pod \"frr-k8s-x257b\" (UID: \"e019f338-ff73-4160-a283-a71e9e6119b3\") " pod="metallb-system/frr-k8s-x257b" Feb 17 14:22:34 crc kubenswrapper[4836]: I0217 14:22:34.828884 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/2690ef6e-0489-43f3-b787-8b6c1295e283-metallb-excludel2\") pod \"speaker-pb5ff\" (UID: \"2690ef6e-0489-43f3-b787-8b6c1295e283\") " pod="metallb-system/speaker-pb5ff" Feb 17 14:22:34 crc kubenswrapper[4836]: I0217 14:22:34.828918 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szs88\" (UniqueName: \"kubernetes.io/projected/27eed55a-1a00-497e-9aa4-74f7007f336e-kube-api-access-szs88\") pod \"controller-69bbfbf88f-szl4j\" (UID: \"27eed55a-1a00-497e-9aa4-74f7007f336e\") " pod="metallb-system/controller-69bbfbf88f-szl4j" Feb 17 14:22:34 crc kubenswrapper[4836]: I0217 14:22:34.828955 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/2690ef6e-0489-43f3-b787-8b6c1295e283-memberlist\") pod \"speaker-pb5ff\" (UID: \"2690ef6e-0489-43f3-b787-8b6c1295e283\") " pod="metallb-system/speaker-pb5ff" Feb 17 14:22:34 crc kubenswrapper[4836]: I0217 14:22:34.829164 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/e019f338-ff73-4160-a283-a71e9e6119b3-metrics\") pod \"frr-k8s-x257b\" (UID: \"e019f338-ff73-4160-a283-a71e9e6119b3\") " pod="metallb-system/frr-k8s-x257b" Feb 17 14:22:34 crc kubenswrapper[4836]: I0217 14:22:34.829181 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/e019f338-ff73-4160-a283-a71e9e6119b3-reloader\") pod \"frr-k8s-x257b\" (UID: \"e019f338-ff73-4160-a283-a71e9e6119b3\") " pod="metallb-system/frr-k8s-x257b" Feb 17 14:22:34 crc kubenswrapper[4836]: I0217 14:22:34.829462 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/e019f338-ff73-4160-a283-a71e9e6119b3-frr-sockets\") pod \"frr-k8s-x257b\" (UID: \"e019f338-ff73-4160-a283-a71e9e6119b3\") " pod="metallb-system/frr-k8s-x257b" Feb 17 14:22:34 crc kubenswrapper[4836]: I0217 14:22:34.829537 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/e019f338-ff73-4160-a283-a71e9e6119b3-frr-conf\") pod \"frr-k8s-x257b\" (UID: \"e019f338-ff73-4160-a283-a71e9e6119b3\") " pod="metallb-system/frr-k8s-x257b" Feb 17 14:22:34 crc kubenswrapper[4836]: I0217 14:22:34.829575 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/e019f338-ff73-4160-a283-a71e9e6119b3-frr-startup\") pod \"frr-k8s-x257b\" (UID: \"e019f338-ff73-4160-a283-a71e9e6119b3\") " pod="metallb-system/frr-k8s-x257b" Feb 17 14:22:34 crc kubenswrapper[4836]: I0217 14:22:34.836285 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e019f338-ff73-4160-a283-a71e9e6119b3-metrics-certs\") pod \"frr-k8s-x257b\" (UID: \"e019f338-ff73-4160-a283-a71e9e6119b3\") " pod="metallb-system/frr-k8s-x257b" Feb 17 14:22:34 crc kubenswrapper[4836]: I0217 14:22:34.837139 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/18ec2995-af0c-4c47-aa70-480f9323329e-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-mznjt\" (UID: \"18ec2995-af0c-4c47-aa70-480f9323329e\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-mznjt" Feb 17 14:22:34 crc kubenswrapper[4836]: I0217 14:22:34.855388 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bp7z7\" (UniqueName: \"kubernetes.io/projected/18ec2995-af0c-4c47-aa70-480f9323329e-kube-api-access-bp7z7\") pod \"frr-k8s-webhook-server-78b44bf5bb-mznjt\" (UID: \"18ec2995-af0c-4c47-aa70-480f9323329e\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-mznjt" Feb 17 14:22:34 crc kubenswrapper[4836]: I0217 14:22:34.900770 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhqr5\" (UniqueName: \"kubernetes.io/projected/e019f338-ff73-4160-a283-a71e9e6119b3-kube-api-access-jhqr5\") pod \"frr-k8s-x257b\" (UID: \"e019f338-ff73-4160-a283-a71e9e6119b3\") " pod="metallb-system/frr-k8s-x257b" Feb 17 14:22:34 crc kubenswrapper[4836]: I0217 14:22:34.925696 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-x257b" Feb 17 14:22:34 crc kubenswrapper[4836]: I0217 14:22:34.930984 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htwzc\" (UniqueName: \"kubernetes.io/projected/2690ef6e-0489-43f3-b787-8b6c1295e283-kube-api-access-htwzc\") pod \"speaker-pb5ff\" (UID: \"2690ef6e-0489-43f3-b787-8b6c1295e283\") " pod="metallb-system/speaker-pb5ff" Feb 17 14:22:34 crc kubenswrapper[4836]: I0217 14:22:34.931044 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/27eed55a-1a00-497e-9aa4-74f7007f336e-metrics-certs\") pod \"controller-69bbfbf88f-szl4j\" (UID: \"27eed55a-1a00-497e-9aa4-74f7007f336e\") " pod="metallb-system/controller-69bbfbf88f-szl4j" Feb 17 14:22:34 crc kubenswrapper[4836]: I0217 14:22:34.931828 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/27eed55a-1a00-497e-9aa4-74f7007f336e-cert\") pod \"controller-69bbfbf88f-szl4j\" (UID: \"27eed55a-1a00-497e-9aa4-74f7007f336e\") " pod="metallb-system/controller-69bbfbf88f-szl4j" Feb 17 14:22:34 crc kubenswrapper[4836]: I0217 14:22:34.931870 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2690ef6e-0489-43f3-b787-8b6c1295e283-metrics-certs\") pod \"speaker-pb5ff\" (UID: \"2690ef6e-0489-43f3-b787-8b6c1295e283\") " pod="metallb-system/speaker-pb5ff" Feb 17 14:22:34 crc kubenswrapper[4836]: I0217 14:22:34.931907 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/2690ef6e-0489-43f3-b787-8b6c1295e283-metallb-excludel2\") pod \"speaker-pb5ff\" (UID: \"2690ef6e-0489-43f3-b787-8b6c1295e283\") " pod="metallb-system/speaker-pb5ff" Feb 17 14:22:34 crc kubenswrapper[4836]: I0217 14:22:34.931932 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szs88\" (UniqueName: \"kubernetes.io/projected/27eed55a-1a00-497e-9aa4-74f7007f336e-kube-api-access-szs88\") pod \"controller-69bbfbf88f-szl4j\" (UID: \"27eed55a-1a00-497e-9aa4-74f7007f336e\") " pod="metallb-system/controller-69bbfbf88f-szl4j" Feb 17 14:22:34 crc kubenswrapper[4836]: I0217 14:22:34.931974 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/2690ef6e-0489-43f3-b787-8b6c1295e283-memberlist\") pod \"speaker-pb5ff\" (UID: \"2690ef6e-0489-43f3-b787-8b6c1295e283\") " pod="metallb-system/speaker-pb5ff" Feb 17 14:22:34 crc kubenswrapper[4836]: E0217 14:22:34.932076 4836 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Feb 17 14:22:34 crc kubenswrapper[4836]: E0217 14:22:34.932134 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2690ef6e-0489-43f3-b787-8b6c1295e283-memberlist podName:2690ef6e-0489-43f3-b787-8b6c1295e283 nodeName:}" failed. No retries permitted until 2026-02-17 14:22:35.432108801 +0000 UTC m=+981.775037070 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/2690ef6e-0489-43f3-b787-8b6c1295e283-memberlist") pod "speaker-pb5ff" (UID: "2690ef6e-0489-43f3-b787-8b6c1295e283") : secret "metallb-memberlist" not found Feb 17 14:22:34 crc kubenswrapper[4836]: I0217 14:22:34.933051 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/2690ef6e-0489-43f3-b787-8b6c1295e283-metallb-excludel2\") pod \"speaker-pb5ff\" (UID: \"2690ef6e-0489-43f3-b787-8b6c1295e283\") " pod="metallb-system/speaker-pb5ff" Feb 17 14:22:34 crc kubenswrapper[4836]: I0217 14:22:34.935460 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2690ef6e-0489-43f3-b787-8b6c1295e283-metrics-certs\") pod \"speaker-pb5ff\" (UID: \"2690ef6e-0489-43f3-b787-8b6c1295e283\") " pod="metallb-system/speaker-pb5ff" Feb 17 14:22:34 crc kubenswrapper[4836]: I0217 14:22:34.935800 4836 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Feb 17 14:22:34 crc kubenswrapper[4836]: I0217 14:22:34.936723 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/27eed55a-1a00-497e-9aa4-74f7007f336e-metrics-certs\") pod \"controller-69bbfbf88f-szl4j\" (UID: \"27eed55a-1a00-497e-9aa4-74f7007f336e\") " pod="metallb-system/controller-69bbfbf88f-szl4j" Feb 17 14:22:34 crc kubenswrapper[4836]: I0217 14:22:34.940745 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-mznjt" Feb 17 14:22:34 crc kubenswrapper[4836]: I0217 14:22:34.945750 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/27eed55a-1a00-497e-9aa4-74f7007f336e-cert\") pod \"controller-69bbfbf88f-szl4j\" (UID: \"27eed55a-1a00-497e-9aa4-74f7007f336e\") " pod="metallb-system/controller-69bbfbf88f-szl4j" Feb 17 14:22:34 crc kubenswrapper[4836]: I0217 14:22:34.950351 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htwzc\" (UniqueName: \"kubernetes.io/projected/2690ef6e-0489-43f3-b787-8b6c1295e283-kube-api-access-htwzc\") pod \"speaker-pb5ff\" (UID: \"2690ef6e-0489-43f3-b787-8b6c1295e283\") " pod="metallb-system/speaker-pb5ff" Feb 17 14:22:34 crc kubenswrapper[4836]: I0217 14:22:34.954871 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-szs88\" (UniqueName: \"kubernetes.io/projected/27eed55a-1a00-497e-9aa4-74f7007f336e-kube-api-access-szs88\") pod \"controller-69bbfbf88f-szl4j\" (UID: \"27eed55a-1a00-497e-9aa4-74f7007f336e\") " pod="metallb-system/controller-69bbfbf88f-szl4j" Feb 17 14:22:35 crc kubenswrapper[4836]: I0217 14:22:35.024629 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-69bbfbf88f-szl4j" Feb 17 14:22:35 crc kubenswrapper[4836]: I0217 14:22:35.132059 4836 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 17 14:22:35 crc kubenswrapper[4836]: I0217 14:22:35.175098 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-x257b" event={"ID":"e019f338-ff73-4160-a283-a71e9e6119b3","Type":"ContainerStarted","Data":"74568734c97bb5c9fae50817ad229949aa021b4982838f7cc380326f6b22251f"} Feb 17 14:22:35 crc kubenswrapper[4836]: I0217 14:22:35.244918 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-78b44bf5bb-mznjt"] Feb 17 14:22:35 crc kubenswrapper[4836]: W0217 14:22:35.250957 4836 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod18ec2995_af0c_4c47_aa70_480f9323329e.slice/crio-10857b4d4594f8173323137f0eb07e6d97fa097637eeb77eb062b9d77fc891d9 WatchSource:0}: Error finding container 10857b4d4594f8173323137f0eb07e6d97fa097637eeb77eb062b9d77fc891d9: Status 404 returned error can't find the container with id 10857b4d4594f8173323137f0eb07e6d97fa097637eeb77eb062b9d77fc891d9 Feb 17 14:22:35 crc kubenswrapper[4836]: I0217 14:22:35.340315 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-69bbfbf88f-szl4j"] Feb 17 14:22:35 crc kubenswrapper[4836]: W0217 14:22:35.342313 4836 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod27eed55a_1a00_497e_9aa4_74f7007f336e.slice/crio-349fcc2a13683cafd7df0f94207afba27f23b46ab5a2a33a05a8df4a0a33eb0b WatchSource:0}: Error finding container 349fcc2a13683cafd7df0f94207afba27f23b46ab5a2a33a05a8df4a0a33eb0b: Status 404 returned error can't find the container with id 349fcc2a13683cafd7df0f94207afba27f23b46ab5a2a33a05a8df4a0a33eb0b Feb 17 14:22:35 crc kubenswrapper[4836]: I0217 14:22:35.440399 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/2690ef6e-0489-43f3-b787-8b6c1295e283-memberlist\") pod \"speaker-pb5ff\" (UID: \"2690ef6e-0489-43f3-b787-8b6c1295e283\") " pod="metallb-system/speaker-pb5ff" Feb 17 14:22:35 crc kubenswrapper[4836]: E0217 14:22:35.440638 4836 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Feb 17 14:22:35 crc kubenswrapper[4836]: E0217 14:22:35.440752 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2690ef6e-0489-43f3-b787-8b6c1295e283-memberlist podName:2690ef6e-0489-43f3-b787-8b6c1295e283 nodeName:}" failed. No retries permitted until 2026-02-17 14:22:36.440732783 +0000 UTC m=+982.783661052 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/2690ef6e-0489-43f3-b787-8b6c1295e283-memberlist") pod "speaker-pb5ff" (UID: "2690ef6e-0489-43f3-b787-8b6c1295e283") : secret "metallb-memberlist" not found Feb 17 14:22:36 crc kubenswrapper[4836]: I0217 14:22:36.184722 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-69bbfbf88f-szl4j" event={"ID":"27eed55a-1a00-497e-9aa4-74f7007f336e","Type":"ContainerStarted","Data":"e084d3dab6f69a37fb444957d942314d3bf90015779c0b410e5d662a99910549"} Feb 17 14:22:36 crc kubenswrapper[4836]: I0217 14:22:36.185180 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-69bbfbf88f-szl4j" event={"ID":"27eed55a-1a00-497e-9aa4-74f7007f336e","Type":"ContainerStarted","Data":"1bab3001db627bc25ecc180ac435d391eead6634a59401452067eaef7eb48f43"} Feb 17 14:22:36 crc kubenswrapper[4836]: I0217 14:22:36.185223 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-69bbfbf88f-szl4j" event={"ID":"27eed55a-1a00-497e-9aa4-74f7007f336e","Type":"ContainerStarted","Data":"349fcc2a13683cafd7df0f94207afba27f23b46ab5a2a33a05a8df4a0a33eb0b"} Feb 17 14:22:36 crc kubenswrapper[4836]: I0217 14:22:36.185246 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-69bbfbf88f-szl4j" Feb 17 14:22:36 crc kubenswrapper[4836]: I0217 14:22:36.186453 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-mznjt" event={"ID":"18ec2995-af0c-4c47-aa70-480f9323329e","Type":"ContainerStarted","Data":"10857b4d4594f8173323137f0eb07e6d97fa097637eeb77eb062b9d77fc891d9"} Feb 17 14:22:36 crc kubenswrapper[4836]: I0217 14:22:36.206715 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-69bbfbf88f-szl4j" podStartSLOduration=2.206691027 podStartE2EDuration="2.206691027s" podCreationTimestamp="2026-02-17 14:22:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:22:36.202792668 +0000 UTC m=+982.545720957" watchObservedRunningTime="2026-02-17 14:22:36.206691027 +0000 UTC m=+982.549619306" Feb 17 14:22:36 crc kubenswrapper[4836]: I0217 14:22:36.498423 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/2690ef6e-0489-43f3-b787-8b6c1295e283-memberlist\") pod \"speaker-pb5ff\" (UID: \"2690ef6e-0489-43f3-b787-8b6c1295e283\") " pod="metallb-system/speaker-pb5ff" Feb 17 14:22:36 crc kubenswrapper[4836]: I0217 14:22:36.504414 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/2690ef6e-0489-43f3-b787-8b6c1295e283-memberlist\") pod \"speaker-pb5ff\" (UID: \"2690ef6e-0489-43f3-b787-8b6c1295e283\") " pod="metallb-system/speaker-pb5ff" Feb 17 14:22:36 crc kubenswrapper[4836]: I0217 14:22:36.511371 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-pb5ff" Feb 17 14:22:36 crc kubenswrapper[4836]: I0217 14:22:36.536149 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vxxx8" Feb 17 14:22:36 crc kubenswrapper[4836]: I0217 14:22:36.536195 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vxxx8" Feb 17 14:22:36 crc kubenswrapper[4836]: I0217 14:22:36.586129 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vxxx8" Feb 17 14:22:37 crc kubenswrapper[4836]: I0217 14:22:37.245554 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-pb5ff" event={"ID":"2690ef6e-0489-43f3-b787-8b6c1295e283","Type":"ContainerStarted","Data":"37b6e206e2beb70b9b6f25c3505a37f97c167cb7b97fc9f25540cac2014508ce"} Feb 17 14:22:37 crc kubenswrapper[4836]: I0217 14:22:37.245897 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-pb5ff" event={"ID":"2690ef6e-0489-43f3-b787-8b6c1295e283","Type":"ContainerStarted","Data":"af41ecfd4980ad5a9b76d3887f65a9f91bf29d79cbea9d371fcff42dffe9b36e"} Feb 17 14:22:37 crc kubenswrapper[4836]: I0217 14:22:37.300074 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vxxx8" Feb 17 14:22:37 crc kubenswrapper[4836]: I0217 14:22:37.441646 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vxxx8"] Feb 17 14:22:38 crc kubenswrapper[4836]: I0217 14:22:38.268639 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-pb5ff" event={"ID":"2690ef6e-0489-43f3-b787-8b6c1295e283","Type":"ContainerStarted","Data":"8e8f7d7b2105a43f43460c23011fa0cfe81eff9f63ca393e833b54a7665842dd"} Feb 17 14:22:38 crc kubenswrapper[4836]: I0217 14:22:38.296700 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-pb5ff" podStartSLOduration=4.296681382 podStartE2EDuration="4.296681382s" podCreationTimestamp="2026-02-17 14:22:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:22:38.293363037 +0000 UTC m=+984.636291306" watchObservedRunningTime="2026-02-17 14:22:38.296681382 +0000 UTC m=+984.639609661" Feb 17 14:22:39 crc kubenswrapper[4836]: I0217 14:22:39.276309 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-vxxx8" podUID="ee0bd3ed-4af9-40a3-9742-ee548934f0c7" containerName="registry-server" containerID="cri-o://130efc368435db6a4abd3d1e4331eadbf9c826809151887de231fde45fba1e8c" gracePeriod=2 Feb 17 14:22:39 crc kubenswrapper[4836]: I0217 14:22:39.276594 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-pb5ff" Feb 17 14:22:39 crc kubenswrapper[4836]: I0217 14:22:39.904106 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vxxx8" Feb 17 14:22:40 crc kubenswrapper[4836]: I0217 14:22:40.062902 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee0bd3ed-4af9-40a3-9742-ee548934f0c7-utilities\") pod \"ee0bd3ed-4af9-40a3-9742-ee548934f0c7\" (UID: \"ee0bd3ed-4af9-40a3-9742-ee548934f0c7\") " Feb 17 14:22:40 crc kubenswrapper[4836]: I0217 14:22:40.063016 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee0bd3ed-4af9-40a3-9742-ee548934f0c7-catalog-content\") pod \"ee0bd3ed-4af9-40a3-9742-ee548934f0c7\" (UID: \"ee0bd3ed-4af9-40a3-9742-ee548934f0c7\") " Feb 17 14:22:40 crc kubenswrapper[4836]: I0217 14:22:40.063254 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q4jss\" (UniqueName: \"kubernetes.io/projected/ee0bd3ed-4af9-40a3-9742-ee548934f0c7-kube-api-access-q4jss\") pod \"ee0bd3ed-4af9-40a3-9742-ee548934f0c7\" (UID: \"ee0bd3ed-4af9-40a3-9742-ee548934f0c7\") " Feb 17 14:22:40 crc kubenswrapper[4836]: I0217 14:22:40.064628 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee0bd3ed-4af9-40a3-9742-ee548934f0c7-utilities" (OuterVolumeSpecName: "utilities") pod "ee0bd3ed-4af9-40a3-9742-ee548934f0c7" (UID: "ee0bd3ed-4af9-40a3-9742-ee548934f0c7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:22:40 crc kubenswrapper[4836]: I0217 14:22:40.082617 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee0bd3ed-4af9-40a3-9742-ee548934f0c7-kube-api-access-q4jss" (OuterVolumeSpecName: "kube-api-access-q4jss") pod "ee0bd3ed-4af9-40a3-9742-ee548934f0c7" (UID: "ee0bd3ed-4af9-40a3-9742-ee548934f0c7"). InnerVolumeSpecName "kube-api-access-q4jss". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:22:40 crc kubenswrapper[4836]: I0217 14:22:40.118728 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee0bd3ed-4af9-40a3-9742-ee548934f0c7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ee0bd3ed-4af9-40a3-9742-ee548934f0c7" (UID: "ee0bd3ed-4af9-40a3-9742-ee548934f0c7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:22:40 crc kubenswrapper[4836]: I0217 14:22:40.165811 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q4jss\" (UniqueName: \"kubernetes.io/projected/ee0bd3ed-4af9-40a3-9742-ee548934f0c7-kube-api-access-q4jss\") on node \"crc\" DevicePath \"\"" Feb 17 14:22:40 crc kubenswrapper[4836]: I0217 14:22:40.165873 4836 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee0bd3ed-4af9-40a3-9742-ee548934f0c7-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 14:22:40 crc kubenswrapper[4836]: I0217 14:22:40.165887 4836 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee0bd3ed-4af9-40a3-9742-ee548934f0c7-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 14:22:40 crc kubenswrapper[4836]: I0217 14:22:40.286705 4836 generic.go:334] "Generic (PLEG): container finished" podID="ee0bd3ed-4af9-40a3-9742-ee548934f0c7" containerID="130efc368435db6a4abd3d1e4331eadbf9c826809151887de231fde45fba1e8c" exitCode=0 Feb 17 14:22:40 crc kubenswrapper[4836]: I0217 14:22:40.286802 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vxxx8" Feb 17 14:22:40 crc kubenswrapper[4836]: I0217 14:22:40.286802 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vxxx8" event={"ID":"ee0bd3ed-4af9-40a3-9742-ee548934f0c7","Type":"ContainerDied","Data":"130efc368435db6a4abd3d1e4331eadbf9c826809151887de231fde45fba1e8c"} Feb 17 14:22:40 crc kubenswrapper[4836]: I0217 14:22:40.287006 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vxxx8" event={"ID":"ee0bd3ed-4af9-40a3-9742-ee548934f0c7","Type":"ContainerDied","Data":"cb2c2b42c4e66e02d5c1a234e004888a9a328e8e2b0673f2fef499c320e33d68"} Feb 17 14:22:40 crc kubenswrapper[4836]: I0217 14:22:40.287036 4836 scope.go:117] "RemoveContainer" containerID="130efc368435db6a4abd3d1e4331eadbf9c826809151887de231fde45fba1e8c" Feb 17 14:22:40 crc kubenswrapper[4836]: I0217 14:22:40.327786 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vxxx8"] Feb 17 14:22:40 crc kubenswrapper[4836]: I0217 14:22:40.334876 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-vxxx8"] Feb 17 14:22:40 crc kubenswrapper[4836]: I0217 14:22:40.503848 4836 scope.go:117] "RemoveContainer" containerID="a437392fc716ac4c181c21aee412d30402f80e7fb51c5e7b4a9ec90fd620087d" Feb 17 14:22:40 crc kubenswrapper[4836]: I0217 14:22:40.556151 4836 scope.go:117] "RemoveContainer" containerID="87a0d8ab30fd50050f265f555c2ac0eecf86edd798af11bdde6d9dab11cbbc03" Feb 17 14:22:40 crc kubenswrapper[4836]: I0217 14:22:40.584373 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee0bd3ed-4af9-40a3-9742-ee548934f0c7" path="/var/lib/kubelet/pods/ee0bd3ed-4af9-40a3-9742-ee548934f0c7/volumes" Feb 17 14:22:40 crc kubenswrapper[4836]: I0217 14:22:40.601442 4836 scope.go:117] "RemoveContainer" containerID="130efc368435db6a4abd3d1e4331eadbf9c826809151887de231fde45fba1e8c" Feb 17 14:22:40 crc kubenswrapper[4836]: E0217 14:22:40.603364 4836 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"130efc368435db6a4abd3d1e4331eadbf9c826809151887de231fde45fba1e8c\": container with ID starting with 130efc368435db6a4abd3d1e4331eadbf9c826809151887de231fde45fba1e8c not found: ID does not exist" containerID="130efc368435db6a4abd3d1e4331eadbf9c826809151887de231fde45fba1e8c" Feb 17 14:22:40 crc kubenswrapper[4836]: I0217 14:22:40.603429 4836 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"130efc368435db6a4abd3d1e4331eadbf9c826809151887de231fde45fba1e8c"} err="failed to get container status \"130efc368435db6a4abd3d1e4331eadbf9c826809151887de231fde45fba1e8c\": rpc error: code = NotFound desc = could not find container \"130efc368435db6a4abd3d1e4331eadbf9c826809151887de231fde45fba1e8c\": container with ID starting with 130efc368435db6a4abd3d1e4331eadbf9c826809151887de231fde45fba1e8c not found: ID does not exist" Feb 17 14:22:40 crc kubenswrapper[4836]: I0217 14:22:40.603462 4836 scope.go:117] "RemoveContainer" containerID="a437392fc716ac4c181c21aee412d30402f80e7fb51c5e7b4a9ec90fd620087d" Feb 17 14:22:40 crc kubenswrapper[4836]: E0217 14:22:40.603952 4836 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a437392fc716ac4c181c21aee412d30402f80e7fb51c5e7b4a9ec90fd620087d\": container with ID starting with a437392fc716ac4c181c21aee412d30402f80e7fb51c5e7b4a9ec90fd620087d not found: ID does not exist" containerID="a437392fc716ac4c181c21aee412d30402f80e7fb51c5e7b4a9ec90fd620087d" Feb 17 14:22:40 crc kubenswrapper[4836]: I0217 14:22:40.604004 4836 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a437392fc716ac4c181c21aee412d30402f80e7fb51c5e7b4a9ec90fd620087d"} err="failed to get container status \"a437392fc716ac4c181c21aee412d30402f80e7fb51c5e7b4a9ec90fd620087d\": rpc error: code = NotFound desc = could not find container \"a437392fc716ac4c181c21aee412d30402f80e7fb51c5e7b4a9ec90fd620087d\": container with ID starting with a437392fc716ac4c181c21aee412d30402f80e7fb51c5e7b4a9ec90fd620087d not found: ID does not exist" Feb 17 14:22:40 crc kubenswrapper[4836]: I0217 14:22:40.604023 4836 scope.go:117] "RemoveContainer" containerID="87a0d8ab30fd50050f265f555c2ac0eecf86edd798af11bdde6d9dab11cbbc03" Feb 17 14:22:40 crc kubenswrapper[4836]: E0217 14:22:40.604258 4836 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87a0d8ab30fd50050f265f555c2ac0eecf86edd798af11bdde6d9dab11cbbc03\": container with ID starting with 87a0d8ab30fd50050f265f555c2ac0eecf86edd798af11bdde6d9dab11cbbc03 not found: ID does not exist" containerID="87a0d8ab30fd50050f265f555c2ac0eecf86edd798af11bdde6d9dab11cbbc03" Feb 17 14:22:40 crc kubenswrapper[4836]: I0217 14:22:40.604281 4836 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87a0d8ab30fd50050f265f555c2ac0eecf86edd798af11bdde6d9dab11cbbc03"} err="failed to get container status \"87a0d8ab30fd50050f265f555c2ac0eecf86edd798af11bdde6d9dab11cbbc03\": rpc error: code = NotFound desc = could not find container \"87a0d8ab30fd50050f265f555c2ac0eecf86edd798af11bdde6d9dab11cbbc03\": container with ID starting with 87a0d8ab30fd50050f265f555c2ac0eecf86edd798af11bdde6d9dab11cbbc03 not found: ID does not exist" Feb 17 14:22:45 crc kubenswrapper[4836]: I0217 14:22:45.037621 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-69bbfbf88f-szl4j" Feb 17 14:22:46 crc kubenswrapper[4836]: I0217 14:22:46.340976 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-mznjt" event={"ID":"18ec2995-af0c-4c47-aa70-480f9323329e","Type":"ContainerStarted","Data":"1c0920ccfc9e03a93f65617bd3331b91335257e2cf2ec0423fbdf07325adcfa0"} Feb 17 14:22:46 crc kubenswrapper[4836]: I0217 14:22:46.341370 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-mznjt" Feb 17 14:22:46 crc kubenswrapper[4836]: I0217 14:22:46.344890 4836 generic.go:334] "Generic (PLEG): container finished" podID="e019f338-ff73-4160-a283-a71e9e6119b3" containerID="1077f06d3d5bebd01c4ffbd75e580dc56862da4c37f0070d4820446af9de47f9" exitCode=0 Feb 17 14:22:46 crc kubenswrapper[4836]: I0217 14:22:46.344926 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-x257b" event={"ID":"e019f338-ff73-4160-a283-a71e9e6119b3","Type":"ContainerDied","Data":"1077f06d3d5bebd01c4ffbd75e580dc56862da4c37f0070d4820446af9de47f9"} Feb 17 14:22:46 crc kubenswrapper[4836]: I0217 14:22:46.356729 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-mznjt" podStartSLOduration=1.952169526 podStartE2EDuration="12.356711521s" podCreationTimestamp="2026-02-17 14:22:34 +0000 UTC" firstStartedPulling="2026-02-17 14:22:35.256821831 +0000 UTC m=+981.599750100" lastFinishedPulling="2026-02-17 14:22:45.661363826 +0000 UTC m=+992.004292095" observedRunningTime="2026-02-17 14:22:46.355393357 +0000 UTC m=+992.698321636" watchObservedRunningTime="2026-02-17 14:22:46.356711521 +0000 UTC m=+992.699639790" Feb 17 14:22:46 crc kubenswrapper[4836]: I0217 14:22:46.515220 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-pb5ff" Feb 17 14:22:47 crc kubenswrapper[4836]: I0217 14:22:47.352313 4836 generic.go:334] "Generic (PLEG): container finished" podID="e019f338-ff73-4160-a283-a71e9e6119b3" containerID="6c83f84061b3a004002402735bacc0efc2a93e244ec45cd30c47231cb2afda75" exitCode=0 Feb 17 14:22:47 crc kubenswrapper[4836]: I0217 14:22:47.352412 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-x257b" event={"ID":"e019f338-ff73-4160-a283-a71e9e6119b3","Type":"ContainerDied","Data":"6c83f84061b3a004002402735bacc0efc2a93e244ec45cd30c47231cb2afda75"} Feb 17 14:22:48 crc kubenswrapper[4836]: I0217 14:22:48.361062 4836 generic.go:334] "Generic (PLEG): container finished" podID="e019f338-ff73-4160-a283-a71e9e6119b3" containerID="d6a140b47bc595c4e0bd540c1b87ceab5fb212382476eab090184203a2d6d60d" exitCode=0 Feb 17 14:22:48 crc kubenswrapper[4836]: I0217 14:22:48.361185 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-x257b" event={"ID":"e019f338-ff73-4160-a283-a71e9e6119b3","Type":"ContainerDied","Data":"d6a140b47bc595c4e0bd540c1b87ceab5fb212382476eab090184203a2d6d60d"} Feb 17 14:22:49 crc kubenswrapper[4836]: I0217 14:22:49.374797 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-x257b" event={"ID":"e019f338-ff73-4160-a283-a71e9e6119b3","Type":"ContainerStarted","Data":"4221d330bfed77c0bd8ca6e95a47e5241b742b1538b0cdbcb61b7daf649469e1"} Feb 17 14:22:49 crc kubenswrapper[4836]: I0217 14:22:49.375161 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-x257b" event={"ID":"e019f338-ff73-4160-a283-a71e9e6119b3","Type":"ContainerStarted","Data":"7e2f69401c03e46067356f09eec6841704c994a0693fde52e64c714223d21f28"} Feb 17 14:22:49 crc kubenswrapper[4836]: I0217 14:22:49.375174 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-x257b" event={"ID":"e019f338-ff73-4160-a283-a71e9e6119b3","Type":"ContainerStarted","Data":"8f78fdf8808d9c7835a9f6e5c6bc93f71345184c3cdc90101be5000f422fb1e0"} Feb 17 14:22:49 crc kubenswrapper[4836]: I0217 14:22:49.375188 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-x257b" event={"ID":"e019f338-ff73-4160-a283-a71e9e6119b3","Type":"ContainerStarted","Data":"e1961ec1710c9130f42a4d633b94b4200234719ebfee80baa144c2765eb9402e"} Feb 17 14:22:49 crc kubenswrapper[4836]: I0217 14:22:49.375198 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-x257b" event={"ID":"e019f338-ff73-4160-a283-a71e9e6119b3","Type":"ContainerStarted","Data":"1da60767d90fabb62748e8dc88f8c69ccdfa89a32f1b48da2d4d8a96b7c55407"} Feb 17 14:22:49 crc kubenswrapper[4836]: I0217 14:22:49.620546 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-f2nk9"] Feb 17 14:22:49 crc kubenswrapper[4836]: E0217 14:22:49.620871 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee0bd3ed-4af9-40a3-9742-ee548934f0c7" containerName="registry-server" Feb 17 14:22:49 crc kubenswrapper[4836]: I0217 14:22:49.620892 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee0bd3ed-4af9-40a3-9742-ee548934f0c7" containerName="registry-server" Feb 17 14:22:49 crc kubenswrapper[4836]: E0217 14:22:49.620910 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee0bd3ed-4af9-40a3-9742-ee548934f0c7" containerName="extract-utilities" Feb 17 14:22:49 crc kubenswrapper[4836]: I0217 14:22:49.620918 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee0bd3ed-4af9-40a3-9742-ee548934f0c7" containerName="extract-utilities" Feb 17 14:22:49 crc kubenswrapper[4836]: E0217 14:22:49.620933 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee0bd3ed-4af9-40a3-9742-ee548934f0c7" containerName="extract-content" Feb 17 14:22:49 crc kubenswrapper[4836]: I0217 14:22:49.620941 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee0bd3ed-4af9-40a3-9742-ee548934f0c7" containerName="extract-content" Feb 17 14:22:49 crc kubenswrapper[4836]: I0217 14:22:49.621090 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee0bd3ed-4af9-40a3-9742-ee548934f0c7" containerName="registry-server" Feb 17 14:22:49 crc kubenswrapper[4836]: I0217 14:22:49.621647 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-f2nk9" Feb 17 14:22:49 crc kubenswrapper[4836]: I0217 14:22:49.624812 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-ntfqc" Feb 17 14:22:49 crc kubenswrapper[4836]: I0217 14:22:49.624824 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Feb 17 14:22:49 crc kubenswrapper[4836]: I0217 14:22:49.627928 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Feb 17 14:22:49 crc kubenswrapper[4836]: I0217 14:22:49.639369 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-f2nk9"] Feb 17 14:22:49 crc kubenswrapper[4836]: I0217 14:22:49.777517 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9v82\" (UniqueName: \"kubernetes.io/projected/edb91fa8-3288-4ae3-b355-7cb7849c1d8d-kube-api-access-n9v82\") pod \"openstack-operator-index-f2nk9\" (UID: \"edb91fa8-3288-4ae3-b355-7cb7849c1d8d\") " pod="openstack-operators/openstack-operator-index-f2nk9" Feb 17 14:22:49 crc kubenswrapper[4836]: I0217 14:22:49.879106 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9v82\" (UniqueName: \"kubernetes.io/projected/edb91fa8-3288-4ae3-b355-7cb7849c1d8d-kube-api-access-n9v82\") pod \"openstack-operator-index-f2nk9\" (UID: \"edb91fa8-3288-4ae3-b355-7cb7849c1d8d\") " pod="openstack-operators/openstack-operator-index-f2nk9" Feb 17 14:22:49 crc kubenswrapper[4836]: I0217 14:22:49.899189 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9v82\" (UniqueName: \"kubernetes.io/projected/edb91fa8-3288-4ae3-b355-7cb7849c1d8d-kube-api-access-n9v82\") pod \"openstack-operator-index-f2nk9\" (UID: \"edb91fa8-3288-4ae3-b355-7cb7849c1d8d\") " pod="openstack-operators/openstack-operator-index-f2nk9" Feb 17 14:22:49 crc kubenswrapper[4836]: I0217 14:22:49.940373 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-f2nk9" Feb 17 14:22:50 crc kubenswrapper[4836]: I0217 14:22:50.429871 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-x257b" event={"ID":"e019f338-ff73-4160-a283-a71e9e6119b3","Type":"ContainerStarted","Data":"973eef3da60090899d57696150c36ebae2399dddc4b5043b47f2d6aed253dbad"} Feb 17 14:22:50 crc kubenswrapper[4836]: I0217 14:22:50.430702 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-x257b" Feb 17 14:22:50 crc kubenswrapper[4836]: I0217 14:22:50.463697 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-x257b" podStartSLOduration=5.953230552 podStartE2EDuration="16.463675511s" podCreationTimestamp="2026-02-17 14:22:34 +0000 UTC" firstStartedPulling="2026-02-17 14:22:35.131803262 +0000 UTC m=+981.474731531" lastFinishedPulling="2026-02-17 14:22:45.642248221 +0000 UTC m=+991.985176490" observedRunningTime="2026-02-17 14:22:50.46088622 +0000 UTC m=+996.803814509" watchObservedRunningTime="2026-02-17 14:22:50.463675511 +0000 UTC m=+996.806603780" Feb 17 14:22:50 crc kubenswrapper[4836]: I0217 14:22:50.654744 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-f2nk9"] Feb 17 14:22:50 crc kubenswrapper[4836]: W0217 14:22:50.665509 4836 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podedb91fa8_3288_4ae3_b355_7cb7849c1d8d.slice/crio-f2c61c3b6d7068ca074c8d1380495038c7afe3ece774b912db9f11ca0f35855c WatchSource:0}: Error finding container f2c61c3b6d7068ca074c8d1380495038c7afe3ece774b912db9f11ca0f35855c: Status 404 returned error can't find the container with id f2c61c3b6d7068ca074c8d1380495038c7afe3ece774b912db9f11ca0f35855c Feb 17 14:22:51 crc kubenswrapper[4836]: I0217 14:22:51.501878 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-f2nk9" event={"ID":"edb91fa8-3288-4ae3-b355-7cb7849c1d8d","Type":"ContainerStarted","Data":"f2c61c3b6d7068ca074c8d1380495038c7afe3ece774b912db9f11ca0f35855c"} Feb 17 14:22:51 crc kubenswrapper[4836]: I0217 14:22:51.575662 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-f2nk9"] Feb 17 14:22:51 crc kubenswrapper[4836]: I0217 14:22:51.982653 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-pz5pz"] Feb 17 14:22:51 crc kubenswrapper[4836]: I0217 14:22:51.983696 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-pz5pz" Feb 17 14:22:51 crc kubenswrapper[4836]: I0217 14:22:51.994498 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-pz5pz"] Feb 17 14:22:52 crc kubenswrapper[4836]: I0217 14:22:52.162254 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lxf8\" (UniqueName: \"kubernetes.io/projected/f0982db9-e1ef-4fc9-b7d4-e52ac91e6676-kube-api-access-4lxf8\") pod \"openstack-operator-index-pz5pz\" (UID: \"f0982db9-e1ef-4fc9-b7d4-e52ac91e6676\") " pod="openstack-operators/openstack-operator-index-pz5pz" Feb 17 14:22:52 crc kubenswrapper[4836]: I0217 14:22:52.263751 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4lxf8\" (UniqueName: \"kubernetes.io/projected/f0982db9-e1ef-4fc9-b7d4-e52ac91e6676-kube-api-access-4lxf8\") pod \"openstack-operator-index-pz5pz\" (UID: \"f0982db9-e1ef-4fc9-b7d4-e52ac91e6676\") " pod="openstack-operators/openstack-operator-index-pz5pz" Feb 17 14:22:52 crc kubenswrapper[4836]: I0217 14:22:52.291135 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lxf8\" (UniqueName: \"kubernetes.io/projected/f0982db9-e1ef-4fc9-b7d4-e52ac91e6676-kube-api-access-4lxf8\") pod \"openstack-operator-index-pz5pz\" (UID: \"f0982db9-e1ef-4fc9-b7d4-e52ac91e6676\") " pod="openstack-operators/openstack-operator-index-pz5pz" Feb 17 14:22:52 crc kubenswrapper[4836]: I0217 14:22:52.302258 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-pz5pz" Feb 17 14:22:53 crc kubenswrapper[4836]: I0217 14:22:53.386196 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-pz5pz"] Feb 17 14:22:53 crc kubenswrapper[4836]: I0217 14:22:53.516021 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-f2nk9" event={"ID":"edb91fa8-3288-4ae3-b355-7cb7849c1d8d","Type":"ContainerStarted","Data":"da23f8997cf70d61317c32ce04b51763484e9e9080e955955480e6e878568c8f"} Feb 17 14:22:53 crc kubenswrapper[4836]: I0217 14:22:53.516190 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-f2nk9" podUID="edb91fa8-3288-4ae3-b355-7cb7849c1d8d" containerName="registry-server" containerID="cri-o://da23f8997cf70d61317c32ce04b51763484e9e9080e955955480e6e878568c8f" gracePeriod=2 Feb 17 14:22:53 crc kubenswrapper[4836]: I0217 14:22:53.519287 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-pz5pz" event={"ID":"f0982db9-e1ef-4fc9-b7d4-e52ac91e6676","Type":"ContainerStarted","Data":"8548eb24eb9b2410813d3e9fd6c73a7876264a12a1b057b176e8f75d28a659eb"} Feb 17 14:22:53 crc kubenswrapper[4836]: I0217 14:22:53.538678 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-f2nk9" podStartSLOduration=1.946951591 podStartE2EDuration="4.538657044s" podCreationTimestamp="2026-02-17 14:22:49 +0000 UTC" firstStartedPulling="2026-02-17 14:22:50.668169134 +0000 UTC m=+997.011097403" lastFinishedPulling="2026-02-17 14:22:53.259874557 +0000 UTC m=+999.602802856" observedRunningTime="2026-02-17 14:22:53.533762419 +0000 UTC m=+999.876690708" watchObservedRunningTime="2026-02-17 14:22:53.538657044 +0000 UTC m=+999.881585313" Feb 17 14:22:53 crc kubenswrapper[4836]: I0217 14:22:53.940992 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-f2nk9" Feb 17 14:22:54 crc kubenswrapper[4836]: I0217 14:22:54.196566 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n9v82\" (UniqueName: \"kubernetes.io/projected/edb91fa8-3288-4ae3-b355-7cb7849c1d8d-kube-api-access-n9v82\") pod \"edb91fa8-3288-4ae3-b355-7cb7849c1d8d\" (UID: \"edb91fa8-3288-4ae3-b355-7cb7849c1d8d\") " Feb 17 14:22:54 crc kubenswrapper[4836]: I0217 14:22:54.203651 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/edb91fa8-3288-4ae3-b355-7cb7849c1d8d-kube-api-access-n9v82" (OuterVolumeSpecName: "kube-api-access-n9v82") pod "edb91fa8-3288-4ae3-b355-7cb7849c1d8d" (UID: "edb91fa8-3288-4ae3-b355-7cb7849c1d8d"). InnerVolumeSpecName "kube-api-access-n9v82". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:22:54 crc kubenswrapper[4836]: I0217 14:22:54.298846 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n9v82\" (UniqueName: \"kubernetes.io/projected/edb91fa8-3288-4ae3-b355-7cb7849c1d8d-kube-api-access-n9v82\") on node \"crc\" DevicePath \"\"" Feb 17 14:22:54 crc kubenswrapper[4836]: I0217 14:22:54.529600 4836 generic.go:334] "Generic (PLEG): container finished" podID="edb91fa8-3288-4ae3-b355-7cb7849c1d8d" containerID="da23f8997cf70d61317c32ce04b51763484e9e9080e955955480e6e878568c8f" exitCode=0 Feb 17 14:22:54 crc kubenswrapper[4836]: I0217 14:22:54.529639 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-f2nk9" Feb 17 14:22:54 crc kubenswrapper[4836]: I0217 14:22:54.529658 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-f2nk9" event={"ID":"edb91fa8-3288-4ae3-b355-7cb7849c1d8d","Type":"ContainerDied","Data":"da23f8997cf70d61317c32ce04b51763484e9e9080e955955480e6e878568c8f"} Feb 17 14:22:54 crc kubenswrapper[4836]: I0217 14:22:54.530040 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-f2nk9" event={"ID":"edb91fa8-3288-4ae3-b355-7cb7849c1d8d","Type":"ContainerDied","Data":"f2c61c3b6d7068ca074c8d1380495038c7afe3ece774b912db9f11ca0f35855c"} Feb 17 14:22:54 crc kubenswrapper[4836]: I0217 14:22:54.530064 4836 scope.go:117] "RemoveContainer" containerID="da23f8997cf70d61317c32ce04b51763484e9e9080e955955480e6e878568c8f" Feb 17 14:22:54 crc kubenswrapper[4836]: I0217 14:22:54.532996 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-pz5pz" event={"ID":"f0982db9-e1ef-4fc9-b7d4-e52ac91e6676","Type":"ContainerStarted","Data":"ab0fdfa98b6bc72d92c461dbb33cd68bef9f51986312eb90d88af739d4355f06"} Feb 17 14:22:54 crc kubenswrapper[4836]: I0217 14:22:54.551900 4836 scope.go:117] "RemoveContainer" containerID="da23f8997cf70d61317c32ce04b51763484e9e9080e955955480e6e878568c8f" Feb 17 14:22:54 crc kubenswrapper[4836]: E0217 14:22:54.552998 4836 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da23f8997cf70d61317c32ce04b51763484e9e9080e955955480e6e878568c8f\": container with ID starting with da23f8997cf70d61317c32ce04b51763484e9e9080e955955480e6e878568c8f not found: ID does not exist" containerID="da23f8997cf70d61317c32ce04b51763484e9e9080e955955480e6e878568c8f" Feb 17 14:22:54 crc kubenswrapper[4836]: I0217 14:22:54.553057 4836 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da23f8997cf70d61317c32ce04b51763484e9e9080e955955480e6e878568c8f"} err="failed to get container status \"da23f8997cf70d61317c32ce04b51763484e9e9080e955955480e6e878568c8f\": rpc error: code = NotFound desc = could not find container \"da23f8997cf70d61317c32ce04b51763484e9e9080e955955480e6e878568c8f\": container with ID starting with da23f8997cf70d61317c32ce04b51763484e9e9080e955955480e6e878568c8f not found: ID does not exist" Feb 17 14:22:54 crc kubenswrapper[4836]: I0217 14:22:54.559533 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-pz5pz" podStartSLOduration=3.513959275 podStartE2EDuration="3.55950842s" podCreationTimestamp="2026-02-17 14:22:51 +0000 UTC" firstStartedPulling="2026-02-17 14:22:53.399377303 +0000 UTC m=+999.742305572" lastFinishedPulling="2026-02-17 14:22:53.444926448 +0000 UTC m=+999.787854717" observedRunningTime="2026-02-17 14:22:54.552107761 +0000 UTC m=+1000.895036030" watchObservedRunningTime="2026-02-17 14:22:54.55950842 +0000 UTC m=+1000.902436689" Feb 17 14:22:54 crc kubenswrapper[4836]: I0217 14:22:54.590467 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-f2nk9"] Feb 17 14:22:54 crc kubenswrapper[4836]: I0217 14:22:54.590514 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-f2nk9"] Feb 17 14:22:54 crc kubenswrapper[4836]: I0217 14:22:54.927272 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-x257b" Feb 17 14:22:54 crc kubenswrapper[4836]: I0217 14:22:54.967327 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-x257b" Feb 17 14:22:56 crc kubenswrapper[4836]: I0217 14:22:56.578082 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="edb91fa8-3288-4ae3-b355-7cb7849c1d8d" path="/var/lib/kubelet/pods/edb91fa8-3288-4ae3-b355-7cb7849c1d8d/volumes" Feb 17 14:22:59 crc kubenswrapper[4836]: I0217 14:22:59.765148 4836 patch_prober.go:28] interesting pod/machine-config-daemon-bkk9g container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 14:22:59 crc kubenswrapper[4836]: I0217 14:22:59.765558 4836 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" podUID="895a19c9-a3f0-4a15-aa19-19347121388c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 14:23:02 crc kubenswrapper[4836]: I0217 14:23:02.357674 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-pz5pz" Feb 17 14:23:02 crc kubenswrapper[4836]: I0217 14:23:02.358018 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-pz5pz" Feb 17 14:23:02 crc kubenswrapper[4836]: I0217 14:23:02.433173 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-pz5pz" Feb 17 14:23:02 crc kubenswrapper[4836]: I0217 14:23:02.619841 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-pz5pz" Feb 17 14:23:04 crc kubenswrapper[4836]: I0217 14:23:04.932787 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-x257b" Feb 17 14:23:04 crc kubenswrapper[4836]: I0217 14:23:04.945606 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-mznjt" Feb 17 14:23:10 crc kubenswrapper[4836]: I0217 14:23:10.593838 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/609d143c5a73782b709463f2a5b0d811d0c25a93f651a8c9a58ebcae612cvgm"] Feb 17 14:23:10 crc kubenswrapper[4836]: E0217 14:23:10.595747 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edb91fa8-3288-4ae3-b355-7cb7849c1d8d" containerName="registry-server" Feb 17 14:23:10 crc kubenswrapper[4836]: I0217 14:23:10.595784 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="edb91fa8-3288-4ae3-b355-7cb7849c1d8d" containerName="registry-server" Feb 17 14:23:10 crc kubenswrapper[4836]: I0217 14:23:10.595934 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="edb91fa8-3288-4ae3-b355-7cb7849c1d8d" containerName="registry-server" Feb 17 14:23:10 crc kubenswrapper[4836]: I0217 14:23:10.596919 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/609d143c5a73782b709463f2a5b0d811d0c25a93f651a8c9a58ebcae612cvgm" Feb 17 14:23:10 crc kubenswrapper[4836]: I0217 14:23:10.600924 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-2zt29" Feb 17 14:23:10 crc kubenswrapper[4836]: I0217 14:23:10.607041 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/609d143c5a73782b709463f2a5b0d811d0c25a93f651a8c9a58ebcae612cvgm"] Feb 17 14:23:10 crc kubenswrapper[4836]: I0217 14:23:10.709956 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4g2g\" (UniqueName: \"kubernetes.io/projected/dc1ca64e-8914-44ae-8d9e-d7c63ba6e166-kube-api-access-s4g2g\") pod \"609d143c5a73782b709463f2a5b0d811d0c25a93f651a8c9a58ebcae612cvgm\" (UID: \"dc1ca64e-8914-44ae-8d9e-d7c63ba6e166\") " pod="openstack-operators/609d143c5a73782b709463f2a5b0d811d0c25a93f651a8c9a58ebcae612cvgm" Feb 17 14:23:10 crc kubenswrapper[4836]: I0217 14:23:10.710477 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dc1ca64e-8914-44ae-8d9e-d7c63ba6e166-util\") pod \"609d143c5a73782b709463f2a5b0d811d0c25a93f651a8c9a58ebcae612cvgm\" (UID: \"dc1ca64e-8914-44ae-8d9e-d7c63ba6e166\") " pod="openstack-operators/609d143c5a73782b709463f2a5b0d811d0c25a93f651a8c9a58ebcae612cvgm" Feb 17 14:23:10 crc kubenswrapper[4836]: I0217 14:23:10.710514 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dc1ca64e-8914-44ae-8d9e-d7c63ba6e166-bundle\") pod \"609d143c5a73782b709463f2a5b0d811d0c25a93f651a8c9a58ebcae612cvgm\" (UID: \"dc1ca64e-8914-44ae-8d9e-d7c63ba6e166\") " pod="openstack-operators/609d143c5a73782b709463f2a5b0d811d0c25a93f651a8c9a58ebcae612cvgm" Feb 17 14:23:10 crc kubenswrapper[4836]: I0217 14:23:10.812078 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4g2g\" (UniqueName: \"kubernetes.io/projected/dc1ca64e-8914-44ae-8d9e-d7c63ba6e166-kube-api-access-s4g2g\") pod \"609d143c5a73782b709463f2a5b0d811d0c25a93f651a8c9a58ebcae612cvgm\" (UID: \"dc1ca64e-8914-44ae-8d9e-d7c63ba6e166\") " pod="openstack-operators/609d143c5a73782b709463f2a5b0d811d0c25a93f651a8c9a58ebcae612cvgm" Feb 17 14:23:10 crc kubenswrapper[4836]: I0217 14:23:10.812190 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dc1ca64e-8914-44ae-8d9e-d7c63ba6e166-util\") pod \"609d143c5a73782b709463f2a5b0d811d0c25a93f651a8c9a58ebcae612cvgm\" (UID: \"dc1ca64e-8914-44ae-8d9e-d7c63ba6e166\") " pod="openstack-operators/609d143c5a73782b709463f2a5b0d811d0c25a93f651a8c9a58ebcae612cvgm" Feb 17 14:23:10 crc kubenswrapper[4836]: I0217 14:23:10.812232 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dc1ca64e-8914-44ae-8d9e-d7c63ba6e166-bundle\") pod \"609d143c5a73782b709463f2a5b0d811d0c25a93f651a8c9a58ebcae612cvgm\" (UID: \"dc1ca64e-8914-44ae-8d9e-d7c63ba6e166\") " pod="openstack-operators/609d143c5a73782b709463f2a5b0d811d0c25a93f651a8c9a58ebcae612cvgm" Feb 17 14:23:10 crc kubenswrapper[4836]: I0217 14:23:10.813150 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dc1ca64e-8914-44ae-8d9e-d7c63ba6e166-bundle\") pod \"609d143c5a73782b709463f2a5b0d811d0c25a93f651a8c9a58ebcae612cvgm\" (UID: \"dc1ca64e-8914-44ae-8d9e-d7c63ba6e166\") " pod="openstack-operators/609d143c5a73782b709463f2a5b0d811d0c25a93f651a8c9a58ebcae612cvgm" Feb 17 14:23:10 crc kubenswrapper[4836]: I0217 14:23:10.813372 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dc1ca64e-8914-44ae-8d9e-d7c63ba6e166-util\") pod \"609d143c5a73782b709463f2a5b0d811d0c25a93f651a8c9a58ebcae612cvgm\" (UID: \"dc1ca64e-8914-44ae-8d9e-d7c63ba6e166\") " pod="openstack-operators/609d143c5a73782b709463f2a5b0d811d0c25a93f651a8c9a58ebcae612cvgm" Feb 17 14:23:10 crc kubenswrapper[4836]: I0217 14:23:10.835868 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4g2g\" (UniqueName: \"kubernetes.io/projected/dc1ca64e-8914-44ae-8d9e-d7c63ba6e166-kube-api-access-s4g2g\") pod \"609d143c5a73782b709463f2a5b0d811d0c25a93f651a8c9a58ebcae612cvgm\" (UID: \"dc1ca64e-8914-44ae-8d9e-d7c63ba6e166\") " pod="openstack-operators/609d143c5a73782b709463f2a5b0d811d0c25a93f651a8c9a58ebcae612cvgm" Feb 17 14:23:10 crc kubenswrapper[4836]: I0217 14:23:10.933718 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/609d143c5a73782b709463f2a5b0d811d0c25a93f651a8c9a58ebcae612cvgm" Feb 17 14:23:11 crc kubenswrapper[4836]: I0217 14:23:11.450478 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/609d143c5a73782b709463f2a5b0d811d0c25a93f651a8c9a58ebcae612cvgm"] Feb 17 14:23:11 crc kubenswrapper[4836]: W0217 14:23:11.460405 4836 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc1ca64e_8914_44ae_8d9e_d7c63ba6e166.slice/crio-0ab5a5cd357a3e9c8bf32c08f6dc8b2f32cd15e6e31d32610488207b53a77062 WatchSource:0}: Error finding container 0ab5a5cd357a3e9c8bf32c08f6dc8b2f32cd15e6e31d32610488207b53a77062: Status 404 returned error can't find the container with id 0ab5a5cd357a3e9c8bf32c08f6dc8b2f32cd15e6e31d32610488207b53a77062 Feb 17 14:23:11 crc kubenswrapper[4836]: I0217 14:23:11.669062 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/609d143c5a73782b709463f2a5b0d811d0c25a93f651a8c9a58ebcae612cvgm" event={"ID":"dc1ca64e-8914-44ae-8d9e-d7c63ba6e166","Type":"ContainerStarted","Data":"b660291d58fd283d9879c66a66dc2fa63506daff41626c13a2530e7ee2f8b4f6"} Feb 17 14:23:11 crc kubenswrapper[4836]: I0217 14:23:11.669739 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/609d143c5a73782b709463f2a5b0d811d0c25a93f651a8c9a58ebcae612cvgm" event={"ID":"dc1ca64e-8914-44ae-8d9e-d7c63ba6e166","Type":"ContainerStarted","Data":"0ab5a5cd357a3e9c8bf32c08f6dc8b2f32cd15e6e31d32610488207b53a77062"} Feb 17 14:23:12 crc kubenswrapper[4836]: I0217 14:23:12.676597 4836 generic.go:334] "Generic (PLEG): container finished" podID="dc1ca64e-8914-44ae-8d9e-d7c63ba6e166" containerID="b660291d58fd283d9879c66a66dc2fa63506daff41626c13a2530e7ee2f8b4f6" exitCode=0 Feb 17 14:23:12 crc kubenswrapper[4836]: I0217 14:23:12.676652 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/609d143c5a73782b709463f2a5b0d811d0c25a93f651a8c9a58ebcae612cvgm" event={"ID":"dc1ca64e-8914-44ae-8d9e-d7c63ba6e166","Type":"ContainerDied","Data":"b660291d58fd283d9879c66a66dc2fa63506daff41626c13a2530e7ee2f8b4f6"} Feb 17 14:23:13 crc kubenswrapper[4836]: I0217 14:23:13.686143 4836 generic.go:334] "Generic (PLEG): container finished" podID="dc1ca64e-8914-44ae-8d9e-d7c63ba6e166" containerID="3c6d713658574434865f5b99cb3e7536bfe66ce51a3f0c64ae27f13273de57c6" exitCode=0 Feb 17 14:23:13 crc kubenswrapper[4836]: I0217 14:23:13.686236 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/609d143c5a73782b709463f2a5b0d811d0c25a93f651a8c9a58ebcae612cvgm" event={"ID":"dc1ca64e-8914-44ae-8d9e-d7c63ba6e166","Type":"ContainerDied","Data":"3c6d713658574434865f5b99cb3e7536bfe66ce51a3f0c64ae27f13273de57c6"} Feb 17 14:23:14 crc kubenswrapper[4836]: I0217 14:23:14.697449 4836 generic.go:334] "Generic (PLEG): container finished" podID="dc1ca64e-8914-44ae-8d9e-d7c63ba6e166" containerID="3ac8a1b6212e43d3b69fdc02f6daca525345d9fe092da6fec178ed9daccd3f4e" exitCode=0 Feb 17 14:23:14 crc kubenswrapper[4836]: I0217 14:23:14.697636 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/609d143c5a73782b709463f2a5b0d811d0c25a93f651a8c9a58ebcae612cvgm" event={"ID":"dc1ca64e-8914-44ae-8d9e-d7c63ba6e166","Type":"ContainerDied","Data":"3ac8a1b6212e43d3b69fdc02f6daca525345d9fe092da6fec178ed9daccd3f4e"} Feb 17 14:23:16 crc kubenswrapper[4836]: I0217 14:23:16.020421 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/609d143c5a73782b709463f2a5b0d811d0c25a93f651a8c9a58ebcae612cvgm" Feb 17 14:23:16 crc kubenswrapper[4836]: I0217 14:23:16.127982 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dc1ca64e-8914-44ae-8d9e-d7c63ba6e166-util\") pod \"dc1ca64e-8914-44ae-8d9e-d7c63ba6e166\" (UID: \"dc1ca64e-8914-44ae-8d9e-d7c63ba6e166\") " Feb 17 14:23:16 crc kubenswrapper[4836]: I0217 14:23:16.128142 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dc1ca64e-8914-44ae-8d9e-d7c63ba6e166-bundle\") pod \"dc1ca64e-8914-44ae-8d9e-d7c63ba6e166\" (UID: \"dc1ca64e-8914-44ae-8d9e-d7c63ba6e166\") " Feb 17 14:23:16 crc kubenswrapper[4836]: I0217 14:23:16.128311 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4g2g\" (UniqueName: \"kubernetes.io/projected/dc1ca64e-8914-44ae-8d9e-d7c63ba6e166-kube-api-access-s4g2g\") pod \"dc1ca64e-8914-44ae-8d9e-d7c63ba6e166\" (UID: \"dc1ca64e-8914-44ae-8d9e-d7c63ba6e166\") " Feb 17 14:23:16 crc kubenswrapper[4836]: I0217 14:23:16.129607 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc1ca64e-8914-44ae-8d9e-d7c63ba6e166-bundle" (OuterVolumeSpecName: "bundle") pod "dc1ca64e-8914-44ae-8d9e-d7c63ba6e166" (UID: "dc1ca64e-8914-44ae-8d9e-d7c63ba6e166"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:23:16 crc kubenswrapper[4836]: I0217 14:23:16.134715 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc1ca64e-8914-44ae-8d9e-d7c63ba6e166-kube-api-access-s4g2g" (OuterVolumeSpecName: "kube-api-access-s4g2g") pod "dc1ca64e-8914-44ae-8d9e-d7c63ba6e166" (UID: "dc1ca64e-8914-44ae-8d9e-d7c63ba6e166"). InnerVolumeSpecName "kube-api-access-s4g2g". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:23:16 crc kubenswrapper[4836]: I0217 14:23:16.141909 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc1ca64e-8914-44ae-8d9e-d7c63ba6e166-util" (OuterVolumeSpecName: "util") pod "dc1ca64e-8914-44ae-8d9e-d7c63ba6e166" (UID: "dc1ca64e-8914-44ae-8d9e-d7c63ba6e166"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:23:16 crc kubenswrapper[4836]: I0217 14:23:16.230171 4836 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dc1ca64e-8914-44ae-8d9e-d7c63ba6e166-util\") on node \"crc\" DevicePath \"\"" Feb 17 14:23:16 crc kubenswrapper[4836]: I0217 14:23:16.230225 4836 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dc1ca64e-8914-44ae-8d9e-d7c63ba6e166-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:23:16 crc kubenswrapper[4836]: I0217 14:23:16.230246 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4g2g\" (UniqueName: \"kubernetes.io/projected/dc1ca64e-8914-44ae-8d9e-d7c63ba6e166-kube-api-access-s4g2g\") on node \"crc\" DevicePath \"\"" Feb 17 14:23:16 crc kubenswrapper[4836]: I0217 14:23:16.715968 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/609d143c5a73782b709463f2a5b0d811d0c25a93f651a8c9a58ebcae612cvgm" event={"ID":"dc1ca64e-8914-44ae-8d9e-d7c63ba6e166","Type":"ContainerDied","Data":"0ab5a5cd357a3e9c8bf32c08f6dc8b2f32cd15e6e31d32610488207b53a77062"} Feb 17 14:23:16 crc kubenswrapper[4836]: I0217 14:23:16.716017 4836 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ab5a5cd357a3e9c8bf32c08f6dc8b2f32cd15e6e31d32610488207b53a77062" Feb 17 14:23:16 crc kubenswrapper[4836]: I0217 14:23:16.716048 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/609d143c5a73782b709463f2a5b0d811d0c25a93f651a8c9a58ebcae612cvgm" Feb 17 14:23:22 crc kubenswrapper[4836]: I0217 14:23:22.839143 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-7464dc569f-6nqxk"] Feb 17 14:23:22 crc kubenswrapper[4836]: E0217 14:23:22.839968 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc1ca64e-8914-44ae-8d9e-d7c63ba6e166" containerName="extract" Feb 17 14:23:22 crc kubenswrapper[4836]: I0217 14:23:22.839982 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc1ca64e-8914-44ae-8d9e-d7c63ba6e166" containerName="extract" Feb 17 14:23:22 crc kubenswrapper[4836]: E0217 14:23:22.840003 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc1ca64e-8914-44ae-8d9e-d7c63ba6e166" containerName="pull" Feb 17 14:23:22 crc kubenswrapper[4836]: I0217 14:23:22.840009 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc1ca64e-8914-44ae-8d9e-d7c63ba6e166" containerName="pull" Feb 17 14:23:22 crc kubenswrapper[4836]: E0217 14:23:22.840028 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc1ca64e-8914-44ae-8d9e-d7c63ba6e166" containerName="util" Feb 17 14:23:22 crc kubenswrapper[4836]: I0217 14:23:22.840034 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc1ca64e-8914-44ae-8d9e-d7c63ba6e166" containerName="util" Feb 17 14:23:22 crc kubenswrapper[4836]: I0217 14:23:22.840148 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc1ca64e-8914-44ae-8d9e-d7c63ba6e166" containerName="extract" Feb 17 14:23:22 crc kubenswrapper[4836]: I0217 14:23:22.840594 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-7464dc569f-6nqxk" Feb 17 14:23:22 crc kubenswrapper[4836]: I0217 14:23:22.851581 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-pl9gf" Feb 17 14:23:22 crc kubenswrapper[4836]: I0217 14:23:22.862245 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-7464dc569f-6nqxk"] Feb 17 14:23:23 crc kubenswrapper[4836]: I0217 14:23:23.007963 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sf6rd\" (UniqueName: \"kubernetes.io/projected/4afa09e7-5273-4170-8c40-6c3ed66e6b8e-kube-api-access-sf6rd\") pod \"openstack-operator-controller-init-7464dc569f-6nqxk\" (UID: \"4afa09e7-5273-4170-8c40-6c3ed66e6b8e\") " pod="openstack-operators/openstack-operator-controller-init-7464dc569f-6nqxk" Feb 17 14:23:23 crc kubenswrapper[4836]: I0217 14:23:23.109636 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sf6rd\" (UniqueName: \"kubernetes.io/projected/4afa09e7-5273-4170-8c40-6c3ed66e6b8e-kube-api-access-sf6rd\") pod \"openstack-operator-controller-init-7464dc569f-6nqxk\" (UID: \"4afa09e7-5273-4170-8c40-6c3ed66e6b8e\") " pod="openstack-operators/openstack-operator-controller-init-7464dc569f-6nqxk" Feb 17 14:23:23 crc kubenswrapper[4836]: I0217 14:23:23.129015 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sf6rd\" (UniqueName: \"kubernetes.io/projected/4afa09e7-5273-4170-8c40-6c3ed66e6b8e-kube-api-access-sf6rd\") pod \"openstack-operator-controller-init-7464dc569f-6nqxk\" (UID: \"4afa09e7-5273-4170-8c40-6c3ed66e6b8e\") " pod="openstack-operators/openstack-operator-controller-init-7464dc569f-6nqxk" Feb 17 14:23:23 crc kubenswrapper[4836]: I0217 14:23:23.159772 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-7464dc569f-6nqxk" Feb 17 14:23:23 crc kubenswrapper[4836]: I0217 14:23:23.646619 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-7464dc569f-6nqxk"] Feb 17 14:23:23 crc kubenswrapper[4836]: I0217 14:23:23.905856 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-7464dc569f-6nqxk" event={"ID":"4afa09e7-5273-4170-8c40-6c3ed66e6b8e","Type":"ContainerStarted","Data":"030e0ff9289543c8e0946be59093d692b43a81a2959df444698c62084e3d15c3"} Feb 17 14:23:29 crc kubenswrapper[4836]: I0217 14:23:29.765658 4836 patch_prober.go:28] interesting pod/machine-config-daemon-bkk9g container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 14:23:29 crc kubenswrapper[4836]: I0217 14:23:29.766056 4836 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" podUID="895a19c9-a3f0-4a15-aa19-19347121388c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 14:23:30 crc kubenswrapper[4836]: I0217 14:23:30.252625 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-7464dc569f-6nqxk" event={"ID":"4afa09e7-5273-4170-8c40-6c3ed66e6b8e","Type":"ContainerStarted","Data":"23eea63b1b0347ac660ddb33113cf73ec732e17d4cb9cae17340f007d044eb4f"} Feb 17 14:23:30 crc kubenswrapper[4836]: I0217 14:23:30.252799 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-7464dc569f-6nqxk" Feb 17 14:23:30 crc kubenswrapper[4836]: I0217 14:23:30.288194 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-7464dc569f-6nqxk" podStartSLOduration=2.216200984 podStartE2EDuration="8.288173353s" podCreationTimestamp="2026-02-17 14:23:22 +0000 UTC" firstStartedPulling="2026-02-17 14:23:23.680590857 +0000 UTC m=+1030.023519116" lastFinishedPulling="2026-02-17 14:23:29.752563216 +0000 UTC m=+1036.095491485" observedRunningTime="2026-02-17 14:23:30.283788541 +0000 UTC m=+1036.626716860" watchObservedRunningTime="2026-02-17 14:23:30.288173353 +0000 UTC m=+1036.631101632" Feb 17 14:23:43 crc kubenswrapper[4836]: I0217 14:23:43.163401 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-7464dc569f-6nqxk" Feb 17 14:23:59 crc kubenswrapper[4836]: I0217 14:23:59.765863 4836 patch_prober.go:28] interesting pod/machine-config-daemon-bkk9g container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 14:23:59 crc kubenswrapper[4836]: I0217 14:23:59.766901 4836 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" podUID="895a19c9-a3f0-4a15-aa19-19347121388c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 14:23:59 crc kubenswrapper[4836]: I0217 14:23:59.766997 4836 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" Feb 17 14:23:59 crc kubenswrapper[4836]: I0217 14:23:59.768111 4836 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"89b78e4cc2264dc06417ab903dd2a1618c1aee2c1d950babae0b011a2e9eac59"} pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 14:23:59 crc kubenswrapper[4836]: I0217 14:23:59.768221 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" podUID="895a19c9-a3f0-4a15-aa19-19347121388c" containerName="machine-config-daemon" containerID="cri-o://89b78e4cc2264dc06417ab903dd2a1618c1aee2c1d950babae0b011a2e9eac59" gracePeriod=600 Feb 17 14:24:00 crc kubenswrapper[4836]: I0217 14:24:00.461819 4836 generic.go:334] "Generic (PLEG): container finished" podID="895a19c9-a3f0-4a15-aa19-19347121388c" containerID="89b78e4cc2264dc06417ab903dd2a1618c1aee2c1d950babae0b011a2e9eac59" exitCode=0 Feb 17 14:24:00 crc kubenswrapper[4836]: I0217 14:24:00.461884 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" event={"ID":"895a19c9-a3f0-4a15-aa19-19347121388c","Type":"ContainerDied","Data":"89b78e4cc2264dc06417ab903dd2a1618c1aee2c1d950babae0b011a2e9eac59"} Feb 17 14:24:00 crc kubenswrapper[4836]: I0217 14:24:00.462171 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" event={"ID":"895a19c9-a3f0-4a15-aa19-19347121388c","Type":"ContainerStarted","Data":"790067b54b3531952a7756a09b793da1fc53330ef71b8011e59f530ae444594e"} Feb 17 14:24:00 crc kubenswrapper[4836]: I0217 14:24:00.462197 4836 scope.go:117] "RemoveContainer" containerID="d7f43ee4be167fb696d056804834f76d74b6a96b2dd00fc7f1328e7b9c2e7869" Feb 17 14:24:13 crc kubenswrapper[4836]: I0217 14:24:13.006084 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-5d946d989d-b6cfm"] Feb 17 14:24:13 crc kubenswrapper[4836]: I0217 14:24:13.009033 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-b6cfm" Feb 17 14:24:13 crc kubenswrapper[4836]: I0217 14:24:13.016173 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-gxh86" Feb 17 14:24:13 crc kubenswrapper[4836]: I0217 14:24:13.019574 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-54696"] Feb 17 14:24:13 crc kubenswrapper[4836]: I0217 14:24:13.020873 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-54696" Feb 17 14:24:13 crc kubenswrapper[4836]: I0217 14:24:13.030617 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-kg9lq" Feb 17 14:24:13 crc kubenswrapper[4836]: I0217 14:24:13.045429 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-54696"] Feb 17 14:24:13 crc kubenswrapper[4836]: I0217 14:24:13.058309 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-5d946d989d-b6cfm"] Feb 17 14:24:13 crc kubenswrapper[4836]: I0217 14:24:13.093442 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-8wdwr"] Feb 17 14:24:13 crc kubenswrapper[4836]: I0217 14:24:13.095069 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-8wdwr" Feb 17 14:24:13 crc kubenswrapper[4836]: I0217 14:24:13.110443 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987464f4-zxb25"] Feb 17 14:24:13 crc kubenswrapper[4836]: I0217 14:24:13.111490 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987464f4-zxb25" Feb 17 14:24:13 crc kubenswrapper[4836]: I0217 14:24:13.124692 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-zhj69" Feb 17 14:24:13 crc kubenswrapper[4836]: I0217 14:24:13.125012 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-6xkvk" Feb 17 14:24:13 crc kubenswrapper[4836]: I0217 14:24:13.159927 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-7vwdd"] Feb 17 14:24:13 crc kubenswrapper[4836]: I0217 14:24:13.161369 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-7vwdd" Feb 17 14:24:13 crc kubenswrapper[4836]: I0217 14:24:13.169135 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-cptpb" Feb 17 14:24:13 crc kubenswrapper[4836]: I0217 14:24:13.186110 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v292x\" (UniqueName: \"kubernetes.io/projected/12cff299-e5ea-40a9-8a69-528c478cd0a0-kube-api-access-v292x\") pod \"cinder-operator-controller-manager-5d946d989d-b6cfm\" (UID: \"12cff299-e5ea-40a9-8a69-528c478cd0a0\") " pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-b6cfm" Feb 17 14:24:13 crc kubenswrapper[4836]: I0217 14:24:13.186168 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcc2j\" (UniqueName: \"kubernetes.io/projected/a7c6acc7-4243-4c0d-a723-e83dc2e054df-kube-api-access-xcc2j\") pod \"barbican-operator-controller-manager-868647ff47-54696\" (UID: \"a7c6acc7-4243-4c0d-a723-e83dc2e054df\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-54696" Feb 17 14:24:13 crc kubenswrapper[4836]: I0217 14:24:13.186318 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-8wdwr"] Feb 17 14:24:13 crc kubenswrapper[4836]: I0217 14:24:13.255116 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987464f4-zxb25"] Feb 17 14:24:13 crc kubenswrapper[4836]: I0217 14:24:13.275366 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-bv7s8"] Feb 17 14:24:13 crc kubenswrapper[4836]: I0217 14:24:13.276428 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-bv7s8" Feb 17 14:24:13 crc kubenswrapper[4836]: I0217 14:24:13.287491 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v292x\" (UniqueName: \"kubernetes.io/projected/12cff299-e5ea-40a9-8a69-528c478cd0a0-kube-api-access-v292x\") pod \"cinder-operator-controller-manager-5d946d989d-b6cfm\" (UID: \"12cff299-e5ea-40a9-8a69-528c478cd0a0\") " pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-b6cfm" Feb 17 14:24:13 crc kubenswrapper[4836]: I0217 14:24:13.287565 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcc2j\" (UniqueName: \"kubernetes.io/projected/a7c6acc7-4243-4c0d-a723-e83dc2e054df-kube-api-access-xcc2j\") pod \"barbican-operator-controller-manager-868647ff47-54696\" (UID: \"a7c6acc7-4243-4c0d-a723-e83dc2e054df\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-54696" Feb 17 14:24:13 crc kubenswrapper[4836]: I0217 14:24:13.287625 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvkhm\" (UniqueName: \"kubernetes.io/projected/ce77a6a5-95bb-4758-8a38-cdc354fd9d6c-kube-api-access-pvkhm\") pod \"glance-operator-controller-manager-77987464f4-zxb25\" (UID: \"ce77a6a5-95bb-4758-8a38-cdc354fd9d6c\") " pod="openstack-operators/glance-operator-controller-manager-77987464f4-zxb25" Feb 17 14:24:13 crc kubenswrapper[4836]: I0217 14:24:13.287673 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmz8x\" (UniqueName: \"kubernetes.io/projected/0962ca43-43c4-4884-bd8e-889835f83632-kube-api-access-bmz8x\") pod \"designate-operator-controller-manager-6d8bf5c495-8wdwr\" (UID: \"0962ca43-43c4-4884-bd8e-889835f83632\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-8wdwr" Feb 17 14:24:13 crc kubenswrapper[4836]: I0217 14:24:13.287709 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5dzj\" (UniqueName: \"kubernetes.io/projected/c3d9def3-7f53-4acc-9c46-d37ddf41e3b7-kube-api-access-s5dzj\") pod \"heat-operator-controller-manager-69f49c598c-7vwdd\" (UID: \"c3d9def3-7f53-4acc-9c46-d37ddf41e3b7\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-7vwdd" Feb 17 14:24:13 crc kubenswrapper[4836]: I0217 14:24:13.296374 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-kjtcf" Feb 17 14:24:13 crc kubenswrapper[4836]: I0217 14:24:13.296623 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-7vwdd"] Feb 17 14:24:13 crc kubenswrapper[4836]: I0217 14:24:13.345392 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-bv7s8"] Feb 17 14:24:13 crc kubenswrapper[4836]: I0217 14:24:13.351054 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcc2j\" (UniqueName: \"kubernetes.io/projected/a7c6acc7-4243-4c0d-a723-e83dc2e054df-kube-api-access-xcc2j\") pod \"barbican-operator-controller-manager-868647ff47-54696\" (UID: \"a7c6acc7-4243-4c0d-a723-e83dc2e054df\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-54696" Feb 17 14:24:13 crc kubenswrapper[4836]: I0217 14:24:13.356033 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v292x\" (UniqueName: \"kubernetes.io/projected/12cff299-e5ea-40a9-8a69-528c478cd0a0-kube-api-access-v292x\") pod \"cinder-operator-controller-manager-5d946d989d-b6cfm\" (UID: \"12cff299-e5ea-40a9-8a69-528c478cd0a0\") " pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-b6cfm" Feb 17 14:24:13 crc kubenswrapper[4836]: I0217 14:24:13.362237 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-b6cfm" Feb 17 14:24:13 crc kubenswrapper[4836]: I0217 14:24:13.390092 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvkhm\" (UniqueName: \"kubernetes.io/projected/ce77a6a5-95bb-4758-8a38-cdc354fd9d6c-kube-api-access-pvkhm\") pod \"glance-operator-controller-manager-77987464f4-zxb25\" (UID: \"ce77a6a5-95bb-4758-8a38-cdc354fd9d6c\") " pod="openstack-operators/glance-operator-controller-manager-77987464f4-zxb25" Feb 17 14:24:13 crc kubenswrapper[4836]: I0217 14:24:13.390213 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmz8x\" (UniqueName: \"kubernetes.io/projected/0962ca43-43c4-4884-bd8e-889835f83632-kube-api-access-bmz8x\") pod \"designate-operator-controller-manager-6d8bf5c495-8wdwr\" (UID: \"0962ca43-43c4-4884-bd8e-889835f83632\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-8wdwr" Feb 17 14:24:13 crc kubenswrapper[4836]: I0217 14:24:13.390268 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5dzj\" (UniqueName: \"kubernetes.io/projected/c3d9def3-7f53-4acc-9c46-d37ddf41e3b7-kube-api-access-s5dzj\") pod \"heat-operator-controller-manager-69f49c598c-7vwdd\" (UID: \"c3d9def3-7f53-4acc-9c46-d37ddf41e3b7\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-7vwdd" Feb 17 14:24:13 crc kubenswrapper[4836]: I0217 14:24:13.390371 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmrdc\" (UniqueName: \"kubernetes.io/projected/f2e6ac9f-ee72-4a28-b298-9b2f918d0c95-kube-api-access-xmrdc\") pod \"horizon-operator-controller-manager-5b9b8895d5-bv7s8\" (UID: \"f2e6ac9f-ee72-4a28-b298-9b2f918d0c95\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-bv7s8" Feb 17 14:24:13 crc kubenswrapper[4836]: I0217 14:24:13.391561 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-54696" Feb 17 14:24:13 crc kubenswrapper[4836]: I0217 14:24:13.401766 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-79d975b745-f4fvp"] Feb 17 14:24:13 crc kubenswrapper[4836]: I0217 14:24:13.403440 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-79d975b745-f4fvp" Feb 17 14:24:13 crc kubenswrapper[4836]: I0217 14:24:13.435590 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Feb 17 14:24:13 crc kubenswrapper[4836]: I0217 14:24:13.435923 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-s5464" Feb 17 14:24:13 crc kubenswrapper[4836]: I0217 14:24:13.460743 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-79d975b745-f4fvp"] Feb 17 14:24:13 crc kubenswrapper[4836]: I0217 14:24:13.487629 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvkhm\" (UniqueName: \"kubernetes.io/projected/ce77a6a5-95bb-4758-8a38-cdc354fd9d6c-kube-api-access-pvkhm\") pod \"glance-operator-controller-manager-77987464f4-zxb25\" (UID: \"ce77a6a5-95bb-4758-8a38-cdc354fd9d6c\") " pod="openstack-operators/glance-operator-controller-manager-77987464f4-zxb25" Feb 17 14:24:13 crc kubenswrapper[4836]: I0217 14:24:13.489212 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmz8x\" (UniqueName: \"kubernetes.io/projected/0962ca43-43c4-4884-bd8e-889835f83632-kube-api-access-bmz8x\") pod \"designate-operator-controller-manager-6d8bf5c495-8wdwr\" (UID: \"0962ca43-43c4-4884-bd8e-889835f83632\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-8wdwr" Feb 17 14:24:13 crc kubenswrapper[4836]: I0217 14:24:13.508804 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a1ae24b8-83c8-416d-9d39-24d84eb6cd83-cert\") pod \"infra-operator-controller-manager-79d975b745-f4fvp\" (UID: \"a1ae24b8-83c8-416d-9d39-24d84eb6cd83\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-f4fvp" Feb 17 14:24:13 crc kubenswrapper[4836]: I0217 14:24:13.509007 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9crsk\" (UniqueName: \"kubernetes.io/projected/a1ae24b8-83c8-416d-9d39-24d84eb6cd83-kube-api-access-9crsk\") pod \"infra-operator-controller-manager-79d975b745-f4fvp\" (UID: \"a1ae24b8-83c8-416d-9d39-24d84eb6cd83\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-f4fvp" Feb 17 14:24:13 crc kubenswrapper[4836]: I0217 14:24:13.509077 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmrdc\" (UniqueName: \"kubernetes.io/projected/f2e6ac9f-ee72-4a28-b298-9b2f918d0c95-kube-api-access-xmrdc\") pod \"horizon-operator-controller-manager-5b9b8895d5-bv7s8\" (UID: \"f2e6ac9f-ee72-4a28-b298-9b2f918d0c95\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-bv7s8" Feb 17 14:24:13 crc kubenswrapper[4836]: I0217 14:24:13.528240 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5dzj\" (UniqueName: \"kubernetes.io/projected/c3d9def3-7f53-4acc-9c46-d37ddf41e3b7-kube-api-access-s5dzj\") pod \"heat-operator-controller-manager-69f49c598c-7vwdd\" (UID: \"c3d9def3-7f53-4acc-9c46-d37ddf41e3b7\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-7vwdd" Feb 17 14:24:13 crc kubenswrapper[4836]: I0217 14:24:13.552346 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-k9p46"] Feb 17 14:24:13 crc kubenswrapper[4836]: I0217 14:24:13.553865 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-k9p46" Feb 17 14:24:13 crc kubenswrapper[4836]: I0217 14:24:13.568447 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-4jqnd" Feb 17 14:24:13 crc kubenswrapper[4836]: I0217 14:24:13.574006 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmrdc\" (UniqueName: \"kubernetes.io/projected/f2e6ac9f-ee72-4a28-b298-9b2f918d0c95-kube-api-access-xmrdc\") pod \"horizon-operator-controller-manager-5b9b8895d5-bv7s8\" (UID: \"f2e6ac9f-ee72-4a28-b298-9b2f918d0c95\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-bv7s8" Feb 17 14:24:13 crc kubenswrapper[4836]: I0217 14:24:13.574083 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-qnb5b"] Feb 17 14:24:13 crc kubenswrapper[4836]: I0217 14:24:13.575018 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-qnb5b" Feb 17 14:24:13 crc kubenswrapper[4836]: I0217 14:24:13.594708 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-6wzf8" Feb 17 14:24:13 crc kubenswrapper[4836]: I0217 14:24:13.597127 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-qnb5b"] Feb 17 14:24:13 crc kubenswrapper[4836]: I0217 14:24:13.611331 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-54f6768c69-6lzts"] Feb 17 14:24:13 crc kubenswrapper[4836]: I0217 14:24:13.613577 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a1ae24b8-83c8-416d-9d39-24d84eb6cd83-cert\") pod \"infra-operator-controller-manager-79d975b745-f4fvp\" (UID: \"a1ae24b8-83c8-416d-9d39-24d84eb6cd83\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-f4fvp" Feb 17 14:24:13 crc kubenswrapper[4836]: I0217 14:24:13.613782 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9crsk\" (UniqueName: \"kubernetes.io/projected/a1ae24b8-83c8-416d-9d39-24d84eb6cd83-kube-api-access-9crsk\") pod \"infra-operator-controller-manager-79d975b745-f4fvp\" (UID: \"a1ae24b8-83c8-416d-9d39-24d84eb6cd83\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-f4fvp" Feb 17 14:24:13 crc kubenswrapper[4836]: E0217 14:24:13.615882 4836 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 17 14:24:13 crc kubenswrapper[4836]: E0217 14:24:13.615985 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a1ae24b8-83c8-416d-9d39-24d84eb6cd83-cert podName:a1ae24b8-83c8-416d-9d39-24d84eb6cd83 nodeName:}" failed. No retries permitted until 2026-02-17 14:24:14.115940765 +0000 UTC m=+1080.458869034 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a1ae24b8-83c8-416d-9d39-24d84eb6cd83-cert") pod "infra-operator-controller-manager-79d975b745-f4fvp" (UID: "a1ae24b8-83c8-416d-9d39-24d84eb6cd83") : secret "infra-operator-webhook-server-cert" not found Feb 17 14:24:13 crc kubenswrapper[4836]: I0217 14:24:13.623651 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-6lzts" Feb 17 14:24:13 crc kubenswrapper[4836]: I0217 14:24:13.626479 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-bv7s8" Feb 17 14:24:13 crc kubenswrapper[4836]: I0217 14:24:13.630670 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-plc5f" Feb 17 14:24:13 crc kubenswrapper[4836]: I0217 14:24:13.630985 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-k9p46"] Feb 17 14:24:13 crc kubenswrapper[4836]: I0217 14:24:13.639521 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-zkzrs"] Feb 17 14:24:13 crc kubenswrapper[4836]: I0217 14:24:13.642695 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-zkzrs" Feb 17 14:24:13 crc kubenswrapper[4836]: I0217 14:24:13.648346 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-rrpf5" Feb 17 14:24:13 crc kubenswrapper[4836]: I0217 14:24:13.658100 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9crsk\" (UniqueName: \"kubernetes.io/projected/a1ae24b8-83c8-416d-9d39-24d84eb6cd83-kube-api-access-9crsk\") pod \"infra-operator-controller-manager-79d975b745-f4fvp\" (UID: \"a1ae24b8-83c8-416d-9d39-24d84eb6cd83\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-f4fvp" Feb 17 14:24:13 crc kubenswrapper[4836]: I0217 14:24:13.656026 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-54f6768c69-6lzts"] Feb 17 14:24:13 crc kubenswrapper[4836]: I0217 14:24:13.669512 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-zkzrs"] Feb 17 14:24:13 crc kubenswrapper[4836]: I0217 14:24:13.678016 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-64ddbf8bb-6c4rn"] Feb 17 14:24:13 crc kubenswrapper[4836]: I0217 14:24:13.679458 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-6c4rn" Feb 17 14:24:13 crc kubenswrapper[4836]: I0217 14:24:13.686612 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-lw755" Feb 17 14:24:13 crc kubenswrapper[4836]: I0217 14:24:13.692047 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-64ddbf8bb-6c4rn"] Feb 17 14:24:13 crc kubenswrapper[4836]: I0217 14:24:13.776975 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-5hz7c"] Feb 17 14:24:13 crc kubenswrapper[4836]: I0217 14:24:13.778900 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-5hz7c" Feb 17 14:24:13 crc kubenswrapper[4836]: I0217 14:24:13.781910 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-ct5wd" Feb 17 14:24:13 crc kubenswrapper[4836]: I0217 14:24:13.787458 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-5hz7c"] Feb 17 14:24:13 crc kubenswrapper[4836]: I0217 14:24:13.795374 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nnnd\" (UniqueName: \"kubernetes.io/projected/e805966b-ea22-4c2a-a6c4-3622300fcb2f-kube-api-access-4nnnd\") pod \"ironic-operator-controller-manager-554564d7fc-k9p46\" (UID: \"e805966b-ea22-4c2a-a6c4-3622300fcb2f\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-k9p46" Feb 17 14:24:13 crc kubenswrapper[4836]: I0217 14:24:13.796009 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhcrg\" (UniqueName: \"kubernetes.io/projected/18a63480-edc2-44ed-bd43-b7750f7f8f33-kube-api-access-fhcrg\") pod \"keystone-operator-controller-manager-b4d948c87-qnb5b\" (UID: \"18a63480-edc2-44ed-bd43-b7750f7f8f33\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-qnb5b" Feb 17 14:24:13 crc kubenswrapper[4836]: I0217 14:24:13.796448 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987464f4-zxb25" Feb 17 14:24:13 crc kubenswrapper[4836]: I0217 14:24:13.796636 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-8wdwr" Feb 17 14:24:13 crc kubenswrapper[4836]: I0217 14:24:13.806816 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wn8jl\" (UniqueName: \"kubernetes.io/projected/9ccd7ed5-2772-4482-af31-2578e98011fd-kube-api-access-wn8jl\") pod \"manila-operator-controller-manager-54f6768c69-6lzts\" (UID: \"9ccd7ed5-2772-4482-af31-2578e98011fd\") " pod="openstack-operators/manila-operator-controller-manager-54f6768c69-6lzts" Feb 17 14:24:13 crc kubenswrapper[4836]: I0217 14:24:13.812326 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-7vwdd" Feb 17 14:24:13 crc kubenswrapper[4836]: I0217 14:24:13.916128 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhcrg\" (UniqueName: \"kubernetes.io/projected/18a63480-edc2-44ed-bd43-b7750f7f8f33-kube-api-access-fhcrg\") pod \"keystone-operator-controller-manager-b4d948c87-qnb5b\" (UID: \"18a63480-edc2-44ed-bd43-b7750f7f8f33\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-qnb5b" Feb 17 14:24:13 crc kubenswrapper[4836]: I0217 14:24:13.916217 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wn8jl\" (UniqueName: \"kubernetes.io/projected/9ccd7ed5-2772-4482-af31-2578e98011fd-kube-api-access-wn8jl\") pod \"manila-operator-controller-manager-54f6768c69-6lzts\" (UID: \"9ccd7ed5-2772-4482-af31-2578e98011fd\") " pod="openstack-operators/manila-operator-controller-manager-54f6768c69-6lzts" Feb 17 14:24:13 crc kubenswrapper[4836]: I0217 14:24:13.950382 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-69f8888797-llzlm"] Feb 17 14:24:13 crc kubenswrapper[4836]: I0217 14:24:13.951851 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-llzlm" Feb 17 14:24:13 crc kubenswrapper[4836]: I0217 14:24:13.916288 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4nnnd\" (UniqueName: \"kubernetes.io/projected/e805966b-ea22-4c2a-a6c4-3622300fcb2f-kube-api-access-4nnnd\") pod \"ironic-operator-controller-manager-554564d7fc-k9p46\" (UID: \"e805966b-ea22-4c2a-a6c4-3622300fcb2f\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-k9p46" Feb 17 14:24:13 crc kubenswrapper[4836]: I0217 14:24:13.957575 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwb95\" (UniqueName: \"kubernetes.io/projected/3d12b131-73a0-477e-ab9e-579309b0f5b1-kube-api-access-mwb95\") pod \"neutron-operator-controller-manager-64ddbf8bb-6c4rn\" (UID: \"3d12b131-73a0-477e-ab9e-579309b0f5b1\") " pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-6c4rn" Feb 17 14:24:13 crc kubenswrapper[4836]: I0217 14:24:13.957624 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6x2cc\" (UniqueName: \"kubernetes.io/projected/7b9749c7-038f-4814-9357-623346c9172c-kube-api-access-6x2cc\") pod \"mariadb-operator-controller-manager-6994f66f48-zkzrs\" (UID: \"7b9749c7-038f-4814-9357-623346c9172c\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-zkzrs" Feb 17 14:24:13 crc kubenswrapper[4836]: I0217 14:24:13.959736 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-4f74j" Feb 17 14:24:13 crc kubenswrapper[4836]: I0217 14:24:13.974696 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wn8jl\" (UniqueName: \"kubernetes.io/projected/9ccd7ed5-2772-4482-af31-2578e98011fd-kube-api-access-wn8jl\") pod \"manila-operator-controller-manager-54f6768c69-6lzts\" (UID: \"9ccd7ed5-2772-4482-af31-2578e98011fd\") " pod="openstack-operators/manila-operator-controller-manager-54f6768c69-6lzts" Feb 17 14:24:13 crc kubenswrapper[4836]: I0217 14:24:13.977882 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhcrg\" (UniqueName: \"kubernetes.io/projected/18a63480-edc2-44ed-bd43-b7750f7f8f33-kube-api-access-fhcrg\") pod \"keystone-operator-controller-manager-b4d948c87-qnb5b\" (UID: \"18a63480-edc2-44ed-bd43-b7750f7f8f33\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-qnb5b" Feb 17 14:24:13 crc kubenswrapper[4836]: I0217 14:24:13.980381 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nnnd\" (UniqueName: \"kubernetes.io/projected/e805966b-ea22-4c2a-a6c4-3622300fcb2f-kube-api-access-4nnnd\") pod \"ironic-operator-controller-manager-554564d7fc-k9p46\" (UID: \"e805966b-ea22-4c2a-a6c4-3622300fcb2f\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-k9p46" Feb 17 14:24:13 crc kubenswrapper[4836]: I0217 14:24:13.992475 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-6lzts" Feb 17 14:24:13 crc kubenswrapper[4836]: I0217 14:24:13.993217 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-69f8888797-llzlm"] Feb 17 14:24:14 crc kubenswrapper[4836]: I0217 14:24:14.059100 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxl4z\" (UniqueName: \"kubernetes.io/projected/52a90e1a-0e2d-4488-8a1a-34de15bfa3a5-kube-api-access-jxl4z\") pod \"nova-operator-controller-manager-567668f5cf-5hz7c\" (UID: \"52a90e1a-0e2d-4488-8a1a-34de15bfa3a5\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-5hz7c" Feb 17 14:24:14 crc kubenswrapper[4836]: I0217 14:24:14.059233 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwb95\" (UniqueName: \"kubernetes.io/projected/3d12b131-73a0-477e-ab9e-579309b0f5b1-kube-api-access-mwb95\") pod \"neutron-operator-controller-manager-64ddbf8bb-6c4rn\" (UID: \"3d12b131-73a0-477e-ab9e-579309b0f5b1\") " pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-6c4rn" Feb 17 14:24:14 crc kubenswrapper[4836]: I0217 14:24:14.059268 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6x2cc\" (UniqueName: \"kubernetes.io/projected/7b9749c7-038f-4814-9357-623346c9172c-kube-api-access-6x2cc\") pod \"mariadb-operator-controller-manager-6994f66f48-zkzrs\" (UID: \"7b9749c7-038f-4814-9357-623346c9172c\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-zkzrs" Feb 17 14:24:14 crc kubenswrapper[4836]: I0217 14:24:14.059316 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxx68\" (UniqueName: \"kubernetes.io/projected/1bb12b86-1f25-4dd9-a44d-449a6deee701-kube-api-access-rxx68\") pod \"octavia-operator-controller-manager-69f8888797-llzlm\" (UID: \"1bb12b86-1f25-4dd9-a44d-449a6deee701\") " pod="openstack-operators/octavia-operator-controller-manager-69f8888797-llzlm" Feb 17 14:24:14 crc kubenswrapper[4836]: I0217 14:24:14.093384 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwb95\" (UniqueName: \"kubernetes.io/projected/3d12b131-73a0-477e-ab9e-579309b0f5b1-kube-api-access-mwb95\") pod \"neutron-operator-controller-manager-64ddbf8bb-6c4rn\" (UID: \"3d12b131-73a0-477e-ab9e-579309b0f5b1\") " pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-6c4rn" Feb 17 14:24:14 crc kubenswrapper[4836]: I0217 14:24:14.113369 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6x2cc\" (UniqueName: \"kubernetes.io/projected/7b9749c7-038f-4814-9357-623346c9172c-kube-api-access-6x2cc\") pod \"mariadb-operator-controller-manager-6994f66f48-zkzrs\" (UID: \"7b9749c7-038f-4814-9357-623346c9172c\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-zkzrs" Feb 17 14:24:14 crc kubenswrapper[4836]: I0217 14:24:14.125026 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9ct7xht"] Feb 17 14:24:14 crc kubenswrapper[4836]: I0217 14:24:14.126654 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9ct7xht" Feb 17 14:24:14 crc kubenswrapper[4836]: I0217 14:24:14.133212 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Feb 17 14:24:14 crc kubenswrapper[4836]: I0217 14:24:14.133531 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-sz5r7" Feb 17 14:24:14 crc kubenswrapper[4836]: I0217 14:24:14.138163 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-jnxzt"] Feb 17 14:24:14 crc kubenswrapper[4836]: I0217 14:24:14.146819 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-jnxzt" Feb 17 14:24:14 crc kubenswrapper[4836]: I0217 14:24:14.156404 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-6c4rn" Feb 17 14:24:14 crc kubenswrapper[4836]: I0217 14:24:14.161248 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxx68\" (UniqueName: \"kubernetes.io/projected/1bb12b86-1f25-4dd9-a44d-449a6deee701-kube-api-access-rxx68\") pod \"octavia-operator-controller-manager-69f8888797-llzlm\" (UID: \"1bb12b86-1f25-4dd9-a44d-449a6deee701\") " pod="openstack-operators/octavia-operator-controller-manager-69f8888797-llzlm" Feb 17 14:24:14 crc kubenswrapper[4836]: I0217 14:24:14.161370 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a1ae24b8-83c8-416d-9d39-24d84eb6cd83-cert\") pod \"infra-operator-controller-manager-79d975b745-f4fvp\" (UID: \"a1ae24b8-83c8-416d-9d39-24d84eb6cd83\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-f4fvp" Feb 17 14:24:14 crc kubenswrapper[4836]: I0217 14:24:14.161496 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxl4z\" (UniqueName: \"kubernetes.io/projected/52a90e1a-0e2d-4488-8a1a-34de15bfa3a5-kube-api-access-jxl4z\") pod \"nova-operator-controller-manager-567668f5cf-5hz7c\" (UID: \"52a90e1a-0e2d-4488-8a1a-34de15bfa3a5\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-5hz7c" Feb 17 14:24:14 crc kubenswrapper[4836]: E0217 14:24:14.162252 4836 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 17 14:24:14 crc kubenswrapper[4836]: I0217 14:24:14.170977 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-d44cf6b75-mq76b"] Feb 17 14:24:14 crc kubenswrapper[4836]: I0217 14:24:14.172178 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-mq76b" Feb 17 14:24:14 crc kubenswrapper[4836]: E0217 14:24:14.173453 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a1ae24b8-83c8-416d-9d39-24d84eb6cd83-cert podName:a1ae24b8-83c8-416d-9d39-24d84eb6cd83 nodeName:}" failed. No retries permitted until 2026-02-17 14:24:15.173422866 +0000 UTC m=+1081.516351125 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a1ae24b8-83c8-416d-9d39-24d84eb6cd83-cert") pod "infra-operator-controller-manager-79d975b745-f4fvp" (UID: "a1ae24b8-83c8-416d-9d39-24d84eb6cd83") : secret "infra-operator-webhook-server-cert" not found Feb 17 14:24:14 crc kubenswrapper[4836]: I0217 14:24:14.176567 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-p4gww" Feb 17 14:24:14 crc kubenswrapper[4836]: I0217 14:24:14.178378 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-p7w4w" Feb 17 14:24:14 crc kubenswrapper[4836]: I0217 14:24:14.178521 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-7ktgs"] Feb 17 14:24:14 crc kubenswrapper[4836]: I0217 14:24:14.182787 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-68f46476f-7ktgs" Feb 17 14:24:14 crc kubenswrapper[4836]: I0217 14:24:14.184957 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-jnxzt"] Feb 17 14:24:14 crc kubenswrapper[4836]: I0217 14:24:14.191642 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxl4z\" (UniqueName: \"kubernetes.io/projected/52a90e1a-0e2d-4488-8a1a-34de15bfa3a5-kube-api-access-jxl4z\") pod \"nova-operator-controller-manager-567668f5cf-5hz7c\" (UID: \"52a90e1a-0e2d-4488-8a1a-34de15bfa3a5\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-5hz7c" Feb 17 14:24:14 crc kubenswrapper[4836]: I0217 14:24:14.192710 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxx68\" (UniqueName: \"kubernetes.io/projected/1bb12b86-1f25-4dd9-a44d-449a6deee701-kube-api-access-rxx68\") pod \"octavia-operator-controller-manager-69f8888797-llzlm\" (UID: \"1bb12b86-1f25-4dd9-a44d-449a6deee701\") " pod="openstack-operators/octavia-operator-controller-manager-69f8888797-llzlm" Feb 17 14:24:14 crc kubenswrapper[4836]: I0217 14:24:14.204762 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-bwcmk" Feb 17 14:24:14 crc kubenswrapper[4836]: I0217 14:24:14.208553 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9ct7xht"] Feb 17 14:24:14 crc kubenswrapper[4836]: I0217 14:24:14.228181 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-k9p46" Feb 17 14:24:14 crc kubenswrapper[4836]: I0217 14:24:14.237666 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6d6964fcdb-rbq62"] Feb 17 14:24:14 crc kubenswrapper[4836]: I0217 14:24:14.241394 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-6d6964fcdb-rbq62" Feb 17 14:24:14 crc kubenswrapper[4836]: I0217 14:24:14.250475 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-7k58q" Feb 17 14:24:14 crc kubenswrapper[4836]: I0217 14:24:14.257355 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-5hz7c" Feb 17 14:24:14 crc kubenswrapper[4836]: I0217 14:24:14.261752 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-qnb5b" Feb 17 14:24:14 crc kubenswrapper[4836]: I0217 14:24:14.265058 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzgjp\" (UniqueName: \"kubernetes.io/projected/4affaaf4-1113-4635-b30f-da26e04f6662-kube-api-access-fzgjp\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9ct7xht\" (UID: \"4affaaf4-1113-4635-b30f-da26e04f6662\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9ct7xht" Feb 17 14:24:14 crc kubenswrapper[4836]: I0217 14:24:14.265134 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fb76g\" (UniqueName: \"kubernetes.io/projected/cf7c4631-b19a-4160-8581-15f72869a60b-kube-api-access-fb76g\") pod \"placement-operator-controller-manager-8497b45c89-jnxzt\" (UID: \"cf7c4631-b19a-4160-8581-15f72869a60b\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-jnxzt" Feb 17 14:24:14 crc kubenswrapper[4836]: I0217 14:24:14.265338 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4affaaf4-1113-4635-b30f-da26e04f6662-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9ct7xht\" (UID: \"4affaaf4-1113-4635-b30f-da26e04f6662\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9ct7xht" Feb 17 14:24:14 crc kubenswrapper[4836]: I0217 14:24:14.265397 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kv6mg\" (UniqueName: \"kubernetes.io/projected/d0c3c41c-ac60-40f0-bdfb-8fe641c9426a-kube-api-access-kv6mg\") pod \"swift-operator-controller-manager-68f46476f-7ktgs\" (UID: \"d0c3c41c-ac60-40f0-bdfb-8fe641c9426a\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-7ktgs" Feb 17 14:24:14 crc kubenswrapper[4836]: I0217 14:24:14.265444 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqpqt\" (UniqueName: \"kubernetes.io/projected/f6ba6343-872d-4e36-accf-959bb437f82d-kube-api-access-xqpqt\") pod \"ovn-operator-controller-manager-d44cf6b75-mq76b\" (UID: \"f6ba6343-872d-4e36-accf-959bb437f82d\") " pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-mq76b" Feb 17 14:24:14 crc kubenswrapper[4836]: I0217 14:24:14.318392 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-llzlm" Feb 17 14:24:14 crc kubenswrapper[4836]: I0217 14:24:14.319474 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-d44cf6b75-mq76b"] Feb 17 14:24:14 crc kubenswrapper[4836]: I0217 14:24:14.346146 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-7ktgs"] Feb 17 14:24:14 crc kubenswrapper[4836]: I0217 14:24:14.367065 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4affaaf4-1113-4635-b30f-da26e04f6662-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9ct7xht\" (UID: \"4affaaf4-1113-4635-b30f-da26e04f6662\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9ct7xht" Feb 17 14:24:14 crc kubenswrapper[4836]: I0217 14:24:14.367136 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kv6mg\" (UniqueName: \"kubernetes.io/projected/d0c3c41c-ac60-40f0-bdfb-8fe641c9426a-kube-api-access-kv6mg\") pod \"swift-operator-controller-manager-68f46476f-7ktgs\" (UID: \"d0c3c41c-ac60-40f0-bdfb-8fe641c9426a\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-7ktgs" Feb 17 14:24:14 crc kubenswrapper[4836]: I0217 14:24:14.367184 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqpqt\" (UniqueName: \"kubernetes.io/projected/f6ba6343-872d-4e36-accf-959bb437f82d-kube-api-access-xqpqt\") pod \"ovn-operator-controller-manager-d44cf6b75-mq76b\" (UID: \"f6ba6343-872d-4e36-accf-959bb437f82d\") " pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-mq76b" Feb 17 14:24:14 crc kubenswrapper[4836]: I0217 14:24:14.367244 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzgjp\" (UniqueName: \"kubernetes.io/projected/4affaaf4-1113-4635-b30f-da26e04f6662-kube-api-access-fzgjp\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9ct7xht\" (UID: \"4affaaf4-1113-4635-b30f-da26e04f6662\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9ct7xht" Feb 17 14:24:14 crc kubenswrapper[4836]: I0217 14:24:14.367283 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fb76g\" (UniqueName: \"kubernetes.io/projected/cf7c4631-b19a-4160-8581-15f72869a60b-kube-api-access-fb76g\") pod \"placement-operator-controller-manager-8497b45c89-jnxzt\" (UID: \"cf7c4631-b19a-4160-8581-15f72869a60b\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-jnxzt" Feb 17 14:24:14 crc kubenswrapper[4836]: I0217 14:24:14.367434 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzgtd\" (UniqueName: \"kubernetes.io/projected/a3c22d9b-6ba0-4dd2-861d-8685c18e9330-kube-api-access-mzgtd\") pod \"telemetry-operator-controller-manager-6d6964fcdb-rbq62\" (UID: \"a3c22d9b-6ba0-4dd2-861d-8685c18e9330\") " pod="openstack-operators/telemetry-operator-controller-manager-6d6964fcdb-rbq62" Feb 17 14:24:14 crc kubenswrapper[4836]: E0217 14:24:14.367684 4836 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 17 14:24:14 crc kubenswrapper[4836]: E0217 14:24:14.367769 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4affaaf4-1113-4635-b30f-da26e04f6662-cert podName:4affaaf4-1113-4635-b30f-da26e04f6662 nodeName:}" failed. No retries permitted until 2026-02-17 14:24:14.86773602 +0000 UTC m=+1081.210664289 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4affaaf4-1113-4635-b30f-da26e04f6662-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9ct7xht" (UID: "4affaaf4-1113-4635-b30f-da26e04f6662") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 17 14:24:14 crc kubenswrapper[4836]: I0217 14:24:14.374434 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-7866795846-ztvz2"] Feb 17 14:24:14 crc kubenswrapper[4836]: I0217 14:24:14.375986 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-7866795846-ztvz2" Feb 17 14:24:14 crc kubenswrapper[4836]: I0217 14:24:14.389731 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-qgk6f" Feb 17 14:24:14 crc kubenswrapper[4836]: I0217 14:24:14.397617 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-zkzrs" Feb 17 14:24:14 crc kubenswrapper[4836]: I0217 14:24:14.401310 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzgjp\" (UniqueName: \"kubernetes.io/projected/4affaaf4-1113-4635-b30f-da26e04f6662-kube-api-access-fzgjp\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9ct7xht\" (UID: \"4affaaf4-1113-4635-b30f-da26e04f6662\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9ct7xht" Feb 17 14:24:14 crc kubenswrapper[4836]: I0217 14:24:14.403914 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6d6964fcdb-rbq62"] Feb 17 14:24:14 crc kubenswrapper[4836]: I0217 14:24:14.418743 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqpqt\" (UniqueName: \"kubernetes.io/projected/f6ba6343-872d-4e36-accf-959bb437f82d-kube-api-access-xqpqt\") pod \"ovn-operator-controller-manager-d44cf6b75-mq76b\" (UID: \"f6ba6343-872d-4e36-accf-959bb437f82d\") " pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-mq76b" Feb 17 14:24:14 crc kubenswrapper[4836]: I0217 14:24:14.428388 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kv6mg\" (UniqueName: \"kubernetes.io/projected/d0c3c41c-ac60-40f0-bdfb-8fe641c9426a-kube-api-access-kv6mg\") pod \"swift-operator-controller-manager-68f46476f-7ktgs\" (UID: \"d0c3c41c-ac60-40f0-bdfb-8fe641c9426a\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-7ktgs" Feb 17 14:24:14 crc kubenswrapper[4836]: I0217 14:24:14.428555 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fb76g\" (UniqueName: \"kubernetes.io/projected/cf7c4631-b19a-4160-8581-15f72869a60b-kube-api-access-fb76g\") pod \"placement-operator-controller-manager-8497b45c89-jnxzt\" (UID: \"cf7c4631-b19a-4160-8581-15f72869a60b\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-jnxzt" Feb 17 14:24:14 crc kubenswrapper[4836]: I0217 14:24:14.469405 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzgtd\" (UniqueName: \"kubernetes.io/projected/a3c22d9b-6ba0-4dd2-861d-8685c18e9330-kube-api-access-mzgtd\") pod \"telemetry-operator-controller-manager-6d6964fcdb-rbq62\" (UID: \"a3c22d9b-6ba0-4dd2-861d-8685c18e9330\") " pod="openstack-operators/telemetry-operator-controller-manager-6d6964fcdb-rbq62" Feb 17 14:24:14 crc kubenswrapper[4836]: I0217 14:24:14.470157 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9gnl\" (UniqueName: \"kubernetes.io/projected/d4aa765a-0f56-4f05-b02f-f041841bc97d-kube-api-access-j9gnl\") pod \"test-operator-controller-manager-7866795846-ztvz2\" (UID: \"d4aa765a-0f56-4f05-b02f-f041841bc97d\") " pod="openstack-operators/test-operator-controller-manager-7866795846-ztvz2" Feb 17 14:24:14 crc kubenswrapper[4836]: I0217 14:24:14.482342 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-7866795846-ztvz2"] Feb 17 14:24:14 crc kubenswrapper[4836]: I0217 14:24:14.492232 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzgtd\" (UniqueName: \"kubernetes.io/projected/a3c22d9b-6ba0-4dd2-861d-8685c18e9330-kube-api-access-mzgtd\") pod \"telemetry-operator-controller-manager-6d6964fcdb-rbq62\" (UID: \"a3c22d9b-6ba0-4dd2-861d-8685c18e9330\") " pod="openstack-operators/telemetry-operator-controller-manager-6d6964fcdb-rbq62" Feb 17 14:24:14 crc kubenswrapper[4836]: I0217 14:24:14.527544 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-5db88f68c-lmtng"] Feb 17 14:24:14 crc kubenswrapper[4836]: I0217 14:24:14.529480 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-lmtng" Feb 17 14:24:14 crc kubenswrapper[4836]: I0217 14:24:14.534798 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-nvvvm" Feb 17 14:24:14 crc kubenswrapper[4836]: I0217 14:24:14.534878 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-p4gww" Feb 17 14:24:14 crc kubenswrapper[4836]: I0217 14:24:14.539481 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-jnxzt" Feb 17 14:24:14 crc kubenswrapper[4836]: I0217 14:24:14.544837 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-5db88f68c-lmtng"] Feb 17 14:24:14 crc kubenswrapper[4836]: I0217 14:24:14.565634 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-p7w4w" Feb 17 14:24:14 crc kubenswrapper[4836]: I0217 14:24:14.571633 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9gnl\" (UniqueName: \"kubernetes.io/projected/d4aa765a-0f56-4f05-b02f-f041841bc97d-kube-api-access-j9gnl\") pod \"test-operator-controller-manager-7866795846-ztvz2\" (UID: \"d4aa765a-0f56-4f05-b02f-f041841bc97d\") " pod="openstack-operators/test-operator-controller-manager-7866795846-ztvz2" Feb 17 14:24:14 crc kubenswrapper[4836]: I0217 14:24:14.571737 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nf6h8\" (UniqueName: \"kubernetes.io/projected/1f238b1a-4c0c-45de-bb7a-12946f426b89-kube-api-access-nf6h8\") pod \"watcher-operator-controller-manager-5db88f68c-lmtng\" (UID: \"1f238b1a-4c0c-45de-bb7a-12946f426b89\") " pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-lmtng" Feb 17 14:24:14 crc kubenswrapper[4836]: I0217 14:24:14.574469 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-mq76b" Feb 17 14:24:14 crc kubenswrapper[4836]: I0217 14:24:14.602213 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-bwcmk" Feb 17 14:24:14 crc kubenswrapper[4836]: I0217 14:24:14.606833 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9gnl\" (UniqueName: \"kubernetes.io/projected/d4aa765a-0f56-4f05-b02f-f041841bc97d-kube-api-access-j9gnl\") pod \"test-operator-controller-manager-7866795846-ztvz2\" (UID: \"d4aa765a-0f56-4f05-b02f-f041841bc97d\") " pod="openstack-operators/test-operator-controller-manager-7866795846-ztvz2" Feb 17 14:24:14 crc kubenswrapper[4836]: I0217 14:24:14.617629 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-68f46476f-7ktgs" Feb 17 14:24:14 crc kubenswrapper[4836]: I0217 14:24:14.619768 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-7k58q" Feb 17 14:24:14 crc kubenswrapper[4836]: I0217 14:24:14.628773 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-6d6964fcdb-rbq62" Feb 17 14:24:14 crc kubenswrapper[4836]: I0217 14:24:14.679413 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nf6h8\" (UniqueName: \"kubernetes.io/projected/1f238b1a-4c0c-45de-bb7a-12946f426b89-kube-api-access-nf6h8\") pod \"watcher-operator-controller-manager-5db88f68c-lmtng\" (UID: \"1f238b1a-4c0c-45de-bb7a-12946f426b89\") " pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-lmtng" Feb 17 14:24:14 crc kubenswrapper[4836]: I0217 14:24:14.794131 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-667f54696f-kskgn"] Feb 17 14:24:14 crc kubenswrapper[4836]: I0217 14:24:14.804857 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nf6h8\" (UniqueName: \"kubernetes.io/projected/1f238b1a-4c0c-45de-bb7a-12946f426b89-kube-api-access-nf6h8\") pod \"watcher-operator-controller-manager-5db88f68c-lmtng\" (UID: \"1f238b1a-4c0c-45de-bb7a-12946f426b89\") " pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-lmtng" Feb 17 14:24:14 crc kubenswrapper[4836]: I0217 14:24:14.816798 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-qgk6f" Feb 17 14:24:14 crc kubenswrapper[4836]: I0217 14:24:14.817886 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-667f54696f-kskgn"] Feb 17 14:24:14 crc kubenswrapper[4836]: I0217 14:24:14.817955 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-b6cfm" event={"ID":"12cff299-e5ea-40a9-8a69-528c478cd0a0","Type":"ContainerStarted","Data":"f0e2544dbf0606b5855b9877ed7f1c369ec341b58e357490987b5a8c45726507"} Feb 17 14:24:14 crc kubenswrapper[4836]: I0217 14:24:14.818002 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-w4dds"] Feb 17 14:24:14 crc kubenswrapper[4836]: I0217 14:24:14.818681 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-667f54696f-kskgn" Feb 17 14:24:14 crc kubenswrapper[4836]: I0217 14:24:14.819055 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-w4dds" Feb 17 14:24:14 crc kubenswrapper[4836]: I0217 14:24:14.823711 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-7866795846-ztvz2" Feb 17 14:24:14 crc kubenswrapper[4836]: I0217 14:24:14.826746 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Feb 17 14:24:14 crc kubenswrapper[4836]: I0217 14:24:14.827010 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-zv8s6" Feb 17 14:24:14 crc kubenswrapper[4836]: I0217 14:24:14.827134 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Feb 17 14:24:14 crc kubenswrapper[4836]: I0217 14:24:14.827271 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-7vtfx" Feb 17 14:24:14 crc kubenswrapper[4836]: I0217 14:24:14.828815 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-lmtng" Feb 17 14:24:14 crc kubenswrapper[4836]: I0217 14:24:14.828818 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-w4dds"] Feb 17 14:24:14 crc kubenswrapper[4836]: I0217 14:24:14.892699 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7wzm\" (UniqueName: \"kubernetes.io/projected/d423f7ba-2751-4d99-8102-3bc52b302161-kube-api-access-c7wzm\") pod \"rabbitmq-cluster-operator-manager-668c99d594-w4dds\" (UID: \"d423f7ba-2751-4d99-8102-3bc52b302161\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-w4dds" Feb 17 14:24:14 crc kubenswrapper[4836]: I0217 14:24:14.892770 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ca022a36-1c0e-4d3b-a6cf-87f4a78cfd48-metrics-certs\") pod \"openstack-operator-controller-manager-667f54696f-kskgn\" (UID: \"ca022a36-1c0e-4d3b-a6cf-87f4a78cfd48\") " pod="openstack-operators/openstack-operator-controller-manager-667f54696f-kskgn" Feb 17 14:24:14 crc kubenswrapper[4836]: I0217 14:24:14.892851 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ca022a36-1c0e-4d3b-a6cf-87f4a78cfd48-webhook-certs\") pod \"openstack-operator-controller-manager-667f54696f-kskgn\" (UID: \"ca022a36-1c0e-4d3b-a6cf-87f4a78cfd48\") " pod="openstack-operators/openstack-operator-controller-manager-667f54696f-kskgn" Feb 17 14:24:14 crc kubenswrapper[4836]: I0217 14:24:14.892920 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-msdtf\" (UniqueName: \"kubernetes.io/projected/ca022a36-1c0e-4d3b-a6cf-87f4a78cfd48-kube-api-access-msdtf\") pod \"openstack-operator-controller-manager-667f54696f-kskgn\" (UID: \"ca022a36-1c0e-4d3b-a6cf-87f4a78cfd48\") " pod="openstack-operators/openstack-operator-controller-manager-667f54696f-kskgn" Feb 17 14:24:14 crc kubenswrapper[4836]: I0217 14:24:14.894058 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4affaaf4-1113-4635-b30f-da26e04f6662-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9ct7xht\" (UID: \"4affaaf4-1113-4635-b30f-da26e04f6662\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9ct7xht" Feb 17 14:24:14 crc kubenswrapper[4836]: E0217 14:24:14.894429 4836 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 17 14:24:14 crc kubenswrapper[4836]: E0217 14:24:14.894509 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4affaaf4-1113-4635-b30f-da26e04f6662-cert podName:4affaaf4-1113-4635-b30f-da26e04f6662 nodeName:}" failed. No retries permitted until 2026-02-17 14:24:15.894481193 +0000 UTC m=+1082.237409462 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4affaaf4-1113-4635-b30f-da26e04f6662-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9ct7xht" (UID: "4affaaf4-1113-4635-b30f-da26e04f6662") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 17 14:24:14 crc kubenswrapper[4836]: I0217 14:24:14.942747 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-5d946d989d-b6cfm"] Feb 17 14:24:14 crc kubenswrapper[4836]: I0217 14:24:14.950175 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-54696"] Feb 17 14:24:14 crc kubenswrapper[4836]: W0217 14:24:14.955196 4836 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf2e6ac9f_ee72_4a28_b298_9b2f918d0c95.slice/crio-1c5266b110bd54fbb84c71c41cbfe10738eb0ef0054b9bc9159f134c8b2ea0dc WatchSource:0}: Error finding container 1c5266b110bd54fbb84c71c41cbfe10738eb0ef0054b9bc9159f134c8b2ea0dc: Status 404 returned error can't find the container with id 1c5266b110bd54fbb84c71c41cbfe10738eb0ef0054b9bc9159f134c8b2ea0dc Feb 17 14:24:15 crc kubenswrapper[4836]: I0217 14:24:14.994890 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-8wdwr"] Feb 17 14:24:15 crc kubenswrapper[4836]: I0217 14:24:14.995285 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-msdtf\" (UniqueName: \"kubernetes.io/projected/ca022a36-1c0e-4d3b-a6cf-87f4a78cfd48-kube-api-access-msdtf\") pod \"openstack-operator-controller-manager-667f54696f-kskgn\" (UID: \"ca022a36-1c0e-4d3b-a6cf-87f4a78cfd48\") " pod="openstack-operators/openstack-operator-controller-manager-667f54696f-kskgn" Feb 17 14:24:15 crc kubenswrapper[4836]: I0217 14:24:14.995415 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7wzm\" (UniqueName: \"kubernetes.io/projected/d423f7ba-2751-4d99-8102-3bc52b302161-kube-api-access-c7wzm\") pod \"rabbitmq-cluster-operator-manager-668c99d594-w4dds\" (UID: \"d423f7ba-2751-4d99-8102-3bc52b302161\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-w4dds" Feb 17 14:24:15 crc kubenswrapper[4836]: I0217 14:24:14.995453 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ca022a36-1c0e-4d3b-a6cf-87f4a78cfd48-metrics-certs\") pod \"openstack-operator-controller-manager-667f54696f-kskgn\" (UID: \"ca022a36-1c0e-4d3b-a6cf-87f4a78cfd48\") " pod="openstack-operators/openstack-operator-controller-manager-667f54696f-kskgn" Feb 17 14:24:15 crc kubenswrapper[4836]: I0217 14:24:14.995497 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ca022a36-1c0e-4d3b-a6cf-87f4a78cfd48-webhook-certs\") pod \"openstack-operator-controller-manager-667f54696f-kskgn\" (UID: \"ca022a36-1c0e-4d3b-a6cf-87f4a78cfd48\") " pod="openstack-operators/openstack-operator-controller-manager-667f54696f-kskgn" Feb 17 14:24:15 crc kubenswrapper[4836]: E0217 14:24:14.995736 4836 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 17 14:24:15 crc kubenswrapper[4836]: E0217 14:24:14.995829 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ca022a36-1c0e-4d3b-a6cf-87f4a78cfd48-webhook-certs podName:ca022a36-1c0e-4d3b-a6cf-87f4a78cfd48 nodeName:}" failed. No retries permitted until 2026-02-17 14:24:15.49580387 +0000 UTC m=+1081.838732139 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/ca022a36-1c0e-4d3b-a6cf-87f4a78cfd48-webhook-certs") pod "openstack-operator-controller-manager-667f54696f-kskgn" (UID: "ca022a36-1c0e-4d3b-a6cf-87f4a78cfd48") : secret "webhook-server-cert" not found Feb 17 14:24:15 crc kubenswrapper[4836]: E0217 14:24:14.996380 4836 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 17 14:24:15 crc kubenswrapper[4836]: E0217 14:24:14.996409 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ca022a36-1c0e-4d3b-a6cf-87f4a78cfd48-metrics-certs podName:ca022a36-1c0e-4d3b-a6cf-87f4a78cfd48 nodeName:}" failed. No retries permitted until 2026-02-17 14:24:15.496400657 +0000 UTC m=+1081.839328926 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ca022a36-1c0e-4d3b-a6cf-87f4a78cfd48-metrics-certs") pod "openstack-operator-controller-manager-667f54696f-kskgn" (UID: "ca022a36-1c0e-4d3b-a6cf-87f4a78cfd48") : secret "metrics-server-cert" not found Feb 17 14:24:15 crc kubenswrapper[4836]: I0217 14:24:15.039543 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-msdtf\" (UniqueName: \"kubernetes.io/projected/ca022a36-1c0e-4d3b-a6cf-87f4a78cfd48-kube-api-access-msdtf\") pod \"openstack-operator-controller-manager-667f54696f-kskgn\" (UID: \"ca022a36-1c0e-4d3b-a6cf-87f4a78cfd48\") " pod="openstack-operators/openstack-operator-controller-manager-667f54696f-kskgn" Feb 17 14:24:15 crc kubenswrapper[4836]: I0217 14:24:15.043969 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7wzm\" (UniqueName: \"kubernetes.io/projected/d423f7ba-2751-4d99-8102-3bc52b302161-kube-api-access-c7wzm\") pod \"rabbitmq-cluster-operator-manager-668c99d594-w4dds\" (UID: \"d423f7ba-2751-4d99-8102-3bc52b302161\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-w4dds" Feb 17 14:24:15 crc kubenswrapper[4836]: I0217 14:24:15.048149 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-bv7s8"] Feb 17 14:24:15 crc kubenswrapper[4836]: W0217 14:24:15.066028 4836 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc3d9def3_7f53_4acc_9c46_d37ddf41e3b7.slice/crio-f26465dbce79c4b5ef61f928c2a02c31f40927ec75eefaf99a897962ce499a52 WatchSource:0}: Error finding container f26465dbce79c4b5ef61f928c2a02c31f40927ec75eefaf99a897962ce499a52: Status 404 returned error can't find the container with id f26465dbce79c4b5ef61f928c2a02c31f40927ec75eefaf99a897962ce499a52 Feb 17 14:24:15 crc kubenswrapper[4836]: W0217 14:24:15.083769 4836 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce77a6a5_95bb_4758_8a38_cdc354fd9d6c.slice/crio-d824721a82bfdf10244745d8cd3c55eae95498b0fa8b0395b92cd93df225144b WatchSource:0}: Error finding container d824721a82bfdf10244745d8cd3c55eae95498b0fa8b0395b92cd93df225144b: Status 404 returned error can't find the container with id d824721a82bfdf10244745d8cd3c55eae95498b0fa8b0395b92cd93df225144b Feb 17 14:24:15 crc kubenswrapper[4836]: W0217 14:24:15.087366 4836 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ccd7ed5_2772_4482_af31_2578e98011fd.slice/crio-d8e81765f872a70ad232d847ea502382138fc1e8a48c4df928c04a6c2002df5e WatchSource:0}: Error finding container d8e81765f872a70ad232d847ea502382138fc1e8a48c4df928c04a6c2002df5e: Status 404 returned error can't find the container with id d8e81765f872a70ad232d847ea502382138fc1e8a48c4df928c04a6c2002df5e Feb 17 14:24:15 crc kubenswrapper[4836]: I0217 14:24:15.095453 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987464f4-zxb25"] Feb 17 14:24:15 crc kubenswrapper[4836]: I0217 14:24:15.107216 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-7vwdd"] Feb 17 14:24:15 crc kubenswrapper[4836]: I0217 14:24:15.113925 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-54f6768c69-6lzts"] Feb 17 14:24:15 crc kubenswrapper[4836]: I0217 14:24:15.198794 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a1ae24b8-83c8-416d-9d39-24d84eb6cd83-cert\") pod \"infra-operator-controller-manager-79d975b745-f4fvp\" (UID: \"a1ae24b8-83c8-416d-9d39-24d84eb6cd83\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-f4fvp" Feb 17 14:24:15 crc kubenswrapper[4836]: E0217 14:24:15.199103 4836 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 17 14:24:15 crc kubenswrapper[4836]: E0217 14:24:15.199231 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a1ae24b8-83c8-416d-9d39-24d84eb6cd83-cert podName:a1ae24b8-83c8-416d-9d39-24d84eb6cd83 nodeName:}" failed. No retries permitted until 2026-02-17 14:24:17.199199706 +0000 UTC m=+1083.542127975 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a1ae24b8-83c8-416d-9d39-24d84eb6cd83-cert") pod "infra-operator-controller-manager-79d975b745-f4fvp" (UID: "a1ae24b8-83c8-416d-9d39-24d84eb6cd83") : secret "infra-operator-webhook-server-cert" not found Feb 17 14:24:15 crc kubenswrapper[4836]: I0217 14:24:15.250184 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-w4dds" Feb 17 14:24:15 crc kubenswrapper[4836]: I0217 14:24:15.298870 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-64ddbf8bb-6c4rn"] Feb 17 14:24:15 crc kubenswrapper[4836]: I0217 14:24:15.513855 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ca022a36-1c0e-4d3b-a6cf-87f4a78cfd48-metrics-certs\") pod \"openstack-operator-controller-manager-667f54696f-kskgn\" (UID: \"ca022a36-1c0e-4d3b-a6cf-87f4a78cfd48\") " pod="openstack-operators/openstack-operator-controller-manager-667f54696f-kskgn" Feb 17 14:24:15 crc kubenswrapper[4836]: I0217 14:24:15.514599 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ca022a36-1c0e-4d3b-a6cf-87f4a78cfd48-webhook-certs\") pod \"openstack-operator-controller-manager-667f54696f-kskgn\" (UID: \"ca022a36-1c0e-4d3b-a6cf-87f4a78cfd48\") " pod="openstack-operators/openstack-operator-controller-manager-667f54696f-kskgn" Feb 17 14:24:15 crc kubenswrapper[4836]: E0217 14:24:15.514500 4836 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 17 14:24:15 crc kubenswrapper[4836]: E0217 14:24:15.514850 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ca022a36-1c0e-4d3b-a6cf-87f4a78cfd48-metrics-certs podName:ca022a36-1c0e-4d3b-a6cf-87f4a78cfd48 nodeName:}" failed. No retries permitted until 2026-02-17 14:24:16.514828588 +0000 UTC m=+1082.857756857 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ca022a36-1c0e-4d3b-a6cf-87f4a78cfd48-metrics-certs") pod "openstack-operator-controller-manager-667f54696f-kskgn" (UID: "ca022a36-1c0e-4d3b-a6cf-87f4a78cfd48") : secret "metrics-server-cert" not found Feb 17 14:24:15 crc kubenswrapper[4836]: E0217 14:24:15.519041 4836 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 17 14:24:15 crc kubenswrapper[4836]: E0217 14:24:15.519185 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ca022a36-1c0e-4d3b-a6cf-87f4a78cfd48-webhook-certs podName:ca022a36-1c0e-4d3b-a6cf-87f4a78cfd48 nodeName:}" failed. No retries permitted until 2026-02-17 14:24:16.519151723 +0000 UTC m=+1082.862079982 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/ca022a36-1c0e-4d3b-a6cf-87f4a78cfd48-webhook-certs") pod "openstack-operator-controller-manager-667f54696f-kskgn" (UID: "ca022a36-1c0e-4d3b-a6cf-87f4a78cfd48") : secret "webhook-server-cert" not found Feb 17 14:24:15 crc kubenswrapper[4836]: I0217 14:24:15.647025 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-k9p46"] Feb 17 14:24:15 crc kubenswrapper[4836]: I0217 14:24:15.666117 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-d44cf6b75-mq76b"] Feb 17 14:24:15 crc kubenswrapper[4836]: I0217 14:24:15.700779 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-69f8888797-llzlm"] Feb 17 14:24:15 crc kubenswrapper[4836]: I0217 14:24:15.711138 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-qnb5b"] Feb 17 14:24:15 crc kubenswrapper[4836]: I0217 14:24:15.719993 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-5hz7c"] Feb 17 14:24:15 crc kubenswrapper[4836]: I0217 14:24:15.736614 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-zkzrs"] Feb 17 14:24:15 crc kubenswrapper[4836]: W0217 14:24:15.772686 4836 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1bb12b86_1f25_4dd9_a44d_449a6deee701.slice/crio-40b90be755541b9ddeaa9549b544f011bdd8e2685e2138b6bdb095591a633653 WatchSource:0}: Error finding container 40b90be755541b9ddeaa9549b544f011bdd8e2685e2138b6bdb095591a633653: Status 404 returned error can't find the container with id 40b90be755541b9ddeaa9549b544f011bdd8e2685e2138b6bdb095591a633653 Feb 17 14:24:15 crc kubenswrapper[4836]: I0217 14:24:15.819607 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-6c4rn" event={"ID":"3d12b131-73a0-477e-ab9e-579309b0f5b1","Type":"ContainerStarted","Data":"c8912a4bc2b101eba8bbee5f1f7afc6d900d84969b246a0e6abb7d5c0cd1df2d"} Feb 17 14:24:15 crc kubenswrapper[4836]: I0217 14:24:15.831497 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-6lzts" event={"ID":"9ccd7ed5-2772-4482-af31-2578e98011fd","Type":"ContainerStarted","Data":"d8e81765f872a70ad232d847ea502382138fc1e8a48c4df928c04a6c2002df5e"} Feb 17 14:24:15 crc kubenswrapper[4836]: I0217 14:24:15.841758 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6d6964fcdb-rbq62"] Feb 17 14:24:15 crc kubenswrapper[4836]: I0217 14:24:15.849481 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-5db88f68c-lmtng"] Feb 17 14:24:15 crc kubenswrapper[4836]: I0217 14:24:15.849677 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-8wdwr" event={"ID":"0962ca43-43c4-4884-bd8e-889835f83632","Type":"ContainerStarted","Data":"f394b4f4f43975965ffb40a146c483db0820fddb1dafee6dde3e2b1a9ffb53f9"} Feb 17 14:24:15 crc kubenswrapper[4836]: I0217 14:24:15.863856 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-5hz7c" event={"ID":"52a90e1a-0e2d-4488-8a1a-34de15bfa3a5","Type":"ContainerStarted","Data":"7b277bafe1bdb6ad7d8d85eb8eb55e3fed5a8cf1ca8b1e29105ae1ba7b762ecc"} Feb 17 14:24:15 crc kubenswrapper[4836]: I0217 14:24:15.869019 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-llzlm" event={"ID":"1bb12b86-1f25-4dd9-a44d-449a6deee701","Type":"ContainerStarted","Data":"40b90be755541b9ddeaa9549b544f011bdd8e2685e2138b6bdb095591a633653"} Feb 17 14:24:15 crc kubenswrapper[4836]: I0217 14:24:15.873341 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-jnxzt"] Feb 17 14:24:15 crc kubenswrapper[4836]: I0217 14:24:15.882778 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-k9p46" event={"ID":"e805966b-ea22-4c2a-a6c4-3622300fcb2f","Type":"ContainerStarted","Data":"ad645d8a39c485ac1537e0c873a6462efe04d66ebbd924bc5bbc4a3bfa35f8c3"} Feb 17 14:24:15 crc kubenswrapper[4836]: I0217 14:24:15.889542 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-54696" event={"ID":"a7c6acc7-4243-4c0d-a723-e83dc2e054df","Type":"ContainerStarted","Data":"ec05a892d51c7f23653352e29351486f6303b3c71b861fa1a6b3fc41171fa4c0"} Feb 17 14:24:15 crc kubenswrapper[4836]: I0217 14:24:15.894303 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-mq76b" event={"ID":"f6ba6343-872d-4e36-accf-959bb437f82d","Type":"ContainerStarted","Data":"dcbc61f55de5c6a8bc8c1201190948ae939f79d04e713c062f127c47cea3b8d2"} Feb 17 14:24:15 crc kubenswrapper[4836]: I0217 14:24:15.904510 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-7vwdd" event={"ID":"c3d9def3-7f53-4acc-9c46-d37ddf41e3b7","Type":"ContainerStarted","Data":"f26465dbce79c4b5ef61f928c2a02c31f40927ec75eefaf99a897962ce499a52"} Feb 17 14:24:15 crc kubenswrapper[4836]: I0217 14:24:15.910360 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-bv7s8" event={"ID":"f2e6ac9f-ee72-4a28-b298-9b2f918d0c95","Type":"ContainerStarted","Data":"1c5266b110bd54fbb84c71c41cbfe10738eb0ef0054b9bc9159f134c8b2ea0dc"} Feb 17 14:24:15 crc kubenswrapper[4836]: I0217 14:24:15.914240 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987464f4-zxb25" event={"ID":"ce77a6a5-95bb-4758-8a38-cdc354fd9d6c","Type":"ContainerStarted","Data":"d824721a82bfdf10244745d8cd3c55eae95498b0fa8b0395b92cd93df225144b"} Feb 17 14:24:15 crc kubenswrapper[4836]: I0217 14:24:15.924102 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4affaaf4-1113-4635-b30f-da26e04f6662-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9ct7xht\" (UID: \"4affaaf4-1113-4635-b30f-da26e04f6662\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9ct7xht" Feb 17 14:24:15 crc kubenswrapper[4836]: E0217 14:24:15.924415 4836 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 17 14:24:15 crc kubenswrapper[4836]: E0217 14:24:15.924483 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4affaaf4-1113-4635-b30f-da26e04f6662-cert podName:4affaaf4-1113-4635-b30f-da26e04f6662 nodeName:}" failed. No retries permitted until 2026-02-17 14:24:17.924466615 +0000 UTC m=+1084.267394894 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4affaaf4-1113-4635-b30f-da26e04f6662-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9ct7xht" (UID: "4affaaf4-1113-4635-b30f-da26e04f6662") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 17 14:24:15 crc kubenswrapper[4836]: W0217 14:24:15.941448 4836 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf7c4631_b19a_4160_8581_15f72869a60b.slice/crio-34932151fc3b56e1b5a94958b6f702e6c96c37cce853efb4e3e302718aa28b8e WatchSource:0}: Error finding container 34932151fc3b56e1b5a94958b6f702e6c96c37cce853efb4e3e302718aa28b8e: Status 404 returned error can't find the container with id 34932151fc3b56e1b5a94958b6f702e6c96c37cce853efb4e3e302718aa28b8e Feb 17 14:24:15 crc kubenswrapper[4836]: E0217 14:24:15.948026 4836 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:a57336b9f95b703f80453db87e43a2834ca1bdc89480796d28ebbe0a9702ecfd,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fb76g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-8497b45c89-jnxzt_openstack-operators(cf7c4631-b19a-4160-8581-15f72869a60b): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 17 14:24:15 crc kubenswrapper[4836]: E0217 14:24:15.952816 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-jnxzt" podUID="cf7c4631-b19a-4160-8581-15f72869a60b" Feb 17 14:24:16 crc kubenswrapper[4836]: I0217 14:24:16.009141 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-7ktgs"] Feb 17 14:24:16 crc kubenswrapper[4836]: I0217 14:24:16.065863 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-7866795846-ztvz2"] Feb 17 14:24:16 crc kubenswrapper[4836]: I0217 14:24:16.184152 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-w4dds"] Feb 17 14:24:16 crc kubenswrapper[4836]: E0217 14:24:16.221562 4836 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-c7wzm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-w4dds_openstack-operators(d423f7ba-2751-4d99-8102-3bc52b302161): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 17 14:24:16 crc kubenswrapper[4836]: E0217 14:24:16.226371 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-w4dds" podUID="d423f7ba-2751-4d99-8102-3bc52b302161" Feb 17 14:24:16 crc kubenswrapper[4836]: I0217 14:24:16.550383 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ca022a36-1c0e-4d3b-a6cf-87f4a78cfd48-metrics-certs\") pod \"openstack-operator-controller-manager-667f54696f-kskgn\" (UID: \"ca022a36-1c0e-4d3b-a6cf-87f4a78cfd48\") " pod="openstack-operators/openstack-operator-controller-manager-667f54696f-kskgn" Feb 17 14:24:16 crc kubenswrapper[4836]: I0217 14:24:16.550487 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ca022a36-1c0e-4d3b-a6cf-87f4a78cfd48-webhook-certs\") pod \"openstack-operator-controller-manager-667f54696f-kskgn\" (UID: \"ca022a36-1c0e-4d3b-a6cf-87f4a78cfd48\") " pod="openstack-operators/openstack-operator-controller-manager-667f54696f-kskgn" Feb 17 14:24:16 crc kubenswrapper[4836]: E0217 14:24:16.550709 4836 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 17 14:24:16 crc kubenswrapper[4836]: E0217 14:24:16.550800 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ca022a36-1c0e-4d3b-a6cf-87f4a78cfd48-webhook-certs podName:ca022a36-1c0e-4d3b-a6cf-87f4a78cfd48 nodeName:}" failed. No retries permitted until 2026-02-17 14:24:18.550775898 +0000 UTC m=+1084.893704167 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/ca022a36-1c0e-4d3b-a6cf-87f4a78cfd48-webhook-certs") pod "openstack-operator-controller-manager-667f54696f-kskgn" (UID: "ca022a36-1c0e-4d3b-a6cf-87f4a78cfd48") : secret "webhook-server-cert" not found Feb 17 14:24:16 crc kubenswrapper[4836]: E0217 14:24:16.550861 4836 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 17 14:24:16 crc kubenswrapper[4836]: E0217 14:24:16.550957 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ca022a36-1c0e-4d3b-a6cf-87f4a78cfd48-metrics-certs podName:ca022a36-1c0e-4d3b-a6cf-87f4a78cfd48 nodeName:}" failed. No retries permitted until 2026-02-17 14:24:18.550878431 +0000 UTC m=+1084.893806700 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ca022a36-1c0e-4d3b-a6cf-87f4a78cfd48-metrics-certs") pod "openstack-operator-controller-manager-667f54696f-kskgn" (UID: "ca022a36-1c0e-4d3b-a6cf-87f4a78cfd48") : secret "metrics-server-cert" not found Feb 17 14:24:16 crc kubenswrapper[4836]: I0217 14:24:16.931701 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-7866795846-ztvz2" event={"ID":"d4aa765a-0f56-4f05-b02f-f041841bc97d","Type":"ContainerStarted","Data":"d5e146a6374cb5855c0a40aff2316fe3979a5e72b86ecab58274bc46f95f188c"} Feb 17 14:24:16 crc kubenswrapper[4836]: I0217 14:24:16.934533 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-lmtng" event={"ID":"1f238b1a-4c0c-45de-bb7a-12946f426b89","Type":"ContainerStarted","Data":"12dc4cadb3992c9e1df317fc358c26ece1879c305e74ffac515fe389fd8acb19"} Feb 17 14:24:16 crc kubenswrapper[4836]: I0217 14:24:16.936945 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-w4dds" event={"ID":"d423f7ba-2751-4d99-8102-3bc52b302161","Type":"ContainerStarted","Data":"3285ec10c0066df96f2e89a02367d01cdf3c81fba042953fde3602d36311330c"} Feb 17 14:24:16 crc kubenswrapper[4836]: E0217 14:24:16.940638 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-w4dds" podUID="d423f7ba-2751-4d99-8102-3bc52b302161" Feb 17 14:24:16 crc kubenswrapper[4836]: I0217 14:24:16.941986 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-6d6964fcdb-rbq62" event={"ID":"a3c22d9b-6ba0-4dd2-861d-8685c18e9330","Type":"ContainerStarted","Data":"f02de2dd0dafacea8b5f5229718cea7a98f2cc7503b70fda904a4167ac903dfe"} Feb 17 14:24:16 crc kubenswrapper[4836]: I0217 14:24:16.944502 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-zkzrs" event={"ID":"7b9749c7-038f-4814-9357-623346c9172c","Type":"ContainerStarted","Data":"030565104d8040ddd5a1d3e05506bef295ad0ac9ff14c0cf1fd7cc6a3d83ae01"} Feb 17 14:24:16 crc kubenswrapper[4836]: I0217 14:24:16.946554 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-jnxzt" event={"ID":"cf7c4631-b19a-4160-8581-15f72869a60b","Type":"ContainerStarted","Data":"34932151fc3b56e1b5a94958b6f702e6c96c37cce853efb4e3e302718aa28b8e"} Feb 17 14:24:16 crc kubenswrapper[4836]: E0217 14:24:16.950211 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:a57336b9f95b703f80453db87e43a2834ca1bdc89480796d28ebbe0a9702ecfd\\\"\"" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-jnxzt" podUID="cf7c4631-b19a-4160-8581-15f72869a60b" Feb 17 14:24:16 crc kubenswrapper[4836]: I0217 14:24:16.965650 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-68f46476f-7ktgs" event={"ID":"d0c3c41c-ac60-40f0-bdfb-8fe641c9426a","Type":"ContainerStarted","Data":"58778d7f45a38abceed1743a34f03d923fd31e400a10b56b2222e8be90be5561"} Feb 17 14:24:16 crc kubenswrapper[4836]: I0217 14:24:16.982228 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-qnb5b" event={"ID":"18a63480-edc2-44ed-bd43-b7750f7f8f33","Type":"ContainerStarted","Data":"35919d25fec64b955968200dd4f791a38595555b32f227a2f54c29ec986a4484"} Feb 17 14:24:17 crc kubenswrapper[4836]: I0217 14:24:17.264523 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a1ae24b8-83c8-416d-9d39-24d84eb6cd83-cert\") pod \"infra-operator-controller-manager-79d975b745-f4fvp\" (UID: \"a1ae24b8-83c8-416d-9d39-24d84eb6cd83\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-f4fvp" Feb 17 14:24:17 crc kubenswrapper[4836]: E0217 14:24:17.264862 4836 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 17 14:24:17 crc kubenswrapper[4836]: E0217 14:24:17.264989 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a1ae24b8-83c8-416d-9d39-24d84eb6cd83-cert podName:a1ae24b8-83c8-416d-9d39-24d84eb6cd83 nodeName:}" failed. No retries permitted until 2026-02-17 14:24:21.264949612 +0000 UTC m=+1087.607878051 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a1ae24b8-83c8-416d-9d39-24d84eb6cd83-cert") pod "infra-operator-controller-manager-79d975b745-f4fvp" (UID: "a1ae24b8-83c8-416d-9d39-24d84eb6cd83") : secret "infra-operator-webhook-server-cert" not found Feb 17 14:24:17 crc kubenswrapper[4836]: I0217 14:24:17.979947 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4affaaf4-1113-4635-b30f-da26e04f6662-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9ct7xht\" (UID: \"4affaaf4-1113-4635-b30f-da26e04f6662\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9ct7xht" Feb 17 14:24:17 crc kubenswrapper[4836]: E0217 14:24:17.980469 4836 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 17 14:24:17 crc kubenswrapper[4836]: E0217 14:24:17.980620 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4affaaf4-1113-4635-b30f-da26e04f6662-cert podName:4affaaf4-1113-4635-b30f-da26e04f6662 nodeName:}" failed. No retries permitted until 2026-02-17 14:24:21.980596814 +0000 UTC m=+1088.323525093 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4affaaf4-1113-4635-b30f-da26e04f6662-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9ct7xht" (UID: "4affaaf4-1113-4635-b30f-da26e04f6662") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 17 14:24:18 crc kubenswrapper[4836]: E0217 14:24:18.002964 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:a57336b9f95b703f80453db87e43a2834ca1bdc89480796d28ebbe0a9702ecfd\\\"\"" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-jnxzt" podUID="cf7c4631-b19a-4160-8581-15f72869a60b" Feb 17 14:24:18 crc kubenswrapper[4836]: E0217 14:24:18.027698 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-w4dds" podUID="d423f7ba-2751-4d99-8102-3bc52b302161" Feb 17 14:24:18 crc kubenswrapper[4836]: I0217 14:24:18.626679 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ca022a36-1c0e-4d3b-a6cf-87f4a78cfd48-webhook-certs\") pod \"openstack-operator-controller-manager-667f54696f-kskgn\" (UID: \"ca022a36-1c0e-4d3b-a6cf-87f4a78cfd48\") " pod="openstack-operators/openstack-operator-controller-manager-667f54696f-kskgn" Feb 17 14:24:18 crc kubenswrapper[4836]: I0217 14:24:18.627086 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ca022a36-1c0e-4d3b-a6cf-87f4a78cfd48-metrics-certs\") pod \"openstack-operator-controller-manager-667f54696f-kskgn\" (UID: \"ca022a36-1c0e-4d3b-a6cf-87f4a78cfd48\") " pod="openstack-operators/openstack-operator-controller-manager-667f54696f-kskgn" Feb 17 14:24:18 crc kubenswrapper[4836]: E0217 14:24:18.627156 4836 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 17 14:24:18 crc kubenswrapper[4836]: E0217 14:24:18.627257 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ca022a36-1c0e-4d3b-a6cf-87f4a78cfd48-webhook-certs podName:ca022a36-1c0e-4d3b-a6cf-87f4a78cfd48 nodeName:}" failed. No retries permitted until 2026-02-17 14:24:22.62723233 +0000 UTC m=+1088.970160599 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/ca022a36-1c0e-4d3b-a6cf-87f4a78cfd48-webhook-certs") pod "openstack-operator-controller-manager-667f54696f-kskgn" (UID: "ca022a36-1c0e-4d3b-a6cf-87f4a78cfd48") : secret "webhook-server-cert" not found Feb 17 14:24:18 crc kubenswrapper[4836]: E0217 14:24:18.627403 4836 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 17 14:24:18 crc kubenswrapper[4836]: E0217 14:24:18.627490 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ca022a36-1c0e-4d3b-a6cf-87f4a78cfd48-metrics-certs podName:ca022a36-1c0e-4d3b-a6cf-87f4a78cfd48 nodeName:}" failed. No retries permitted until 2026-02-17 14:24:22.627464246 +0000 UTC m=+1088.970392725 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ca022a36-1c0e-4d3b-a6cf-87f4a78cfd48-metrics-certs") pod "openstack-operator-controller-manager-667f54696f-kskgn" (UID: "ca022a36-1c0e-4d3b-a6cf-87f4a78cfd48") : secret "metrics-server-cert" not found Feb 17 14:24:21 crc kubenswrapper[4836]: I0217 14:24:21.284244 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a1ae24b8-83c8-416d-9d39-24d84eb6cd83-cert\") pod \"infra-operator-controller-manager-79d975b745-f4fvp\" (UID: \"a1ae24b8-83c8-416d-9d39-24d84eb6cd83\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-f4fvp" Feb 17 14:24:21 crc kubenswrapper[4836]: E0217 14:24:21.284490 4836 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 17 14:24:21 crc kubenswrapper[4836]: E0217 14:24:21.284798 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a1ae24b8-83c8-416d-9d39-24d84eb6cd83-cert podName:a1ae24b8-83c8-416d-9d39-24d84eb6cd83 nodeName:}" failed. No retries permitted until 2026-02-17 14:24:29.284773961 +0000 UTC m=+1095.627702240 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a1ae24b8-83c8-416d-9d39-24d84eb6cd83-cert") pod "infra-operator-controller-manager-79d975b745-f4fvp" (UID: "a1ae24b8-83c8-416d-9d39-24d84eb6cd83") : secret "infra-operator-webhook-server-cert" not found Feb 17 14:24:21 crc kubenswrapper[4836]: I0217 14:24:21.995066 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4affaaf4-1113-4635-b30f-da26e04f6662-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9ct7xht\" (UID: \"4affaaf4-1113-4635-b30f-da26e04f6662\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9ct7xht" Feb 17 14:24:21 crc kubenswrapper[4836]: E0217 14:24:21.995276 4836 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 17 14:24:21 crc kubenswrapper[4836]: E0217 14:24:21.995375 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4affaaf4-1113-4635-b30f-da26e04f6662-cert podName:4affaaf4-1113-4635-b30f-da26e04f6662 nodeName:}" failed. No retries permitted until 2026-02-17 14:24:29.995356748 +0000 UTC m=+1096.338285017 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4affaaf4-1113-4635-b30f-da26e04f6662-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9ct7xht" (UID: "4affaaf4-1113-4635-b30f-da26e04f6662") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 17 14:24:22 crc kubenswrapper[4836]: I0217 14:24:22.704751 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ca022a36-1c0e-4d3b-a6cf-87f4a78cfd48-metrics-certs\") pod \"openstack-operator-controller-manager-667f54696f-kskgn\" (UID: \"ca022a36-1c0e-4d3b-a6cf-87f4a78cfd48\") " pod="openstack-operators/openstack-operator-controller-manager-667f54696f-kskgn" Feb 17 14:24:22 crc kubenswrapper[4836]: I0217 14:24:22.705397 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ca022a36-1c0e-4d3b-a6cf-87f4a78cfd48-webhook-certs\") pod \"openstack-operator-controller-manager-667f54696f-kskgn\" (UID: \"ca022a36-1c0e-4d3b-a6cf-87f4a78cfd48\") " pod="openstack-operators/openstack-operator-controller-manager-667f54696f-kskgn" Feb 17 14:24:22 crc kubenswrapper[4836]: E0217 14:24:22.705553 4836 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 17 14:24:22 crc kubenswrapper[4836]: E0217 14:24:22.705605 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ca022a36-1c0e-4d3b-a6cf-87f4a78cfd48-webhook-certs podName:ca022a36-1c0e-4d3b-a6cf-87f4a78cfd48 nodeName:}" failed. No retries permitted until 2026-02-17 14:24:30.705590578 +0000 UTC m=+1097.048518847 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/ca022a36-1c0e-4d3b-a6cf-87f4a78cfd48-webhook-certs") pod "openstack-operator-controller-manager-667f54696f-kskgn" (UID: "ca022a36-1c0e-4d3b-a6cf-87f4a78cfd48") : secret "webhook-server-cert" not found Feb 17 14:24:22 crc kubenswrapper[4836]: E0217 14:24:22.706060 4836 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 17 14:24:22 crc kubenswrapper[4836]: E0217 14:24:22.706084 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ca022a36-1c0e-4d3b-a6cf-87f4a78cfd48-metrics-certs podName:ca022a36-1c0e-4d3b-a6cf-87f4a78cfd48 nodeName:}" failed. No retries permitted until 2026-02-17 14:24:30.70607644 +0000 UTC m=+1097.049004709 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ca022a36-1c0e-4d3b-a6cf-87f4a78cfd48-metrics-certs") pod "openstack-operator-controller-manager-667f54696f-kskgn" (UID: "ca022a36-1c0e-4d3b-a6cf-87f4a78cfd48") : secret "metrics-server-cert" not found Feb 17 14:24:29 crc kubenswrapper[4836]: E0217 14:24:29.247057 4836 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/manila-operator@sha256:8fb0a33b8d93cf9f84f079af5f2ceb680afada4e44542514959146779f57f64c" Feb 17 14:24:29 crc kubenswrapper[4836]: E0217 14:24:29.247949 4836 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:8fb0a33b8d93cf9f84f079af5f2ceb680afada4e44542514959146779f57f64c,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wn8jl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-54f6768c69-6lzts_openstack-operators(9ccd7ed5-2772-4482-af31-2578e98011fd): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 14:24:29 crc kubenswrapper[4836]: E0217 14:24:29.249190 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-6lzts" podUID="9ccd7ed5-2772-4482-af31-2578e98011fd" Feb 17 14:24:29 crc kubenswrapper[4836]: I0217 14:24:29.321403 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a1ae24b8-83c8-416d-9d39-24d84eb6cd83-cert\") pod \"infra-operator-controller-manager-79d975b745-f4fvp\" (UID: \"a1ae24b8-83c8-416d-9d39-24d84eb6cd83\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-f4fvp" Feb 17 14:24:29 crc kubenswrapper[4836]: I0217 14:24:29.344160 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a1ae24b8-83c8-416d-9d39-24d84eb6cd83-cert\") pod \"infra-operator-controller-manager-79d975b745-f4fvp\" (UID: \"a1ae24b8-83c8-416d-9d39-24d84eb6cd83\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-f4fvp" Feb 17 14:24:29 crc kubenswrapper[4836]: I0217 14:24:29.506931 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-s5464" Feb 17 14:24:29 crc kubenswrapper[4836]: I0217 14:24:29.514363 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-79d975b745-f4fvp" Feb 17 14:24:30 crc kubenswrapper[4836]: I0217 14:24:30.033842 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4affaaf4-1113-4635-b30f-da26e04f6662-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9ct7xht\" (UID: \"4affaaf4-1113-4635-b30f-da26e04f6662\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9ct7xht" Feb 17 14:24:30 crc kubenswrapper[4836]: I0217 14:24:30.041190 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4affaaf4-1113-4635-b30f-da26e04f6662-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9ct7xht\" (UID: \"4affaaf4-1113-4635-b30f-da26e04f6662\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9ct7xht" Feb 17 14:24:30 crc kubenswrapper[4836]: I0217 14:24:30.095186 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-sz5r7" Feb 17 14:24:30 crc kubenswrapper[4836]: I0217 14:24:30.105689 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9ct7xht" Feb 17 14:24:30 crc kubenswrapper[4836]: E0217 14:24:30.150836 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:8fb0a33b8d93cf9f84f079af5f2ceb680afada4e44542514959146779f57f64c\\\"\"" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-6lzts" podUID="9ccd7ed5-2772-4482-af31-2578e98011fd" Feb 17 14:24:30 crc kubenswrapper[4836]: E0217 14:24:30.217269 4836 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/octavia-operator@sha256:229fc8c8d94dd4102d2151cd4ec1eaaa09d897c2b396d06e903f61ea29c1fa34" Feb 17 14:24:30 crc kubenswrapper[4836]: E0217 14:24:30.217525 4836 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:229fc8c8d94dd4102d2151cd4ec1eaaa09d897c2b396d06e903f61ea29c1fa34,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-rxx68,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-69f8888797-llzlm_openstack-operators(1bb12b86-1f25-4dd9-a44d-449a6deee701): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 14:24:30 crc kubenswrapper[4836]: E0217 14:24:30.219594 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-llzlm" podUID="1bb12b86-1f25-4dd9-a44d-449a6deee701" Feb 17 14:24:30 crc kubenswrapper[4836]: I0217 14:24:30.725883 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ca022a36-1c0e-4d3b-a6cf-87f4a78cfd48-metrics-certs\") pod \"openstack-operator-controller-manager-667f54696f-kskgn\" (UID: \"ca022a36-1c0e-4d3b-a6cf-87f4a78cfd48\") " pod="openstack-operators/openstack-operator-controller-manager-667f54696f-kskgn" Feb 17 14:24:30 crc kubenswrapper[4836]: I0217 14:24:30.726005 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ca022a36-1c0e-4d3b-a6cf-87f4a78cfd48-webhook-certs\") pod \"openstack-operator-controller-manager-667f54696f-kskgn\" (UID: \"ca022a36-1c0e-4d3b-a6cf-87f4a78cfd48\") " pod="openstack-operators/openstack-operator-controller-manager-667f54696f-kskgn" Feb 17 14:24:30 crc kubenswrapper[4836]: I0217 14:24:30.731176 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ca022a36-1c0e-4d3b-a6cf-87f4a78cfd48-metrics-certs\") pod \"openstack-operator-controller-manager-667f54696f-kskgn\" (UID: \"ca022a36-1c0e-4d3b-a6cf-87f4a78cfd48\") " pod="openstack-operators/openstack-operator-controller-manager-667f54696f-kskgn" Feb 17 14:24:30 crc kubenswrapper[4836]: I0217 14:24:30.731996 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ca022a36-1c0e-4d3b-a6cf-87f4a78cfd48-webhook-certs\") pod \"openstack-operator-controller-manager-667f54696f-kskgn\" (UID: \"ca022a36-1c0e-4d3b-a6cf-87f4a78cfd48\") " pod="openstack-operators/openstack-operator-controller-manager-667f54696f-kskgn" Feb 17 14:24:30 crc kubenswrapper[4836]: I0217 14:24:30.865802 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-667f54696f-kskgn" Feb 17 14:24:31 crc kubenswrapper[4836]: E0217 14:24:31.157152 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:229fc8c8d94dd4102d2151cd4ec1eaaa09d897c2b396d06e903f61ea29c1fa34\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-llzlm" podUID="1bb12b86-1f25-4dd9-a44d-449a6deee701" Feb 17 14:24:34 crc kubenswrapper[4836]: E0217 14:24:34.042810 4836 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/test-operator@sha256:f0fabdf79095def0f8b1c0442925548a94ca94bed4de2d3b171277129f8079e6" Feb 17 14:24:34 crc kubenswrapper[4836]: E0217 14:24:34.043520 4836 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:f0fabdf79095def0f8b1c0442925548a94ca94bed4de2d3b171277129f8079e6,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-j9gnl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-7866795846-ztvz2_openstack-operators(d4aa765a-0f56-4f05-b02f-f041841bc97d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 14:24:34 crc kubenswrapper[4836]: E0217 14:24:34.044783 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/test-operator-controller-manager-7866795846-ztvz2" podUID="d4aa765a-0f56-4f05-b02f-f041841bc97d" Feb 17 14:24:34 crc kubenswrapper[4836]: E0217 14:24:34.188508 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:f0fabdf79095def0f8b1c0442925548a94ca94bed4de2d3b171277129f8079e6\\\"\"" pod="openstack-operators/test-operator-controller-manager-7866795846-ztvz2" podUID="d4aa765a-0f56-4f05-b02f-f041841bc97d" Feb 17 14:24:35 crc kubenswrapper[4836]: E0217 14:24:35.144648 4836 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/mariadb-operator@sha256:a18f12497b7159b100fcfd72c7ba2273d0669a5c00600a9ff1333bca028f256a" Feb 17 14:24:35 crc kubenswrapper[4836]: E0217 14:24:35.145373 4836 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:a18f12497b7159b100fcfd72c7ba2273d0669a5c00600a9ff1333bca028f256a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6x2cc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-6994f66f48-zkzrs_openstack-operators(7b9749c7-038f-4814-9357-623346c9172c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 14:24:35 crc kubenswrapper[4836]: E0217 14:24:35.146628 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-zkzrs" podUID="7b9749c7-038f-4814-9357-623346c9172c" Feb 17 14:24:35 crc kubenswrapper[4836]: E0217 14:24:35.194814 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:a18f12497b7159b100fcfd72c7ba2273d0669a5c00600a9ff1333bca028f256a\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-zkzrs" podUID="7b9749c7-038f-4814-9357-623346c9172c" Feb 17 14:24:35 crc kubenswrapper[4836]: E0217 14:24:35.755406 4836 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ovn-operator@sha256:543c103838f3e6ef48755665a7695dfa3ed84753c557560257d265db31f92759" Feb 17 14:24:35 crc kubenswrapper[4836]: E0217 14:24:35.755628 4836 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:543c103838f3e6ef48755665a7695dfa3ed84753c557560257d265db31f92759,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-xqpqt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-d44cf6b75-mq76b_openstack-operators(f6ba6343-872d-4e36-accf-959bb437f82d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 14:24:35 crc kubenswrapper[4836]: E0217 14:24:35.757589 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-mq76b" podUID="f6ba6343-872d-4e36-accf-959bb437f82d" Feb 17 14:24:36 crc kubenswrapper[4836]: E0217 14:24:36.201902 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:543c103838f3e6ef48755665a7695dfa3ed84753c557560257d265db31f92759\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-mq76b" podUID="f6ba6343-872d-4e36-accf-959bb437f82d" Feb 17 14:24:36 crc kubenswrapper[4836]: E0217 14:24:36.301652 4836 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/swift-operator@sha256:3d676f1281e24ef07de617570d2f7fbf625032e41866d1551a856c052248bb04" Feb 17 14:24:36 crc kubenswrapper[4836]: E0217 14:24:36.301917 4836 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:3d676f1281e24ef07de617570d2f7fbf625032e41866d1551a856c052248bb04,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-kv6mg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-68f46476f-7ktgs_openstack-operators(d0c3c41c-ac60-40f0-bdfb-8fe641c9426a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 14:24:36 crc kubenswrapper[4836]: E0217 14:24:36.303092 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/swift-operator-controller-manager-68f46476f-7ktgs" podUID="d0c3c41c-ac60-40f0-bdfb-8fe641c9426a" Feb 17 14:24:36 crc kubenswrapper[4836]: E0217 14:24:36.844665 4836 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/cinder-operator@sha256:2b8ab3063af4aaeed0198197aae6f391c6647ac686c94c85668537f1d5933979" Feb 17 14:24:36 crc kubenswrapper[4836]: E0217 14:24:36.844968 4836 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/cinder-operator@sha256:2b8ab3063af4aaeed0198197aae6f391c6647ac686c94c85668537f1d5933979,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-v292x,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-operator-controller-manager-5d946d989d-b6cfm_openstack-operators(12cff299-e5ea-40a9-8a69-528c478cd0a0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 14:24:36 crc kubenswrapper[4836]: E0217 14:24:36.847101 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-b6cfm" podUID="12cff299-e5ea-40a9-8a69-528c478cd0a0" Feb 17 14:24:37 crc kubenswrapper[4836]: E0217 14:24:37.210931 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/cinder-operator@sha256:2b8ab3063af4aaeed0198197aae6f391c6647ac686c94c85668537f1d5933979\\\"\"" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-b6cfm" podUID="12cff299-e5ea-40a9-8a69-528c478cd0a0" Feb 17 14:24:37 crc kubenswrapper[4836]: E0217 14:24:37.212503 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:3d676f1281e24ef07de617570d2f7fbf625032e41866d1551a856c052248bb04\\\"\"" pod="openstack-operators/swift-operator-controller-manager-68f46476f-7ktgs" podUID="d0c3c41c-ac60-40f0-bdfb-8fe641c9426a" Feb 17 14:24:37 crc kubenswrapper[4836]: E0217 14:24:37.392700 4836 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/neutron-operator@sha256:e4689246ae78635dc3c1db9c677d8b16b8f94276df15fb9c84bfc57cc6578fcf" Feb 17 14:24:37 crc kubenswrapper[4836]: E0217 14:24:37.392928 4836 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:e4689246ae78635dc3c1db9c677d8b16b8f94276df15fb9c84bfc57cc6578fcf,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-mwb95,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-64ddbf8bb-6c4rn_openstack-operators(3d12b131-73a0-477e-ab9e-579309b0f5b1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 14:24:37 crc kubenswrapper[4836]: E0217 14:24:37.395085 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-6c4rn" podUID="3d12b131-73a0-477e-ab9e-579309b0f5b1" Feb 17 14:24:38 crc kubenswrapper[4836]: E0217 14:24:38.074328 4836 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/horizon-operator@sha256:9f2e1299d908411457e53b49e1062265d2b9d76f6719db24d1be9347c388e4da" Feb 17 14:24:38 crc kubenswrapper[4836]: E0217 14:24:38.074507 4836 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/horizon-operator@sha256:9f2e1299d908411457e53b49e1062265d2b9d76f6719db24d1be9347c388e4da,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-xmrdc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-operator-controller-manager-5b9b8895d5-bv7s8_openstack-operators(f2e6ac9f-ee72-4a28-b298-9b2f918d0c95): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 14:24:38 crc kubenswrapper[4836]: E0217 14:24:38.075970 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-bv7s8" podUID="f2e6ac9f-ee72-4a28-b298-9b2f918d0c95" Feb 17 14:24:38 crc kubenswrapper[4836]: E0217 14:24:38.228553 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:e4689246ae78635dc3c1db9c677d8b16b8f94276df15fb9c84bfc57cc6578fcf\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-6c4rn" podUID="3d12b131-73a0-477e-ab9e-579309b0f5b1" Feb 17 14:24:38 crc kubenswrapper[4836]: E0217 14:24:38.228780 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/horizon-operator@sha256:9f2e1299d908411457e53b49e1062265d2b9d76f6719db24d1be9347c388e4da\\\"\"" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-bv7s8" podUID="f2e6ac9f-ee72-4a28-b298-9b2f918d0c95" Feb 17 14:24:39 crc kubenswrapper[4836]: E0217 14:24:39.092949 4836 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ironic-operator@sha256:7e1b0b7b172ad0d707ab80dd72d609e1d0f5bbd38a22c24a28ed0f17a960c867" Feb 17 14:24:39 crc kubenswrapper[4836]: E0217 14:24:39.093187 4836 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ironic-operator@sha256:7e1b0b7b172ad0d707ab80dd72d609e1d0f5bbd38a22c24a28ed0f17a960c867,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4nnnd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ironic-operator-controller-manager-554564d7fc-k9p46_openstack-operators(e805966b-ea22-4c2a-a6c4-3622300fcb2f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 14:24:39 crc kubenswrapper[4836]: E0217 14:24:39.094384 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-k9p46" podUID="e805966b-ea22-4c2a-a6c4-3622300fcb2f" Feb 17 14:24:39 crc kubenswrapper[4836]: E0217 14:24:39.242410 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ironic-operator@sha256:7e1b0b7b172ad0d707ab80dd72d609e1d0f5bbd38a22c24a28ed0f17a960c867\\\"\"" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-k9p46" podUID="e805966b-ea22-4c2a-a6c4-3622300fcb2f" Feb 17 14:24:39 crc kubenswrapper[4836]: E0217 14:24:39.432167 4836 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.5:5001/openstack-k8s-operators/telemetry-operator:49fb0a393e644ad55559f09981950c6ee3a56dc1" Feb 17 14:24:39 crc kubenswrapper[4836]: E0217 14:24:39.432277 4836 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.5:5001/openstack-k8s-operators/telemetry-operator:49fb0a393e644ad55559f09981950c6ee3a56dc1" Feb 17 14:24:39 crc kubenswrapper[4836]: E0217 14:24:39.432523 4836 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:38.102.83.5:5001/openstack-k8s-operators/telemetry-operator:49fb0a393e644ad55559f09981950c6ee3a56dc1,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-mzgtd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-6d6964fcdb-rbq62_openstack-operators(a3c22d9b-6ba0-4dd2-861d-8685c18e9330): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 14:24:39 crc kubenswrapper[4836]: E0217 14:24:39.433797 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/telemetry-operator-controller-manager-6d6964fcdb-rbq62" podUID="a3c22d9b-6ba0-4dd2-861d-8685c18e9330" Feb 17 14:24:39 crc kubenswrapper[4836]: E0217 14:24:39.978385 4836 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:c6ad383f55f955902b074d1ee947a2233a5fcbf40698479ae693ce056c80dcc1" Feb 17 14:24:39 crc kubenswrapper[4836]: E0217 14:24:39.978674 4836 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:c6ad383f55f955902b074d1ee947a2233a5fcbf40698479ae693ce056c80dcc1,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fhcrg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-b4d948c87-qnb5b_openstack-operators(18a63480-edc2-44ed-bd43-b7750f7f8f33): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 14:24:39 crc kubenswrapper[4836]: E0217 14:24:39.980006 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-qnb5b" podUID="18a63480-edc2-44ed-bd43-b7750f7f8f33" Feb 17 14:24:40 crc kubenswrapper[4836]: E0217 14:24:40.250325 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.5:5001/openstack-k8s-operators/telemetry-operator:49fb0a393e644ad55559f09981950c6ee3a56dc1\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-6d6964fcdb-rbq62" podUID="a3c22d9b-6ba0-4dd2-861d-8685c18e9330" Feb 17 14:24:40 crc kubenswrapper[4836]: E0217 14:24:40.250863 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:c6ad383f55f955902b074d1ee947a2233a5fcbf40698479ae693ce056c80dcc1\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-qnb5b" podUID="18a63480-edc2-44ed-bd43-b7750f7f8f33" Feb 17 14:24:40 crc kubenswrapper[4836]: E0217 14:24:40.624558 4836 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:fe85dd595906fac0fe1e7a42215bb306a963cf87d55e07cd2573726b690b2838" Feb 17 14:24:40 crc kubenswrapper[4836]: E0217 14:24:40.624782 4836 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:fe85dd595906fac0fe1e7a42215bb306a963cf87d55e07cd2573726b690b2838,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jxl4z,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-567668f5cf-5hz7c_openstack-operators(52a90e1a-0e2d-4488-8a1a-34de15bfa3a5): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 14:24:40 crc kubenswrapper[4836]: E0217 14:24:40.626502 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-5hz7c" podUID="52a90e1a-0e2d-4488-8a1a-34de15bfa3a5" Feb 17 14:24:41 crc kubenswrapper[4836]: E0217 14:24:41.259386 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:fe85dd595906fac0fe1e7a42215bb306a963cf87d55e07cd2573726b690b2838\\\"\"" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-5hz7c" podUID="52a90e1a-0e2d-4488-8a1a-34de15bfa3a5" Feb 17 14:24:43 crc kubenswrapper[4836]: I0217 14:24:43.244987 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-79d975b745-f4fvp"] Feb 17 14:24:43 crc kubenswrapper[4836]: W0217 14:24:43.534614 4836 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda1ae24b8_83c8_416d_9d39_24d84eb6cd83.slice/crio-29c49fea7918f4841548ce64de5334ffb851ee553c872044a1a7d3506146bcc1 WatchSource:0}: Error finding container 29c49fea7918f4841548ce64de5334ffb851ee553c872044a1a7d3506146bcc1: Status 404 returned error can't find the container with id 29c49fea7918f4841548ce64de5334ffb851ee553c872044a1a7d3506146bcc1 Feb 17 14:24:44 crc kubenswrapper[4836]: I0217 14:24:44.076473 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9ct7xht"] Feb 17 14:24:44 crc kubenswrapper[4836]: W0217 14:24:44.102799 4836 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4affaaf4_1113_4635_b30f_da26e04f6662.slice/crio-207a583e5ef8f57f77f7026c8fc84b6f995ef483add4599c8abedcb41cbd7100 WatchSource:0}: Error finding container 207a583e5ef8f57f77f7026c8fc84b6f995ef483add4599c8abedcb41cbd7100: Status 404 returned error can't find the container with id 207a583e5ef8f57f77f7026c8fc84b6f995ef483add4599c8abedcb41cbd7100 Feb 17 14:24:44 crc kubenswrapper[4836]: I0217 14:24:44.107156 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-667f54696f-kskgn"] Feb 17 14:24:44 crc kubenswrapper[4836]: W0217 14:24:44.123128 4836 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca022a36_1c0e_4d3b_a6cf_87f4a78cfd48.slice/crio-b9a51bec82daee147a4b5b4b6929361e333f732f6fd29c9819a6f2fbbc2af054 WatchSource:0}: Error finding container b9a51bec82daee147a4b5b4b6929361e333f732f6fd29c9819a6f2fbbc2af054: Status 404 returned error can't find the container with id b9a51bec82daee147a4b5b4b6929361e333f732f6fd29c9819a6f2fbbc2af054 Feb 17 14:24:44 crc kubenswrapper[4836]: I0217 14:24:44.309618 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-jnxzt" event={"ID":"cf7c4631-b19a-4160-8581-15f72869a60b","Type":"ContainerStarted","Data":"958fcda9023106cd43167a56dddecd3ccee7273b15bfa8412cad588e8c2edb03"} Feb 17 14:24:44 crc kubenswrapper[4836]: I0217 14:24:44.310833 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-jnxzt" Feb 17 14:24:44 crc kubenswrapper[4836]: I0217 14:24:44.329943 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-54696" event={"ID":"a7c6acc7-4243-4c0d-a723-e83dc2e054df","Type":"ContainerStarted","Data":"e56083496734d6d4073a7b0c7e0e5be4b8d8f4db663d854e080d73b1ec0786d7"} Feb 17 14:24:44 crc kubenswrapper[4836]: I0217 14:24:44.330182 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-54696" Feb 17 14:24:44 crc kubenswrapper[4836]: I0217 14:24:44.350535 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-8wdwr" event={"ID":"0962ca43-43c4-4884-bd8e-889835f83632","Type":"ContainerStarted","Data":"e63affa4126c2053cd36b1f5738f4d983b8d1752706f2e3a5c6b3007ff77a087"} Feb 17 14:24:44 crc kubenswrapper[4836]: I0217 14:24:44.351366 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-8wdwr" Feb 17 14:24:44 crc kubenswrapper[4836]: I0217 14:24:44.362482 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-7vwdd" event={"ID":"c3d9def3-7f53-4acc-9c46-d37ddf41e3b7","Type":"ContainerStarted","Data":"06f9ef1d7d3b07ef46637391a35b2eeaa0cb72f854cc7159a4aed561e3024636"} Feb 17 14:24:44 crc kubenswrapper[4836]: I0217 14:24:44.363225 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-7vwdd" Feb 17 14:24:44 crc kubenswrapper[4836]: I0217 14:24:44.376477 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-jnxzt" podStartSLOduration=3.599888231 podStartE2EDuration="31.376395907s" podCreationTimestamp="2026-02-17 14:24:13 +0000 UTC" firstStartedPulling="2026-02-17 14:24:15.947564649 +0000 UTC m=+1082.290492918" lastFinishedPulling="2026-02-17 14:24:43.724072325 +0000 UTC m=+1110.067000594" observedRunningTime="2026-02-17 14:24:44.3650595 +0000 UTC m=+1110.707987779" watchObservedRunningTime="2026-02-17 14:24:44.376395907 +0000 UTC m=+1110.719324176" Feb 17 14:24:44 crc kubenswrapper[4836]: I0217 14:24:44.390673 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987464f4-zxb25" event={"ID":"ce77a6a5-95bb-4758-8a38-cdc354fd9d6c","Type":"ContainerStarted","Data":"3f2229ea30a416d447515977dca499b1a4097965d6864eebeb909c756f1b55e5"} Feb 17 14:24:44 crc kubenswrapper[4836]: I0217 14:24:44.391399 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-77987464f4-zxb25" Feb 17 14:24:44 crc kubenswrapper[4836]: I0217 14:24:44.396539 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-lmtng" event={"ID":"1f238b1a-4c0c-45de-bb7a-12946f426b89","Type":"ContainerStarted","Data":"2476fcf1b1ba9365ac703f9c160d00016da1128afa99f6bc0d6399a11c9a9f48"} Feb 17 14:24:44 crc kubenswrapper[4836]: I0217 14:24:44.396751 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-lmtng" Feb 17 14:24:44 crc kubenswrapper[4836]: I0217 14:24:44.408004 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-w4dds" event={"ID":"d423f7ba-2751-4d99-8102-3bc52b302161","Type":"ContainerStarted","Data":"4a6e1d7cc23236717b9d91223594c2175b6cb0c66ca4b09b080158d4c7387cd6"} Feb 17 14:24:44 crc kubenswrapper[4836]: I0217 14:24:44.409729 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-8wdwr" podStartSLOduration=5.700681604 podStartE2EDuration="31.40970954s" podCreationTimestamp="2026-02-17 14:24:13 +0000 UTC" firstStartedPulling="2026-02-17 14:24:14.894788921 +0000 UTC m=+1081.237717190" lastFinishedPulling="2026-02-17 14:24:40.603816857 +0000 UTC m=+1106.946745126" observedRunningTime="2026-02-17 14:24:44.397645083 +0000 UTC m=+1110.740573352" watchObservedRunningTime="2026-02-17 14:24:44.40970954 +0000 UTC m=+1110.752637809" Feb 17 14:24:44 crc kubenswrapper[4836]: I0217 14:24:44.422746 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-6lzts" event={"ID":"9ccd7ed5-2772-4482-af31-2578e98011fd","Type":"ContainerStarted","Data":"2b435b6c7424118be4215baf54140a03ec61746af41ee081413eb2479f1896c7"} Feb 17 14:24:44 crc kubenswrapper[4836]: I0217 14:24:44.423905 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-6lzts" Feb 17 14:24:44 crc kubenswrapper[4836]: I0217 14:24:44.435096 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-79d975b745-f4fvp" event={"ID":"a1ae24b8-83c8-416d-9d39-24d84eb6cd83","Type":"ContainerStarted","Data":"29c49fea7918f4841548ce64de5334ffb851ee553c872044a1a7d3506146bcc1"} Feb 17 14:24:44 crc kubenswrapper[4836]: I0217 14:24:44.452346 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-667f54696f-kskgn" event={"ID":"ca022a36-1c0e-4d3b-a6cf-87f4a78cfd48","Type":"ContainerStarted","Data":"b9a51bec82daee147a4b5b4b6929361e333f732f6fd29c9819a6f2fbbc2af054"} Feb 17 14:24:44 crc kubenswrapper[4836]: I0217 14:24:44.455482 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-54696" podStartSLOduration=6.640750083 podStartE2EDuration="32.455459941s" podCreationTimestamp="2026-02-17 14:24:12 +0000 UTC" firstStartedPulling="2026-02-17 14:24:14.786370435 +0000 UTC m=+1081.129298704" lastFinishedPulling="2026-02-17 14:24:40.601080293 +0000 UTC m=+1106.944008562" observedRunningTime="2026-02-17 14:24:44.442940761 +0000 UTC m=+1110.785869040" watchObservedRunningTime="2026-02-17 14:24:44.455459941 +0000 UTC m=+1110.798388210" Feb 17 14:24:44 crc kubenswrapper[4836]: I0217 14:24:44.474154 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9ct7xht" event={"ID":"4affaaf4-1113-4635-b30f-da26e04f6662","Type":"ContainerStarted","Data":"207a583e5ef8f57f77f7026c8fc84b6f995ef483add4599c8abedcb41cbd7100"} Feb 17 14:24:44 crc kubenswrapper[4836]: I0217 14:24:44.487051 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-7vwdd" podStartSLOduration=5.962083731 podStartE2EDuration="31.487012986s" podCreationTimestamp="2026-02-17 14:24:13 +0000 UTC" firstStartedPulling="2026-02-17 14:24:15.078136832 +0000 UTC m=+1081.421065101" lastFinishedPulling="2026-02-17 14:24:40.603066087 +0000 UTC m=+1106.945994356" observedRunningTime="2026-02-17 14:24:44.472838021 +0000 UTC m=+1110.815766310" watchObservedRunningTime="2026-02-17 14:24:44.487012986 +0000 UTC m=+1110.829941255" Feb 17 14:24:44 crc kubenswrapper[4836]: I0217 14:24:44.533933 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-lmtng" podStartSLOduration=6.794177276 podStartE2EDuration="31.533909867s" podCreationTimestamp="2026-02-17 14:24:13 +0000 UTC" firstStartedPulling="2026-02-17 14:24:15.863993854 +0000 UTC m=+1082.206922123" lastFinishedPulling="2026-02-17 14:24:40.603726445 +0000 UTC m=+1106.946654714" observedRunningTime="2026-02-17 14:24:44.529759965 +0000 UTC m=+1110.872688244" watchObservedRunningTime="2026-02-17 14:24:44.533909867 +0000 UTC m=+1110.876838136" Feb 17 14:24:44 crc kubenswrapper[4836]: I0217 14:24:44.607765 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-6lzts" podStartSLOduration=2.973091447 podStartE2EDuration="31.607738239s" podCreationTimestamp="2026-02-17 14:24:13 +0000 UTC" firstStartedPulling="2026-02-17 14:24:15.089423223 +0000 UTC m=+1081.432351492" lastFinishedPulling="2026-02-17 14:24:43.724070015 +0000 UTC m=+1110.066998284" observedRunningTime="2026-02-17 14:24:44.605731704 +0000 UTC m=+1110.948659973" watchObservedRunningTime="2026-02-17 14:24:44.607738239 +0000 UTC m=+1110.950666508" Feb 17 14:24:44 crc kubenswrapper[4836]: I0217 14:24:44.611692 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-77987464f4-zxb25" podStartSLOduration=6.099904681 podStartE2EDuration="31.611680225s" podCreationTimestamp="2026-02-17 14:24:13 +0000 UTC" firstStartedPulling="2026-02-17 14:24:15.090741788 +0000 UTC m=+1081.433670047" lastFinishedPulling="2026-02-17 14:24:40.602517322 +0000 UTC m=+1106.945445591" observedRunningTime="2026-02-17 14:24:44.556854759 +0000 UTC m=+1110.899783028" watchObservedRunningTime="2026-02-17 14:24:44.611680225 +0000 UTC m=+1110.954608514" Feb 17 14:24:44 crc kubenswrapper[4836]: I0217 14:24:44.638475 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-w4dds" podStartSLOduration=3.128180479 podStartE2EDuration="30.638458511s" podCreationTimestamp="2026-02-17 14:24:14 +0000 UTC" firstStartedPulling="2026-02-17 14:24:16.22139024 +0000 UTC m=+1082.564318509" lastFinishedPulling="2026-02-17 14:24:43.731668272 +0000 UTC m=+1110.074596541" observedRunningTime="2026-02-17 14:24:44.632779157 +0000 UTC m=+1110.975707446" watchObservedRunningTime="2026-02-17 14:24:44.638458511 +0000 UTC m=+1110.981386780" Feb 17 14:24:45 crc kubenswrapper[4836]: I0217 14:24:45.490983 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-667f54696f-kskgn" event={"ID":"ca022a36-1c0e-4d3b-a6cf-87f4a78cfd48","Type":"ContainerStarted","Data":"708a2fe9e35db4f83b40aa4c7322845835b651a153295351abeb42dbbcd2edd8"} Feb 17 14:24:45 crc kubenswrapper[4836]: I0217 14:24:45.491456 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-667f54696f-kskgn" Feb 17 14:24:45 crc kubenswrapper[4836]: I0217 14:24:45.499247 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-llzlm" event={"ID":"1bb12b86-1f25-4dd9-a44d-449a6deee701","Type":"ContainerStarted","Data":"cf32f810bc332f8efcb400d9b1dcbe8ca81ac1a06416a46b64f17eae54b3e1db"} Feb 17 14:24:45 crc kubenswrapper[4836]: I0217 14:24:45.615861 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-667f54696f-kskgn" podStartSLOduration=31.615831814 podStartE2EDuration="31.615831814s" podCreationTimestamp="2026-02-17 14:24:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:24:45.570264089 +0000 UTC m=+1111.913192358" watchObservedRunningTime="2026-02-17 14:24:45.615831814 +0000 UTC m=+1111.958760083" Feb 17 14:24:45 crc kubenswrapper[4836]: I0217 14:24:45.673824 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-llzlm" podStartSLOduration=4.218669397 podStartE2EDuration="32.673792285s" podCreationTimestamp="2026-02-17 14:24:13 +0000 UTC" firstStartedPulling="2026-02-17 14:24:15.775226671 +0000 UTC m=+1082.118154940" lastFinishedPulling="2026-02-17 14:24:44.230349559 +0000 UTC m=+1110.573277828" observedRunningTime="2026-02-17 14:24:45.618517306 +0000 UTC m=+1111.961445605" watchObservedRunningTime="2026-02-17 14:24:45.673792285 +0000 UTC m=+1112.016720574" Feb 17 14:24:50 crc kubenswrapper[4836]: I0217 14:24:50.615680 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-7866795846-ztvz2" event={"ID":"d4aa765a-0f56-4f05-b02f-f041841bc97d","Type":"ContainerStarted","Data":"5191cedafd66ba2cd22d023c2056361c8e2acf7c2b599367e99e74082824f087"} Feb 17 14:24:50 crc kubenswrapper[4836]: I0217 14:24:50.616811 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-7866795846-ztvz2" Feb 17 14:24:50 crc kubenswrapper[4836]: I0217 14:24:50.856721 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-mq76b" event={"ID":"f6ba6343-872d-4e36-accf-959bb437f82d","Type":"ContainerStarted","Data":"e24c63fa0f774e9fa7bb7ef037058452c11973342b3b28780f09ed7f801ebff1"} Feb 17 14:24:50 crc kubenswrapper[4836]: I0217 14:24:50.857274 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-mq76b" Feb 17 14:24:50 crc kubenswrapper[4836]: I0217 14:24:50.868336 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-zkzrs" event={"ID":"7b9749c7-038f-4814-9357-623346c9172c","Type":"ContainerStarted","Data":"be7ade67079dc5021195621f82d2860a5ef1f09ba1cff570b6f8a7a6c21d0ab9"} Feb 17 14:24:50 crc kubenswrapper[4836]: I0217 14:24:50.868932 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-zkzrs" Feb 17 14:24:50 crc kubenswrapper[4836]: I0217 14:24:50.870964 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-79d975b745-f4fvp" event={"ID":"a1ae24b8-83c8-416d-9d39-24d84eb6cd83","Type":"ContainerStarted","Data":"3282111097c5b5c1e0bb4f2354474b0578f6c05ab2c972060fe1a6428e670589"} Feb 17 14:24:50 crc kubenswrapper[4836]: I0217 14:24:50.871168 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-79d975b745-f4fvp" Feb 17 14:24:50 crc kubenswrapper[4836]: I0217 14:24:50.880141 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-68f46476f-7ktgs" event={"ID":"d0c3c41c-ac60-40f0-bdfb-8fe641c9426a","Type":"ContainerStarted","Data":"9768fb7add75b5b01fc42c92fdcccdabd19bed5b5c0fb476beebce8f4b53ded5"} Feb 17 14:24:50 crc kubenswrapper[4836]: I0217 14:24:50.882718 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-68f46476f-7ktgs" Feb 17 14:24:50 crc kubenswrapper[4836]: I0217 14:24:50.885091 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9ct7xht" event={"ID":"4affaaf4-1113-4635-b30f-da26e04f6662","Type":"ContainerStarted","Data":"c6739dc27b24b8b995bb3d2605c8cb7044479d89652a36941d38b08687be632f"} Feb 17 14:24:50 crc kubenswrapper[4836]: I0217 14:24:50.885367 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9ct7xht" Feb 17 14:24:50 crc kubenswrapper[4836]: I0217 14:24:50.921239 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-667f54696f-kskgn" Feb 17 14:24:51 crc kubenswrapper[4836]: I0217 14:24:51.661144 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-7866795846-ztvz2" podStartSLOduration=5.192983177 podStartE2EDuration="38.661121159s" podCreationTimestamp="2026-02-17 14:24:13 +0000 UTC" firstStartedPulling="2026-02-17 14:24:16.208330931 +0000 UTC m=+1082.551259200" lastFinishedPulling="2026-02-17 14:24:49.676468913 +0000 UTC m=+1116.019397182" observedRunningTime="2026-02-17 14:24:50.999408443 +0000 UTC m=+1117.342336722" watchObservedRunningTime="2026-02-17 14:24:51.661121159 +0000 UTC m=+1118.004049438" Feb 17 14:24:51 crc kubenswrapper[4836]: I0217 14:24:51.713572 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-mq76b" podStartSLOduration=4.765050083 podStartE2EDuration="38.71353428s" podCreationTimestamp="2026-02-17 14:24:13 +0000 UTC" firstStartedPulling="2026-02-17 14:24:15.724698216 +0000 UTC m=+1082.067626485" lastFinishedPulling="2026-02-17 14:24:49.673182413 +0000 UTC m=+1116.016110682" observedRunningTime="2026-02-17 14:24:51.677607956 +0000 UTC m=+1118.020536225" watchObservedRunningTime="2026-02-17 14:24:51.71353428 +0000 UTC m=+1118.056462549" Feb 17 14:24:51 crc kubenswrapper[4836]: I0217 14:24:51.715643 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-zkzrs" podStartSLOduration=5.176168587 podStartE2EDuration="38.715630267s" podCreationTimestamp="2026-02-17 14:24:13 +0000 UTC" firstStartedPulling="2026-02-17 14:24:15.825379406 +0000 UTC m=+1082.168307675" lastFinishedPulling="2026-02-17 14:24:49.364841086 +0000 UTC m=+1115.707769355" observedRunningTime="2026-02-17 14:24:51.708670168 +0000 UTC m=+1118.051598447" watchObservedRunningTime="2026-02-17 14:24:51.715630267 +0000 UTC m=+1118.058558536" Feb 17 14:24:51 crc kubenswrapper[4836]: I0217 14:24:51.746042 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-68f46476f-7ktgs" podStartSLOduration=4.75159594 podStartE2EDuration="38.74601001s" podCreationTimestamp="2026-02-17 14:24:13 +0000 UTC" firstStartedPulling="2026-02-17 14:24:16.030465806 +0000 UTC m=+1082.373394075" lastFinishedPulling="2026-02-17 14:24:50.024879876 +0000 UTC m=+1116.367808145" observedRunningTime="2026-02-17 14:24:51.741182849 +0000 UTC m=+1118.084111118" watchObservedRunningTime="2026-02-17 14:24:51.74601001 +0000 UTC m=+1118.088938279" Feb 17 14:24:51 crc kubenswrapper[4836]: I0217 14:24:51.786669 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9ct7xht" podStartSLOduration=33.539618684 podStartE2EDuration="38.786633792s" podCreationTimestamp="2026-02-17 14:24:13 +0000 UTC" firstStartedPulling="2026-02-17 14:24:44.115896066 +0000 UTC m=+1110.458824335" lastFinishedPulling="2026-02-17 14:24:49.362911174 +0000 UTC m=+1115.705839443" observedRunningTime="2026-02-17 14:24:51.780022242 +0000 UTC m=+1118.122950531" watchObservedRunningTime="2026-02-17 14:24:51.786633792 +0000 UTC m=+1118.129562071" Feb 17 14:24:51 crc kubenswrapper[4836]: I0217 14:24:51.944880 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-79d975b745-f4fvp" podStartSLOduration=32.835566069 podStartE2EDuration="38.94485598s" podCreationTimestamp="2026-02-17 14:24:13 +0000 UTC" firstStartedPulling="2026-02-17 14:24:43.539278736 +0000 UTC m=+1109.882207005" lastFinishedPulling="2026-02-17 14:24:49.648568657 +0000 UTC m=+1115.991496916" observedRunningTime="2026-02-17 14:24:51.941472738 +0000 UTC m=+1118.284401037" watchObservedRunningTime="2026-02-17 14:24:51.94485598 +0000 UTC m=+1118.287784249" Feb 17 14:24:53 crc kubenswrapper[4836]: I0217 14:24:53.395360 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-54696" Feb 17 14:24:53 crc kubenswrapper[4836]: I0217 14:24:53.801174 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-77987464f4-zxb25" Feb 17 14:24:53 crc kubenswrapper[4836]: I0217 14:24:53.801654 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-8wdwr" Feb 17 14:24:53 crc kubenswrapper[4836]: I0217 14:24:53.815151 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-7vwdd" Feb 17 14:24:53 crc kubenswrapper[4836]: I0217 14:24:53.996257 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-6lzts" Feb 17 14:24:54 crc kubenswrapper[4836]: I0217 14:24:54.331813 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-llzlm" Feb 17 14:24:54 crc kubenswrapper[4836]: I0217 14:24:54.396189 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-llzlm" Feb 17 14:24:54 crc kubenswrapper[4836]: I0217 14:24:54.413879 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-zkzrs" Feb 17 14:24:54 crc kubenswrapper[4836]: I0217 14:24:54.542915 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-jnxzt" Feb 17 14:24:54 crc kubenswrapper[4836]: I0217 14:24:54.829211 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-7866795846-ztvz2" Feb 17 14:24:54 crc kubenswrapper[4836]: I0217 14:24:54.831222 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-lmtng" Feb 17 14:24:55 crc kubenswrapper[4836]: I0217 14:24:55.278414 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-mq76b" Feb 17 14:24:59 crc kubenswrapper[4836]: I0217 14:24:59.760918 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-79d975b745-f4fvp" Feb 17 14:25:00 crc kubenswrapper[4836]: I0217 14:25:00.116204 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9ct7xht" Feb 17 14:25:03 crc kubenswrapper[4836]: I0217 14:25:03.371179 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-k9p46" event={"ID":"e805966b-ea22-4c2a-a6c4-3622300fcb2f","Type":"ContainerStarted","Data":"2321f59df1f41fd133675fc8ee34eba7282d0495b1c7daa28ec9b46cd02156b1"} Feb 17 14:25:03 crc kubenswrapper[4836]: I0217 14:25:03.378320 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-k9p46" Feb 17 14:25:03 crc kubenswrapper[4836]: I0217 14:25:03.381058 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-6c4rn" event={"ID":"3d12b131-73a0-477e-ab9e-579309b0f5b1","Type":"ContainerStarted","Data":"807688d0e1d6c570f4713555266e9275668dfb9982995d91bdd84c6cbaf4e0d0"} Feb 17 14:25:03 crc kubenswrapper[4836]: I0217 14:25:03.382911 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-6c4rn" Feb 17 14:25:03 crc kubenswrapper[4836]: I0217 14:25:03.389473 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-b6cfm" event={"ID":"12cff299-e5ea-40a9-8a69-528c478cd0a0","Type":"ContainerStarted","Data":"f8d70c0e3d96a6dfb60dfb8639e0e5011472512bc988c98b13b487c2828e8971"} Feb 17 14:25:03 crc kubenswrapper[4836]: I0217 14:25:03.395623 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-b6cfm" Feb 17 14:25:03 crc kubenswrapper[4836]: I0217 14:25:03.421169 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-qnb5b" event={"ID":"18a63480-edc2-44ed-bd43-b7750f7f8f33","Type":"ContainerStarted","Data":"7f2b3c2f7eec73e27bcb4cd5915c8293e58fe18df35b97da46fbb0d6fd1af60e"} Feb 17 14:25:03 crc kubenswrapper[4836]: I0217 14:25:03.423870 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-qnb5b" Feb 17 14:25:03 crc kubenswrapper[4836]: I0217 14:25:03.453968 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-bv7s8" event={"ID":"f2e6ac9f-ee72-4a28-b298-9b2f918d0c95","Type":"ContainerStarted","Data":"8ce1dabe9924cb8b28bf979f68fdd93d0f836e1196ca4a2fd80cc8339a67b9a6"} Feb 17 14:25:03 crc kubenswrapper[4836]: I0217 14:25:03.455536 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-k9p46" podStartSLOduration=3.987661765 podStartE2EDuration="50.455489s" podCreationTimestamp="2026-02-17 14:24:13 +0000 UTC" firstStartedPulling="2026-02-17 14:24:15.717349071 +0000 UTC m=+1082.060277340" lastFinishedPulling="2026-02-17 14:25:02.185176306 +0000 UTC m=+1128.528104575" observedRunningTime="2026-02-17 14:25:03.433309848 +0000 UTC m=+1129.776238127" watchObservedRunningTime="2026-02-17 14:25:03.455489 +0000 UTC m=+1129.798417289" Feb 17 14:25:03 crc kubenswrapper[4836]: I0217 14:25:03.458959 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-bv7s8" Feb 17 14:25:03 crc kubenswrapper[4836]: I0217 14:25:03.462990 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-6d6964fcdb-rbq62" event={"ID":"a3c22d9b-6ba0-4dd2-861d-8685c18e9330","Type":"ContainerStarted","Data":"cac0671ac8dbcb53aec8149b0b645141f6585af883bb10cdeb19994be97350ba"} Feb 17 14:25:03 crc kubenswrapper[4836]: I0217 14:25:03.465897 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-6d6964fcdb-rbq62" Feb 17 14:25:03 crc kubenswrapper[4836]: I0217 14:25:03.507239 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-6c4rn" podStartSLOduration=3.661467534 podStartE2EDuration="50.507184731s" podCreationTimestamp="2026-02-17 14:24:13 +0000 UTC" firstStartedPulling="2026-02-17 14:24:15.339814839 +0000 UTC m=+1081.682743098" lastFinishedPulling="2026-02-17 14:25:02.185532026 +0000 UTC m=+1128.528460295" observedRunningTime="2026-02-17 14:25:03.491467084 +0000 UTC m=+1129.834395373" watchObservedRunningTime="2026-02-17 14:25:03.507184731 +0000 UTC m=+1129.850113010" Feb 17 14:25:03 crc kubenswrapper[4836]: I0217 14:25:03.572098 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-qnb5b" podStartSLOduration=4.24988045 podStartE2EDuration="50.572028948s" podCreationTimestamp="2026-02-17 14:24:13 +0000 UTC" firstStartedPulling="2026-02-17 14:24:15.863998604 +0000 UTC m=+1082.206926873" lastFinishedPulling="2026-02-17 14:25:02.186147102 +0000 UTC m=+1128.529075371" observedRunningTime="2026-02-17 14:25:03.548971214 +0000 UTC m=+1129.891899503" watchObservedRunningTime="2026-02-17 14:25:03.572028948 +0000 UTC m=+1129.914957217" Feb 17 14:25:03 crc kubenswrapper[4836]: I0217 14:25:03.637369 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-b6cfm" podStartSLOduration=4.023844526 podStartE2EDuration="51.637328519s" podCreationTimestamp="2026-02-17 14:24:12 +0000 UTC" firstStartedPulling="2026-02-17 14:24:14.58485843 +0000 UTC m=+1080.927786699" lastFinishedPulling="2026-02-17 14:25:02.198342423 +0000 UTC m=+1128.541270692" observedRunningTime="2026-02-17 14:25:03.631463729 +0000 UTC m=+1129.974391998" watchObservedRunningTime="2026-02-17 14:25:03.637328519 +0000 UTC m=+1129.980256818" Feb 17 14:25:03 crc kubenswrapper[4836]: I0217 14:25:03.683003 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-6d6964fcdb-rbq62" podStartSLOduration=4.349291914 podStartE2EDuration="50.682978485s" podCreationTimestamp="2026-02-17 14:24:13 +0000 UTC" firstStartedPulling="2026-02-17 14:24:15.864322703 +0000 UTC m=+1082.207250982" lastFinishedPulling="2026-02-17 14:25:02.198009274 +0000 UTC m=+1128.540937553" observedRunningTime="2026-02-17 14:25:03.675957705 +0000 UTC m=+1130.018885984" watchObservedRunningTime="2026-02-17 14:25:03.682978485 +0000 UTC m=+1130.025906754" Feb 17 14:25:03 crc kubenswrapper[4836]: I0217 14:25:03.718536 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-bv7s8" podStartSLOduration=3.534614384 podStartE2EDuration="50.718491079s" podCreationTimestamp="2026-02-17 14:24:13 +0000 UTC" firstStartedPulling="2026-02-17 14:24:15.013775689 +0000 UTC m=+1081.356703958" lastFinishedPulling="2026-02-17 14:25:02.197652384 +0000 UTC m=+1128.540580653" observedRunningTime="2026-02-17 14:25:03.710194014 +0000 UTC m=+1130.053122283" watchObservedRunningTime="2026-02-17 14:25:03.718491079 +0000 UTC m=+1130.061419348" Feb 17 14:25:04 crc kubenswrapper[4836]: I0217 14:25:04.477084 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-5hz7c" event={"ID":"52a90e1a-0e2d-4488-8a1a-34de15bfa3a5","Type":"ContainerStarted","Data":"f829eaa4f03e253de4cc0fe49720d25547d1eaa733834ad46123eace2cc39e92"} Feb 17 14:25:04 crc kubenswrapper[4836]: I0217 14:25:04.548579 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-5hz7c" podStartSLOduration=5.097211322 podStartE2EDuration="51.548545208s" podCreationTimestamp="2026-02-17 14:24:13 +0000 UTC" firstStartedPulling="2026-02-17 14:24:15.814518677 +0000 UTC m=+1082.157446946" lastFinishedPulling="2026-02-17 14:25:02.265852563 +0000 UTC m=+1128.608780832" observedRunningTime="2026-02-17 14:25:04.544572411 +0000 UTC m=+1130.887500690" watchObservedRunningTime="2026-02-17 14:25:04.548545208 +0000 UTC m=+1130.891473487" Feb 17 14:25:04 crc kubenswrapper[4836]: I0217 14:25:04.625846 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-68f46476f-7ktgs" Feb 17 14:25:13 crc kubenswrapper[4836]: I0217 14:25:13.367070 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-b6cfm" Feb 17 14:25:13 crc kubenswrapper[4836]: I0217 14:25:13.631810 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-bv7s8" Feb 17 14:25:14 crc kubenswrapper[4836]: I0217 14:25:14.192796 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-6c4rn" Feb 17 14:25:14 crc kubenswrapper[4836]: I0217 14:25:14.235241 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-k9p46" Feb 17 14:25:14 crc kubenswrapper[4836]: I0217 14:25:14.259192 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-5hz7c" Feb 17 14:25:14 crc kubenswrapper[4836]: I0217 14:25:14.263277 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-5hz7c" Feb 17 14:25:14 crc kubenswrapper[4836]: I0217 14:25:14.268228 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-qnb5b" Feb 17 14:25:14 crc kubenswrapper[4836]: I0217 14:25:14.633174 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-6d6964fcdb-rbq62" Feb 17 14:25:33 crc kubenswrapper[4836]: I0217 14:25:33.790905 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-46wms"] Feb 17 14:25:33 crc kubenswrapper[4836]: I0217 14:25:33.794057 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-46wms" Feb 17 14:25:33 crc kubenswrapper[4836]: I0217 14:25:33.799951 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Feb 17 14:25:33 crc kubenswrapper[4836]: I0217 14:25:33.800286 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-vkr8h" Feb 17 14:25:33 crc kubenswrapper[4836]: I0217 14:25:33.800557 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Feb 17 14:25:33 crc kubenswrapper[4836]: I0217 14:25:33.809274 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Feb 17 14:25:33 crc kubenswrapper[4836]: I0217 14:25:33.823265 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-46wms"] Feb 17 14:25:33 crc kubenswrapper[4836]: I0217 14:25:33.843831 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qmnr\" (UniqueName: \"kubernetes.io/projected/f1ebdbfb-7f75-4205-80ca-0ee085a21c0b-kube-api-access-8qmnr\") pod \"dnsmasq-dns-675f4bcbfc-46wms\" (UID: \"f1ebdbfb-7f75-4205-80ca-0ee085a21c0b\") " pod="openstack/dnsmasq-dns-675f4bcbfc-46wms" Feb 17 14:25:33 crc kubenswrapper[4836]: I0217 14:25:33.844452 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1ebdbfb-7f75-4205-80ca-0ee085a21c0b-config\") pod \"dnsmasq-dns-675f4bcbfc-46wms\" (UID: \"f1ebdbfb-7f75-4205-80ca-0ee085a21c0b\") " pod="openstack/dnsmasq-dns-675f4bcbfc-46wms" Feb 17 14:25:33 crc kubenswrapper[4836]: I0217 14:25:33.934198 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-ggz9w"] Feb 17 14:25:33 crc kubenswrapper[4836]: I0217 14:25:33.936775 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-ggz9w" Feb 17 14:25:33 crc kubenswrapper[4836]: I0217 14:25:33.946547 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvxj4\" (UniqueName: \"kubernetes.io/projected/24a665ea-1793-426d-b4df-48bfdd048f1c-kube-api-access-pvxj4\") pod \"dnsmasq-dns-78dd6ddcc-ggz9w\" (UID: \"24a665ea-1793-426d-b4df-48bfdd048f1c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-ggz9w" Feb 17 14:25:33 crc kubenswrapper[4836]: I0217 14:25:33.946661 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qmnr\" (UniqueName: \"kubernetes.io/projected/f1ebdbfb-7f75-4205-80ca-0ee085a21c0b-kube-api-access-8qmnr\") pod \"dnsmasq-dns-675f4bcbfc-46wms\" (UID: \"f1ebdbfb-7f75-4205-80ca-0ee085a21c0b\") " pod="openstack/dnsmasq-dns-675f4bcbfc-46wms" Feb 17 14:25:33 crc kubenswrapper[4836]: I0217 14:25:33.946737 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/24a665ea-1793-426d-b4df-48bfdd048f1c-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-ggz9w\" (UID: \"24a665ea-1793-426d-b4df-48bfdd048f1c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-ggz9w" Feb 17 14:25:33 crc kubenswrapper[4836]: I0217 14:25:33.946829 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24a665ea-1793-426d-b4df-48bfdd048f1c-config\") pod \"dnsmasq-dns-78dd6ddcc-ggz9w\" (UID: \"24a665ea-1793-426d-b4df-48bfdd048f1c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-ggz9w" Feb 17 14:25:33 crc kubenswrapper[4836]: I0217 14:25:33.946874 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1ebdbfb-7f75-4205-80ca-0ee085a21c0b-config\") pod \"dnsmasq-dns-675f4bcbfc-46wms\" (UID: \"f1ebdbfb-7f75-4205-80ca-0ee085a21c0b\") " pod="openstack/dnsmasq-dns-675f4bcbfc-46wms" Feb 17 14:25:33 crc kubenswrapper[4836]: I0217 14:25:33.949377 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1ebdbfb-7f75-4205-80ca-0ee085a21c0b-config\") pod \"dnsmasq-dns-675f4bcbfc-46wms\" (UID: \"f1ebdbfb-7f75-4205-80ca-0ee085a21c0b\") " pod="openstack/dnsmasq-dns-675f4bcbfc-46wms" Feb 17 14:25:33 crc kubenswrapper[4836]: I0217 14:25:33.963018 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Feb 17 14:25:33 crc kubenswrapper[4836]: I0217 14:25:33.963238 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-ggz9w"] Feb 17 14:25:34 crc kubenswrapper[4836]: I0217 14:25:34.002174 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qmnr\" (UniqueName: \"kubernetes.io/projected/f1ebdbfb-7f75-4205-80ca-0ee085a21c0b-kube-api-access-8qmnr\") pod \"dnsmasq-dns-675f4bcbfc-46wms\" (UID: \"f1ebdbfb-7f75-4205-80ca-0ee085a21c0b\") " pod="openstack/dnsmasq-dns-675f4bcbfc-46wms" Feb 17 14:25:34 crc kubenswrapper[4836]: I0217 14:25:34.056897 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvxj4\" (UniqueName: \"kubernetes.io/projected/24a665ea-1793-426d-b4df-48bfdd048f1c-kube-api-access-pvxj4\") pod \"dnsmasq-dns-78dd6ddcc-ggz9w\" (UID: \"24a665ea-1793-426d-b4df-48bfdd048f1c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-ggz9w" Feb 17 14:25:34 crc kubenswrapper[4836]: I0217 14:25:34.163642 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-46wms" Feb 17 14:25:34 crc kubenswrapper[4836]: I0217 14:25:34.165256 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/24a665ea-1793-426d-b4df-48bfdd048f1c-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-ggz9w\" (UID: \"24a665ea-1793-426d-b4df-48bfdd048f1c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-ggz9w" Feb 17 14:25:34 crc kubenswrapper[4836]: I0217 14:25:34.167952 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/24a665ea-1793-426d-b4df-48bfdd048f1c-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-ggz9w\" (UID: \"24a665ea-1793-426d-b4df-48bfdd048f1c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-ggz9w" Feb 17 14:25:34 crc kubenswrapper[4836]: I0217 14:25:34.168168 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24a665ea-1793-426d-b4df-48bfdd048f1c-config\") pod \"dnsmasq-dns-78dd6ddcc-ggz9w\" (UID: \"24a665ea-1793-426d-b4df-48bfdd048f1c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-ggz9w" Feb 17 14:25:34 crc kubenswrapper[4836]: I0217 14:25:34.172048 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24a665ea-1793-426d-b4df-48bfdd048f1c-config\") pod \"dnsmasq-dns-78dd6ddcc-ggz9w\" (UID: \"24a665ea-1793-426d-b4df-48bfdd048f1c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-ggz9w" Feb 17 14:25:34 crc kubenswrapper[4836]: I0217 14:25:34.205635 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvxj4\" (UniqueName: \"kubernetes.io/projected/24a665ea-1793-426d-b4df-48bfdd048f1c-kube-api-access-pvxj4\") pod \"dnsmasq-dns-78dd6ddcc-ggz9w\" (UID: \"24a665ea-1793-426d-b4df-48bfdd048f1c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-ggz9w" Feb 17 14:25:34 crc kubenswrapper[4836]: I0217 14:25:34.262737 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-ggz9w" Feb 17 14:25:34 crc kubenswrapper[4836]: I0217 14:25:34.848035 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-46wms"] Feb 17 14:25:34 crc kubenswrapper[4836]: I0217 14:25:34.944808 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-ggz9w"] Feb 17 14:25:34 crc kubenswrapper[4836]: W0217 14:25:34.953174 4836 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod24a665ea_1793_426d_b4df_48bfdd048f1c.slice/crio-a6bb86072d0798a1afb87a4b547d5cc9e6e5bb7f7f723d97aa4e7592e494e55c WatchSource:0}: Error finding container a6bb86072d0798a1afb87a4b547d5cc9e6e5bb7f7f723d97aa4e7592e494e55c: Status 404 returned error can't find the container with id a6bb86072d0798a1afb87a4b547d5cc9e6e5bb7f7f723d97aa4e7592e494e55c Feb 17 14:25:35 crc kubenswrapper[4836]: I0217 14:25:35.187682 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-ggz9w" event={"ID":"24a665ea-1793-426d-b4df-48bfdd048f1c","Type":"ContainerStarted","Data":"a6bb86072d0798a1afb87a4b547d5cc9e6e5bb7f7f723d97aa4e7592e494e55c"} Feb 17 14:25:35 crc kubenswrapper[4836]: I0217 14:25:35.189881 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-46wms" event={"ID":"f1ebdbfb-7f75-4205-80ca-0ee085a21c0b","Type":"ContainerStarted","Data":"b0dca2b7fd572359a51505f1dec3dc3d3db3e7f58bf21d6f39749cf427d85d3b"} Feb 17 14:25:36 crc kubenswrapper[4836]: I0217 14:25:36.355145 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-46wms"] Feb 17 14:25:36 crc kubenswrapper[4836]: I0217 14:25:36.397246 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-jbcz5"] Feb 17 14:25:36 crc kubenswrapper[4836]: I0217 14:25:36.403973 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-jbcz5" Feb 17 14:25:36 crc kubenswrapper[4836]: I0217 14:25:36.439080 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-jbcz5"] Feb 17 14:25:36 crc kubenswrapper[4836]: I0217 14:25:36.518438 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e14b6d2f-85ef-4f0c-8a81-426aee02b456-config\") pod \"dnsmasq-dns-666b6646f7-jbcz5\" (UID: \"e14b6d2f-85ef-4f0c-8a81-426aee02b456\") " pod="openstack/dnsmasq-dns-666b6646f7-jbcz5" Feb 17 14:25:36 crc kubenswrapper[4836]: I0217 14:25:36.518604 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e14b6d2f-85ef-4f0c-8a81-426aee02b456-dns-svc\") pod \"dnsmasq-dns-666b6646f7-jbcz5\" (UID: \"e14b6d2f-85ef-4f0c-8a81-426aee02b456\") " pod="openstack/dnsmasq-dns-666b6646f7-jbcz5" Feb 17 14:25:36 crc kubenswrapper[4836]: I0217 14:25:36.518651 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zckh\" (UniqueName: \"kubernetes.io/projected/e14b6d2f-85ef-4f0c-8a81-426aee02b456-kube-api-access-4zckh\") pod \"dnsmasq-dns-666b6646f7-jbcz5\" (UID: \"e14b6d2f-85ef-4f0c-8a81-426aee02b456\") " pod="openstack/dnsmasq-dns-666b6646f7-jbcz5" Feb 17 14:25:36 crc kubenswrapper[4836]: I0217 14:25:36.623204 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e14b6d2f-85ef-4f0c-8a81-426aee02b456-dns-svc\") pod \"dnsmasq-dns-666b6646f7-jbcz5\" (UID: \"e14b6d2f-85ef-4f0c-8a81-426aee02b456\") " pod="openstack/dnsmasq-dns-666b6646f7-jbcz5" Feb 17 14:25:36 crc kubenswrapper[4836]: I0217 14:25:36.623322 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zckh\" (UniqueName: \"kubernetes.io/projected/e14b6d2f-85ef-4f0c-8a81-426aee02b456-kube-api-access-4zckh\") pod \"dnsmasq-dns-666b6646f7-jbcz5\" (UID: \"e14b6d2f-85ef-4f0c-8a81-426aee02b456\") " pod="openstack/dnsmasq-dns-666b6646f7-jbcz5" Feb 17 14:25:36 crc kubenswrapper[4836]: I0217 14:25:36.623426 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e14b6d2f-85ef-4f0c-8a81-426aee02b456-config\") pod \"dnsmasq-dns-666b6646f7-jbcz5\" (UID: \"e14b6d2f-85ef-4f0c-8a81-426aee02b456\") " pod="openstack/dnsmasq-dns-666b6646f7-jbcz5" Feb 17 14:25:36 crc kubenswrapper[4836]: I0217 14:25:36.692998 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e14b6d2f-85ef-4f0c-8a81-426aee02b456-config\") pod \"dnsmasq-dns-666b6646f7-jbcz5\" (UID: \"e14b6d2f-85ef-4f0c-8a81-426aee02b456\") " pod="openstack/dnsmasq-dns-666b6646f7-jbcz5" Feb 17 14:25:36 crc kubenswrapper[4836]: I0217 14:25:36.697109 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e14b6d2f-85ef-4f0c-8a81-426aee02b456-dns-svc\") pod \"dnsmasq-dns-666b6646f7-jbcz5\" (UID: \"e14b6d2f-85ef-4f0c-8a81-426aee02b456\") " pod="openstack/dnsmasq-dns-666b6646f7-jbcz5" Feb 17 14:25:36 crc kubenswrapper[4836]: I0217 14:25:36.737207 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zckh\" (UniqueName: \"kubernetes.io/projected/e14b6d2f-85ef-4f0c-8a81-426aee02b456-kube-api-access-4zckh\") pod \"dnsmasq-dns-666b6646f7-jbcz5\" (UID: \"e14b6d2f-85ef-4f0c-8a81-426aee02b456\") " pod="openstack/dnsmasq-dns-666b6646f7-jbcz5" Feb 17 14:25:36 crc kubenswrapper[4836]: I0217 14:25:36.800973 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-jbcz5" Feb 17 14:25:36 crc kubenswrapper[4836]: I0217 14:25:36.906734 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-ggz9w"] Feb 17 14:25:36 crc kubenswrapper[4836]: I0217 14:25:36.961442 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-s6vqb"] Feb 17 14:25:36 crc kubenswrapper[4836]: I0217 14:25:36.963678 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-s6vqb" Feb 17 14:25:36 crc kubenswrapper[4836]: I0217 14:25:36.980556 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-s6vqb"] Feb 17 14:25:37 crc kubenswrapper[4836]: I0217 14:25:37.030794 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63d320ce-8669-4285-b4bc-dbb6eeb9a190-config\") pod \"dnsmasq-dns-57d769cc4f-s6vqb\" (UID: \"63d320ce-8669-4285-b4bc-dbb6eeb9a190\") " pod="openstack/dnsmasq-dns-57d769cc4f-s6vqb" Feb 17 14:25:37 crc kubenswrapper[4836]: I0217 14:25:37.030929 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ntrv\" (UniqueName: \"kubernetes.io/projected/63d320ce-8669-4285-b4bc-dbb6eeb9a190-kube-api-access-8ntrv\") pod \"dnsmasq-dns-57d769cc4f-s6vqb\" (UID: \"63d320ce-8669-4285-b4bc-dbb6eeb9a190\") " pod="openstack/dnsmasq-dns-57d769cc4f-s6vqb" Feb 17 14:25:37 crc kubenswrapper[4836]: I0217 14:25:37.031032 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/63d320ce-8669-4285-b4bc-dbb6eeb9a190-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-s6vqb\" (UID: \"63d320ce-8669-4285-b4bc-dbb6eeb9a190\") " pod="openstack/dnsmasq-dns-57d769cc4f-s6vqb" Feb 17 14:25:37 crc kubenswrapper[4836]: I0217 14:25:37.133281 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ntrv\" (UniqueName: \"kubernetes.io/projected/63d320ce-8669-4285-b4bc-dbb6eeb9a190-kube-api-access-8ntrv\") pod \"dnsmasq-dns-57d769cc4f-s6vqb\" (UID: \"63d320ce-8669-4285-b4bc-dbb6eeb9a190\") " pod="openstack/dnsmasq-dns-57d769cc4f-s6vqb" Feb 17 14:25:37 crc kubenswrapper[4836]: I0217 14:25:37.134956 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/63d320ce-8669-4285-b4bc-dbb6eeb9a190-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-s6vqb\" (UID: \"63d320ce-8669-4285-b4bc-dbb6eeb9a190\") " pod="openstack/dnsmasq-dns-57d769cc4f-s6vqb" Feb 17 14:25:37 crc kubenswrapper[4836]: I0217 14:25:37.136531 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/63d320ce-8669-4285-b4bc-dbb6eeb9a190-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-s6vqb\" (UID: \"63d320ce-8669-4285-b4bc-dbb6eeb9a190\") " pod="openstack/dnsmasq-dns-57d769cc4f-s6vqb" Feb 17 14:25:37 crc kubenswrapper[4836]: I0217 14:25:37.136683 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63d320ce-8669-4285-b4bc-dbb6eeb9a190-config\") pod \"dnsmasq-dns-57d769cc4f-s6vqb\" (UID: \"63d320ce-8669-4285-b4bc-dbb6eeb9a190\") " pod="openstack/dnsmasq-dns-57d769cc4f-s6vqb" Feb 17 14:25:37 crc kubenswrapper[4836]: I0217 14:25:37.137593 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63d320ce-8669-4285-b4bc-dbb6eeb9a190-config\") pod \"dnsmasq-dns-57d769cc4f-s6vqb\" (UID: \"63d320ce-8669-4285-b4bc-dbb6eeb9a190\") " pod="openstack/dnsmasq-dns-57d769cc4f-s6vqb" Feb 17 14:25:37 crc kubenswrapper[4836]: I0217 14:25:37.161829 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ntrv\" (UniqueName: \"kubernetes.io/projected/63d320ce-8669-4285-b4bc-dbb6eeb9a190-kube-api-access-8ntrv\") pod \"dnsmasq-dns-57d769cc4f-s6vqb\" (UID: \"63d320ce-8669-4285-b4bc-dbb6eeb9a190\") " pod="openstack/dnsmasq-dns-57d769cc4f-s6vqb" Feb 17 14:25:37 crc kubenswrapper[4836]: I0217 14:25:37.300056 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-jbcz5"] Feb 17 14:25:37 crc kubenswrapper[4836]: I0217 14:25:37.305013 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-s6vqb" Feb 17 14:25:37 crc kubenswrapper[4836]: I0217 14:25:37.570621 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Feb 17 14:25:37 crc kubenswrapper[4836]: I0217 14:25:37.586164 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 17 14:25:37 crc kubenswrapper[4836]: I0217 14:25:37.586349 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 17 14:25:37 crc kubenswrapper[4836]: I0217 14:25:37.592281 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Feb 17 14:25:37 crc kubenswrapper[4836]: I0217 14:25:37.592393 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Feb 17 14:25:37 crc kubenswrapper[4836]: I0217 14:25:37.592518 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Feb 17 14:25:37 crc kubenswrapper[4836]: I0217 14:25:37.592702 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Feb 17 14:25:37 crc kubenswrapper[4836]: I0217 14:25:37.592796 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Feb 17 14:25:37 crc kubenswrapper[4836]: I0217 14:25:37.592911 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-xcfhz" Feb 17 14:25:37 crc kubenswrapper[4836]: I0217 14:25:37.592992 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Feb 17 14:25:37 crc kubenswrapper[4836]: I0217 14:25:37.859917 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ec9408e6-0474-4f84-842e-b1c20f42a7b8-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"ec9408e6-0474-4f84-842e-b1c20f42a7b8\") " pod="openstack/rabbitmq-server-0" Feb 17 14:25:37 crc kubenswrapper[4836]: I0217 14:25:37.861625 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ec9408e6-0474-4f84-842e-b1c20f42a7b8-config-data\") pod \"rabbitmq-server-0\" (UID: \"ec9408e6-0474-4f84-842e-b1c20f42a7b8\") " pod="openstack/rabbitmq-server-0" Feb 17 14:25:37 crc kubenswrapper[4836]: I0217 14:25:37.861670 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ec9408e6-0474-4f84-842e-b1c20f42a7b8-server-conf\") pod \"rabbitmq-server-0\" (UID: \"ec9408e6-0474-4f84-842e-b1c20f42a7b8\") " pod="openstack/rabbitmq-server-0" Feb 17 14:25:37 crc kubenswrapper[4836]: I0217 14:25:37.861849 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ec9408e6-0474-4f84-842e-b1c20f42a7b8-pod-info\") pod \"rabbitmq-server-0\" (UID: \"ec9408e6-0474-4f84-842e-b1c20f42a7b8\") " pod="openstack/rabbitmq-server-0" Feb 17 14:25:37 crc kubenswrapper[4836]: I0217 14:25:37.862011 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ec9408e6-0474-4f84-842e-b1c20f42a7b8-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"ec9408e6-0474-4f84-842e-b1c20f42a7b8\") " pod="openstack/rabbitmq-server-0" Feb 17 14:25:37 crc kubenswrapper[4836]: I0217 14:25:37.862036 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-5d8603b9-3f26-4e03-8164-a0930fb9429c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5d8603b9-3f26-4e03-8164-a0930fb9429c\") pod \"rabbitmq-server-0\" (UID: \"ec9408e6-0474-4f84-842e-b1c20f42a7b8\") " pod="openstack/rabbitmq-server-0" Feb 17 14:25:37 crc kubenswrapper[4836]: I0217 14:25:37.862066 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ec9408e6-0474-4f84-842e-b1c20f42a7b8-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"ec9408e6-0474-4f84-842e-b1c20f42a7b8\") " pod="openstack/rabbitmq-server-0" Feb 17 14:25:37 crc kubenswrapper[4836]: I0217 14:25:37.862142 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfz2p\" (UniqueName: \"kubernetes.io/projected/ec9408e6-0474-4f84-842e-b1c20f42a7b8-kube-api-access-xfz2p\") pod \"rabbitmq-server-0\" (UID: \"ec9408e6-0474-4f84-842e-b1c20f42a7b8\") " pod="openstack/rabbitmq-server-0" Feb 17 14:25:37 crc kubenswrapper[4836]: I0217 14:25:37.862174 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ec9408e6-0474-4f84-842e-b1c20f42a7b8-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"ec9408e6-0474-4f84-842e-b1c20f42a7b8\") " pod="openstack/rabbitmq-server-0" Feb 17 14:25:37 crc kubenswrapper[4836]: I0217 14:25:37.862294 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ec9408e6-0474-4f84-842e-b1c20f42a7b8-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"ec9408e6-0474-4f84-842e-b1c20f42a7b8\") " pod="openstack/rabbitmq-server-0" Feb 17 14:25:37 crc kubenswrapper[4836]: I0217 14:25:37.862394 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ec9408e6-0474-4f84-842e-b1c20f42a7b8-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"ec9408e6-0474-4f84-842e-b1c20f42a7b8\") " pod="openstack/rabbitmq-server-0" Feb 17 14:25:37 crc kubenswrapper[4836]: I0217 14:25:37.964398 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ec9408e6-0474-4f84-842e-b1c20f42a7b8-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"ec9408e6-0474-4f84-842e-b1c20f42a7b8\") " pod="openstack/rabbitmq-server-0" Feb 17 14:25:37 crc kubenswrapper[4836]: I0217 14:25:37.964471 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ec9408e6-0474-4f84-842e-b1c20f42a7b8-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"ec9408e6-0474-4f84-842e-b1c20f42a7b8\") " pod="openstack/rabbitmq-server-0" Feb 17 14:25:37 crc kubenswrapper[4836]: I0217 14:25:37.964555 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ec9408e6-0474-4f84-842e-b1c20f42a7b8-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"ec9408e6-0474-4f84-842e-b1c20f42a7b8\") " pod="openstack/rabbitmq-server-0" Feb 17 14:25:37 crc kubenswrapper[4836]: I0217 14:25:37.964586 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ec9408e6-0474-4f84-842e-b1c20f42a7b8-config-data\") pod \"rabbitmq-server-0\" (UID: \"ec9408e6-0474-4f84-842e-b1c20f42a7b8\") " pod="openstack/rabbitmq-server-0" Feb 17 14:25:37 crc kubenswrapper[4836]: I0217 14:25:37.964607 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ec9408e6-0474-4f84-842e-b1c20f42a7b8-server-conf\") pod \"rabbitmq-server-0\" (UID: \"ec9408e6-0474-4f84-842e-b1c20f42a7b8\") " pod="openstack/rabbitmq-server-0" Feb 17 14:25:37 crc kubenswrapper[4836]: I0217 14:25:37.964645 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ec9408e6-0474-4f84-842e-b1c20f42a7b8-pod-info\") pod \"rabbitmq-server-0\" (UID: \"ec9408e6-0474-4f84-842e-b1c20f42a7b8\") " pod="openstack/rabbitmq-server-0" Feb 17 14:25:37 crc kubenswrapper[4836]: I0217 14:25:37.964688 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-5d8603b9-3f26-4e03-8164-a0930fb9429c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5d8603b9-3f26-4e03-8164-a0930fb9429c\") pod \"rabbitmq-server-0\" (UID: \"ec9408e6-0474-4f84-842e-b1c20f42a7b8\") " pod="openstack/rabbitmq-server-0" Feb 17 14:25:37 crc kubenswrapper[4836]: I0217 14:25:37.964714 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ec9408e6-0474-4f84-842e-b1c20f42a7b8-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"ec9408e6-0474-4f84-842e-b1c20f42a7b8\") " pod="openstack/rabbitmq-server-0" Feb 17 14:25:37 crc kubenswrapper[4836]: I0217 14:25:37.964745 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ec9408e6-0474-4f84-842e-b1c20f42a7b8-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"ec9408e6-0474-4f84-842e-b1c20f42a7b8\") " pod="openstack/rabbitmq-server-0" Feb 17 14:25:37 crc kubenswrapper[4836]: I0217 14:25:37.964773 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfz2p\" (UniqueName: \"kubernetes.io/projected/ec9408e6-0474-4f84-842e-b1c20f42a7b8-kube-api-access-xfz2p\") pod \"rabbitmq-server-0\" (UID: \"ec9408e6-0474-4f84-842e-b1c20f42a7b8\") " pod="openstack/rabbitmq-server-0" Feb 17 14:25:37 crc kubenswrapper[4836]: I0217 14:25:37.964797 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ec9408e6-0474-4f84-842e-b1c20f42a7b8-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"ec9408e6-0474-4f84-842e-b1c20f42a7b8\") " pod="openstack/rabbitmq-server-0" Feb 17 14:25:37 crc kubenswrapper[4836]: I0217 14:25:37.965441 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ec9408e6-0474-4f84-842e-b1c20f42a7b8-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"ec9408e6-0474-4f84-842e-b1c20f42a7b8\") " pod="openstack/rabbitmq-server-0" Feb 17 14:25:37 crc kubenswrapper[4836]: I0217 14:25:37.966265 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ec9408e6-0474-4f84-842e-b1c20f42a7b8-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"ec9408e6-0474-4f84-842e-b1c20f42a7b8\") " pod="openstack/rabbitmq-server-0" Feb 17 14:25:37 crc kubenswrapper[4836]: I0217 14:25:37.971027 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ec9408e6-0474-4f84-842e-b1c20f42a7b8-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"ec9408e6-0474-4f84-842e-b1c20f42a7b8\") " pod="openstack/rabbitmq-server-0" Feb 17 14:25:37 crc kubenswrapper[4836]: I0217 14:25:37.971377 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ec9408e6-0474-4f84-842e-b1c20f42a7b8-server-conf\") pod \"rabbitmq-server-0\" (UID: \"ec9408e6-0474-4f84-842e-b1c20f42a7b8\") " pod="openstack/rabbitmq-server-0" Feb 17 14:25:37 crc kubenswrapper[4836]: I0217 14:25:37.971537 4836 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 17 14:25:37 crc kubenswrapper[4836]: I0217 14:25:37.971637 4836 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-5d8603b9-3f26-4e03-8164-a0930fb9429c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5d8603b9-3f26-4e03-8164-a0930fb9429c\") pod \"rabbitmq-server-0\" (UID: \"ec9408e6-0474-4f84-842e-b1c20f42a7b8\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/03c0dd7af3740fd4ae1135362211cc7ed6efb2bcdea721aed8377f0d38bda50d/globalmount\"" pod="openstack/rabbitmq-server-0" Feb 17 14:25:37 crc kubenswrapper[4836]: I0217 14:25:37.971653 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ec9408e6-0474-4f84-842e-b1c20f42a7b8-pod-info\") pod \"rabbitmq-server-0\" (UID: \"ec9408e6-0474-4f84-842e-b1c20f42a7b8\") " pod="openstack/rabbitmq-server-0" Feb 17 14:25:37 crc kubenswrapper[4836]: I0217 14:25:37.976892 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ec9408e6-0474-4f84-842e-b1c20f42a7b8-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"ec9408e6-0474-4f84-842e-b1c20f42a7b8\") " pod="openstack/rabbitmq-server-0" Feb 17 14:25:37 crc kubenswrapper[4836]: I0217 14:25:37.977779 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ec9408e6-0474-4f84-842e-b1c20f42a7b8-config-data\") pod \"rabbitmq-server-0\" (UID: \"ec9408e6-0474-4f84-842e-b1c20f42a7b8\") " pod="openstack/rabbitmq-server-0" Feb 17 14:25:37 crc kubenswrapper[4836]: I0217 14:25:37.978855 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ec9408e6-0474-4f84-842e-b1c20f42a7b8-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"ec9408e6-0474-4f84-842e-b1c20f42a7b8\") " pod="openstack/rabbitmq-server-0" Feb 17 14:25:37 crc kubenswrapper[4836]: I0217 14:25:37.989823 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ec9408e6-0474-4f84-842e-b1c20f42a7b8-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"ec9408e6-0474-4f84-842e-b1c20f42a7b8\") " pod="openstack/rabbitmq-server-0" Feb 17 14:25:37 crc kubenswrapper[4836]: I0217 14:25:37.996065 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfz2p\" (UniqueName: \"kubernetes.io/projected/ec9408e6-0474-4f84-842e-b1c20f42a7b8-kube-api-access-xfz2p\") pod \"rabbitmq-server-0\" (UID: \"ec9408e6-0474-4f84-842e-b1c20f42a7b8\") " pod="openstack/rabbitmq-server-0" Feb 17 14:25:38 crc kubenswrapper[4836]: I0217 14:25:38.032808 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-5d8603b9-3f26-4e03-8164-a0930fb9429c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5d8603b9-3f26-4e03-8164-a0930fb9429c\") pod \"rabbitmq-server-0\" (UID: \"ec9408e6-0474-4f84-842e-b1c20f42a7b8\") " pod="openstack/rabbitmq-server-0" Feb 17 14:25:38 crc kubenswrapper[4836]: I0217 14:25:38.071215 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-s6vqb"] Feb 17 14:25:38 crc kubenswrapper[4836]: I0217 14:25:38.072063 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 17 14:25:38 crc kubenswrapper[4836]: I0217 14:25:38.085645 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 17 14:25:38 crc kubenswrapper[4836]: I0217 14:25:38.087898 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 17 14:25:38 crc kubenswrapper[4836]: I0217 14:25:38.092757 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-q7f7v" Feb 17 14:25:38 crc kubenswrapper[4836]: I0217 14:25:38.093126 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Feb 17 14:25:38 crc kubenswrapper[4836]: I0217 14:25:38.093269 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Feb 17 14:25:38 crc kubenswrapper[4836]: I0217 14:25:38.093855 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Feb 17 14:25:38 crc kubenswrapper[4836]: I0217 14:25:38.095282 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Feb 17 14:25:38 crc kubenswrapper[4836]: I0217 14:25:38.095747 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Feb 17 14:25:38 crc kubenswrapper[4836]: I0217 14:25:38.095971 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Feb 17 14:25:38 crc kubenswrapper[4836]: W0217 14:25:38.118377 4836 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod63d320ce_8669_4285_b4bc_dbb6eeb9a190.slice/crio-0ccc818ba3aecccefe49bbab270ac8d64079fabeee863e8305c62599ebffa6de WatchSource:0}: Error finding container 0ccc818ba3aecccefe49bbab270ac8d64079fabeee863e8305c62599ebffa6de: Status 404 returned error can't find the container with id 0ccc818ba3aecccefe49bbab270ac8d64079fabeee863e8305c62599ebffa6de Feb 17 14:25:38 crc kubenswrapper[4836]: I0217 14:25:38.139111 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 17 14:25:38 crc kubenswrapper[4836]: I0217 14:25:38.260080 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-jbcz5" event={"ID":"e14b6d2f-85ef-4f0c-8a81-426aee02b456","Type":"ContainerStarted","Data":"4be2faa5279826c8447da22307f09f3ad1d1675b115d7c7c5cab72070952c1fe"} Feb 17 14:25:38 crc kubenswrapper[4836]: I0217 14:25:38.270060 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6f866bb7-5209-4275-8884-df6f074b3f7c-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f866bb7-5209-4275-8884-df6f074b3f7c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 14:25:38 crc kubenswrapper[4836]: I0217 14:25:38.270117 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6f866bb7-5209-4275-8884-df6f074b3f7c-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f866bb7-5209-4275-8884-df6f074b3f7c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 14:25:38 crc kubenswrapper[4836]: I0217 14:25:38.270191 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6f866bb7-5209-4275-8884-df6f074b3f7c-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f866bb7-5209-4275-8884-df6f074b3f7c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 14:25:38 crc kubenswrapper[4836]: I0217 14:25:38.270214 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6f866bb7-5209-4275-8884-df6f074b3f7c-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f866bb7-5209-4275-8884-df6f074b3f7c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 14:25:38 crc kubenswrapper[4836]: I0217 14:25:38.270253 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6f866bb7-5209-4275-8884-df6f074b3f7c-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f866bb7-5209-4275-8884-df6f074b3f7c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 14:25:38 crc kubenswrapper[4836]: I0217 14:25:38.270326 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6f866bb7-5209-4275-8884-df6f074b3f7c-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f866bb7-5209-4275-8884-df6f074b3f7c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 14:25:38 crc kubenswrapper[4836]: I0217 14:25:38.270353 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6f866bb7-5209-4275-8884-df6f074b3f7c-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f866bb7-5209-4275-8884-df6f074b3f7c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 14:25:38 crc kubenswrapper[4836]: I0217 14:25:38.270377 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6f866bb7-5209-4275-8884-df6f074b3f7c-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f866bb7-5209-4275-8884-df6f074b3f7c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 14:25:38 crc kubenswrapper[4836]: I0217 14:25:38.270400 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6f866bb7-5209-4275-8884-df6f074b3f7c-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f866bb7-5209-4275-8884-df6f074b3f7c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 14:25:38 crc kubenswrapper[4836]: I0217 14:25:38.270447 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-07fb2529-04ae-48ee-a7cb-9474d02cf39f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-07fb2529-04ae-48ee-a7cb-9474d02cf39f\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f866bb7-5209-4275-8884-df6f074b3f7c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 14:25:38 crc kubenswrapper[4836]: I0217 14:25:38.270477 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9h24\" (UniqueName: \"kubernetes.io/projected/6f866bb7-5209-4275-8884-df6f074b3f7c-kube-api-access-t9h24\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f866bb7-5209-4275-8884-df6f074b3f7c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 14:25:38 crc kubenswrapper[4836]: I0217 14:25:38.272610 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-s6vqb" event={"ID":"63d320ce-8669-4285-b4bc-dbb6eeb9a190","Type":"ContainerStarted","Data":"0ccc818ba3aecccefe49bbab270ac8d64079fabeee863e8305c62599ebffa6de"} Feb 17 14:25:38 crc kubenswrapper[4836]: I0217 14:25:38.371900 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-07fb2529-04ae-48ee-a7cb-9474d02cf39f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-07fb2529-04ae-48ee-a7cb-9474d02cf39f\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f866bb7-5209-4275-8884-df6f074b3f7c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 14:25:38 crc kubenswrapper[4836]: I0217 14:25:38.372383 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9h24\" (UniqueName: \"kubernetes.io/projected/6f866bb7-5209-4275-8884-df6f074b3f7c-kube-api-access-t9h24\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f866bb7-5209-4275-8884-df6f074b3f7c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 14:25:38 crc kubenswrapper[4836]: I0217 14:25:38.372420 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6f866bb7-5209-4275-8884-df6f074b3f7c-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f866bb7-5209-4275-8884-df6f074b3f7c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 14:25:38 crc kubenswrapper[4836]: I0217 14:25:38.372435 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6f866bb7-5209-4275-8884-df6f074b3f7c-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f866bb7-5209-4275-8884-df6f074b3f7c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 14:25:38 crc kubenswrapper[4836]: I0217 14:25:38.372484 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6f866bb7-5209-4275-8884-df6f074b3f7c-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f866bb7-5209-4275-8884-df6f074b3f7c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 14:25:38 crc kubenswrapper[4836]: I0217 14:25:38.372501 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6f866bb7-5209-4275-8884-df6f074b3f7c-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f866bb7-5209-4275-8884-df6f074b3f7c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 14:25:38 crc kubenswrapper[4836]: I0217 14:25:38.372536 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6f866bb7-5209-4275-8884-df6f074b3f7c-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f866bb7-5209-4275-8884-df6f074b3f7c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 14:25:38 crc kubenswrapper[4836]: I0217 14:25:38.372569 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6f866bb7-5209-4275-8884-df6f074b3f7c-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f866bb7-5209-4275-8884-df6f074b3f7c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 14:25:38 crc kubenswrapper[4836]: I0217 14:25:38.372587 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6f866bb7-5209-4275-8884-df6f074b3f7c-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f866bb7-5209-4275-8884-df6f074b3f7c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 14:25:38 crc kubenswrapper[4836]: I0217 14:25:38.372614 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6f866bb7-5209-4275-8884-df6f074b3f7c-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f866bb7-5209-4275-8884-df6f074b3f7c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 14:25:38 crc kubenswrapper[4836]: I0217 14:25:38.372642 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6f866bb7-5209-4275-8884-df6f074b3f7c-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f866bb7-5209-4275-8884-df6f074b3f7c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 14:25:38 crc kubenswrapper[4836]: I0217 14:25:38.373993 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6f866bb7-5209-4275-8884-df6f074b3f7c-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f866bb7-5209-4275-8884-df6f074b3f7c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 14:25:38 crc kubenswrapper[4836]: I0217 14:25:38.377773 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6f866bb7-5209-4275-8884-df6f074b3f7c-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f866bb7-5209-4275-8884-df6f074b3f7c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 14:25:38 crc kubenswrapper[4836]: I0217 14:25:38.377814 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6f866bb7-5209-4275-8884-df6f074b3f7c-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f866bb7-5209-4275-8884-df6f074b3f7c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 14:25:38 crc kubenswrapper[4836]: I0217 14:25:38.378256 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6f866bb7-5209-4275-8884-df6f074b3f7c-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f866bb7-5209-4275-8884-df6f074b3f7c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 14:25:38 crc kubenswrapper[4836]: I0217 14:25:38.379321 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6f866bb7-5209-4275-8884-df6f074b3f7c-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f866bb7-5209-4275-8884-df6f074b3f7c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 14:25:38 crc kubenswrapper[4836]: I0217 14:25:38.383927 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6f866bb7-5209-4275-8884-df6f074b3f7c-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f866bb7-5209-4275-8884-df6f074b3f7c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 14:25:38 crc kubenswrapper[4836]: I0217 14:25:38.390055 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6f866bb7-5209-4275-8884-df6f074b3f7c-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f866bb7-5209-4275-8884-df6f074b3f7c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 14:25:38 crc kubenswrapper[4836]: I0217 14:25:38.390271 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6f866bb7-5209-4275-8884-df6f074b3f7c-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f866bb7-5209-4275-8884-df6f074b3f7c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 14:25:38 crc kubenswrapper[4836]: I0217 14:25:38.390900 4836 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 17 14:25:38 crc kubenswrapper[4836]: I0217 14:25:38.390972 4836 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-07fb2529-04ae-48ee-a7cb-9474d02cf39f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-07fb2529-04ae-48ee-a7cb-9474d02cf39f\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f866bb7-5209-4275-8884-df6f074b3f7c\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/b8a4c70e8190ff4e8e3819dded9c01c5615ea2f38f06b8e31f9d4a795c0f880b/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Feb 17 14:25:38 crc kubenswrapper[4836]: I0217 14:25:38.402744 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9h24\" (UniqueName: \"kubernetes.io/projected/6f866bb7-5209-4275-8884-df6f074b3f7c-kube-api-access-t9h24\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f866bb7-5209-4275-8884-df6f074b3f7c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 14:25:38 crc kubenswrapper[4836]: I0217 14:25:38.403608 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6f866bb7-5209-4275-8884-df6f074b3f7c-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f866bb7-5209-4275-8884-df6f074b3f7c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 14:25:38 crc kubenswrapper[4836]: I0217 14:25:38.435183 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-07fb2529-04ae-48ee-a7cb-9474d02cf39f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-07fb2529-04ae-48ee-a7cb-9474d02cf39f\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f866bb7-5209-4275-8884-df6f074b3f7c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 14:25:38 crc kubenswrapper[4836]: I0217 14:25:38.447087 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 17 14:25:38 crc kubenswrapper[4836]: I0217 14:25:38.596851 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 17 14:25:38 crc kubenswrapper[4836]: W0217 14:25:38.619356 4836 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec9408e6_0474_4f84_842e_b1c20f42a7b8.slice/crio-b0fa4d163a114845da5261a3895974dd34d3a05172b60f7f2dca90e0c423de30 WatchSource:0}: Error finding container b0fa4d163a114845da5261a3895974dd34d3a05172b60f7f2dca90e0c423de30: Status 404 returned error can't find the container with id b0fa4d163a114845da5261a3895974dd34d3a05172b60f7f2dca90e0c423de30 Feb 17 14:25:39 crc kubenswrapper[4836]: I0217 14:25:39.166275 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 17 14:25:39 crc kubenswrapper[4836]: W0217 14:25:39.214794 4836 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f866bb7_5209_4275_8884_df6f074b3f7c.slice/crio-aa9448becf8224adce1ecee542747964e3dcfc59ddd21273b79be9dd9f859c35 WatchSource:0}: Error finding container aa9448becf8224adce1ecee542747964e3dcfc59ddd21273b79be9dd9f859c35: Status 404 returned error can't find the container with id aa9448becf8224adce1ecee542747964e3dcfc59ddd21273b79be9dd9f859c35 Feb 17 14:25:39 crc kubenswrapper[4836]: I0217 14:25:39.280133 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Feb 17 14:25:39 crc kubenswrapper[4836]: I0217 14:25:39.281990 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 17 14:25:39 crc kubenswrapper[4836]: I0217 14:25:39.285900 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-xwjkv" Feb 17 14:25:39 crc kubenswrapper[4836]: I0217 14:25:39.286371 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Feb 17 14:25:39 crc kubenswrapper[4836]: I0217 14:25:39.286495 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Feb 17 14:25:39 crc kubenswrapper[4836]: I0217 14:25:39.286676 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Feb 17 14:25:39 crc kubenswrapper[4836]: I0217 14:25:39.293087 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 17 14:25:39 crc kubenswrapper[4836]: I0217 14:25:39.299943 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Feb 17 14:25:39 crc kubenswrapper[4836]: I0217 14:25:39.303368 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ec9408e6-0474-4f84-842e-b1c20f42a7b8","Type":"ContainerStarted","Data":"b0fa4d163a114845da5261a3895974dd34d3a05172b60f7f2dca90e0c423de30"} Feb 17 14:25:39 crc kubenswrapper[4836]: I0217 14:25:39.314069 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6f866bb7-5209-4275-8884-df6f074b3f7c","Type":"ContainerStarted","Data":"aa9448becf8224adce1ecee542747964e3dcfc59ddd21273b79be9dd9f859c35"} Feb 17 14:25:39 crc kubenswrapper[4836]: I0217 14:25:39.397338 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wccv\" (UniqueName: \"kubernetes.io/projected/2fd891e0-6f97-4fa3-8281-aa97232d6c6d-kube-api-access-8wccv\") pod \"openstack-galera-0\" (UID: \"2fd891e0-6f97-4fa3-8281-aa97232d6c6d\") " pod="openstack/openstack-galera-0" Feb 17 14:25:39 crc kubenswrapper[4836]: I0217 14:25:39.397426 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-0b71cc85-2788-4798-ad47-2c45d9c63e69\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0b71cc85-2788-4798-ad47-2c45d9c63e69\") pod \"openstack-galera-0\" (UID: \"2fd891e0-6f97-4fa3-8281-aa97232d6c6d\") " pod="openstack/openstack-galera-0" Feb 17 14:25:39 crc kubenswrapper[4836]: I0217 14:25:39.397462 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2fd891e0-6f97-4fa3-8281-aa97232d6c6d-kolla-config\") pod \"openstack-galera-0\" (UID: \"2fd891e0-6f97-4fa3-8281-aa97232d6c6d\") " pod="openstack/openstack-galera-0" Feb 17 14:25:39 crc kubenswrapper[4836]: I0217 14:25:39.397498 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/2fd891e0-6f97-4fa3-8281-aa97232d6c6d-config-data-generated\") pod \"openstack-galera-0\" (UID: \"2fd891e0-6f97-4fa3-8281-aa97232d6c6d\") " pod="openstack/openstack-galera-0" Feb 17 14:25:39 crc kubenswrapper[4836]: I0217 14:25:39.397553 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fd891e0-6f97-4fa3-8281-aa97232d6c6d-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"2fd891e0-6f97-4fa3-8281-aa97232d6c6d\") " pod="openstack/openstack-galera-0" Feb 17 14:25:39 crc kubenswrapper[4836]: I0217 14:25:39.398521 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/2fd891e0-6f97-4fa3-8281-aa97232d6c6d-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"2fd891e0-6f97-4fa3-8281-aa97232d6c6d\") " pod="openstack/openstack-galera-0" Feb 17 14:25:39 crc kubenswrapper[4836]: I0217 14:25:39.398643 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2fd891e0-6f97-4fa3-8281-aa97232d6c6d-operator-scripts\") pod \"openstack-galera-0\" (UID: \"2fd891e0-6f97-4fa3-8281-aa97232d6c6d\") " pod="openstack/openstack-galera-0" Feb 17 14:25:39 crc kubenswrapper[4836]: I0217 14:25:39.398942 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/2fd891e0-6f97-4fa3-8281-aa97232d6c6d-config-data-default\") pod \"openstack-galera-0\" (UID: \"2fd891e0-6f97-4fa3-8281-aa97232d6c6d\") " pod="openstack/openstack-galera-0" Feb 17 14:25:39 crc kubenswrapper[4836]: I0217 14:25:39.501242 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fd891e0-6f97-4fa3-8281-aa97232d6c6d-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"2fd891e0-6f97-4fa3-8281-aa97232d6c6d\") " pod="openstack/openstack-galera-0" Feb 17 14:25:39 crc kubenswrapper[4836]: I0217 14:25:39.501331 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/2fd891e0-6f97-4fa3-8281-aa97232d6c6d-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"2fd891e0-6f97-4fa3-8281-aa97232d6c6d\") " pod="openstack/openstack-galera-0" Feb 17 14:25:39 crc kubenswrapper[4836]: I0217 14:25:39.501365 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2fd891e0-6f97-4fa3-8281-aa97232d6c6d-operator-scripts\") pod \"openstack-galera-0\" (UID: \"2fd891e0-6f97-4fa3-8281-aa97232d6c6d\") " pod="openstack/openstack-galera-0" Feb 17 14:25:39 crc kubenswrapper[4836]: I0217 14:25:39.501420 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/2fd891e0-6f97-4fa3-8281-aa97232d6c6d-config-data-default\") pod \"openstack-galera-0\" (UID: \"2fd891e0-6f97-4fa3-8281-aa97232d6c6d\") " pod="openstack/openstack-galera-0" Feb 17 14:25:39 crc kubenswrapper[4836]: I0217 14:25:39.501470 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wccv\" (UniqueName: \"kubernetes.io/projected/2fd891e0-6f97-4fa3-8281-aa97232d6c6d-kube-api-access-8wccv\") pod \"openstack-galera-0\" (UID: \"2fd891e0-6f97-4fa3-8281-aa97232d6c6d\") " pod="openstack/openstack-galera-0" Feb 17 14:25:39 crc kubenswrapper[4836]: I0217 14:25:39.501510 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-0b71cc85-2788-4798-ad47-2c45d9c63e69\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0b71cc85-2788-4798-ad47-2c45d9c63e69\") pod \"openstack-galera-0\" (UID: \"2fd891e0-6f97-4fa3-8281-aa97232d6c6d\") " pod="openstack/openstack-galera-0" Feb 17 14:25:39 crc kubenswrapper[4836]: I0217 14:25:39.501536 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2fd891e0-6f97-4fa3-8281-aa97232d6c6d-kolla-config\") pod \"openstack-galera-0\" (UID: \"2fd891e0-6f97-4fa3-8281-aa97232d6c6d\") " pod="openstack/openstack-galera-0" Feb 17 14:25:39 crc kubenswrapper[4836]: I0217 14:25:39.501571 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/2fd891e0-6f97-4fa3-8281-aa97232d6c6d-config-data-generated\") pod \"openstack-galera-0\" (UID: \"2fd891e0-6f97-4fa3-8281-aa97232d6c6d\") " pod="openstack/openstack-galera-0" Feb 17 14:25:39 crc kubenswrapper[4836]: I0217 14:25:39.504850 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/2fd891e0-6f97-4fa3-8281-aa97232d6c6d-config-data-generated\") pod \"openstack-galera-0\" (UID: \"2fd891e0-6f97-4fa3-8281-aa97232d6c6d\") " pod="openstack/openstack-galera-0" Feb 17 14:25:39 crc kubenswrapper[4836]: I0217 14:25:39.505494 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/2fd891e0-6f97-4fa3-8281-aa97232d6c6d-kolla-config\") pod \"openstack-galera-0\" (UID: \"2fd891e0-6f97-4fa3-8281-aa97232d6c6d\") " pod="openstack/openstack-galera-0" Feb 17 14:25:39 crc kubenswrapper[4836]: I0217 14:25:39.506571 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2fd891e0-6f97-4fa3-8281-aa97232d6c6d-operator-scripts\") pod \"openstack-galera-0\" (UID: \"2fd891e0-6f97-4fa3-8281-aa97232d6c6d\") " pod="openstack/openstack-galera-0" Feb 17 14:25:39 crc kubenswrapper[4836]: I0217 14:25:39.507365 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/2fd891e0-6f97-4fa3-8281-aa97232d6c6d-config-data-default\") pod \"openstack-galera-0\" (UID: \"2fd891e0-6f97-4fa3-8281-aa97232d6c6d\") " pod="openstack/openstack-galera-0" Feb 17 14:25:39 crc kubenswrapper[4836]: I0217 14:25:39.515943 4836 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 17 14:25:39 crc kubenswrapper[4836]: I0217 14:25:39.516002 4836 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-0b71cc85-2788-4798-ad47-2c45d9c63e69\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0b71cc85-2788-4798-ad47-2c45d9c63e69\") pod \"openstack-galera-0\" (UID: \"2fd891e0-6f97-4fa3-8281-aa97232d6c6d\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/afb4cc9a14bf9ee01a267a35faf227427838c3a04bd3afa8c77910fa5827f2c9/globalmount\"" pod="openstack/openstack-galera-0" Feb 17 14:25:39 crc kubenswrapper[4836]: I0217 14:25:39.529668 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fd891e0-6f97-4fa3-8281-aa97232d6c6d-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"2fd891e0-6f97-4fa3-8281-aa97232d6c6d\") " pod="openstack/openstack-galera-0" Feb 17 14:25:39 crc kubenswrapper[4836]: I0217 14:25:39.532559 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wccv\" (UniqueName: \"kubernetes.io/projected/2fd891e0-6f97-4fa3-8281-aa97232d6c6d-kube-api-access-8wccv\") pod \"openstack-galera-0\" (UID: \"2fd891e0-6f97-4fa3-8281-aa97232d6c6d\") " pod="openstack/openstack-galera-0" Feb 17 14:25:39 crc kubenswrapper[4836]: I0217 14:25:39.532567 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/2fd891e0-6f97-4fa3-8281-aa97232d6c6d-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"2fd891e0-6f97-4fa3-8281-aa97232d6c6d\") " pod="openstack/openstack-galera-0" Feb 17 14:25:39 crc kubenswrapper[4836]: I0217 14:25:39.573551 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-0b71cc85-2788-4798-ad47-2c45d9c63e69\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0b71cc85-2788-4798-ad47-2c45d9c63e69\") pod \"openstack-galera-0\" (UID: \"2fd891e0-6f97-4fa3-8281-aa97232d6c6d\") " pod="openstack/openstack-galera-0" Feb 17 14:25:39 crc kubenswrapper[4836]: I0217 14:25:39.620023 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 17 14:25:40 crc kubenswrapper[4836]: I0217 14:25:40.231318 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 17 14:25:40 crc kubenswrapper[4836]: W0217 14:25:40.336878 4836 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2fd891e0_6f97_4fa3_8281_aa97232d6c6d.slice/crio-b7ccab72d6bfe7a32d290bc2bd21c7153f92a355a14103904bd51d275095d6c6 WatchSource:0}: Error finding container b7ccab72d6bfe7a32d290bc2bd21c7153f92a355a14103904bd51d275095d6c6: Status 404 returned error can't find the container with id b7ccab72d6bfe7a32d290bc2bd21c7153f92a355a14103904bd51d275095d6c6 Feb 17 14:25:40 crc kubenswrapper[4836]: I0217 14:25:40.753652 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 17 14:25:40 crc kubenswrapper[4836]: I0217 14:25:40.765209 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 17 14:25:40 crc kubenswrapper[4836]: I0217 14:25:40.774207 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-br528" Feb 17 14:25:40 crc kubenswrapper[4836]: I0217 14:25:40.774622 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Feb 17 14:25:40 crc kubenswrapper[4836]: I0217 14:25:40.775159 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Feb 17 14:25:40 crc kubenswrapper[4836]: I0217 14:25:40.796868 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Feb 17 14:25:40 crc kubenswrapper[4836]: I0217 14:25:40.808490 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 17 14:25:40 crc kubenswrapper[4836]: I0217 14:25:40.827675 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a6016745-1634-4eb6-afee-b98ce9ab8f56-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"a6016745-1634-4eb6-afee-b98ce9ab8f56\") " pod="openstack/openstack-cell1-galera-0" Feb 17 14:25:40 crc kubenswrapper[4836]: I0217 14:25:40.827735 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-d533d4c9-53ad-455f-9db7-827245c43d24\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d533d4c9-53ad-455f-9db7-827245c43d24\") pod \"openstack-cell1-galera-0\" (UID: \"a6016745-1634-4eb6-afee-b98ce9ab8f56\") " pod="openstack/openstack-cell1-galera-0" Feb 17 14:25:40 crc kubenswrapper[4836]: I0217 14:25:40.827758 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a6016745-1634-4eb6-afee-b98ce9ab8f56-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"a6016745-1634-4eb6-afee-b98ce9ab8f56\") " pod="openstack/openstack-cell1-galera-0" Feb 17 14:25:40 crc kubenswrapper[4836]: I0217 14:25:40.827787 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6016745-1634-4eb6-afee-b98ce9ab8f56-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"a6016745-1634-4eb6-afee-b98ce9ab8f56\") " pod="openstack/openstack-cell1-galera-0" Feb 17 14:25:40 crc kubenswrapper[4836]: I0217 14:25:40.827841 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a6016745-1634-4eb6-afee-b98ce9ab8f56-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"a6016745-1634-4eb6-afee-b98ce9ab8f56\") " pod="openstack/openstack-cell1-galera-0" Feb 17 14:25:40 crc kubenswrapper[4836]: I0217 14:25:40.830183 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6785l\" (UniqueName: \"kubernetes.io/projected/a6016745-1634-4eb6-afee-b98ce9ab8f56-kube-api-access-6785l\") pod \"openstack-cell1-galera-0\" (UID: \"a6016745-1634-4eb6-afee-b98ce9ab8f56\") " pod="openstack/openstack-cell1-galera-0" Feb 17 14:25:40 crc kubenswrapper[4836]: I0217 14:25:40.830240 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6016745-1634-4eb6-afee-b98ce9ab8f56-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"a6016745-1634-4eb6-afee-b98ce9ab8f56\") " pod="openstack/openstack-cell1-galera-0" Feb 17 14:25:40 crc kubenswrapper[4836]: I0217 14:25:40.830596 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a6016745-1634-4eb6-afee-b98ce9ab8f56-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"a6016745-1634-4eb6-afee-b98ce9ab8f56\") " pod="openstack/openstack-cell1-galera-0" Feb 17 14:25:40 crc kubenswrapper[4836]: I0217 14:25:40.910752 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Feb 17 14:25:40 crc kubenswrapper[4836]: I0217 14:25:40.913713 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 17 14:25:40 crc kubenswrapper[4836]: I0217 14:25:40.922544 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-zckhv" Feb 17 14:25:40 crc kubenswrapper[4836]: I0217 14:25:40.922934 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Feb 17 14:25:40 crc kubenswrapper[4836]: I0217 14:25:40.923116 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Feb 17 14:25:40 crc kubenswrapper[4836]: I0217 14:25:40.934223 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a6016745-1634-4eb6-afee-b98ce9ab8f56-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"a6016745-1634-4eb6-afee-b98ce9ab8f56\") " pod="openstack/openstack-cell1-galera-0" Feb 17 14:25:40 crc kubenswrapper[4836]: I0217 14:25:40.934329 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6785l\" (UniqueName: \"kubernetes.io/projected/a6016745-1634-4eb6-afee-b98ce9ab8f56-kube-api-access-6785l\") pod \"openstack-cell1-galera-0\" (UID: \"a6016745-1634-4eb6-afee-b98ce9ab8f56\") " pod="openstack/openstack-cell1-galera-0" Feb 17 14:25:40 crc kubenswrapper[4836]: I0217 14:25:40.934367 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6016745-1634-4eb6-afee-b98ce9ab8f56-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"a6016745-1634-4eb6-afee-b98ce9ab8f56\") " pod="openstack/openstack-cell1-galera-0" Feb 17 14:25:40 crc kubenswrapper[4836]: I0217 14:25:40.934409 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a6016745-1634-4eb6-afee-b98ce9ab8f56-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"a6016745-1634-4eb6-afee-b98ce9ab8f56\") " pod="openstack/openstack-cell1-galera-0" Feb 17 14:25:40 crc kubenswrapper[4836]: I0217 14:25:40.934463 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a6016745-1634-4eb6-afee-b98ce9ab8f56-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"a6016745-1634-4eb6-afee-b98ce9ab8f56\") " pod="openstack/openstack-cell1-galera-0" Feb 17 14:25:40 crc kubenswrapper[4836]: I0217 14:25:40.934493 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-d533d4c9-53ad-455f-9db7-827245c43d24\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d533d4c9-53ad-455f-9db7-827245c43d24\") pod \"openstack-cell1-galera-0\" (UID: \"a6016745-1634-4eb6-afee-b98ce9ab8f56\") " pod="openstack/openstack-cell1-galera-0" Feb 17 14:25:40 crc kubenswrapper[4836]: I0217 14:25:40.934513 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a6016745-1634-4eb6-afee-b98ce9ab8f56-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"a6016745-1634-4eb6-afee-b98ce9ab8f56\") " pod="openstack/openstack-cell1-galera-0" Feb 17 14:25:40 crc kubenswrapper[4836]: I0217 14:25:40.934539 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6016745-1634-4eb6-afee-b98ce9ab8f56-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"a6016745-1634-4eb6-afee-b98ce9ab8f56\") " pod="openstack/openstack-cell1-galera-0" Feb 17 14:25:40 crc kubenswrapper[4836]: I0217 14:25:40.935057 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a6016745-1634-4eb6-afee-b98ce9ab8f56-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"a6016745-1634-4eb6-afee-b98ce9ab8f56\") " pod="openstack/openstack-cell1-galera-0" Feb 17 14:25:40 crc kubenswrapper[4836]: I0217 14:25:40.940213 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a6016745-1634-4eb6-afee-b98ce9ab8f56-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"a6016745-1634-4eb6-afee-b98ce9ab8f56\") " pod="openstack/openstack-cell1-galera-0" Feb 17 14:25:40 crc kubenswrapper[4836]: I0217 14:25:40.942581 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a6016745-1634-4eb6-afee-b98ce9ab8f56-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"a6016745-1634-4eb6-afee-b98ce9ab8f56\") " pod="openstack/openstack-cell1-galera-0" Feb 17 14:25:40 crc kubenswrapper[4836]: I0217 14:25:40.942637 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a6016745-1634-4eb6-afee-b98ce9ab8f56-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"a6016745-1634-4eb6-afee-b98ce9ab8f56\") " pod="openstack/openstack-cell1-galera-0" Feb 17 14:25:40 crc kubenswrapper[4836]: I0217 14:25:40.950642 4836 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 17 14:25:40 crc kubenswrapper[4836]: I0217 14:25:40.950697 4836 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-d533d4c9-53ad-455f-9db7-827245c43d24\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d533d4c9-53ad-455f-9db7-827245c43d24\") pod \"openstack-cell1-galera-0\" (UID: \"a6016745-1634-4eb6-afee-b98ce9ab8f56\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/de93ef36a3acf049f0dff48064a98354008e521ee562dddd1e6894d45770836f/globalmount\"" pod="openstack/openstack-cell1-galera-0" Feb 17 14:25:40 crc kubenswrapper[4836]: I0217 14:25:40.956099 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/a6016745-1634-4eb6-afee-b98ce9ab8f56-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"a6016745-1634-4eb6-afee-b98ce9ab8f56\") " pod="openstack/openstack-cell1-galera-0" Feb 17 14:25:40 crc kubenswrapper[4836]: I0217 14:25:40.959717 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6785l\" (UniqueName: \"kubernetes.io/projected/a6016745-1634-4eb6-afee-b98ce9ab8f56-kube-api-access-6785l\") pod \"openstack-cell1-galera-0\" (UID: \"a6016745-1634-4eb6-afee-b98ce9ab8f56\") " pod="openstack/openstack-cell1-galera-0" Feb 17 14:25:40 crc kubenswrapper[4836]: I0217 14:25:40.978955 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 17 14:25:40 crc kubenswrapper[4836]: I0217 14:25:40.979703 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6016745-1634-4eb6-afee-b98ce9ab8f56-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"a6016745-1634-4eb6-afee-b98ce9ab8f56\") " pod="openstack/openstack-cell1-galera-0" Feb 17 14:25:41 crc kubenswrapper[4836]: I0217 14:25:41.012933 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-d533d4c9-53ad-455f-9db7-827245c43d24\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d533d4c9-53ad-455f-9db7-827245c43d24\") pod \"openstack-cell1-galera-0\" (UID: \"a6016745-1634-4eb6-afee-b98ce9ab8f56\") " pod="openstack/openstack-cell1-galera-0" Feb 17 14:25:41 crc kubenswrapper[4836]: I0217 14:25:41.083680 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ce3babe4-6d77-45ce-b9cc-626678d3ec64-kolla-config\") pod \"memcached-0\" (UID: \"ce3babe4-6d77-45ce-b9cc-626678d3ec64\") " pod="openstack/memcached-0" Feb 17 14:25:41 crc kubenswrapper[4836]: I0217 14:25:41.083775 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce3babe4-6d77-45ce-b9cc-626678d3ec64-combined-ca-bundle\") pod \"memcached-0\" (UID: \"ce3babe4-6d77-45ce-b9cc-626678d3ec64\") " pod="openstack/memcached-0" Feb 17 14:25:41 crc kubenswrapper[4836]: I0217 14:25:41.083819 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ce3babe4-6d77-45ce-b9cc-626678d3ec64-config-data\") pod \"memcached-0\" (UID: \"ce3babe4-6d77-45ce-b9cc-626678d3ec64\") " pod="openstack/memcached-0" Feb 17 14:25:41 crc kubenswrapper[4836]: I0217 14:25:41.083863 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcpcn\" (UniqueName: \"kubernetes.io/projected/ce3babe4-6d77-45ce-b9cc-626678d3ec64-kube-api-access-wcpcn\") pod \"memcached-0\" (UID: \"ce3babe4-6d77-45ce-b9cc-626678d3ec64\") " pod="openstack/memcached-0" Feb 17 14:25:41 crc kubenswrapper[4836]: I0217 14:25:41.083883 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce3babe4-6d77-45ce-b9cc-626678d3ec64-memcached-tls-certs\") pod \"memcached-0\" (UID: \"ce3babe4-6d77-45ce-b9cc-626678d3ec64\") " pod="openstack/memcached-0" Feb 17 14:25:41 crc kubenswrapper[4836]: I0217 14:25:41.136952 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 17 14:25:41 crc kubenswrapper[4836]: I0217 14:25:41.185323 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wcpcn\" (UniqueName: \"kubernetes.io/projected/ce3babe4-6d77-45ce-b9cc-626678d3ec64-kube-api-access-wcpcn\") pod \"memcached-0\" (UID: \"ce3babe4-6d77-45ce-b9cc-626678d3ec64\") " pod="openstack/memcached-0" Feb 17 14:25:41 crc kubenswrapper[4836]: I0217 14:25:41.185377 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce3babe4-6d77-45ce-b9cc-626678d3ec64-memcached-tls-certs\") pod \"memcached-0\" (UID: \"ce3babe4-6d77-45ce-b9cc-626678d3ec64\") " pod="openstack/memcached-0" Feb 17 14:25:41 crc kubenswrapper[4836]: I0217 14:25:41.185432 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ce3babe4-6d77-45ce-b9cc-626678d3ec64-kolla-config\") pod \"memcached-0\" (UID: \"ce3babe4-6d77-45ce-b9cc-626678d3ec64\") " pod="openstack/memcached-0" Feb 17 14:25:41 crc kubenswrapper[4836]: I0217 14:25:41.185490 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce3babe4-6d77-45ce-b9cc-626678d3ec64-combined-ca-bundle\") pod \"memcached-0\" (UID: \"ce3babe4-6d77-45ce-b9cc-626678d3ec64\") " pod="openstack/memcached-0" Feb 17 14:25:41 crc kubenswrapper[4836]: I0217 14:25:41.185531 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ce3babe4-6d77-45ce-b9cc-626678d3ec64-config-data\") pod \"memcached-0\" (UID: \"ce3babe4-6d77-45ce-b9cc-626678d3ec64\") " pod="openstack/memcached-0" Feb 17 14:25:41 crc kubenswrapper[4836]: I0217 14:25:41.186269 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ce3babe4-6d77-45ce-b9cc-626678d3ec64-config-data\") pod \"memcached-0\" (UID: \"ce3babe4-6d77-45ce-b9cc-626678d3ec64\") " pod="openstack/memcached-0" Feb 17 14:25:41 crc kubenswrapper[4836]: I0217 14:25:41.188660 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ce3babe4-6d77-45ce-b9cc-626678d3ec64-kolla-config\") pod \"memcached-0\" (UID: \"ce3babe4-6d77-45ce-b9cc-626678d3ec64\") " pod="openstack/memcached-0" Feb 17 14:25:41 crc kubenswrapper[4836]: I0217 14:25:41.195984 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce3babe4-6d77-45ce-b9cc-626678d3ec64-memcached-tls-certs\") pod \"memcached-0\" (UID: \"ce3babe4-6d77-45ce-b9cc-626678d3ec64\") " pod="openstack/memcached-0" Feb 17 14:25:41 crc kubenswrapper[4836]: I0217 14:25:41.209415 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce3babe4-6d77-45ce-b9cc-626678d3ec64-combined-ca-bundle\") pod \"memcached-0\" (UID: \"ce3babe4-6d77-45ce-b9cc-626678d3ec64\") " pod="openstack/memcached-0" Feb 17 14:25:41 crc kubenswrapper[4836]: I0217 14:25:41.288352 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcpcn\" (UniqueName: \"kubernetes.io/projected/ce3babe4-6d77-45ce-b9cc-626678d3ec64-kube-api-access-wcpcn\") pod \"memcached-0\" (UID: \"ce3babe4-6d77-45ce-b9cc-626678d3ec64\") " pod="openstack/memcached-0" Feb 17 14:25:41 crc kubenswrapper[4836]: I0217 14:25:41.310521 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 17 14:25:41 crc kubenswrapper[4836]: I0217 14:25:41.460492 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"2fd891e0-6f97-4fa3-8281-aa97232d6c6d","Type":"ContainerStarted","Data":"b7ccab72d6bfe7a32d290bc2bd21c7153f92a355a14103904bd51d275095d6c6"} Feb 17 14:25:42 crc kubenswrapper[4836]: I0217 14:25:42.019420 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 17 14:25:42 crc kubenswrapper[4836]: I0217 14:25:42.364481 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 17 14:25:43 crc kubenswrapper[4836]: I0217 14:25:43.515670 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Feb 17 14:25:43 crc kubenswrapper[4836]: I0217 14:25:43.519860 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 17 14:25:43 crc kubenswrapper[4836]: I0217 14:25:43.525424 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-vzz5b" Feb 17 14:25:43 crc kubenswrapper[4836]: I0217 14:25:43.532400 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 17 14:25:43 crc kubenswrapper[4836]: I0217 14:25:43.636693 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnv8r\" (UniqueName: \"kubernetes.io/projected/87197028-3222-4c04-89a7-135997258e0d-kube-api-access-wnv8r\") pod \"kube-state-metrics-0\" (UID: \"87197028-3222-4c04-89a7-135997258e0d\") " pod="openstack/kube-state-metrics-0" Feb 17 14:25:43 crc kubenswrapper[4836]: I0217 14:25:43.750506 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wnv8r\" (UniqueName: \"kubernetes.io/projected/87197028-3222-4c04-89a7-135997258e0d-kube-api-access-wnv8r\") pod \"kube-state-metrics-0\" (UID: \"87197028-3222-4c04-89a7-135997258e0d\") " pod="openstack/kube-state-metrics-0" Feb 17 14:25:43 crc kubenswrapper[4836]: I0217 14:25:43.813373 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnv8r\" (UniqueName: \"kubernetes.io/projected/87197028-3222-4c04-89a7-135997258e0d-kube-api-access-wnv8r\") pod \"kube-state-metrics-0\" (UID: \"87197028-3222-4c04-89a7-135997258e0d\") " pod="openstack/kube-state-metrics-0" Feb 17 14:25:43 crc kubenswrapper[4836]: I0217 14:25:43.859895 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 17 14:25:44 crc kubenswrapper[4836]: I0217 14:25:44.798264 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/alertmanager-metric-storage-0"] Feb 17 14:25:44 crc kubenswrapper[4836]: I0217 14:25:44.801999 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Feb 17 14:25:44 crc kubenswrapper[4836]: I0217 14:25:44.810626 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-cluster-tls-config" Feb 17 14:25:44 crc kubenswrapper[4836]: I0217 14:25:44.810788 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-generated" Feb 17 14:25:44 crc kubenswrapper[4836]: I0217 14:25:44.810941 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-web-config" Feb 17 14:25:44 crc kubenswrapper[4836]: I0217 14:25:44.811010 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-tls-assets-0" Feb 17 14:25:44 crc kubenswrapper[4836]: I0217 14:25:44.811174 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-alertmanager-dockercfg-mlnwd" Feb 17 14:25:44 crc kubenswrapper[4836]: I0217 14:25:44.817094 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Feb 17 14:25:44 crc kubenswrapper[4836]: I0217 14:25:44.928203 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5jwk\" (UniqueName: \"kubernetes.io/projected/039a526c-4f5a-4641-9340-b18459145569-kube-api-access-p5jwk\") pod \"alertmanager-metric-storage-0\" (UID: \"039a526c-4f5a-4641-9340-b18459145569\") " pod="openstack/alertmanager-metric-storage-0" Feb 17 14:25:44 crc kubenswrapper[4836]: I0217 14:25:44.928268 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/039a526c-4f5a-4641-9340-b18459145569-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"039a526c-4f5a-4641-9340-b18459145569\") " pod="openstack/alertmanager-metric-storage-0" Feb 17 14:25:44 crc kubenswrapper[4836]: I0217 14:25:44.928319 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/039a526c-4f5a-4641-9340-b18459145569-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"039a526c-4f5a-4641-9340-b18459145569\") " pod="openstack/alertmanager-metric-storage-0" Feb 17 14:25:44 crc kubenswrapper[4836]: I0217 14:25:44.932900 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/039a526c-4f5a-4641-9340-b18459145569-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"039a526c-4f5a-4641-9340-b18459145569\") " pod="openstack/alertmanager-metric-storage-0" Feb 17 14:25:44 crc kubenswrapper[4836]: I0217 14:25:44.933042 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/039a526c-4f5a-4641-9340-b18459145569-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"039a526c-4f5a-4641-9340-b18459145569\") " pod="openstack/alertmanager-metric-storage-0" Feb 17 14:25:44 crc kubenswrapper[4836]: I0217 14:25:44.933325 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/039a526c-4f5a-4641-9340-b18459145569-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"039a526c-4f5a-4641-9340-b18459145569\") " pod="openstack/alertmanager-metric-storage-0" Feb 17 14:25:44 crc kubenswrapper[4836]: I0217 14:25:44.933399 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/039a526c-4f5a-4641-9340-b18459145569-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"039a526c-4f5a-4641-9340-b18459145569\") " pod="openstack/alertmanager-metric-storage-0" Feb 17 14:25:45 crc kubenswrapper[4836]: I0217 14:25:45.035702 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/039a526c-4f5a-4641-9340-b18459145569-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"039a526c-4f5a-4641-9340-b18459145569\") " pod="openstack/alertmanager-metric-storage-0" Feb 17 14:25:45 crc kubenswrapper[4836]: I0217 14:25:45.035792 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/039a526c-4f5a-4641-9340-b18459145569-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"039a526c-4f5a-4641-9340-b18459145569\") " pod="openstack/alertmanager-metric-storage-0" Feb 17 14:25:45 crc kubenswrapper[4836]: I0217 14:25:45.035840 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5jwk\" (UniqueName: \"kubernetes.io/projected/039a526c-4f5a-4641-9340-b18459145569-kube-api-access-p5jwk\") pod \"alertmanager-metric-storage-0\" (UID: \"039a526c-4f5a-4641-9340-b18459145569\") " pod="openstack/alertmanager-metric-storage-0" Feb 17 14:25:45 crc kubenswrapper[4836]: I0217 14:25:45.035863 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/039a526c-4f5a-4641-9340-b18459145569-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"039a526c-4f5a-4641-9340-b18459145569\") " pod="openstack/alertmanager-metric-storage-0" Feb 17 14:25:45 crc kubenswrapper[4836]: I0217 14:25:45.035888 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/039a526c-4f5a-4641-9340-b18459145569-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"039a526c-4f5a-4641-9340-b18459145569\") " pod="openstack/alertmanager-metric-storage-0" Feb 17 14:25:45 crc kubenswrapper[4836]: I0217 14:25:45.035925 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/039a526c-4f5a-4641-9340-b18459145569-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"039a526c-4f5a-4641-9340-b18459145569\") " pod="openstack/alertmanager-metric-storage-0" Feb 17 14:25:45 crc kubenswrapper[4836]: I0217 14:25:45.035950 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/039a526c-4f5a-4641-9340-b18459145569-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"039a526c-4f5a-4641-9340-b18459145569\") " pod="openstack/alertmanager-metric-storage-0" Feb 17 14:25:45 crc kubenswrapper[4836]: I0217 14:25:45.038244 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/039a526c-4f5a-4641-9340-b18459145569-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"039a526c-4f5a-4641-9340-b18459145569\") " pod="openstack/alertmanager-metric-storage-0" Feb 17 14:25:45 crc kubenswrapper[4836]: I0217 14:25:45.043716 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/039a526c-4f5a-4641-9340-b18459145569-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"039a526c-4f5a-4641-9340-b18459145569\") " pod="openstack/alertmanager-metric-storage-0" Feb 17 14:25:45 crc kubenswrapper[4836]: I0217 14:25:45.044733 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/039a526c-4f5a-4641-9340-b18459145569-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"039a526c-4f5a-4641-9340-b18459145569\") " pod="openstack/alertmanager-metric-storage-0" Feb 17 14:25:45 crc kubenswrapper[4836]: I0217 14:25:45.063123 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/039a526c-4f5a-4641-9340-b18459145569-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"039a526c-4f5a-4641-9340-b18459145569\") " pod="openstack/alertmanager-metric-storage-0" Feb 17 14:25:45 crc kubenswrapper[4836]: I0217 14:25:45.068041 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5jwk\" (UniqueName: \"kubernetes.io/projected/039a526c-4f5a-4641-9340-b18459145569-kube-api-access-p5jwk\") pod \"alertmanager-metric-storage-0\" (UID: \"039a526c-4f5a-4641-9340-b18459145569\") " pod="openstack/alertmanager-metric-storage-0" Feb 17 14:25:45 crc kubenswrapper[4836]: I0217 14:25:45.071092 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/039a526c-4f5a-4641-9340-b18459145569-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"039a526c-4f5a-4641-9340-b18459145569\") " pod="openstack/alertmanager-metric-storage-0" Feb 17 14:25:45 crc kubenswrapper[4836]: I0217 14:25:45.079799 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/039a526c-4f5a-4641-9340-b18459145569-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"039a526c-4f5a-4641-9340-b18459145569\") " pod="openstack/alertmanager-metric-storage-0" Feb 17 14:25:45 crc kubenswrapper[4836]: I0217 14:25:45.103061 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 17 14:25:45 crc kubenswrapper[4836]: I0217 14:25:45.107602 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 17 14:25:45 crc kubenswrapper[4836]: I0217 14:25:45.120436 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Feb 17 14:25:45 crc kubenswrapper[4836]: I0217 14:25:45.125330 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Feb 17 14:25:45 crc kubenswrapper[4836]: I0217 14:25:45.125567 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Feb 17 14:25:45 crc kubenswrapper[4836]: I0217 14:25:45.126615 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Feb 17 14:25:45 crc kubenswrapper[4836]: I0217 14:25:45.126704 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Feb 17 14:25:45 crc kubenswrapper[4836]: I0217 14:25:45.126966 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-x7d2x" Feb 17 14:25:45 crc kubenswrapper[4836]: I0217 14:25:45.127070 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Feb 17 14:25:45 crc kubenswrapper[4836]: I0217 14:25:45.129408 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Feb 17 14:25:45 crc kubenswrapper[4836]: I0217 14:25:45.135794 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 17 14:25:45 crc kubenswrapper[4836]: I0217 14:25:45.140001 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Feb 17 14:25:45 crc kubenswrapper[4836]: I0217 14:25:45.145042 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e29295a6-4250-4a7d-84c5-a8b3ddf96bd0-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"e29295a6-4250-4a7d-84c5-a8b3ddf96bd0\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:25:45 crc kubenswrapper[4836]: I0217 14:25:45.145188 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/e29295a6-4250-4a7d-84c5-a8b3ddf96bd0-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"e29295a6-4250-4a7d-84c5-a8b3ddf96bd0\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:25:45 crc kubenswrapper[4836]: I0217 14:25:45.145249 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e29295a6-4250-4a7d-84c5-a8b3ddf96bd0-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"e29295a6-4250-4a7d-84c5-a8b3ddf96bd0\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:25:45 crc kubenswrapper[4836]: I0217 14:25:45.145312 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/e29295a6-4250-4a7d-84c5-a8b3ddf96bd0-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"e29295a6-4250-4a7d-84c5-a8b3ddf96bd0\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:25:45 crc kubenswrapper[4836]: I0217 14:25:45.145381 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-93f26e02-6577-44e5-880e-5ede6b185735\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-93f26e02-6577-44e5-880e-5ede6b185735\") pod \"prometheus-metric-storage-0\" (UID: \"e29295a6-4250-4a7d-84c5-a8b3ddf96bd0\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:25:45 crc kubenswrapper[4836]: I0217 14:25:45.145415 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e29295a6-4250-4a7d-84c5-a8b3ddf96bd0-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"e29295a6-4250-4a7d-84c5-a8b3ddf96bd0\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:25:45 crc kubenswrapper[4836]: I0217 14:25:45.145453 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e29295a6-4250-4a7d-84c5-a8b3ddf96bd0-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"e29295a6-4250-4a7d-84c5-a8b3ddf96bd0\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:25:45 crc kubenswrapper[4836]: I0217 14:25:45.146768 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e29295a6-4250-4a7d-84c5-a8b3ddf96bd0-config\") pod \"prometheus-metric-storage-0\" (UID: \"e29295a6-4250-4a7d-84c5-a8b3ddf96bd0\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:25:45 crc kubenswrapper[4836]: I0217 14:25:45.146855 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8z8l\" (UniqueName: \"kubernetes.io/projected/e29295a6-4250-4a7d-84c5-a8b3ddf96bd0-kube-api-access-t8z8l\") pod \"prometheus-metric-storage-0\" (UID: \"e29295a6-4250-4a7d-84c5-a8b3ddf96bd0\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:25:45 crc kubenswrapper[4836]: I0217 14:25:45.146915 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e29295a6-4250-4a7d-84c5-a8b3ddf96bd0-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"e29295a6-4250-4a7d-84c5-a8b3ddf96bd0\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:25:45 crc kubenswrapper[4836]: I0217 14:25:45.248805 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e29295a6-4250-4a7d-84c5-a8b3ddf96bd0-config\") pod \"prometheus-metric-storage-0\" (UID: \"e29295a6-4250-4a7d-84c5-a8b3ddf96bd0\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:25:45 crc kubenswrapper[4836]: I0217 14:25:45.248904 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8z8l\" (UniqueName: \"kubernetes.io/projected/e29295a6-4250-4a7d-84c5-a8b3ddf96bd0-kube-api-access-t8z8l\") pod \"prometheus-metric-storage-0\" (UID: \"e29295a6-4250-4a7d-84c5-a8b3ddf96bd0\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:25:45 crc kubenswrapper[4836]: I0217 14:25:45.248942 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e29295a6-4250-4a7d-84c5-a8b3ddf96bd0-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"e29295a6-4250-4a7d-84c5-a8b3ddf96bd0\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:25:45 crc kubenswrapper[4836]: I0217 14:25:45.249008 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e29295a6-4250-4a7d-84c5-a8b3ddf96bd0-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"e29295a6-4250-4a7d-84c5-a8b3ddf96bd0\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:25:45 crc kubenswrapper[4836]: I0217 14:25:45.249046 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/e29295a6-4250-4a7d-84c5-a8b3ddf96bd0-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"e29295a6-4250-4a7d-84c5-a8b3ddf96bd0\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:25:45 crc kubenswrapper[4836]: I0217 14:25:45.249078 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e29295a6-4250-4a7d-84c5-a8b3ddf96bd0-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"e29295a6-4250-4a7d-84c5-a8b3ddf96bd0\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:25:45 crc kubenswrapper[4836]: I0217 14:25:45.249098 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/e29295a6-4250-4a7d-84c5-a8b3ddf96bd0-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"e29295a6-4250-4a7d-84c5-a8b3ddf96bd0\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:25:45 crc kubenswrapper[4836]: I0217 14:25:45.249130 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-93f26e02-6577-44e5-880e-5ede6b185735\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-93f26e02-6577-44e5-880e-5ede6b185735\") pod \"prometheus-metric-storage-0\" (UID: \"e29295a6-4250-4a7d-84c5-a8b3ddf96bd0\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:25:45 crc kubenswrapper[4836]: I0217 14:25:45.249150 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e29295a6-4250-4a7d-84c5-a8b3ddf96bd0-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"e29295a6-4250-4a7d-84c5-a8b3ddf96bd0\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:25:45 crc kubenswrapper[4836]: I0217 14:25:45.249173 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e29295a6-4250-4a7d-84c5-a8b3ddf96bd0-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"e29295a6-4250-4a7d-84c5-a8b3ddf96bd0\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:25:45 crc kubenswrapper[4836]: I0217 14:25:45.250359 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/e29295a6-4250-4a7d-84c5-a8b3ddf96bd0-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"e29295a6-4250-4a7d-84c5-a8b3ddf96bd0\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:25:45 crc kubenswrapper[4836]: I0217 14:25:45.250664 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/e29295a6-4250-4a7d-84c5-a8b3ddf96bd0-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"e29295a6-4250-4a7d-84c5-a8b3ddf96bd0\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:25:45 crc kubenswrapper[4836]: I0217 14:25:45.253127 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e29295a6-4250-4a7d-84c5-a8b3ddf96bd0-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"e29295a6-4250-4a7d-84c5-a8b3ddf96bd0\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:25:45 crc kubenswrapper[4836]: I0217 14:25:45.261122 4836 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 17 14:25:45 crc kubenswrapper[4836]: I0217 14:25:45.261181 4836 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-93f26e02-6577-44e5-880e-5ede6b185735\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-93f26e02-6577-44e5-880e-5ede6b185735\") pod \"prometheus-metric-storage-0\" (UID: \"e29295a6-4250-4a7d-84c5-a8b3ddf96bd0\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/94da064c7e93eda9403c837c8900dc0ec43041d0305170815d7b87148c388206/globalmount\"" pod="openstack/prometheus-metric-storage-0" Feb 17 14:25:45 crc kubenswrapper[4836]: I0217 14:25:45.265589 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e29295a6-4250-4a7d-84c5-a8b3ddf96bd0-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"e29295a6-4250-4a7d-84c5-a8b3ddf96bd0\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:25:45 crc kubenswrapper[4836]: I0217 14:25:45.269561 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8z8l\" (UniqueName: \"kubernetes.io/projected/e29295a6-4250-4a7d-84c5-a8b3ddf96bd0-kube-api-access-t8z8l\") pod \"prometheus-metric-storage-0\" (UID: \"e29295a6-4250-4a7d-84c5-a8b3ddf96bd0\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:25:45 crc kubenswrapper[4836]: I0217 14:25:45.271230 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e29295a6-4250-4a7d-84c5-a8b3ddf96bd0-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"e29295a6-4250-4a7d-84c5-a8b3ddf96bd0\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:25:45 crc kubenswrapper[4836]: I0217 14:25:45.273942 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e29295a6-4250-4a7d-84c5-a8b3ddf96bd0-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"e29295a6-4250-4a7d-84c5-a8b3ddf96bd0\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:25:45 crc kubenswrapper[4836]: I0217 14:25:45.275604 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e29295a6-4250-4a7d-84c5-a8b3ddf96bd0-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"e29295a6-4250-4a7d-84c5-a8b3ddf96bd0\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:25:45 crc kubenswrapper[4836]: I0217 14:25:45.279915 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/e29295a6-4250-4a7d-84c5-a8b3ddf96bd0-config\") pod \"prometheus-metric-storage-0\" (UID: \"e29295a6-4250-4a7d-84c5-a8b3ddf96bd0\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:25:45 crc kubenswrapper[4836]: I0217 14:25:45.338058 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-93f26e02-6577-44e5-880e-5ede6b185735\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-93f26e02-6577-44e5-880e-5ede6b185735\") pod \"prometheus-metric-storage-0\" (UID: \"e29295a6-4250-4a7d-84c5-a8b3ddf96bd0\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:25:45 crc kubenswrapper[4836]: I0217 14:25:45.471196 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 17 14:25:46 crc kubenswrapper[4836]: I0217 14:25:46.463830 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ghk5k"] Feb 17 14:25:46 crc kubenswrapper[4836]: I0217 14:25:46.465022 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ghk5k" Feb 17 14:25:46 crc kubenswrapper[4836]: I0217 14:25:46.468475 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-llqfn" Feb 17 14:25:46 crc kubenswrapper[4836]: I0217 14:25:46.473082 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Feb 17 14:25:46 crc kubenswrapper[4836]: I0217 14:25:46.473350 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Feb 17 14:25:46 crc kubenswrapper[4836]: I0217 14:25:46.498490 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ghk5k"] Feb 17 14:25:46 crc kubenswrapper[4836]: I0217 14:25:46.563743 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-j4jj9"] Feb 17 14:25:46 crc kubenswrapper[4836]: I0217 14:25:46.570961 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-j4jj9" Feb 17 14:25:46 crc kubenswrapper[4836]: I0217 14:25:46.596177 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5949d44f-ef6d-417e-9035-9b235cd59863-var-run\") pod \"ovn-controller-ghk5k\" (UID: \"5949d44f-ef6d-417e-9035-9b235cd59863\") " pod="openstack/ovn-controller-ghk5k" Feb 17 14:25:46 crc kubenswrapper[4836]: I0217 14:25:46.596336 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5949d44f-ef6d-417e-9035-9b235cd59863-var-log-ovn\") pod \"ovn-controller-ghk5k\" (UID: \"5949d44f-ef6d-417e-9035-9b235cd59863\") " pod="openstack/ovn-controller-ghk5k" Feb 17 14:25:46 crc kubenswrapper[4836]: I0217 14:25:46.596374 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5949d44f-ef6d-417e-9035-9b235cd59863-var-run-ovn\") pod \"ovn-controller-ghk5k\" (UID: \"5949d44f-ef6d-417e-9035-9b235cd59863\") " pod="openstack/ovn-controller-ghk5k" Feb 17 14:25:46 crc kubenswrapper[4836]: I0217 14:25:46.596405 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/5949d44f-ef6d-417e-9035-9b235cd59863-ovn-controller-tls-certs\") pod \"ovn-controller-ghk5k\" (UID: \"5949d44f-ef6d-417e-9035-9b235cd59863\") " pod="openstack/ovn-controller-ghk5k" Feb 17 14:25:46 crc kubenswrapper[4836]: I0217 14:25:46.596436 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9d6bz\" (UniqueName: \"kubernetes.io/projected/5949d44f-ef6d-417e-9035-9b235cd59863-kube-api-access-9d6bz\") pod \"ovn-controller-ghk5k\" (UID: \"5949d44f-ef6d-417e-9035-9b235cd59863\") " pod="openstack/ovn-controller-ghk5k" Feb 17 14:25:46 crc kubenswrapper[4836]: I0217 14:25:46.596454 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5949d44f-ef6d-417e-9035-9b235cd59863-combined-ca-bundle\") pod \"ovn-controller-ghk5k\" (UID: \"5949d44f-ef6d-417e-9035-9b235cd59863\") " pod="openstack/ovn-controller-ghk5k" Feb 17 14:25:46 crc kubenswrapper[4836]: I0217 14:25:46.596483 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5949d44f-ef6d-417e-9035-9b235cd59863-scripts\") pod \"ovn-controller-ghk5k\" (UID: \"5949d44f-ef6d-417e-9035-9b235cd59863\") " pod="openstack/ovn-controller-ghk5k" Feb 17 14:25:46 crc kubenswrapper[4836]: I0217 14:25:46.605539 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-j4jj9"] Feb 17 14:25:46 crc kubenswrapper[4836]: I0217 14:25:46.698107 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5949d44f-ef6d-417e-9035-9b235cd59863-var-run-ovn\") pod \"ovn-controller-ghk5k\" (UID: \"5949d44f-ef6d-417e-9035-9b235cd59863\") " pod="openstack/ovn-controller-ghk5k" Feb 17 14:25:46 crc kubenswrapper[4836]: I0217 14:25:46.698168 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/5949d44f-ef6d-417e-9035-9b235cd59863-ovn-controller-tls-certs\") pod \"ovn-controller-ghk5k\" (UID: \"5949d44f-ef6d-417e-9035-9b235cd59863\") " pod="openstack/ovn-controller-ghk5k" Feb 17 14:25:46 crc kubenswrapper[4836]: I0217 14:25:46.698194 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/cefe420d-f25c-4681-9ae8-b61f0a354282-var-run\") pod \"ovn-controller-ovs-j4jj9\" (UID: \"cefe420d-f25c-4681-9ae8-b61f0a354282\") " pod="openstack/ovn-controller-ovs-j4jj9" Feb 17 14:25:46 crc kubenswrapper[4836]: I0217 14:25:46.698221 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9d6bz\" (UniqueName: \"kubernetes.io/projected/5949d44f-ef6d-417e-9035-9b235cd59863-kube-api-access-9d6bz\") pod \"ovn-controller-ghk5k\" (UID: \"5949d44f-ef6d-417e-9035-9b235cd59863\") " pod="openstack/ovn-controller-ghk5k" Feb 17 14:25:46 crc kubenswrapper[4836]: I0217 14:25:46.698237 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5949d44f-ef6d-417e-9035-9b235cd59863-combined-ca-bundle\") pod \"ovn-controller-ghk5k\" (UID: \"5949d44f-ef6d-417e-9035-9b235cd59863\") " pod="openstack/ovn-controller-ghk5k" Feb 17 14:25:46 crc kubenswrapper[4836]: I0217 14:25:46.698273 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5949d44f-ef6d-417e-9035-9b235cd59863-scripts\") pod \"ovn-controller-ghk5k\" (UID: \"5949d44f-ef6d-417e-9035-9b235cd59863\") " pod="openstack/ovn-controller-ghk5k" Feb 17 14:25:46 crc kubenswrapper[4836]: I0217 14:25:46.698413 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cefe420d-f25c-4681-9ae8-b61f0a354282-scripts\") pod \"ovn-controller-ovs-j4jj9\" (UID: \"cefe420d-f25c-4681-9ae8-b61f0a354282\") " pod="openstack/ovn-controller-ovs-j4jj9" Feb 17 14:25:46 crc kubenswrapper[4836]: I0217 14:25:46.698442 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/cefe420d-f25c-4681-9ae8-b61f0a354282-var-log\") pod \"ovn-controller-ovs-j4jj9\" (UID: \"cefe420d-f25c-4681-9ae8-b61f0a354282\") " pod="openstack/ovn-controller-ovs-j4jj9" Feb 17 14:25:46 crc kubenswrapper[4836]: I0217 14:25:46.698458 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/cefe420d-f25c-4681-9ae8-b61f0a354282-var-lib\") pod \"ovn-controller-ovs-j4jj9\" (UID: \"cefe420d-f25c-4681-9ae8-b61f0a354282\") " pod="openstack/ovn-controller-ovs-j4jj9" Feb 17 14:25:46 crc kubenswrapper[4836]: I0217 14:25:46.698490 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5949d44f-ef6d-417e-9035-9b235cd59863-var-run\") pod \"ovn-controller-ghk5k\" (UID: \"5949d44f-ef6d-417e-9035-9b235cd59863\") " pod="openstack/ovn-controller-ghk5k" Feb 17 14:25:46 crc kubenswrapper[4836]: I0217 14:25:46.698522 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/cefe420d-f25c-4681-9ae8-b61f0a354282-etc-ovs\") pod \"ovn-controller-ovs-j4jj9\" (UID: \"cefe420d-f25c-4681-9ae8-b61f0a354282\") " pod="openstack/ovn-controller-ovs-j4jj9" Feb 17 14:25:46 crc kubenswrapper[4836]: I0217 14:25:46.698540 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5949d44f-ef6d-417e-9035-9b235cd59863-var-log-ovn\") pod \"ovn-controller-ghk5k\" (UID: \"5949d44f-ef6d-417e-9035-9b235cd59863\") " pod="openstack/ovn-controller-ghk5k" Feb 17 14:25:46 crc kubenswrapper[4836]: I0217 14:25:46.698572 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dszrb\" (UniqueName: \"kubernetes.io/projected/cefe420d-f25c-4681-9ae8-b61f0a354282-kube-api-access-dszrb\") pod \"ovn-controller-ovs-j4jj9\" (UID: \"cefe420d-f25c-4681-9ae8-b61f0a354282\") " pod="openstack/ovn-controller-ovs-j4jj9" Feb 17 14:25:46 crc kubenswrapper[4836]: I0217 14:25:46.699255 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5949d44f-ef6d-417e-9035-9b235cd59863-var-run-ovn\") pod \"ovn-controller-ghk5k\" (UID: \"5949d44f-ef6d-417e-9035-9b235cd59863\") " pod="openstack/ovn-controller-ghk5k" Feb 17 14:25:46 crc kubenswrapper[4836]: I0217 14:25:46.701118 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5949d44f-ef6d-417e-9035-9b235cd59863-var-run\") pod \"ovn-controller-ghk5k\" (UID: \"5949d44f-ef6d-417e-9035-9b235cd59863\") " pod="openstack/ovn-controller-ghk5k" Feb 17 14:25:46 crc kubenswrapper[4836]: I0217 14:25:46.701449 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5949d44f-ef6d-417e-9035-9b235cd59863-var-log-ovn\") pod \"ovn-controller-ghk5k\" (UID: \"5949d44f-ef6d-417e-9035-9b235cd59863\") " pod="openstack/ovn-controller-ghk5k" Feb 17 14:25:46 crc kubenswrapper[4836]: I0217 14:25:46.702873 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5949d44f-ef6d-417e-9035-9b235cd59863-scripts\") pod \"ovn-controller-ghk5k\" (UID: \"5949d44f-ef6d-417e-9035-9b235cd59863\") " pod="openstack/ovn-controller-ghk5k" Feb 17 14:25:46 crc kubenswrapper[4836]: I0217 14:25:46.714240 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/5949d44f-ef6d-417e-9035-9b235cd59863-ovn-controller-tls-certs\") pod \"ovn-controller-ghk5k\" (UID: \"5949d44f-ef6d-417e-9035-9b235cd59863\") " pod="openstack/ovn-controller-ghk5k" Feb 17 14:25:46 crc kubenswrapper[4836]: I0217 14:25:46.729428 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5949d44f-ef6d-417e-9035-9b235cd59863-combined-ca-bundle\") pod \"ovn-controller-ghk5k\" (UID: \"5949d44f-ef6d-417e-9035-9b235cd59863\") " pod="openstack/ovn-controller-ghk5k" Feb 17 14:25:46 crc kubenswrapper[4836]: I0217 14:25:46.790100 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9d6bz\" (UniqueName: \"kubernetes.io/projected/5949d44f-ef6d-417e-9035-9b235cd59863-kube-api-access-9d6bz\") pod \"ovn-controller-ghk5k\" (UID: \"5949d44f-ef6d-417e-9035-9b235cd59863\") " pod="openstack/ovn-controller-ghk5k" Feb 17 14:25:46 crc kubenswrapper[4836]: I0217 14:25:46.799969 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/cefe420d-f25c-4681-9ae8-b61f0a354282-var-log\") pod \"ovn-controller-ovs-j4jj9\" (UID: \"cefe420d-f25c-4681-9ae8-b61f0a354282\") " pod="openstack/ovn-controller-ovs-j4jj9" Feb 17 14:25:46 crc kubenswrapper[4836]: I0217 14:25:46.800023 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/cefe420d-f25c-4681-9ae8-b61f0a354282-var-lib\") pod \"ovn-controller-ovs-j4jj9\" (UID: \"cefe420d-f25c-4681-9ae8-b61f0a354282\") " pod="openstack/ovn-controller-ovs-j4jj9" Feb 17 14:25:46 crc kubenswrapper[4836]: I0217 14:25:46.800164 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/cefe420d-f25c-4681-9ae8-b61f0a354282-etc-ovs\") pod \"ovn-controller-ovs-j4jj9\" (UID: \"cefe420d-f25c-4681-9ae8-b61f0a354282\") " pod="openstack/ovn-controller-ovs-j4jj9" Feb 17 14:25:46 crc kubenswrapper[4836]: I0217 14:25:46.800198 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dszrb\" (UniqueName: \"kubernetes.io/projected/cefe420d-f25c-4681-9ae8-b61f0a354282-kube-api-access-dszrb\") pod \"ovn-controller-ovs-j4jj9\" (UID: \"cefe420d-f25c-4681-9ae8-b61f0a354282\") " pod="openstack/ovn-controller-ovs-j4jj9" Feb 17 14:25:46 crc kubenswrapper[4836]: I0217 14:25:46.800272 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/cefe420d-f25c-4681-9ae8-b61f0a354282-var-run\") pod \"ovn-controller-ovs-j4jj9\" (UID: \"cefe420d-f25c-4681-9ae8-b61f0a354282\") " pod="openstack/ovn-controller-ovs-j4jj9" Feb 17 14:25:46 crc kubenswrapper[4836]: I0217 14:25:46.800429 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cefe420d-f25c-4681-9ae8-b61f0a354282-scripts\") pod \"ovn-controller-ovs-j4jj9\" (UID: \"cefe420d-f25c-4681-9ae8-b61f0a354282\") " pod="openstack/ovn-controller-ovs-j4jj9" Feb 17 14:25:46 crc kubenswrapper[4836]: I0217 14:25:46.801832 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/cefe420d-f25c-4681-9ae8-b61f0a354282-var-lib\") pod \"ovn-controller-ovs-j4jj9\" (UID: \"cefe420d-f25c-4681-9ae8-b61f0a354282\") " pod="openstack/ovn-controller-ovs-j4jj9" Feb 17 14:25:46 crc kubenswrapper[4836]: I0217 14:25:46.801972 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/cefe420d-f25c-4681-9ae8-b61f0a354282-var-log\") pod \"ovn-controller-ovs-j4jj9\" (UID: \"cefe420d-f25c-4681-9ae8-b61f0a354282\") " pod="openstack/ovn-controller-ovs-j4jj9" Feb 17 14:25:46 crc kubenswrapper[4836]: I0217 14:25:46.802553 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/cefe420d-f25c-4681-9ae8-b61f0a354282-var-run\") pod \"ovn-controller-ovs-j4jj9\" (UID: \"cefe420d-f25c-4681-9ae8-b61f0a354282\") " pod="openstack/ovn-controller-ovs-j4jj9" Feb 17 14:25:46 crc kubenswrapper[4836]: I0217 14:25:46.802751 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/cefe420d-f25c-4681-9ae8-b61f0a354282-etc-ovs\") pod \"ovn-controller-ovs-j4jj9\" (UID: \"cefe420d-f25c-4681-9ae8-b61f0a354282\") " pod="openstack/ovn-controller-ovs-j4jj9" Feb 17 14:25:46 crc kubenswrapper[4836]: I0217 14:25:46.806483 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cefe420d-f25c-4681-9ae8-b61f0a354282-scripts\") pod \"ovn-controller-ovs-j4jj9\" (UID: \"cefe420d-f25c-4681-9ae8-b61f0a354282\") " pod="openstack/ovn-controller-ovs-j4jj9" Feb 17 14:25:46 crc kubenswrapper[4836]: I0217 14:25:46.831007 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dszrb\" (UniqueName: \"kubernetes.io/projected/cefe420d-f25c-4681-9ae8-b61f0a354282-kube-api-access-dszrb\") pod \"ovn-controller-ovs-j4jj9\" (UID: \"cefe420d-f25c-4681-9ae8-b61f0a354282\") " pod="openstack/ovn-controller-ovs-j4jj9" Feb 17 14:25:46 crc kubenswrapper[4836]: I0217 14:25:46.831464 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ghk5k" Feb 17 14:25:46 crc kubenswrapper[4836]: I0217 14:25:46.899588 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-j4jj9" Feb 17 14:25:47 crc kubenswrapper[4836]: I0217 14:25:47.835672 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 17 14:25:47 crc kubenswrapper[4836]: I0217 14:25:47.837690 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 17 14:25:47 crc kubenswrapper[4836]: I0217 14:25:47.842747 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Feb 17 14:25:47 crc kubenswrapper[4836]: I0217 14:25:47.842995 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Feb 17 14:25:47 crc kubenswrapper[4836]: I0217 14:25:47.843221 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Feb 17 14:25:47 crc kubenswrapper[4836]: I0217 14:25:47.843377 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Feb 17 14:25:47 crc kubenswrapper[4836]: I0217 14:25:47.843537 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-r7fq2" Feb 17 14:25:47 crc kubenswrapper[4836]: I0217 14:25:47.850815 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 17 14:25:48 crc kubenswrapper[4836]: I0217 14:25:48.031541 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/348d02a8-d1b2-4bd3-9f4c-9153e24a5f19-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"348d02a8-d1b2-4bd3-9f4c-9153e24a5f19\") " pod="openstack/ovsdbserver-sb-0" Feb 17 14:25:48 crc kubenswrapper[4836]: I0217 14:25:48.031641 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8v4xz\" (UniqueName: \"kubernetes.io/projected/348d02a8-d1b2-4bd3-9f4c-9153e24a5f19-kube-api-access-8v4xz\") pod \"ovsdbserver-sb-0\" (UID: \"348d02a8-d1b2-4bd3-9f4c-9153e24a5f19\") " pod="openstack/ovsdbserver-sb-0" Feb 17 14:25:48 crc kubenswrapper[4836]: I0217 14:25:48.032414 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-4f4437ae-059c-47fb-bde0-9623e03fca9c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4f4437ae-059c-47fb-bde0-9623e03fca9c\") pod \"ovsdbserver-sb-0\" (UID: \"348d02a8-d1b2-4bd3-9f4c-9153e24a5f19\") " pod="openstack/ovsdbserver-sb-0" Feb 17 14:25:48 crc kubenswrapper[4836]: I0217 14:25:48.032549 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/348d02a8-d1b2-4bd3-9f4c-9153e24a5f19-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"348d02a8-d1b2-4bd3-9f4c-9153e24a5f19\") " pod="openstack/ovsdbserver-sb-0" Feb 17 14:25:48 crc kubenswrapper[4836]: I0217 14:25:48.032631 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/348d02a8-d1b2-4bd3-9f4c-9153e24a5f19-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"348d02a8-d1b2-4bd3-9f4c-9153e24a5f19\") " pod="openstack/ovsdbserver-sb-0" Feb 17 14:25:48 crc kubenswrapper[4836]: I0217 14:25:48.032727 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/348d02a8-d1b2-4bd3-9f4c-9153e24a5f19-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"348d02a8-d1b2-4bd3-9f4c-9153e24a5f19\") " pod="openstack/ovsdbserver-sb-0" Feb 17 14:25:48 crc kubenswrapper[4836]: I0217 14:25:48.032865 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/348d02a8-d1b2-4bd3-9f4c-9153e24a5f19-config\") pod \"ovsdbserver-sb-0\" (UID: \"348d02a8-d1b2-4bd3-9f4c-9153e24a5f19\") " pod="openstack/ovsdbserver-sb-0" Feb 17 14:25:48 crc kubenswrapper[4836]: I0217 14:25:48.033125 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/348d02a8-d1b2-4bd3-9f4c-9153e24a5f19-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"348d02a8-d1b2-4bd3-9f4c-9153e24a5f19\") " pod="openstack/ovsdbserver-sb-0" Feb 17 14:25:48 crc kubenswrapper[4836]: I0217 14:25:48.136863 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/348d02a8-d1b2-4bd3-9f4c-9153e24a5f19-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"348d02a8-d1b2-4bd3-9f4c-9153e24a5f19\") " pod="openstack/ovsdbserver-sb-0" Feb 17 14:25:48 crc kubenswrapper[4836]: I0217 14:25:48.137038 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/348d02a8-d1b2-4bd3-9f4c-9153e24a5f19-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"348d02a8-d1b2-4bd3-9f4c-9153e24a5f19\") " pod="openstack/ovsdbserver-sb-0" Feb 17 14:25:48 crc kubenswrapper[4836]: I0217 14:25:48.137950 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/348d02a8-d1b2-4bd3-9f4c-9153e24a5f19-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"348d02a8-d1b2-4bd3-9f4c-9153e24a5f19\") " pod="openstack/ovsdbserver-sb-0" Feb 17 14:25:48 crc kubenswrapper[4836]: I0217 14:25:48.137984 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8v4xz\" (UniqueName: \"kubernetes.io/projected/348d02a8-d1b2-4bd3-9f4c-9153e24a5f19-kube-api-access-8v4xz\") pod \"ovsdbserver-sb-0\" (UID: \"348d02a8-d1b2-4bd3-9f4c-9153e24a5f19\") " pod="openstack/ovsdbserver-sb-0" Feb 17 14:25:48 crc kubenswrapper[4836]: I0217 14:25:48.138135 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-4f4437ae-059c-47fb-bde0-9623e03fca9c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4f4437ae-059c-47fb-bde0-9623e03fca9c\") pod \"ovsdbserver-sb-0\" (UID: \"348d02a8-d1b2-4bd3-9f4c-9153e24a5f19\") " pod="openstack/ovsdbserver-sb-0" Feb 17 14:25:48 crc kubenswrapper[4836]: I0217 14:25:48.138228 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/348d02a8-d1b2-4bd3-9f4c-9153e24a5f19-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"348d02a8-d1b2-4bd3-9f4c-9153e24a5f19\") " pod="openstack/ovsdbserver-sb-0" Feb 17 14:25:48 crc kubenswrapper[4836]: I0217 14:25:48.138328 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/348d02a8-d1b2-4bd3-9f4c-9153e24a5f19-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"348d02a8-d1b2-4bd3-9f4c-9153e24a5f19\") " pod="openstack/ovsdbserver-sb-0" Feb 17 14:25:48 crc kubenswrapper[4836]: I0217 14:25:48.138434 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/348d02a8-d1b2-4bd3-9f4c-9153e24a5f19-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"348d02a8-d1b2-4bd3-9f4c-9153e24a5f19\") " pod="openstack/ovsdbserver-sb-0" Feb 17 14:25:48 crc kubenswrapper[4836]: I0217 14:25:48.138595 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/348d02a8-d1b2-4bd3-9f4c-9153e24a5f19-config\") pod \"ovsdbserver-sb-0\" (UID: \"348d02a8-d1b2-4bd3-9f4c-9153e24a5f19\") " pod="openstack/ovsdbserver-sb-0" Feb 17 14:25:48 crc kubenswrapper[4836]: I0217 14:25:48.142640 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/348d02a8-d1b2-4bd3-9f4c-9153e24a5f19-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"348d02a8-d1b2-4bd3-9f4c-9153e24a5f19\") " pod="openstack/ovsdbserver-sb-0" Feb 17 14:25:48 crc kubenswrapper[4836]: I0217 14:25:48.143330 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/348d02a8-d1b2-4bd3-9f4c-9153e24a5f19-config\") pod \"ovsdbserver-sb-0\" (UID: \"348d02a8-d1b2-4bd3-9f4c-9153e24a5f19\") " pod="openstack/ovsdbserver-sb-0" Feb 17 14:25:48 crc kubenswrapper[4836]: I0217 14:25:48.143799 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/348d02a8-d1b2-4bd3-9f4c-9153e24a5f19-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"348d02a8-d1b2-4bd3-9f4c-9153e24a5f19\") " pod="openstack/ovsdbserver-sb-0" Feb 17 14:25:48 crc kubenswrapper[4836]: I0217 14:25:48.144081 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/348d02a8-d1b2-4bd3-9f4c-9153e24a5f19-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"348d02a8-d1b2-4bd3-9f4c-9153e24a5f19\") " pod="openstack/ovsdbserver-sb-0" Feb 17 14:25:48 crc kubenswrapper[4836]: I0217 14:25:48.144747 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/348d02a8-d1b2-4bd3-9f4c-9153e24a5f19-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"348d02a8-d1b2-4bd3-9f4c-9153e24a5f19\") " pod="openstack/ovsdbserver-sb-0" Feb 17 14:25:48 crc kubenswrapper[4836]: I0217 14:25:48.146608 4836 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 17 14:25:48 crc kubenswrapper[4836]: I0217 14:25:48.146667 4836 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-4f4437ae-059c-47fb-bde0-9623e03fca9c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4f4437ae-059c-47fb-bde0-9623e03fca9c\") pod \"ovsdbserver-sb-0\" (UID: \"348d02a8-d1b2-4bd3-9f4c-9153e24a5f19\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/0b21ed61e8b484d21a8479a2c41be99518c171635272917cf20fee632a18901a/globalmount\"" pod="openstack/ovsdbserver-sb-0" Feb 17 14:25:48 crc kubenswrapper[4836]: I0217 14:25:48.173427 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8v4xz\" (UniqueName: \"kubernetes.io/projected/348d02a8-d1b2-4bd3-9f4c-9153e24a5f19-kube-api-access-8v4xz\") pod \"ovsdbserver-sb-0\" (UID: \"348d02a8-d1b2-4bd3-9f4c-9153e24a5f19\") " pod="openstack/ovsdbserver-sb-0" Feb 17 14:25:48 crc kubenswrapper[4836]: I0217 14:25:48.215504 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-4f4437ae-059c-47fb-bde0-9623e03fca9c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4f4437ae-059c-47fb-bde0-9623e03fca9c\") pod \"ovsdbserver-sb-0\" (UID: \"348d02a8-d1b2-4bd3-9f4c-9153e24a5f19\") " pod="openstack/ovsdbserver-sb-0" Feb 17 14:25:48 crc kubenswrapper[4836]: I0217 14:25:48.482516 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 17 14:25:50 crc kubenswrapper[4836]: I0217 14:25:50.707835 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 17 14:25:50 crc kubenswrapper[4836]: I0217 14:25:50.712032 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 17 14:25:50 crc kubenswrapper[4836]: I0217 14:25:50.717512 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Feb 17 14:25:50 crc kubenswrapper[4836]: I0217 14:25:50.717862 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-cxtfj" Feb 17 14:25:50 crc kubenswrapper[4836]: I0217 14:25:50.718114 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Feb 17 14:25:50 crc kubenswrapper[4836]: I0217 14:25:50.723255 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Feb 17 14:25:50 crc kubenswrapper[4836]: I0217 14:25:50.755008 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 17 14:25:50 crc kubenswrapper[4836]: I0217 14:25:50.852374 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/55bc1962-7790-448a-838c-cb13a870ea23-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"55bc1962-7790-448a-838c-cb13a870ea23\") " pod="openstack/ovsdbserver-nb-0" Feb 17 14:25:50 crc kubenswrapper[4836]: I0217 14:25:50.852510 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55bc1962-7790-448a-838c-cb13a870ea23-config\") pod \"ovsdbserver-nb-0\" (UID: \"55bc1962-7790-448a-838c-cb13a870ea23\") " pod="openstack/ovsdbserver-nb-0" Feb 17 14:25:50 crc kubenswrapper[4836]: I0217 14:25:50.852617 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/55bc1962-7790-448a-838c-cb13a870ea23-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"55bc1962-7790-448a-838c-cb13a870ea23\") " pod="openstack/ovsdbserver-nb-0" Feb 17 14:25:50 crc kubenswrapper[4836]: I0217 14:25:50.852703 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55bc1962-7790-448a-838c-cb13a870ea23-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"55bc1962-7790-448a-838c-cb13a870ea23\") " pod="openstack/ovsdbserver-nb-0" Feb 17 14:25:50 crc kubenswrapper[4836]: I0217 14:25:50.852774 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-0ca3a731-ad92-46ba-aef4-201b1c5ff483\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0ca3a731-ad92-46ba-aef4-201b1c5ff483\") pod \"ovsdbserver-nb-0\" (UID: \"55bc1962-7790-448a-838c-cb13a870ea23\") " pod="openstack/ovsdbserver-nb-0" Feb 17 14:25:50 crc kubenswrapper[4836]: I0217 14:25:50.852844 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/55bc1962-7790-448a-838c-cb13a870ea23-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"55bc1962-7790-448a-838c-cb13a870ea23\") " pod="openstack/ovsdbserver-nb-0" Feb 17 14:25:50 crc kubenswrapper[4836]: I0217 14:25:50.852939 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w69mz\" (UniqueName: \"kubernetes.io/projected/55bc1962-7790-448a-838c-cb13a870ea23-kube-api-access-w69mz\") pod \"ovsdbserver-nb-0\" (UID: \"55bc1962-7790-448a-838c-cb13a870ea23\") " pod="openstack/ovsdbserver-nb-0" Feb 17 14:25:50 crc kubenswrapper[4836]: I0217 14:25:50.853005 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/55bc1962-7790-448a-838c-cb13a870ea23-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"55bc1962-7790-448a-838c-cb13a870ea23\") " pod="openstack/ovsdbserver-nb-0" Feb 17 14:25:50 crc kubenswrapper[4836]: I0217 14:25:50.955035 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/55bc1962-7790-448a-838c-cb13a870ea23-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"55bc1962-7790-448a-838c-cb13a870ea23\") " pod="openstack/ovsdbserver-nb-0" Feb 17 14:25:50 crc kubenswrapper[4836]: I0217 14:25:50.955467 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55bc1962-7790-448a-838c-cb13a870ea23-config\") pod \"ovsdbserver-nb-0\" (UID: \"55bc1962-7790-448a-838c-cb13a870ea23\") " pod="openstack/ovsdbserver-nb-0" Feb 17 14:25:50 crc kubenswrapper[4836]: I0217 14:25:50.955511 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/55bc1962-7790-448a-838c-cb13a870ea23-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"55bc1962-7790-448a-838c-cb13a870ea23\") " pod="openstack/ovsdbserver-nb-0" Feb 17 14:25:50 crc kubenswrapper[4836]: I0217 14:25:50.955546 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55bc1962-7790-448a-838c-cb13a870ea23-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"55bc1962-7790-448a-838c-cb13a870ea23\") " pod="openstack/ovsdbserver-nb-0" Feb 17 14:25:50 crc kubenswrapper[4836]: I0217 14:25:50.955600 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-0ca3a731-ad92-46ba-aef4-201b1c5ff483\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0ca3a731-ad92-46ba-aef4-201b1c5ff483\") pod \"ovsdbserver-nb-0\" (UID: \"55bc1962-7790-448a-838c-cb13a870ea23\") " pod="openstack/ovsdbserver-nb-0" Feb 17 14:25:50 crc kubenswrapper[4836]: I0217 14:25:50.955621 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/55bc1962-7790-448a-838c-cb13a870ea23-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"55bc1962-7790-448a-838c-cb13a870ea23\") " pod="openstack/ovsdbserver-nb-0" Feb 17 14:25:50 crc kubenswrapper[4836]: I0217 14:25:50.955658 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w69mz\" (UniqueName: \"kubernetes.io/projected/55bc1962-7790-448a-838c-cb13a870ea23-kube-api-access-w69mz\") pod \"ovsdbserver-nb-0\" (UID: \"55bc1962-7790-448a-838c-cb13a870ea23\") " pod="openstack/ovsdbserver-nb-0" Feb 17 14:25:50 crc kubenswrapper[4836]: I0217 14:25:50.955675 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/55bc1962-7790-448a-838c-cb13a870ea23-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"55bc1962-7790-448a-838c-cb13a870ea23\") " pod="openstack/ovsdbserver-nb-0" Feb 17 14:25:50 crc kubenswrapper[4836]: I0217 14:25:50.956155 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/55bc1962-7790-448a-838c-cb13a870ea23-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"55bc1962-7790-448a-838c-cb13a870ea23\") " pod="openstack/ovsdbserver-nb-0" Feb 17 14:25:50 crc kubenswrapper[4836]: I0217 14:25:50.957253 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55bc1962-7790-448a-838c-cb13a870ea23-config\") pod \"ovsdbserver-nb-0\" (UID: \"55bc1962-7790-448a-838c-cb13a870ea23\") " pod="openstack/ovsdbserver-nb-0" Feb 17 14:25:50 crc kubenswrapper[4836]: I0217 14:25:50.958334 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/55bc1962-7790-448a-838c-cb13a870ea23-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"55bc1962-7790-448a-838c-cb13a870ea23\") " pod="openstack/ovsdbserver-nb-0" Feb 17 14:25:50 crc kubenswrapper[4836]: I0217 14:25:50.961972 4836 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 17 14:25:50 crc kubenswrapper[4836]: I0217 14:25:50.962031 4836 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-0ca3a731-ad92-46ba-aef4-201b1c5ff483\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0ca3a731-ad92-46ba-aef4-201b1c5ff483\") pod \"ovsdbserver-nb-0\" (UID: \"55bc1962-7790-448a-838c-cb13a870ea23\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/acb6dfbf3ca60bf020d14ded6b9677efb27a4261f5bb0945a55b0cd863775a84/globalmount\"" pod="openstack/ovsdbserver-nb-0" Feb 17 14:25:50 crc kubenswrapper[4836]: I0217 14:25:50.963844 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/55bc1962-7790-448a-838c-cb13a870ea23-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"55bc1962-7790-448a-838c-cb13a870ea23\") " pod="openstack/ovsdbserver-nb-0" Feb 17 14:25:50 crc kubenswrapper[4836]: I0217 14:25:50.967884 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/55bc1962-7790-448a-838c-cb13a870ea23-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"55bc1962-7790-448a-838c-cb13a870ea23\") " pod="openstack/ovsdbserver-nb-0" Feb 17 14:25:50 crc kubenswrapper[4836]: I0217 14:25:50.977147 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"a6016745-1634-4eb6-afee-b98ce9ab8f56","Type":"ContainerStarted","Data":"d9ff2705d4d9971449e56a8df2c8bcf12c30b8741924c07275058e3b28283829"} Feb 17 14:25:50 crc kubenswrapper[4836]: I0217 14:25:50.979204 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w69mz\" (UniqueName: \"kubernetes.io/projected/55bc1962-7790-448a-838c-cb13a870ea23-kube-api-access-w69mz\") pod \"ovsdbserver-nb-0\" (UID: \"55bc1962-7790-448a-838c-cb13a870ea23\") " pod="openstack/ovsdbserver-nb-0" Feb 17 14:25:50 crc kubenswrapper[4836]: I0217 14:25:50.984145 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55bc1962-7790-448a-838c-cb13a870ea23-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"55bc1962-7790-448a-838c-cb13a870ea23\") " pod="openstack/ovsdbserver-nb-0" Feb 17 14:25:51 crc kubenswrapper[4836]: I0217 14:25:51.010774 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-0ca3a731-ad92-46ba-aef4-201b1c5ff483\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0ca3a731-ad92-46ba-aef4-201b1c5ff483\") pod \"ovsdbserver-nb-0\" (UID: \"55bc1962-7790-448a-838c-cb13a870ea23\") " pod="openstack/ovsdbserver-nb-0" Feb 17 14:25:51 crc kubenswrapper[4836]: I0217 14:25:51.042284 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 17 14:25:51 crc kubenswrapper[4836]: I0217 14:25:51.991314 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"ce3babe4-6d77-45ce-b9cc-626678d3ec64","Type":"ContainerStarted","Data":"215ebeb5a59ba861cc33d511721255c32dabb5083993a84473b1381d4f746889"} Feb 17 14:25:56 crc kubenswrapper[4836]: I0217 14:25:56.035560 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 17 14:25:56 crc kubenswrapper[4836]: I0217 14:25:56.319250 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-distributor-585d9bcbc-r4gdh"] Feb 17 14:25:56 crc kubenswrapper[4836]: I0217 14:25:56.321006 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-r4gdh" Feb 17 14:25:56 crc kubenswrapper[4836]: I0217 14:25:56.324779 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-distributor-grpc" Feb 17 14:25:56 crc kubenswrapper[4836]: I0217 14:25:56.325061 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"cloudkitty-lokistack-ca-bundle" Feb 17 14:25:56 crc kubenswrapper[4836]: I0217 14:25:56.325207 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"cloudkitty-lokistack-config" Feb 17 14:25:56 crc kubenswrapper[4836]: I0217 14:25:56.325482 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-dockercfg-fc2f6" Feb 17 14:25:56 crc kubenswrapper[4836]: I0217 14:25:56.325503 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-distributor-http" Feb 17 14:25:56 crc kubenswrapper[4836]: I0217 14:25:56.333758 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-distributor-585d9bcbc-r4gdh"] Feb 17 14:25:56 crc kubenswrapper[4836]: I0217 14:25:56.488514 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/33c54f8c-91c4-4742-b545-d0e2c4e85fe2-cloudkitty-lokistack-distributor-grpc\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-r4gdh\" (UID: \"33c54f8c-91c4-4742-b545-d0e2c4e85fe2\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-r4gdh" Feb 17 14:25:56 crc kubenswrapper[4836]: I0217 14:25:56.488653 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/33c54f8c-91c4-4742-b545-d0e2c4e85fe2-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-r4gdh\" (UID: \"33c54f8c-91c4-4742-b545-d0e2c4e85fe2\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-r4gdh" Feb 17 14:25:56 crc kubenswrapper[4836]: I0217 14:25:56.488746 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-distributor-http\" (UniqueName: \"kubernetes.io/secret/33c54f8c-91c4-4742-b545-d0e2c4e85fe2-cloudkitty-lokistack-distributor-http\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-r4gdh\" (UID: \"33c54f8c-91c4-4742-b545-d0e2c4e85fe2\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-r4gdh" Feb 17 14:25:56 crc kubenswrapper[4836]: I0217 14:25:56.488796 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33c54f8c-91c4-4742-b545-d0e2c4e85fe2-config\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-r4gdh\" (UID: \"33c54f8c-91c4-4742-b545-d0e2c4e85fe2\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-r4gdh" Feb 17 14:25:56 crc kubenswrapper[4836]: I0217 14:25:56.488828 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vljr6\" (UniqueName: \"kubernetes.io/projected/33c54f8c-91c4-4742-b545-d0e2c4e85fe2-kube-api-access-vljr6\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-r4gdh\" (UID: \"33c54f8c-91c4-4742-b545-d0e2c4e85fe2\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-r4gdh" Feb 17 14:25:56 crc kubenswrapper[4836]: I0217 14:25:56.540872 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-querier-58c84b5844-fsq2h"] Feb 17 14:25:56 crc kubenswrapper[4836]: I0217 14:25:56.542397 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-querier-58c84b5844-fsq2h" Feb 17 14:25:56 crc kubenswrapper[4836]: I0217 14:25:56.544674 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-loki-s3" Feb 17 14:25:56 crc kubenswrapper[4836]: I0217 14:25:56.545737 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-querier-http" Feb 17 14:25:56 crc kubenswrapper[4836]: I0217 14:25:56.552955 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-querier-58c84b5844-fsq2h"] Feb 17 14:25:56 crc kubenswrapper[4836]: I0217 14:25:56.554791 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-querier-grpc" Feb 17 14:25:56 crc kubenswrapper[4836]: I0217 14:25:56.591065 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33c54f8c-91c4-4742-b545-d0e2c4e85fe2-config\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-r4gdh\" (UID: \"33c54f8c-91c4-4742-b545-d0e2c4e85fe2\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-r4gdh" Feb 17 14:25:56 crc kubenswrapper[4836]: I0217 14:25:56.591129 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vljr6\" (UniqueName: \"kubernetes.io/projected/33c54f8c-91c4-4742-b545-d0e2c4e85fe2-kube-api-access-vljr6\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-r4gdh\" (UID: \"33c54f8c-91c4-4742-b545-d0e2c4e85fe2\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-r4gdh" Feb 17 14:25:56 crc kubenswrapper[4836]: I0217 14:25:56.591189 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/33c54f8c-91c4-4742-b545-d0e2c4e85fe2-cloudkitty-lokistack-distributor-grpc\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-r4gdh\" (UID: \"33c54f8c-91c4-4742-b545-d0e2c4e85fe2\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-r4gdh" Feb 17 14:25:56 crc kubenswrapper[4836]: I0217 14:25:56.591287 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/33c54f8c-91c4-4742-b545-d0e2c4e85fe2-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-r4gdh\" (UID: \"33c54f8c-91c4-4742-b545-d0e2c4e85fe2\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-r4gdh" Feb 17 14:25:56 crc kubenswrapper[4836]: I0217 14:25:56.591367 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-distributor-http\" (UniqueName: \"kubernetes.io/secret/33c54f8c-91c4-4742-b545-d0e2c4e85fe2-cloudkitty-lokistack-distributor-http\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-r4gdh\" (UID: \"33c54f8c-91c4-4742-b545-d0e2c4e85fe2\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-r4gdh" Feb 17 14:25:56 crc kubenswrapper[4836]: I0217 14:25:56.596409 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/33c54f8c-91c4-4742-b545-d0e2c4e85fe2-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-r4gdh\" (UID: \"33c54f8c-91c4-4742-b545-d0e2c4e85fe2\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-r4gdh" Feb 17 14:25:56 crc kubenswrapper[4836]: I0217 14:25:56.597864 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33c54f8c-91c4-4742-b545-d0e2c4e85fe2-config\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-r4gdh\" (UID: \"33c54f8c-91c4-4742-b545-d0e2c4e85fe2\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-r4gdh" Feb 17 14:25:56 crc kubenswrapper[4836]: I0217 14:25:56.605616 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-distributor-http\" (UniqueName: \"kubernetes.io/secret/33c54f8c-91c4-4742-b545-d0e2c4e85fe2-cloudkitty-lokistack-distributor-http\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-r4gdh\" (UID: \"33c54f8c-91c4-4742-b545-d0e2c4e85fe2\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-r4gdh" Feb 17 14:25:56 crc kubenswrapper[4836]: I0217 14:25:56.613770 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/33c54f8c-91c4-4742-b545-d0e2c4e85fe2-cloudkitty-lokistack-distributor-grpc\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-r4gdh\" (UID: \"33c54f8c-91c4-4742-b545-d0e2c4e85fe2\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-r4gdh" Feb 17 14:25:56 crc kubenswrapper[4836]: I0217 14:25:56.657779 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-bqn6j"] Feb 17 14:25:56 crc kubenswrapper[4836]: I0217 14:25:56.659223 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-bqn6j" Feb 17 14:25:56 crc kubenswrapper[4836]: I0217 14:25:56.669848 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-query-frontend-grpc" Feb 17 14:25:56 crc kubenswrapper[4836]: I0217 14:25:56.670059 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-query-frontend-http" Feb 17 14:25:56 crc kubenswrapper[4836]: I0217 14:25:56.674119 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vljr6\" (UniqueName: \"kubernetes.io/projected/33c54f8c-91c4-4742-b545-d0e2c4e85fe2-kube-api-access-vljr6\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-r4gdh\" (UID: \"33c54f8c-91c4-4742-b545-d0e2c4e85fe2\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-r4gdh" Feb 17 14:25:56 crc kubenswrapper[4836]: I0217 14:25:56.687511 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-bqn6j"] Feb 17 14:25:56 crc kubenswrapper[4836]: I0217 14:25:56.735688 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/487d19a3-7f23-4945-bfe1-6231a37a84c6-config\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-bqn6j\" (UID: \"487d19a3-7f23-4945-bfe1-6231a37a84c6\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-bqn6j" Feb 17 14:25:56 crc kubenswrapper[4836]: I0217 14:25:56.735743 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-querier-http\" (UniqueName: \"kubernetes.io/secret/27c5f450-8bef-4732-a7fb-272d9b5a4ea8-cloudkitty-lokistack-querier-http\") pod \"cloudkitty-lokistack-querier-58c84b5844-fsq2h\" (UID: \"27c5f450-8bef-4732-a7fb-272d9b5a4ea8\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-fsq2h" Feb 17 14:25:56 crc kubenswrapper[4836]: I0217 14:25:56.735879 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/27c5f450-8bef-4732-a7fb-272d9b5a4ea8-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-querier-58c84b5844-fsq2h\" (UID: \"27c5f450-8bef-4732-a7fb-272d9b5a4ea8\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-fsq2h" Feb 17 14:25:56 crc kubenswrapper[4836]: I0217 14:25:56.735918 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fc8kl\" (UniqueName: \"kubernetes.io/projected/27c5f450-8bef-4732-a7fb-272d9b5a4ea8-kube-api-access-fc8kl\") pod \"cloudkitty-lokistack-querier-58c84b5844-fsq2h\" (UID: \"27c5f450-8bef-4732-a7fb-272d9b5a4ea8\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-fsq2h" Feb 17 14:25:56 crc kubenswrapper[4836]: I0217 14:25:56.735998 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/487d19a3-7f23-4945-bfe1-6231a37a84c6-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-bqn6j\" (UID: \"487d19a3-7f23-4945-bfe1-6231a37a84c6\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-bqn6j" Feb 17 14:25:56 crc kubenswrapper[4836]: I0217 14:25:56.736109 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-querier-grpc\" (UniqueName: \"kubernetes.io/secret/27c5f450-8bef-4732-a7fb-272d9b5a4ea8-cloudkitty-lokistack-querier-grpc\") pod \"cloudkitty-lokistack-querier-58c84b5844-fsq2h\" (UID: \"27c5f450-8bef-4732-a7fb-272d9b5a4ea8\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-fsq2h" Feb 17 14:25:56 crc kubenswrapper[4836]: I0217 14:25:56.736132 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/487d19a3-7f23-4945-bfe1-6231a37a84c6-cloudkitty-lokistack-query-frontend-grpc\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-bqn6j\" (UID: \"487d19a3-7f23-4945-bfe1-6231a37a84c6\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-bqn6j" Feb 17 14:25:56 crc kubenswrapper[4836]: I0217 14:25:56.736186 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/487d19a3-7f23-4945-bfe1-6231a37a84c6-cloudkitty-lokistack-query-frontend-http\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-bqn6j\" (UID: \"487d19a3-7f23-4945-bfe1-6231a37a84c6\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-bqn6j" Feb 17 14:25:56 crc kubenswrapper[4836]: I0217 14:25:56.736247 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmwjc\" (UniqueName: \"kubernetes.io/projected/487d19a3-7f23-4945-bfe1-6231a37a84c6-kube-api-access-dmwjc\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-bqn6j\" (UID: \"487d19a3-7f23-4945-bfe1-6231a37a84c6\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-bqn6j" Feb 17 14:25:56 crc kubenswrapper[4836]: I0217 14:25:56.736283 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27c5f450-8bef-4732-a7fb-272d9b5a4ea8-config\") pod \"cloudkitty-lokistack-querier-58c84b5844-fsq2h\" (UID: \"27c5f450-8bef-4732-a7fb-272d9b5a4ea8\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-fsq2h" Feb 17 14:25:56 crc kubenswrapper[4836]: I0217 14:25:56.736436 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/27c5f450-8bef-4732-a7fb-272d9b5a4ea8-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-querier-58c84b5844-fsq2h\" (UID: \"27c5f450-8bef-4732-a7fb-272d9b5a4ea8\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-fsq2h" Feb 17 14:25:56 crc kubenswrapper[4836]: I0217 14:25:56.837652 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fc8kl\" (UniqueName: \"kubernetes.io/projected/27c5f450-8bef-4732-a7fb-272d9b5a4ea8-kube-api-access-fc8kl\") pod \"cloudkitty-lokistack-querier-58c84b5844-fsq2h\" (UID: \"27c5f450-8bef-4732-a7fb-272d9b5a4ea8\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-fsq2h" Feb 17 14:25:56 crc kubenswrapper[4836]: I0217 14:25:56.837731 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/487d19a3-7f23-4945-bfe1-6231a37a84c6-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-bqn6j\" (UID: \"487d19a3-7f23-4945-bfe1-6231a37a84c6\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-bqn6j" Feb 17 14:25:56 crc kubenswrapper[4836]: I0217 14:25:56.837798 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-querier-grpc\" (UniqueName: \"kubernetes.io/secret/27c5f450-8bef-4732-a7fb-272d9b5a4ea8-cloudkitty-lokistack-querier-grpc\") pod \"cloudkitty-lokistack-querier-58c84b5844-fsq2h\" (UID: \"27c5f450-8bef-4732-a7fb-272d9b5a4ea8\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-fsq2h" Feb 17 14:25:56 crc kubenswrapper[4836]: I0217 14:25:56.837825 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/487d19a3-7f23-4945-bfe1-6231a37a84c6-cloudkitty-lokistack-query-frontend-grpc\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-bqn6j\" (UID: \"487d19a3-7f23-4945-bfe1-6231a37a84c6\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-bqn6j" Feb 17 14:25:56 crc kubenswrapper[4836]: I0217 14:25:56.837968 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/487d19a3-7f23-4945-bfe1-6231a37a84c6-cloudkitty-lokistack-query-frontend-http\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-bqn6j\" (UID: \"487d19a3-7f23-4945-bfe1-6231a37a84c6\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-bqn6j" Feb 17 14:25:56 crc kubenswrapper[4836]: I0217 14:25:56.838859 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/487d19a3-7f23-4945-bfe1-6231a37a84c6-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-bqn6j\" (UID: \"487d19a3-7f23-4945-bfe1-6231a37a84c6\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-bqn6j" Feb 17 14:25:56 crc kubenswrapper[4836]: I0217 14:25:56.839879 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dmwjc\" (UniqueName: \"kubernetes.io/projected/487d19a3-7f23-4945-bfe1-6231a37a84c6-kube-api-access-dmwjc\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-bqn6j\" (UID: \"487d19a3-7f23-4945-bfe1-6231a37a84c6\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-bqn6j" Feb 17 14:25:56 crc kubenswrapper[4836]: I0217 14:25:56.839918 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27c5f450-8bef-4732-a7fb-272d9b5a4ea8-config\") pod \"cloudkitty-lokistack-querier-58c84b5844-fsq2h\" (UID: \"27c5f450-8bef-4732-a7fb-272d9b5a4ea8\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-fsq2h" Feb 17 14:25:56 crc kubenswrapper[4836]: I0217 14:25:56.839976 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/27c5f450-8bef-4732-a7fb-272d9b5a4ea8-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-querier-58c84b5844-fsq2h\" (UID: \"27c5f450-8bef-4732-a7fb-272d9b5a4ea8\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-fsq2h" Feb 17 14:25:56 crc kubenswrapper[4836]: I0217 14:25:56.840073 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/487d19a3-7f23-4945-bfe1-6231a37a84c6-config\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-bqn6j\" (UID: \"487d19a3-7f23-4945-bfe1-6231a37a84c6\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-bqn6j" Feb 17 14:25:56 crc kubenswrapper[4836]: I0217 14:25:56.840125 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-querier-http\" (UniqueName: \"kubernetes.io/secret/27c5f450-8bef-4732-a7fb-272d9b5a4ea8-cloudkitty-lokistack-querier-http\") pod \"cloudkitty-lokistack-querier-58c84b5844-fsq2h\" (UID: \"27c5f450-8bef-4732-a7fb-272d9b5a4ea8\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-fsq2h" Feb 17 14:25:56 crc kubenswrapper[4836]: I0217 14:25:56.840216 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/27c5f450-8bef-4732-a7fb-272d9b5a4ea8-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-querier-58c84b5844-fsq2h\" (UID: \"27c5f450-8bef-4732-a7fb-272d9b5a4ea8\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-fsq2h" Feb 17 14:25:56 crc kubenswrapper[4836]: I0217 14:25:56.844636 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/487d19a3-7f23-4945-bfe1-6231a37a84c6-cloudkitty-lokistack-query-frontend-grpc\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-bqn6j\" (UID: \"487d19a3-7f23-4945-bfe1-6231a37a84c6\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-bqn6j" Feb 17 14:25:56 crc kubenswrapper[4836]: I0217 14:25:56.844930 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/487d19a3-7f23-4945-bfe1-6231a37a84c6-config\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-bqn6j\" (UID: \"487d19a3-7f23-4945-bfe1-6231a37a84c6\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-bqn6j" Feb 17 14:25:56 crc kubenswrapper[4836]: I0217 14:25:56.845583 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/27c5f450-8bef-4732-a7fb-272d9b5a4ea8-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-querier-58c84b5844-fsq2h\" (UID: \"27c5f450-8bef-4732-a7fb-272d9b5a4ea8\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-fsq2h" Feb 17 14:25:56 crc kubenswrapper[4836]: I0217 14:25:56.847284 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27c5f450-8bef-4732-a7fb-272d9b5a4ea8-config\") pod \"cloudkitty-lokistack-querier-58c84b5844-fsq2h\" (UID: \"27c5f450-8bef-4732-a7fb-272d9b5a4ea8\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-fsq2h" Feb 17 14:25:56 crc kubenswrapper[4836]: I0217 14:25:56.851007 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-querier-grpc\" (UniqueName: \"kubernetes.io/secret/27c5f450-8bef-4732-a7fb-272d9b5a4ea8-cloudkitty-lokistack-querier-grpc\") pod \"cloudkitty-lokistack-querier-58c84b5844-fsq2h\" (UID: \"27c5f450-8bef-4732-a7fb-272d9b5a4ea8\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-fsq2h" Feb 17 14:25:56 crc kubenswrapper[4836]: I0217 14:25:56.851706 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-querier-http\" (UniqueName: \"kubernetes.io/secret/27c5f450-8bef-4732-a7fb-272d9b5a4ea8-cloudkitty-lokistack-querier-http\") pod \"cloudkitty-lokistack-querier-58c84b5844-fsq2h\" (UID: \"27c5f450-8bef-4732-a7fb-272d9b5a4ea8\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-fsq2h" Feb 17 14:25:56 crc kubenswrapper[4836]: I0217 14:25:56.852030 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/487d19a3-7f23-4945-bfe1-6231a37a84c6-cloudkitty-lokistack-query-frontend-http\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-bqn6j\" (UID: \"487d19a3-7f23-4945-bfe1-6231a37a84c6\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-bqn6j" Feb 17 14:25:56 crc kubenswrapper[4836]: I0217 14:25:56.890810 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-gateway-7f8685b49f-q78z5"] Feb 17 14:25:56 crc kubenswrapper[4836]: I0217 14:25:56.891874 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fc8kl\" (UniqueName: \"kubernetes.io/projected/27c5f450-8bef-4732-a7fb-272d9b5a4ea8-kube-api-access-fc8kl\") pod \"cloudkitty-lokistack-querier-58c84b5844-fsq2h\" (UID: \"27c5f450-8bef-4732-a7fb-272d9b5a4ea8\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-fsq2h" Feb 17 14:25:56 crc kubenswrapper[4836]: I0217 14:25:56.892753 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-q78z5" Feb 17 14:25:56 crc kubenswrapper[4836]: I0217 14:25:56.896588 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-gateway-http" Feb 17 14:25:56 crc kubenswrapper[4836]: I0217 14:25:56.896881 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"cloudkitty-lokistack-gateway-ca-bundle" Feb 17 14:25:56 crc kubenswrapper[4836]: I0217 14:25:56.897202 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"cloudkitty-lokistack-gateway" Feb 17 14:25:56 crc kubenswrapper[4836]: I0217 14:25:56.897520 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-gateway-client-http" Feb 17 14:25:56 crc kubenswrapper[4836]: I0217 14:25:56.899277 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"cloudkitty-lokistack-ca" Feb 17 14:25:56 crc kubenswrapper[4836]: I0217 14:25:56.899387 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-gateway" Feb 17 14:25:56 crc kubenswrapper[4836]: I0217 14:25:56.902954 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmwjc\" (UniqueName: \"kubernetes.io/projected/487d19a3-7f23-4945-bfe1-6231a37a84c6-kube-api-access-dmwjc\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-bqn6j\" (UID: \"487d19a3-7f23-4945-bfe1-6231a37a84c6\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-bqn6j" Feb 17 14:25:56 crc kubenswrapper[4836]: I0217 14:25:56.912144 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/27c5f450-8bef-4732-a7fb-272d9b5a4ea8-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-querier-58c84b5844-fsq2h\" (UID: \"27c5f450-8bef-4732-a7fb-272d9b5a4ea8\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-fsq2h" Feb 17 14:25:56 crc kubenswrapper[4836]: I0217 14:25:56.912229 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-gateway-7f8685b49f-q78z5"] Feb 17 14:25:56 crc kubenswrapper[4836]: I0217 14:25:56.942850 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/974f66b3-690f-4008-949d-1d57c978d427-tls-secret\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-q78z5\" (UID: \"974f66b3-690f-4008-949d-1d57c978d427\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-q78z5" Feb 17 14:25:56 crc kubenswrapper[4836]: I0217 14:25:56.942930 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/974f66b3-690f-4008-949d-1d57c978d427-rbac\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-q78z5\" (UID: \"974f66b3-690f-4008-949d-1d57c978d427\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-q78z5" Feb 17 14:25:56 crc kubenswrapper[4836]: I0217 14:25:56.942966 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/974f66b3-690f-4008-949d-1d57c978d427-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-q78z5\" (UID: \"974f66b3-690f-4008-949d-1d57c978d427\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-q78z5" Feb 17 14:25:56 crc kubenswrapper[4836]: I0217 14:25:56.942988 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/974f66b3-690f-4008-949d-1d57c978d427-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-q78z5\" (UID: \"974f66b3-690f-4008-949d-1d57c978d427\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-q78z5" Feb 17 14:25:56 crc kubenswrapper[4836]: I0217 14:25:56.943014 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/974f66b3-690f-4008-949d-1d57c978d427-tenants\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-q78z5\" (UID: \"974f66b3-690f-4008-949d-1d57c978d427\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-q78z5" Feb 17 14:25:56 crc kubenswrapper[4836]: I0217 14:25:56.943066 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/974f66b3-690f-4008-949d-1d57c978d427-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-q78z5\" (UID: \"974f66b3-690f-4008-949d-1d57c978d427\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-q78z5" Feb 17 14:25:56 crc kubenswrapper[4836]: I0217 14:25:56.943126 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/974f66b3-690f-4008-949d-1d57c978d427-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-q78z5\" (UID: \"974f66b3-690f-4008-949d-1d57c978d427\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-q78z5" Feb 17 14:25:56 crc kubenswrapper[4836]: I0217 14:25:56.943151 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tl2pt\" (UniqueName: \"kubernetes.io/projected/974f66b3-690f-4008-949d-1d57c978d427-kube-api-access-tl2pt\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-q78z5\" (UID: \"974f66b3-690f-4008-949d-1d57c978d427\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-q78z5" Feb 17 14:25:56 crc kubenswrapper[4836]: I0217 14:25:56.943193 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/974f66b3-690f-4008-949d-1d57c978d427-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-q78z5\" (UID: \"974f66b3-690f-4008-949d-1d57c978d427\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-q78z5" Feb 17 14:25:56 crc kubenswrapper[4836]: I0217 14:25:56.943468 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-r4gdh" Feb 17 14:25:56 crc kubenswrapper[4836]: I0217 14:25:56.977527 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-gateway-7f8685b49f-nbvnf"] Feb 17 14:25:56 crc kubenswrapper[4836]: I0217 14:25:56.979251 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-nbvnf" Feb 17 14:25:56 crc kubenswrapper[4836]: I0217 14:25:56.996911 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-gateway-7f8685b49f-nbvnf"] Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.027875 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-gateway-dockercfg-s855t" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.045252 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/974f66b3-690f-4008-949d-1d57c978d427-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-q78z5\" (UID: \"974f66b3-690f-4008-949d-1d57c978d427\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-q78z5" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.059949 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/974f66b3-690f-4008-949d-1d57c978d427-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-q78z5\" (UID: \"974f66b3-690f-4008-949d-1d57c978d427\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-q78z5" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.062141 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/974f66b3-690f-4008-949d-1d57c978d427-tenants\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-q78z5\" (UID: \"974f66b3-690f-4008-949d-1d57c978d427\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-q78z5" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.062591 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/974f66b3-690f-4008-949d-1d57c978d427-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-q78z5\" (UID: \"974f66b3-690f-4008-949d-1d57c978d427\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-q78z5" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.085277 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/974f66b3-690f-4008-949d-1d57c978d427-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-q78z5\" (UID: \"974f66b3-690f-4008-949d-1d57c978d427\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-q78z5" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.100180 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tl2pt\" (UniqueName: \"kubernetes.io/projected/974f66b3-690f-4008-949d-1d57c978d427-kube-api-access-tl2pt\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-q78z5\" (UID: \"974f66b3-690f-4008-949d-1d57c978d427\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-q78z5" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.101609 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/974f66b3-690f-4008-949d-1d57c978d427-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-q78z5\" (UID: \"974f66b3-690f-4008-949d-1d57c978d427\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-q78z5" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.102339 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/974f66b3-690f-4008-949d-1d57c978d427-tls-secret\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-q78z5\" (UID: \"974f66b3-690f-4008-949d-1d57c978d427\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-q78z5" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.102564 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/974f66b3-690f-4008-949d-1d57c978d427-rbac\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-q78z5\" (UID: \"974f66b3-690f-4008-949d-1d57c978d427\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-q78z5" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.065755 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/974f66b3-690f-4008-949d-1d57c978d427-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-q78z5\" (UID: \"974f66b3-690f-4008-949d-1d57c978d427\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-q78z5" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.099968 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.046956 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/974f66b3-690f-4008-949d-1d57c978d427-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-q78z5\" (UID: \"974f66b3-690f-4008-949d-1d57c978d427\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-q78z5" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.068675 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/974f66b3-690f-4008-949d-1d57c978d427-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-q78z5\" (UID: \"974f66b3-690f-4008-949d-1d57c978d427\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-q78z5" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.081028 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-bqn6j" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.111264 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/974f66b3-690f-4008-949d-1d57c978d427-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-q78z5\" (UID: \"974f66b3-690f-4008-949d-1d57c978d427\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-q78z5" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.080447 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/974f66b3-690f-4008-949d-1d57c978d427-tenants\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-q78z5\" (UID: \"974f66b3-690f-4008-949d-1d57c978d427\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-q78z5" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.112824 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/974f66b3-690f-4008-949d-1d57c978d427-rbac\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-q78z5\" (UID: \"974f66b3-690f-4008-949d-1d57c978d427\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-q78z5" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.114407 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/974f66b3-690f-4008-949d-1d57c978d427-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-q78z5\" (UID: \"974f66b3-690f-4008-949d-1d57c978d427\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-q78z5" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.130708 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/974f66b3-690f-4008-949d-1d57c978d427-tls-secret\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-q78z5\" (UID: \"974f66b3-690f-4008-949d-1d57c978d427\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-q78z5" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.146039 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tl2pt\" (UniqueName: \"kubernetes.io/projected/974f66b3-690f-4008-949d-1d57c978d427-kube-api-access-tl2pt\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-q78z5\" (UID: \"974f66b3-690f-4008-949d-1d57c978d427\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-q78z5" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.177497 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-querier-58c84b5844-fsq2h" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.212525 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/a977b831-7959-4509-93bf-a45b375ca722-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-nbvnf\" (UID: \"a977b831-7959-4509-93bf-a45b375ca722\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-nbvnf" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.212599 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a977b831-7959-4509-93bf-a45b375ca722-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-nbvnf\" (UID: \"a977b831-7959-4509-93bf-a45b375ca722\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-nbvnf" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.212654 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a977b831-7959-4509-93bf-a45b375ca722-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-nbvnf\" (UID: \"a977b831-7959-4509-93bf-a45b375ca722\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-nbvnf" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.212694 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/a977b831-7959-4509-93bf-a45b375ca722-rbac\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-nbvnf\" (UID: \"a977b831-7959-4509-93bf-a45b375ca722\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-nbvnf" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.212749 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/a977b831-7959-4509-93bf-a45b375ca722-tenants\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-nbvnf\" (UID: \"a977b831-7959-4509-93bf-a45b375ca722\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-nbvnf" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.212778 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/a977b831-7959-4509-93bf-a45b375ca722-tls-secret\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-nbvnf\" (UID: \"a977b831-7959-4509-93bf-a45b375ca722\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-nbvnf" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.212847 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/a977b831-7959-4509-93bf-a45b375ca722-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-nbvnf\" (UID: \"a977b831-7959-4509-93bf-a45b375ca722\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-nbvnf" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.212884 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7f8s9\" (UniqueName: \"kubernetes.io/projected/a977b831-7959-4509-93bf-a45b375ca722-kube-api-access-7f8s9\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-nbvnf\" (UID: \"a977b831-7959-4509-93bf-a45b375ca722\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-nbvnf" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.212920 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a977b831-7959-4509-93bf-a45b375ca722-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-nbvnf\" (UID: \"a977b831-7959-4509-93bf-a45b375ca722\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-nbvnf" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.273654 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-q78z5" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.316443 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/a977b831-7959-4509-93bf-a45b375ca722-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-nbvnf\" (UID: \"a977b831-7959-4509-93bf-a45b375ca722\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-nbvnf" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.316512 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7f8s9\" (UniqueName: \"kubernetes.io/projected/a977b831-7959-4509-93bf-a45b375ca722-kube-api-access-7f8s9\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-nbvnf\" (UID: \"a977b831-7959-4509-93bf-a45b375ca722\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-nbvnf" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.316544 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a977b831-7959-4509-93bf-a45b375ca722-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-nbvnf\" (UID: \"a977b831-7959-4509-93bf-a45b375ca722\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-nbvnf" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.316599 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/a977b831-7959-4509-93bf-a45b375ca722-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-nbvnf\" (UID: \"a977b831-7959-4509-93bf-a45b375ca722\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-nbvnf" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.316616 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a977b831-7959-4509-93bf-a45b375ca722-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-nbvnf\" (UID: \"a977b831-7959-4509-93bf-a45b375ca722\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-nbvnf" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.316650 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a977b831-7959-4509-93bf-a45b375ca722-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-nbvnf\" (UID: \"a977b831-7959-4509-93bf-a45b375ca722\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-nbvnf" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.316676 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/a977b831-7959-4509-93bf-a45b375ca722-rbac\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-nbvnf\" (UID: \"a977b831-7959-4509-93bf-a45b375ca722\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-nbvnf" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.316709 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/a977b831-7959-4509-93bf-a45b375ca722-tenants\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-nbvnf\" (UID: \"a977b831-7959-4509-93bf-a45b375ca722\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-nbvnf" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.316725 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/a977b831-7959-4509-93bf-a45b375ca722-tls-secret\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-nbvnf\" (UID: \"a977b831-7959-4509-93bf-a45b375ca722\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-nbvnf" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.318536 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a977b831-7959-4509-93bf-a45b375ca722-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-nbvnf\" (UID: \"a977b831-7959-4509-93bf-a45b375ca722\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-nbvnf" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.318592 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a977b831-7959-4509-93bf-a45b375ca722-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-nbvnf\" (UID: \"a977b831-7959-4509-93bf-a45b375ca722\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-nbvnf" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.318628 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a977b831-7959-4509-93bf-a45b375ca722-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-nbvnf\" (UID: \"a977b831-7959-4509-93bf-a45b375ca722\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-nbvnf" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.318905 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/a977b831-7959-4509-93bf-a45b375ca722-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-nbvnf\" (UID: \"a977b831-7959-4509-93bf-a45b375ca722\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-nbvnf" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.319432 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/a977b831-7959-4509-93bf-a45b375ca722-rbac\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-nbvnf\" (UID: \"a977b831-7959-4509-93bf-a45b375ca722\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-nbvnf" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.320617 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/a977b831-7959-4509-93bf-a45b375ca722-tls-secret\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-nbvnf\" (UID: \"a977b831-7959-4509-93bf-a45b375ca722\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-nbvnf" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.320651 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/a977b831-7959-4509-93bf-a45b375ca722-tenants\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-nbvnf\" (UID: \"a977b831-7959-4509-93bf-a45b375ca722\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-nbvnf" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.321096 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/a977b831-7959-4509-93bf-a45b375ca722-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-nbvnf\" (UID: \"a977b831-7959-4509-93bf-a45b375ca722\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-nbvnf" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.343106 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7f8s9\" (UniqueName: \"kubernetes.io/projected/a977b831-7959-4509-93bf-a45b375ca722-kube-api-access-7f8s9\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-nbvnf\" (UID: \"a977b831-7959-4509-93bf-a45b375ca722\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-nbvnf" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.347983 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-nbvnf" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.495392 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-ingester-0"] Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.496569 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-ingester-0" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.498572 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-ingester-http" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.499441 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-ingester-grpc" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.552190 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-ingester-0"] Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.633719 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"1c33fb01-9bf7-43f1-86d5-004e70d3721c\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.633859 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/1c33fb01-9bf7-43f1-86d5-004e70d3721c-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"1c33fb01-9bf7-43f1-86d5-004e70d3721c\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.633915 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ingester-http\" (UniqueName: \"kubernetes.io/secret/1c33fb01-9bf7-43f1-86d5-004e70d3721c-cloudkitty-lokistack-ingester-http\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"1c33fb01-9bf7-43f1-86d5-004e70d3721c\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.633984 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"1c33fb01-9bf7-43f1-86d5-004e70d3721c\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.634116 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c33fb01-9bf7-43f1-86d5-004e70d3721c-config\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"1c33fb01-9bf7-43f1-86d5-004e70d3721c\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.634154 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1c33fb01-9bf7-43f1-86d5-004e70d3721c-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"1c33fb01-9bf7-43f1-86d5-004e70d3721c\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.634204 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hckzp\" (UniqueName: \"kubernetes.io/projected/1c33fb01-9bf7-43f1-86d5-004e70d3721c-kube-api-access-hckzp\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"1c33fb01-9bf7-43f1-86d5-004e70d3721c\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.634255 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/1c33fb01-9bf7-43f1-86d5-004e70d3721c-cloudkitty-lokistack-ingester-grpc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"1c33fb01-9bf7-43f1-86d5-004e70d3721c\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.653570 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-compactor-0"] Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.655002 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-compactor-0" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.659380 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-compactor-http" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.659575 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-compactor-grpc" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.695144 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-compactor-0"] Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.735558 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ingester-http\" (UniqueName: \"kubernetes.io/secret/1c33fb01-9bf7-43f1-86d5-004e70d3721c-cloudkitty-lokistack-ingester-http\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"1c33fb01-9bf7-43f1-86d5-004e70d3721c\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.735626 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"1c33fb01-9bf7-43f1-86d5-004e70d3721c\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.735656 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"e2c3e649-7933-49e2-800c-b66dbd377ac6\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.735716 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e2c3e649-7933-49e2-800c-b66dbd377ac6-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"e2c3e649-7933-49e2-800c-b66dbd377ac6\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.735754 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9wnr\" (UniqueName: \"kubernetes.io/projected/e2c3e649-7933-49e2-800c-b66dbd377ac6-kube-api-access-z9wnr\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"e2c3e649-7933-49e2-800c-b66dbd377ac6\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.735782 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c33fb01-9bf7-43f1-86d5-004e70d3721c-config\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"1c33fb01-9bf7-43f1-86d5-004e70d3721c\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.735819 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-compactor-http\" (UniqueName: \"kubernetes.io/secret/e2c3e649-7933-49e2-800c-b66dbd377ac6-cloudkitty-lokistack-compactor-http\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"e2c3e649-7933-49e2-800c-b66dbd377ac6\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.735846 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1c33fb01-9bf7-43f1-86d5-004e70d3721c-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"1c33fb01-9bf7-43f1-86d5-004e70d3721c\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.735964 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2c3e649-7933-49e2-800c-b66dbd377ac6-config\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"e2c3e649-7933-49e2-800c-b66dbd377ac6\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.736071 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hckzp\" (UniqueName: \"kubernetes.io/projected/1c33fb01-9bf7-43f1-86d5-004e70d3721c-kube-api-access-hckzp\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"1c33fb01-9bf7-43f1-86d5-004e70d3721c\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.736110 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/e2c3e649-7933-49e2-800c-b66dbd377ac6-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"e2c3e649-7933-49e2-800c-b66dbd377ac6\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.736163 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/1c33fb01-9bf7-43f1-86d5-004e70d3721c-cloudkitty-lokistack-ingester-grpc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"1c33fb01-9bf7-43f1-86d5-004e70d3721c\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.736192 4836 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"1c33fb01-9bf7-43f1-86d5-004e70d3721c\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/cloudkitty-lokistack-ingester-0" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.736310 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"1c33fb01-9bf7-43f1-86d5-004e70d3721c\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.736409 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/1c33fb01-9bf7-43f1-86d5-004e70d3721c-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"1c33fb01-9bf7-43f1-86d5-004e70d3721c\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.736456 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/e2c3e649-7933-49e2-800c-b66dbd377ac6-cloudkitty-lokistack-compactor-grpc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"e2c3e649-7933-49e2-800c-b66dbd377ac6\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.736543 4836 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"1c33fb01-9bf7-43f1-86d5-004e70d3721c\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/cloudkitty-lokistack-ingester-0" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.736882 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c33fb01-9bf7-43f1-86d5-004e70d3721c-config\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"1c33fb01-9bf7-43f1-86d5-004e70d3721c\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.737085 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1c33fb01-9bf7-43f1-86d5-004e70d3721c-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"1c33fb01-9bf7-43f1-86d5-004e70d3721c\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.741065 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/1c33fb01-9bf7-43f1-86d5-004e70d3721c-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"1c33fb01-9bf7-43f1-86d5-004e70d3721c\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.742144 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/1c33fb01-9bf7-43f1-86d5-004e70d3721c-cloudkitty-lokistack-ingester-grpc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"1c33fb01-9bf7-43f1-86d5-004e70d3721c\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.745431 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ingester-http\" (UniqueName: \"kubernetes.io/secret/1c33fb01-9bf7-43f1-86d5-004e70d3721c-cloudkitty-lokistack-ingester-http\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"1c33fb01-9bf7-43f1-86d5-004e70d3721c\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.762335 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hckzp\" (UniqueName: \"kubernetes.io/projected/1c33fb01-9bf7-43f1-86d5-004e70d3721c-kube-api-access-hckzp\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"1c33fb01-9bf7-43f1-86d5-004e70d3721c\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.763634 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"1c33fb01-9bf7-43f1-86d5-004e70d3721c\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.766244 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"1c33fb01-9bf7-43f1-86d5-004e70d3721c\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.799202 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-index-gateway-0"] Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.802466 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.805618 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-index-gateway-http" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.806016 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-index-gateway-grpc" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.822557 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-ingester-0" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.839706 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2c3e649-7933-49e2-800c-b66dbd377ac6-config\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"e2c3e649-7933-49e2-800c-b66dbd377ac6\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.839782 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/e2c3e649-7933-49e2-800c-b66dbd377ac6-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"e2c3e649-7933-49e2-800c-b66dbd377ac6\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.839840 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d370240e-d6c1-4d9c-9877-293afa6e77f2-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"d370240e-d6c1-4d9c-9877-293afa6e77f2\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.839894 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"d370240e-d6c1-4d9c-9877-293afa6e77f2\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.839946 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmc66\" (UniqueName: \"kubernetes.io/projected/d370240e-d6c1-4d9c-9877-293afa6e77f2-kube-api-access-rmc66\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"d370240e-d6c1-4d9c-9877-293afa6e77f2\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.839989 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/d370240e-d6c1-4d9c-9877-293afa6e77f2-cloudkitty-lokistack-index-gateway-http\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"d370240e-d6c1-4d9c-9877-293afa6e77f2\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.840017 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d370240e-d6c1-4d9c-9877-293afa6e77f2-config\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"d370240e-d6c1-4d9c-9877-293afa6e77f2\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.840047 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/e2c3e649-7933-49e2-800c-b66dbd377ac6-cloudkitty-lokistack-compactor-grpc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"e2c3e649-7933-49e2-800c-b66dbd377ac6\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.840100 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"e2c3e649-7933-49e2-800c-b66dbd377ac6\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.840130 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/d370240e-d6c1-4d9c-9877-293afa6e77f2-cloudkitty-lokistack-index-gateway-grpc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"d370240e-d6c1-4d9c-9877-293afa6e77f2\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.840168 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/d370240e-d6c1-4d9c-9877-293afa6e77f2-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"d370240e-d6c1-4d9c-9877-293afa6e77f2\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.840331 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e2c3e649-7933-49e2-800c-b66dbd377ac6-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"e2c3e649-7933-49e2-800c-b66dbd377ac6\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.840420 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9wnr\" (UniqueName: \"kubernetes.io/projected/e2c3e649-7933-49e2-800c-b66dbd377ac6-kube-api-access-z9wnr\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"e2c3e649-7933-49e2-800c-b66dbd377ac6\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.840496 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-compactor-http\" (UniqueName: \"kubernetes.io/secret/e2c3e649-7933-49e2-800c-b66dbd377ac6-cloudkitty-lokistack-compactor-http\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"e2c3e649-7933-49e2-800c-b66dbd377ac6\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.842045 4836 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"e2c3e649-7933-49e2-800c-b66dbd377ac6\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/cloudkitty-lokistack-compactor-0" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.842343 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2c3e649-7933-49e2-800c-b66dbd377ac6-config\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"e2c3e649-7933-49e2-800c-b66dbd377ac6\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.843214 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e2c3e649-7933-49e2-800c-b66dbd377ac6-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"e2c3e649-7933-49e2-800c-b66dbd377ac6\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.843991 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-compactor-http\" (UniqueName: \"kubernetes.io/secret/e2c3e649-7933-49e2-800c-b66dbd377ac6-cloudkitty-lokistack-compactor-http\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"e2c3e649-7933-49e2-800c-b66dbd377ac6\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.858023 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/e2c3e649-7933-49e2-800c-b66dbd377ac6-cloudkitty-lokistack-compactor-grpc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"e2c3e649-7933-49e2-800c-b66dbd377ac6\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.862448 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/e2c3e649-7933-49e2-800c-b66dbd377ac6-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"e2c3e649-7933-49e2-800c-b66dbd377ac6\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.900525 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9wnr\" (UniqueName: \"kubernetes.io/projected/e2c3e649-7933-49e2-800c-b66dbd377ac6-kube-api-access-z9wnr\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"e2c3e649-7933-49e2-800c-b66dbd377ac6\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.917147 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"e2c3e649-7933-49e2-800c-b66dbd377ac6\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.918200 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-index-gateway-0"] Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.942742 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d370240e-d6c1-4d9c-9877-293afa6e77f2-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"d370240e-d6c1-4d9c-9877-293afa6e77f2\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.942936 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"d370240e-d6c1-4d9c-9877-293afa6e77f2\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.943054 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmc66\" (UniqueName: \"kubernetes.io/projected/d370240e-d6c1-4d9c-9877-293afa6e77f2-kube-api-access-rmc66\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"d370240e-d6c1-4d9c-9877-293afa6e77f2\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.943139 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/d370240e-d6c1-4d9c-9877-293afa6e77f2-cloudkitty-lokistack-index-gateway-http\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"d370240e-d6c1-4d9c-9877-293afa6e77f2\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.943200 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d370240e-d6c1-4d9c-9877-293afa6e77f2-config\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"d370240e-d6c1-4d9c-9877-293afa6e77f2\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.943226 4836 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"d370240e-d6c1-4d9c-9877-293afa6e77f2\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.944065 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/d370240e-d6c1-4d9c-9877-293afa6e77f2-cloudkitty-lokistack-index-gateway-grpc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"d370240e-d6c1-4d9c-9877-293afa6e77f2\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.944145 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/d370240e-d6c1-4d9c-9877-293afa6e77f2-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"d370240e-d6c1-4d9c-9877-293afa6e77f2\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.944455 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d370240e-d6c1-4d9c-9877-293afa6e77f2-config\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"d370240e-d6c1-4d9c-9877-293afa6e77f2\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.944641 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d370240e-d6c1-4d9c-9877-293afa6e77f2-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"d370240e-d6c1-4d9c-9877-293afa6e77f2\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.952479 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/d370240e-d6c1-4d9c-9877-293afa6e77f2-cloudkitty-lokistack-index-gateway-http\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"d370240e-d6c1-4d9c-9877-293afa6e77f2\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.952859 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/d370240e-d6c1-4d9c-9877-293afa6e77f2-cloudkitty-lokistack-index-gateway-grpc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"d370240e-d6c1-4d9c-9877-293afa6e77f2\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.955049 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/d370240e-d6c1-4d9c-9877-293afa6e77f2-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"d370240e-d6c1-4d9c-9877-293afa6e77f2\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.967210 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmc66\" (UniqueName: \"kubernetes.io/projected/d370240e-d6c1-4d9c-9877-293afa6e77f2-kube-api-access-rmc66\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"d370240e-d6c1-4d9c-9877-293afa6e77f2\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.982404 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"d370240e-d6c1-4d9c-9877-293afa6e77f2\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 17 14:25:57 crc kubenswrapper[4836]: I0217 14:25:57.996925 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-compactor-0" Feb 17 14:25:58 crc kubenswrapper[4836]: I0217 14:25:58.146056 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 17 14:26:00 crc kubenswrapper[4836]: I0217 14:26:00.135502 4836 patch_prober.go:28] interesting pod/machine-config-daemon-bkk9g container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 14:26:00 crc kubenswrapper[4836]: I0217 14:26:00.135598 4836 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" podUID="895a19c9-a3f0-4a15-aa19-19347121388c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 14:26:05 crc kubenswrapper[4836]: E0217 14:26:05.777237 4836 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-mariadb:current-podified" Feb 17 14:26:05 crc kubenswrapper[4836]: E0217 14:26:05.779314 4836 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6785l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-cell1-galera-0_openstack(a6016745-1634-4eb6-afee-b98ce9ab8f56): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 14:26:05 crc kubenswrapper[4836]: E0217 14:26:05.781031 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstack-cell1-galera-0" podUID="a6016745-1634-4eb6-afee-b98ce9ab8f56" Feb 17 14:26:06 crc kubenswrapper[4836]: E0217 14:26:06.238758 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-mariadb:current-podified\\\"\"" pod="openstack/openstack-cell1-galera-0" podUID="a6016745-1634-4eb6-afee-b98ce9ab8f56" Feb 17 14:26:06 crc kubenswrapper[4836]: E0217 14:26:06.492698 4836 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-memcached:current-podified" Feb 17 14:26:06 crc kubenswrapper[4836]: E0217 14:26:06.492901 4836 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:memcached,Image:quay.io/podified-antelope-centos9/openstack-memcached:current-podified,Command:[/usr/bin/dumb-init -- /usr/local/bin/kolla_start],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:memcached,HostPort:0,ContainerPort:11211,Protocol:TCP,HostIP:,},ContainerPort{Name:memcached-tls,HostPort:0,ContainerPort:11212,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:POD_IPS,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIPs,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:CONFIG_HASH,Value:n5b4h5fbh5d6h5f9h9bh659h675h56dhbch5fdh68bh9fh699h5b6h5b5h668hcdhcdh65h68fh649h59h8bh64fh65bhc7h569h68hb8h544h5bbh686q,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/src,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:memcached-tls-certs,ReadOnly:true,MountPath:/var/lib/config-data/tls/certs/memcached.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:memcached-tls-certs,ReadOnly:true,MountPath:/var/lib/config-data/tls/private/memcached.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wcpcn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 11211 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 11211 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42457,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42457,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod memcached-0_openstack(ce3babe4-6d77-45ce-b9cc-626678d3ec64): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 14:26:06 crc kubenswrapper[4836]: E0217 14:26:06.494003 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"memcached\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/memcached-0" podUID="ce3babe4-6d77-45ce-b9cc-626678d3ec64" Feb 17 14:26:07 crc kubenswrapper[4836]: I0217 14:26:07.249322 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e29295a6-4250-4a7d-84c5-a8b3ddf96bd0","Type":"ContainerStarted","Data":"cefd70541e5e6c57648aaec13bc3ac8008ad32d2cca2fd2d95d8a18012223fb3"} Feb 17 14:26:07 crc kubenswrapper[4836]: I0217 14:26:07.250909 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"55bc1962-7790-448a-838c-cb13a870ea23","Type":"ContainerStarted","Data":"06062ba94e5713ddd8c227b30be89f5edc2cd18f7f07ef99efd389454307ed51"} Feb 17 14:26:07 crc kubenswrapper[4836]: E0217 14:26:07.252489 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"memcached\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-memcached:current-podified\\\"\"" pod="openstack/memcached-0" podUID="ce3babe4-6d77-45ce-b9cc-626678d3ec64" Feb 17 14:26:07 crc kubenswrapper[4836]: E0217 14:26:07.625470 4836 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Feb 17 14:26:07 crc kubenswrapper[4836]: E0217 14:26:07.625719 4836 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4zckh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-666b6646f7-jbcz5_openstack(e14b6d2f-85ef-4f0c-8a81-426aee02b456): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 14:26:07 crc kubenswrapper[4836]: E0217 14:26:07.626944 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-666b6646f7-jbcz5" podUID="e14b6d2f-85ef-4f0c-8a81-426aee02b456" Feb 17 14:26:07 crc kubenswrapper[4836]: E0217 14:26:07.636035 4836 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Feb 17 14:26:07 crc kubenswrapper[4836]: E0217 14:26:07.636208 4836 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8qmnr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-46wms_openstack(f1ebdbfb-7f75-4205-80ca-0ee085a21c0b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 14:26:07 crc kubenswrapper[4836]: E0217 14:26:07.637352 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-46wms" podUID="f1ebdbfb-7f75-4205-80ca-0ee085a21c0b" Feb 17 14:26:07 crc kubenswrapper[4836]: E0217 14:26:07.639898 4836 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Feb 17 14:26:07 crc kubenswrapper[4836]: E0217 14:26:07.640077 4836 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pvxj4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-ggz9w_openstack(24a665ea-1793-426d-b4df-48bfdd048f1c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 14:26:07 crc kubenswrapper[4836]: E0217 14:26:07.641776 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-ggz9w" podUID="24a665ea-1793-426d-b4df-48bfdd048f1c" Feb 17 14:26:07 crc kubenswrapper[4836]: E0217 14:26:07.673261 4836 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Feb 17 14:26:07 crc kubenswrapper[4836]: E0217 14:26:07.673438 4836 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8ntrv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-57d769cc4f-s6vqb_openstack(63d320ce-8669-4285-b4bc-dbb6eeb9a190): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 14:26:07 crc kubenswrapper[4836]: E0217 14:26:07.675400 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-57d769cc4f-s6vqb" podUID="63d320ce-8669-4285-b4bc-dbb6eeb9a190" Feb 17 14:26:08 crc kubenswrapper[4836]: I0217 14:26:08.162258 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Feb 17 14:26:08 crc kubenswrapper[4836]: I0217 14:26:08.323340 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"039a526c-4f5a-4641-9340-b18459145569","Type":"ContainerStarted","Data":"4cbe90198ede73e79e317e522482bdcf15991436a05ef6e581e62bb3968c9ce7"} Feb 17 14:26:08 crc kubenswrapper[4836]: E0217 14:26:08.334512 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-666b6646f7-jbcz5" podUID="e14b6d2f-85ef-4f0c-8a81-426aee02b456" Feb 17 14:26:08 crc kubenswrapper[4836]: E0217 14:26:08.343536 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-57d769cc4f-s6vqb" podUID="63d320ce-8669-4285-b4bc-dbb6eeb9a190" Feb 17 14:26:08 crc kubenswrapper[4836]: I0217 14:26:08.421123 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 17 14:26:08 crc kubenswrapper[4836]: I0217 14:26:08.991980 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ghk5k"] Feb 17 14:26:09 crc kubenswrapper[4836]: I0217 14:26:09.008366 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 17 14:26:09 crc kubenswrapper[4836]: W0217 14:26:09.067896 4836 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5949d44f_ef6d_417e_9035_9b235cd59863.slice/crio-b5ab6de8a0439c7bcacdde47f557ba468960c7db13041b91e5eadd707b7c9b08 WatchSource:0}: Error finding container b5ab6de8a0439c7bcacdde47f557ba468960c7db13041b91e5eadd707b7c9b08: Status 404 returned error can't find the container with id b5ab6de8a0439c7bcacdde47f557ba468960c7db13041b91e5eadd707b7c9b08 Feb 17 14:26:09 crc kubenswrapper[4836]: W0217 14:26:09.079749 4836 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod87197028_3222_4c04_89a7_135997258e0d.slice/crio-ac1cfd9dcf6c1abc6e025d9d148f792c18f57e036146c98f80e4d81f4745553b WatchSource:0}: Error finding container ac1cfd9dcf6c1abc6e025d9d148f792c18f57e036146c98f80e4d81f4745553b: Status 404 returned error can't find the container with id ac1cfd9dcf6c1abc6e025d9d148f792c18f57e036146c98f80e4d81f4745553b Feb 17 14:26:09 crc kubenswrapper[4836]: I0217 14:26:09.347307 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ghk5k" event={"ID":"5949d44f-ef6d-417e-9035-9b235cd59863","Type":"ContainerStarted","Data":"b5ab6de8a0439c7bcacdde47f557ba468960c7db13041b91e5eadd707b7c9b08"} Feb 17 14:26:09 crc kubenswrapper[4836]: I0217 14:26:09.357983 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"348d02a8-d1b2-4bd3-9f4c-9153e24a5f19","Type":"ContainerStarted","Data":"dfa019eb6dfc780d8a7bb7c10f837f86c6cf9b05a42be0d211fd953b53b28d68"} Feb 17 14:26:09 crc kubenswrapper[4836]: I0217 14:26:09.363844 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"87197028-3222-4c04-89a7-135997258e0d","Type":"ContainerStarted","Data":"ac1cfd9dcf6c1abc6e025d9d148f792c18f57e036146c98f80e4d81f4745553b"} Feb 17 14:26:09 crc kubenswrapper[4836]: I0217 14:26:09.372165 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"2fd891e0-6f97-4fa3-8281-aa97232d6c6d","Type":"ContainerStarted","Data":"7589ff250191c7eebfbce02cc148fe3104e0d0057941b75d9ae842fb9b393bcb"} Feb 17 14:26:09 crc kubenswrapper[4836]: I0217 14:26:09.452652 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-j4jj9"] Feb 17 14:26:09 crc kubenswrapper[4836]: W0217 14:26:09.467223 4836 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcefe420d_f25c_4681_9ae8_b61f0a354282.slice/crio-f7165b4765ba88e3f536fab00d381ac7718ee33e729a9b3168c1033bcef519d3 WatchSource:0}: Error finding container f7165b4765ba88e3f536fab00d381ac7718ee33e729a9b3168c1033bcef519d3: Status 404 returned error can't find the container with id f7165b4765ba88e3f536fab00d381ac7718ee33e729a9b3168c1033bcef519d3 Feb 17 14:26:09 crc kubenswrapper[4836]: I0217 14:26:09.689625 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-gateway-7f8685b49f-nbvnf"] Feb 17 14:26:09 crc kubenswrapper[4836]: I0217 14:26:09.724446 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-index-gateway-0"] Feb 17 14:26:09 crc kubenswrapper[4836]: I0217 14:26:09.753874 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-compactor-0"] Feb 17 14:26:09 crc kubenswrapper[4836]: I0217 14:26:09.767584 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-ingester-0"] Feb 17 14:26:09 crc kubenswrapper[4836]: I0217 14:26:09.778231 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-gateway-7f8685b49f-q78z5"] Feb 17 14:26:09 crc kubenswrapper[4836]: I0217 14:26:09.791356 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-bqn6j"] Feb 17 14:26:09 crc kubenswrapper[4836]: I0217 14:26:09.802320 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-querier-58c84b5844-fsq2h"] Feb 17 14:26:09 crc kubenswrapper[4836]: I0217 14:26:09.812596 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-distributor-585d9bcbc-r4gdh"] Feb 17 14:26:09 crc kubenswrapper[4836]: I0217 14:26:09.820677 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-46wms" Feb 17 14:26:09 crc kubenswrapper[4836]: I0217 14:26:09.821268 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-ggz9w" Feb 17 14:26:09 crc kubenswrapper[4836]: W0217 14:26:09.936991 4836 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod487d19a3_7f23_4945_bfe1_6231a37a84c6.slice/crio-ba38186ea7f2c89b3b027c148184bd75cf8b2c02b4be30bfeeca2d6fd8389527 WatchSource:0}: Error finding container ba38186ea7f2c89b3b027c148184bd75cf8b2c02b4be30bfeeca2d6fd8389527: Status 404 returned error can't find the container with id ba38186ea7f2c89b3b027c148184bd75cf8b2c02b4be30bfeeca2d6fd8389527 Feb 17 14:26:09 crc kubenswrapper[4836]: W0217 14:26:09.941843 4836 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode2c3e649_7933_49e2_800c_b66dbd377ac6.slice/crio-ce854d82f7fdab0f8db2fadd426f8a89bbf9188aaeda8fb1b8be61e0563586d8 WatchSource:0}: Error finding container ce854d82f7fdab0f8db2fadd426f8a89bbf9188aaeda8fb1b8be61e0563586d8: Status 404 returned error can't find the container with id ce854d82f7fdab0f8db2fadd426f8a89bbf9188aaeda8fb1b8be61e0563586d8 Feb 17 14:26:09 crc kubenswrapper[4836]: I0217 14:26:09.945585 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24a665ea-1793-426d-b4df-48bfdd048f1c-config\") pod \"24a665ea-1793-426d-b4df-48bfdd048f1c\" (UID: \"24a665ea-1793-426d-b4df-48bfdd048f1c\") " Feb 17 14:26:09 crc kubenswrapper[4836]: I0217 14:26:09.945643 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8qmnr\" (UniqueName: \"kubernetes.io/projected/f1ebdbfb-7f75-4205-80ca-0ee085a21c0b-kube-api-access-8qmnr\") pod \"f1ebdbfb-7f75-4205-80ca-0ee085a21c0b\" (UID: \"f1ebdbfb-7f75-4205-80ca-0ee085a21c0b\") " Feb 17 14:26:09 crc kubenswrapper[4836]: I0217 14:26:09.945719 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1ebdbfb-7f75-4205-80ca-0ee085a21c0b-config\") pod \"f1ebdbfb-7f75-4205-80ca-0ee085a21c0b\" (UID: \"f1ebdbfb-7f75-4205-80ca-0ee085a21c0b\") " Feb 17 14:26:09 crc kubenswrapper[4836]: I0217 14:26:09.945794 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/24a665ea-1793-426d-b4df-48bfdd048f1c-dns-svc\") pod \"24a665ea-1793-426d-b4df-48bfdd048f1c\" (UID: \"24a665ea-1793-426d-b4df-48bfdd048f1c\") " Feb 17 14:26:09 crc kubenswrapper[4836]: I0217 14:26:09.945851 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pvxj4\" (UniqueName: \"kubernetes.io/projected/24a665ea-1793-426d-b4df-48bfdd048f1c-kube-api-access-pvxj4\") pod \"24a665ea-1793-426d-b4df-48bfdd048f1c\" (UID: \"24a665ea-1793-426d-b4df-48bfdd048f1c\") " Feb 17 14:26:09 crc kubenswrapper[4836]: I0217 14:26:09.948957 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24a665ea-1793-426d-b4df-48bfdd048f1c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "24a665ea-1793-426d-b4df-48bfdd048f1c" (UID: "24a665ea-1793-426d-b4df-48bfdd048f1c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:26:09 crc kubenswrapper[4836]: I0217 14:26:09.950606 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24a665ea-1793-426d-b4df-48bfdd048f1c-config" (OuterVolumeSpecName: "config") pod "24a665ea-1793-426d-b4df-48bfdd048f1c" (UID: "24a665ea-1793-426d-b4df-48bfdd048f1c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:26:09 crc kubenswrapper[4836]: I0217 14:26:09.955291 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1ebdbfb-7f75-4205-80ca-0ee085a21c0b-kube-api-access-8qmnr" (OuterVolumeSpecName: "kube-api-access-8qmnr") pod "f1ebdbfb-7f75-4205-80ca-0ee085a21c0b" (UID: "f1ebdbfb-7f75-4205-80ca-0ee085a21c0b"). InnerVolumeSpecName "kube-api-access-8qmnr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:26:09 crc kubenswrapper[4836]: I0217 14:26:09.955433 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24a665ea-1793-426d-b4df-48bfdd048f1c-kube-api-access-pvxj4" (OuterVolumeSpecName: "kube-api-access-pvxj4") pod "24a665ea-1793-426d-b4df-48bfdd048f1c" (UID: "24a665ea-1793-426d-b4df-48bfdd048f1c"). InnerVolumeSpecName "kube-api-access-pvxj4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:26:09 crc kubenswrapper[4836]: I0217 14:26:09.961777 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1ebdbfb-7f75-4205-80ca-0ee085a21c0b-config" (OuterVolumeSpecName: "config") pod "f1ebdbfb-7f75-4205-80ca-0ee085a21c0b" (UID: "f1ebdbfb-7f75-4205-80ca-0ee085a21c0b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:26:09 crc kubenswrapper[4836]: W0217 14:26:09.965796 4836 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod33c54f8c_91c4_4742_b545_d0e2c4e85fe2.slice/crio-2aea778e655b61fe26ab4925dc0dec08762b82aac7ed8bba7974e98ef9d0e2f2 WatchSource:0}: Error finding container 2aea778e655b61fe26ab4925dc0dec08762b82aac7ed8bba7974e98ef9d0e2f2: Status 404 returned error can't find the container with id 2aea778e655b61fe26ab4925dc0dec08762b82aac7ed8bba7974e98ef9d0e2f2 Feb 17 14:26:10 crc kubenswrapper[4836]: I0217 14:26:10.048633 4836 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24a665ea-1793-426d-b4df-48bfdd048f1c-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:26:10 crc kubenswrapper[4836]: I0217 14:26:10.048687 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8qmnr\" (UniqueName: \"kubernetes.io/projected/f1ebdbfb-7f75-4205-80ca-0ee085a21c0b-kube-api-access-8qmnr\") on node \"crc\" DevicePath \"\"" Feb 17 14:26:10 crc kubenswrapper[4836]: I0217 14:26:10.048701 4836 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1ebdbfb-7f75-4205-80ca-0ee085a21c0b-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:26:10 crc kubenswrapper[4836]: I0217 14:26:10.048715 4836 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/24a665ea-1793-426d-b4df-48bfdd048f1c-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 14:26:10 crc kubenswrapper[4836]: I0217 14:26:10.048727 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pvxj4\" (UniqueName: \"kubernetes.io/projected/24a665ea-1793-426d-b4df-48bfdd048f1c-kube-api-access-pvxj4\") on node \"crc\" DevicePath \"\"" Feb 17 14:26:10 crc kubenswrapper[4836]: I0217 14:26:10.386395 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-nbvnf" event={"ID":"a977b831-7959-4509-93bf-a45b375ca722","Type":"ContainerStarted","Data":"197e017888d334883246dc517cc6ebcda5d85e0d0ec5f5b401f0cc5f753788a8"} Feb 17 14:26:10 crc kubenswrapper[4836]: I0217 14:26:10.387988 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-j4jj9" event={"ID":"cefe420d-f25c-4681-9ae8-b61f0a354282","Type":"ContainerStarted","Data":"f7165b4765ba88e3f536fab00d381ac7718ee33e729a9b3168c1033bcef519d3"} Feb 17 14:26:10 crc kubenswrapper[4836]: I0217 14:26:10.391543 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6f866bb7-5209-4275-8884-df6f074b3f7c","Type":"ContainerStarted","Data":"85576fe15acb4ec82e880a96b65a7ac8f381e29f3114bed6ed63c37985fe03f0"} Feb 17 14:26:10 crc kubenswrapper[4836]: I0217 14:26:10.393981 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-querier-58c84b5844-fsq2h" event={"ID":"27c5f450-8bef-4732-a7fb-272d9b5a4ea8","Type":"ContainerStarted","Data":"402269aa664e653829b9605208deffb731e1154b5ccc2f3a77365bf572021284"} Feb 17 14:26:10 crc kubenswrapper[4836]: I0217 14:26:10.397077 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-compactor-0" event={"ID":"e2c3e649-7933-49e2-800c-b66dbd377ac6","Type":"ContainerStarted","Data":"ce854d82f7fdab0f8db2fadd426f8a89bbf9188aaeda8fb1b8be61e0563586d8"} Feb 17 14:26:10 crc kubenswrapper[4836]: I0217 14:26:10.398991 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-r4gdh" event={"ID":"33c54f8c-91c4-4742-b545-d0e2c4e85fe2","Type":"ContainerStarted","Data":"2aea778e655b61fe26ab4925dc0dec08762b82aac7ed8bba7974e98ef9d0e2f2"} Feb 17 14:26:10 crc kubenswrapper[4836]: I0217 14:26:10.400013 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-q78z5" event={"ID":"974f66b3-690f-4008-949d-1d57c978d427","Type":"ContainerStarted","Data":"909cbde8c79ad044d96184d526b2b525ce1fb1c3c2bb3d48d600956da9444a32"} Feb 17 14:26:10 crc kubenswrapper[4836]: I0217 14:26:10.403232 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-ggz9w" event={"ID":"24a665ea-1793-426d-b4df-48bfdd048f1c","Type":"ContainerDied","Data":"a6bb86072d0798a1afb87a4b547d5cc9e6e5bb7f7f723d97aa4e7592e494e55c"} Feb 17 14:26:10 crc kubenswrapper[4836]: I0217 14:26:10.403328 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-ggz9w" Feb 17 14:26:10 crc kubenswrapper[4836]: I0217 14:26:10.407001 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-ingester-0" event={"ID":"1c33fb01-9bf7-43f1-86d5-004e70d3721c","Type":"ContainerStarted","Data":"da5a1432ebe8c39c9387ed27f7ccb7165ee9a6ce317dda015178127c8bea9a8f"} Feb 17 14:26:10 crc kubenswrapper[4836]: I0217 14:26:10.409573 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ec9408e6-0474-4f84-842e-b1c20f42a7b8","Type":"ContainerStarted","Data":"1e0077eb33d7cdccabd3d53eadba26bb33ef9899ccdc0c0e3003d7b300233249"} Feb 17 14:26:10 crc kubenswrapper[4836]: I0217 14:26:10.412452 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-bqn6j" event={"ID":"487d19a3-7f23-4945-bfe1-6231a37a84c6","Type":"ContainerStarted","Data":"ba38186ea7f2c89b3b027c148184bd75cf8b2c02b4be30bfeeca2d6fd8389527"} Feb 17 14:26:10 crc kubenswrapper[4836]: I0217 14:26:10.418129 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-46wms" event={"ID":"f1ebdbfb-7f75-4205-80ca-0ee085a21c0b","Type":"ContainerDied","Data":"b0dca2b7fd572359a51505f1dec3dc3d3db3e7f58bf21d6f39749cf427d85d3b"} Feb 17 14:26:10 crc kubenswrapper[4836]: I0217 14:26:10.418207 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-46wms" Feb 17 14:26:10 crc kubenswrapper[4836]: I0217 14:26:10.447720 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-index-gateway-0" event={"ID":"d370240e-d6c1-4d9c-9877-293afa6e77f2","Type":"ContainerStarted","Data":"15680e21e6ac532d32c569f9b60dab424465fbc5f504aee69bdfe34e5577a70d"} Feb 17 14:26:10 crc kubenswrapper[4836]: I0217 14:26:10.529734 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-ggz9w"] Feb 17 14:26:10 crc kubenswrapper[4836]: I0217 14:26:10.542216 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-ggz9w"] Feb 17 14:26:10 crc kubenswrapper[4836]: I0217 14:26:10.638051 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24a665ea-1793-426d-b4df-48bfdd048f1c" path="/var/lib/kubelet/pods/24a665ea-1793-426d-b4df-48bfdd048f1c/volumes" Feb 17 14:26:10 crc kubenswrapper[4836]: I0217 14:26:10.643406 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-46wms"] Feb 17 14:26:10 crc kubenswrapper[4836]: I0217 14:26:10.643706 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-46wms"] Feb 17 14:26:12 crc kubenswrapper[4836]: I0217 14:26:12.582340 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1ebdbfb-7f75-4205-80ca-0ee085a21c0b" path="/var/lib/kubelet/pods/f1ebdbfb-7f75-4205-80ca-0ee085a21c0b/volumes" Feb 17 14:26:13 crc kubenswrapper[4836]: I0217 14:26:13.496160 4836 generic.go:334] "Generic (PLEG): container finished" podID="2fd891e0-6f97-4fa3-8281-aa97232d6c6d" containerID="7589ff250191c7eebfbce02cc148fe3104e0d0057941b75d9ae842fb9b393bcb" exitCode=0 Feb 17 14:26:13 crc kubenswrapper[4836]: I0217 14:26:13.496211 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"2fd891e0-6f97-4fa3-8281-aa97232d6c6d","Type":"ContainerDied","Data":"7589ff250191c7eebfbce02cc148fe3104e0d0057941b75d9ae842fb9b393bcb"} Feb 17 14:26:19 crc kubenswrapper[4836]: I0217 14:26:19.727910 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"2fd891e0-6f97-4fa3-8281-aa97232d6c6d","Type":"ContainerStarted","Data":"cbff2d76a45a19fc91e95a754dc92867ba6368787be797f1530c25ebdc789c33"} Feb 17 14:26:19 crc kubenswrapper[4836]: I0217 14:26:19.731007 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-bqn6j" event={"ID":"487d19a3-7f23-4945-bfe1-6231a37a84c6","Type":"ContainerStarted","Data":"2c7e51c42a8648fdf229dc91eb17c49c900557e3036f650c0634dfc08051dcbb"} Feb 17 14:26:19 crc kubenswrapper[4836]: I0217 14:26:19.731195 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-bqn6j" Feb 17 14:26:19 crc kubenswrapper[4836]: I0217 14:26:19.788401 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=14.216513853 podStartE2EDuration="41.788362695s" podCreationTimestamp="2026-02-17 14:25:38 +0000 UTC" firstStartedPulling="2026-02-17 14:25:40.357838521 +0000 UTC m=+1166.700789321" lastFinishedPulling="2026-02-17 14:26:07.929709894 +0000 UTC m=+1194.272638163" observedRunningTime="2026-02-17 14:26:19.755009892 +0000 UTC m=+1206.097938181" watchObservedRunningTime="2026-02-17 14:26:19.788362695 +0000 UTC m=+1206.131291134" Feb 17 14:26:19 crc kubenswrapper[4836]: I0217 14:26:19.791913 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-bqn6j" podStartSLOduration=16.587228087 podStartE2EDuration="23.791886858s" podCreationTimestamp="2026-02-17 14:25:56 +0000 UTC" firstStartedPulling="2026-02-17 14:26:09.940014041 +0000 UTC m=+1196.282942310" lastFinishedPulling="2026-02-17 14:26:17.144672812 +0000 UTC m=+1203.487601081" observedRunningTime="2026-02-17 14:26:19.78365839 +0000 UTC m=+1206.126586779" watchObservedRunningTime="2026-02-17 14:26:19.791886858 +0000 UTC m=+1206.134815127" Feb 17 14:26:20 crc kubenswrapper[4836]: I0217 14:26:20.753052 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e29295a6-4250-4a7d-84c5-a8b3ddf96bd0","Type":"ContainerStarted","Data":"1aeb38549c5093ddcbd19fe025e8df306afcc08ba355a33bcd16537686f0d989"} Feb 17 14:26:21 crc kubenswrapper[4836]: I0217 14:26:21.762467 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-compactor-0" event={"ID":"e2c3e649-7933-49e2-800c-b66dbd377ac6","Type":"ContainerStarted","Data":"64e1582fb06d05de2483b260f089507530e9d2d49cb5a107d301863d329da8a2"} Feb 17 14:26:21 crc kubenswrapper[4836]: I0217 14:26:21.763140 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-compactor-0" Feb 17 14:26:21 crc kubenswrapper[4836]: I0217 14:26:21.766015 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-r4gdh" event={"ID":"33c54f8c-91c4-4742-b545-d0e2c4e85fe2","Type":"ContainerStarted","Data":"03ebbb1b3ce184b45c3662c6adbec0c02ab9f4f09ca958693abcdfeffe8f9ee5"} Feb 17 14:26:21 crc kubenswrapper[4836]: I0217 14:26:21.766154 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-r4gdh" Feb 17 14:26:21 crc kubenswrapper[4836]: I0217 14:26:21.768956 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-ingester-0" event={"ID":"1c33fb01-9bf7-43f1-86d5-004e70d3721c","Type":"ContainerStarted","Data":"e35267da76ae00fa184a439eba6ca0d6a766d7b2cc5eb5e026aaf5d342b3a4f6"} Feb 17 14:26:21 crc kubenswrapper[4836]: I0217 14:26:21.769145 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-ingester-0" Feb 17 14:26:21 crc kubenswrapper[4836]: I0217 14:26:21.771852 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ghk5k" event={"ID":"5949d44f-ef6d-417e-9035-9b235cd59863","Type":"ContainerStarted","Data":"32287f9f678dbaed07061e53120bb2d03b0a8363de0671992dceb7e7bed21aa9"} Feb 17 14:26:21 crc kubenswrapper[4836]: I0217 14:26:21.772022 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ghk5k" Feb 17 14:26:21 crc kubenswrapper[4836]: I0217 14:26:21.773961 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-querier-58c84b5844-fsq2h" event={"ID":"27c5f450-8bef-4732-a7fb-272d9b5a4ea8","Type":"ContainerStarted","Data":"dde5854aa96cd113469c300d94b30c2eb9058189e0ab789d9fea33d42b96a117"} Feb 17 14:26:21 crc kubenswrapper[4836]: I0217 14:26:21.774007 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-querier-58c84b5844-fsq2h" Feb 17 14:26:21 crc kubenswrapper[4836]: I0217 14:26:21.776244 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-index-gateway-0" event={"ID":"d370240e-d6c1-4d9c-9877-293afa6e77f2","Type":"ContainerStarted","Data":"ba0ecd0b100eb9ea63f73bf1a5ca84603cc3ebac6a987a21846e914b41a2eb7d"} Feb 17 14:26:21 crc kubenswrapper[4836]: I0217 14:26:21.776343 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 17 14:26:21 crc kubenswrapper[4836]: I0217 14:26:21.778049 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-nbvnf" event={"ID":"a977b831-7959-4509-93bf-a45b375ca722","Type":"ContainerStarted","Data":"05ad14d1ceb71d26ff8ffe5dbb522f582d3b9ff7f132b341d7172c0b5b23e36c"} Feb 17 14:26:21 crc kubenswrapper[4836]: I0217 14:26:21.778236 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-nbvnf" Feb 17 14:26:21 crc kubenswrapper[4836]: I0217 14:26:21.787470 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"ce3babe4-6d77-45ce-b9cc-626678d3ec64","Type":"ContainerStarted","Data":"4e19f0fa1d919c16dad83c81a615cac893a67fd86f695e784545453a405b8ac2"} Feb 17 14:26:21 crc kubenswrapper[4836]: I0217 14:26:21.789376 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Feb 17 14:26:21 crc kubenswrapper[4836]: I0217 14:26:21.796786 4836 generic.go:334] "Generic (PLEG): container finished" podID="cefe420d-f25c-4681-9ae8-b61f0a354282" containerID="636101e7aa1ec6e32cd2fd443d1dc53a1da78342ee17ec1c5f1e7bf1f82351d7" exitCode=0 Feb 17 14:26:21 crc kubenswrapper[4836]: I0217 14:26:21.796885 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-j4jj9" event={"ID":"cefe420d-f25c-4681-9ae8-b61f0a354282","Type":"ContainerDied","Data":"636101e7aa1ec6e32cd2fd443d1dc53a1da78342ee17ec1c5f1e7bf1f82351d7"} Feb 17 14:26:21 crc kubenswrapper[4836]: I0217 14:26:21.801595 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"87197028-3222-4c04-89a7-135997258e0d","Type":"ContainerStarted","Data":"6d20de79ec5b2889e85dc2a14e5a092567cc4ce841befd2fa670b857d176cad9"} Feb 17 14:26:21 crc kubenswrapper[4836]: I0217 14:26:21.802204 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Feb 17 14:26:21 crc kubenswrapper[4836]: I0217 14:26:21.804690 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"55bc1962-7790-448a-838c-cb13a870ea23","Type":"ContainerStarted","Data":"f31dffc396a57b84b92df802b0646ee470765b8b992c8550f0246d91f5466b27"} Feb 17 14:26:21 crc kubenswrapper[4836]: I0217 14:26:21.805141 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-compactor-0" podStartSLOduration=18.295408192 podStartE2EDuration="25.805109883s" podCreationTimestamp="2026-02-17 14:25:56 +0000 UTC" firstStartedPulling="2026-02-17 14:26:09.9490564 +0000 UTC m=+1196.291984669" lastFinishedPulling="2026-02-17 14:26:17.458758071 +0000 UTC m=+1203.801686360" observedRunningTime="2026-02-17 14:26:21.786440259 +0000 UTC m=+1208.129368538" watchObservedRunningTime="2026-02-17 14:26:21.805109883 +0000 UTC m=+1208.148038162" Feb 17 14:26:21 crc kubenswrapper[4836]: I0217 14:26:21.807868 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-q78z5" event={"ID":"974f66b3-690f-4008-949d-1d57c978d427","Type":"ContainerStarted","Data":"9930295418d6f14f49b1a38e8481e519c6d37978249d30a530eb887ce5e5ce4a"} Feb 17 14:26:21 crc kubenswrapper[4836]: I0217 14:26:21.807918 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-nbvnf" Feb 17 14:26:21 crc kubenswrapper[4836]: I0217 14:26:21.808281 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-q78z5" Feb 17 14:26:21 crc kubenswrapper[4836]: I0217 14:26:21.820903 4836 generic.go:334] "Generic (PLEG): container finished" podID="63d320ce-8669-4285-b4bc-dbb6eeb9a190" containerID="57ea1eebc786d3a8ae12a685cfa802406deab325110c652e436a68a0c258022f" exitCode=0 Feb 17 14:26:21 crc kubenswrapper[4836]: I0217 14:26:21.821076 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-s6vqb" event={"ID":"63d320ce-8669-4285-b4bc-dbb6eeb9a190","Type":"ContainerDied","Data":"57ea1eebc786d3a8ae12a685cfa802406deab325110c652e436a68a0c258022f"} Feb 17 14:26:21 crc kubenswrapper[4836]: I0217 14:26:21.821522 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-querier-58c84b5844-fsq2h" podStartSLOduration=18.552252726 podStartE2EDuration="25.821501737s" podCreationTimestamp="2026-02-17 14:25:56 +0000 UTC" firstStartedPulling="2026-02-17 14:26:09.967938041 +0000 UTC m=+1196.310866310" lastFinishedPulling="2026-02-17 14:26:17.237187052 +0000 UTC m=+1203.580115321" observedRunningTime="2026-02-17 14:26:21.809698434 +0000 UTC m=+1208.152626723" watchObservedRunningTime="2026-02-17 14:26:21.821501737 +0000 UTC m=+1208.164430006" Feb 17 14:26:21 crc kubenswrapper[4836]: I0217 14:26:21.827043 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"348d02a8-d1b2-4bd3-9f4c-9153e24a5f19","Type":"ContainerStarted","Data":"a0c6837423c83012243ded0c8254010821ac471a614b60ef2aa6c50c514ceee8"} Feb 17 14:26:21 crc kubenswrapper[4836]: I0217 14:26:21.832964 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-q78z5" Feb 17 14:26:21 crc kubenswrapper[4836]: I0217 14:26:21.862627 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-ingester-0" podStartSLOduration=18.577280219 podStartE2EDuration="25.862601095s" podCreationTimestamp="2026-02-17 14:25:56 +0000 UTC" firstStartedPulling="2026-02-17 14:26:09.951404343 +0000 UTC m=+1196.294332612" lastFinishedPulling="2026-02-17 14:26:17.236725219 +0000 UTC m=+1203.579653488" observedRunningTime="2026-02-17 14:26:21.841876146 +0000 UTC m=+1208.184804425" watchObservedRunningTime="2026-02-17 14:26:21.862601095 +0000 UTC m=+1208.205529374" Feb 17 14:26:21 crc kubenswrapper[4836]: I0217 14:26:21.866446 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-index-gateway-0" podStartSLOduration=17.75089924 podStartE2EDuration="25.866421617s" podCreationTimestamp="2026-02-17 14:25:56 +0000 UTC" firstStartedPulling="2026-02-17 14:26:09.693038109 +0000 UTC m=+1196.035966378" lastFinishedPulling="2026-02-17 14:26:17.808560486 +0000 UTC m=+1204.151488755" observedRunningTime="2026-02-17 14:26:21.859649197 +0000 UTC m=+1208.202577486" watchObservedRunningTime="2026-02-17 14:26:21.866421617 +0000 UTC m=+1208.209349886" Feb 17 14:26:21 crc kubenswrapper[4836]: I0217 14:26:21.911064 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-nbvnf" podStartSLOduration=18.662474646 podStartE2EDuration="25.911035219s" podCreationTimestamp="2026-02-17 14:25:56 +0000 UTC" firstStartedPulling="2026-02-17 14:26:09.953066807 +0000 UTC m=+1196.295995076" lastFinishedPulling="2026-02-17 14:26:17.20162736 +0000 UTC m=+1203.544555649" observedRunningTime="2026-02-17 14:26:21.891491971 +0000 UTC m=+1208.234420250" watchObservedRunningTime="2026-02-17 14:26:21.911035219 +0000 UTC m=+1208.253963488" Feb 17 14:26:21 crc kubenswrapper[4836]: I0217 14:26:21.940323 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-r4gdh" podStartSLOduration=18.674279699 podStartE2EDuration="25.940279993s" podCreationTimestamp="2026-02-17 14:25:56 +0000 UTC" firstStartedPulling="2026-02-17 14:26:09.971998539 +0000 UTC m=+1196.314926808" lastFinishedPulling="2026-02-17 14:26:17.237998833 +0000 UTC m=+1203.580927102" observedRunningTime="2026-02-17 14:26:21.916264997 +0000 UTC m=+1208.259193286" watchObservedRunningTime="2026-02-17 14:26:21.940279993 +0000 UTC m=+1208.283208262" Feb 17 14:26:21 crc kubenswrapper[4836]: I0217 14:26:21.959760 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ghk5k" podStartSLOduration=27.700684741 podStartE2EDuration="35.959721638s" podCreationTimestamp="2026-02-17 14:25:46 +0000 UTC" firstStartedPulling="2026-02-17 14:26:09.072972116 +0000 UTC m=+1195.415900385" lastFinishedPulling="2026-02-17 14:26:17.332009013 +0000 UTC m=+1203.674937282" observedRunningTime="2026-02-17 14:26:21.948076429 +0000 UTC m=+1208.291004718" watchObservedRunningTime="2026-02-17 14:26:21.959721638 +0000 UTC m=+1208.302649907" Feb 17 14:26:21 crc kubenswrapper[4836]: I0217 14:26:21.985834 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=12.584460113 podStartE2EDuration="41.985799949s" podCreationTimestamp="2026-02-17 14:25:40 +0000 UTC" firstStartedPulling="2026-02-17 14:25:51.666832788 +0000 UTC m=+1178.009761077" lastFinishedPulling="2026-02-17 14:26:21.068172644 +0000 UTC m=+1207.411100913" observedRunningTime="2026-02-17 14:26:21.969307772 +0000 UTC m=+1208.312236041" watchObservedRunningTime="2026-02-17 14:26:21.985799949 +0000 UTC m=+1208.328728218" Feb 17 14:26:22 crc kubenswrapper[4836]: I0217 14:26:22.007599 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=28.177142979 podStartE2EDuration="39.007569046s" podCreationTimestamp="2026-02-17 14:25:43 +0000 UTC" firstStartedPulling="2026-02-17 14:26:09.095107022 +0000 UTC m=+1195.438035291" lastFinishedPulling="2026-02-17 14:26:19.925533079 +0000 UTC m=+1206.268461358" observedRunningTime="2026-02-17 14:26:21.993666657 +0000 UTC m=+1208.336594926" watchObservedRunningTime="2026-02-17 14:26:22.007569046 +0000 UTC m=+1208.350497315" Feb 17 14:26:22 crc kubenswrapper[4836]: I0217 14:26:22.095712 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-q78z5" podStartSLOduration=18.493372907 podStartE2EDuration="26.09567535s" podCreationTimestamp="2026-02-17 14:25:56 +0000 UTC" firstStartedPulling="2026-02-17 14:26:09.733420479 +0000 UTC m=+1196.076348748" lastFinishedPulling="2026-02-17 14:26:17.335722922 +0000 UTC m=+1203.678651191" observedRunningTime="2026-02-17 14:26:22.093223364 +0000 UTC m=+1208.436151653" watchObservedRunningTime="2026-02-17 14:26:22.09567535 +0000 UTC m=+1208.438603619" Feb 17 14:26:22 crc kubenswrapper[4836]: I0217 14:26:22.838522 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"a6016745-1634-4eb6-afee-b98ce9ab8f56","Type":"ContainerStarted","Data":"aee74edc0c06a08e555878906493cce427efbca90aaeb3c4fe3a23355ef32693"} Feb 17 14:26:23 crc kubenswrapper[4836]: I0217 14:26:23.870226 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"039a526c-4f5a-4641-9340-b18459145569","Type":"ContainerStarted","Data":"9fc719884946b23c18eb39d431c1a3a86925f7b12eb5058327ff5297c2544b72"} Feb 17 14:26:23 crc kubenswrapper[4836]: I0217 14:26:23.879852 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-s6vqb" event={"ID":"63d320ce-8669-4285-b4bc-dbb6eeb9a190","Type":"ContainerStarted","Data":"d1ced8732b18e9a32bcf99bb2f034caca8afbe19ad1c6c3a49849748da69630c"} Feb 17 14:26:23 crc kubenswrapper[4836]: I0217 14:26:23.880232 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-s6vqb" Feb 17 14:26:23 crc kubenswrapper[4836]: I0217 14:26:23.901735 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"348d02a8-d1b2-4bd3-9f4c-9153e24a5f19","Type":"ContainerStarted","Data":"1ec68336a3c5494d166918ccd0c9bb1885725856abc7c73cfa1b9a88ce8c4dbe"} Feb 17 14:26:23 crc kubenswrapper[4836]: I0217 14:26:23.905061 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-j4jj9" event={"ID":"cefe420d-f25c-4681-9ae8-b61f0a354282","Type":"ContainerStarted","Data":"d3bd473c9d3b050f0ef16304bf8861b295846719e86a7ff11a5cbe0b0bfbab0b"} Feb 17 14:26:23 crc kubenswrapper[4836]: I0217 14:26:23.907111 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"55bc1962-7790-448a-838c-cb13a870ea23","Type":"ContainerStarted","Data":"3a2f0903fa9451947c8daa39f2fd1b4f6ad75329d2c8ec14431d2d465b026a83"} Feb 17 14:26:24 crc kubenswrapper[4836]: I0217 14:26:24.016669 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=23.155248515 podStartE2EDuration="38.016644781s" podCreationTimestamp="2026-02-17 14:25:46 +0000 UTC" firstStartedPulling="2026-02-17 14:26:08.459342422 +0000 UTC m=+1194.802270691" lastFinishedPulling="2026-02-17 14:26:23.320738688 +0000 UTC m=+1209.663666957" observedRunningTime="2026-02-17 14:26:24.000215025 +0000 UTC m=+1210.343143324" watchObservedRunningTime="2026-02-17 14:26:24.016644781 +0000 UTC m=+1210.359573060" Feb 17 14:26:24 crc kubenswrapper[4836]: I0217 14:26:24.033375 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-s6vqb" podStartSLOduration=5.119360009 podStartE2EDuration="48.033339852s" podCreationTimestamp="2026-02-17 14:25:36 +0000 UTC" firstStartedPulling="2026-02-17 14:25:38.158182537 +0000 UTC m=+1164.501110796" lastFinishedPulling="2026-02-17 14:26:21.07216236 +0000 UTC m=+1207.415090639" observedRunningTime="2026-02-17 14:26:24.023946484 +0000 UTC m=+1210.366874753" watchObservedRunningTime="2026-02-17 14:26:24.033339852 +0000 UTC m=+1210.376268121" Feb 17 14:26:24 crc kubenswrapper[4836]: I0217 14:26:24.043698 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Feb 17 14:26:24 crc kubenswrapper[4836]: I0217 14:26:24.067218 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=18.269132058 podStartE2EDuration="35.067187469s" podCreationTimestamp="2026-02-17 14:25:49 +0000 UTC" firstStartedPulling="2026-02-17 14:26:06.510669568 +0000 UTC m=+1192.853597837" lastFinishedPulling="2026-02-17 14:26:23.308724979 +0000 UTC m=+1209.651653248" observedRunningTime="2026-02-17 14:26:24.063641085 +0000 UTC m=+1210.406569364" watchObservedRunningTime="2026-02-17 14:26:24.067187469 +0000 UTC m=+1210.410115738" Feb 17 14:26:24 crc kubenswrapper[4836]: I0217 14:26:24.096461 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Feb 17 14:26:24 crc kubenswrapper[4836]: I0217 14:26:24.483652 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Feb 17 14:26:24 crc kubenswrapper[4836]: I0217 14:26:24.524485 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Feb 17 14:26:24 crc kubenswrapper[4836]: I0217 14:26:24.918684 4836 generic.go:334] "Generic (PLEG): container finished" podID="e14b6d2f-85ef-4f0c-8a81-426aee02b456" containerID="7bca88336a02b6b00bc19416ffdd2164736c7a5342d72427305b5d2ff3839adf" exitCode=0 Feb 17 14:26:24 crc kubenswrapper[4836]: I0217 14:26:24.918737 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-jbcz5" event={"ID":"e14b6d2f-85ef-4f0c-8a81-426aee02b456","Type":"ContainerDied","Data":"7bca88336a02b6b00bc19416ffdd2164736c7a5342d72427305b5d2ff3839adf"} Feb 17 14:26:24 crc kubenswrapper[4836]: I0217 14:26:24.922276 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-j4jj9" event={"ID":"cefe420d-f25c-4681-9ae8-b61f0a354282","Type":"ContainerStarted","Data":"b3cdaabc9e929f938f58423846d3d6283236f41b4567b11e16979ebd00a0d473"} Feb 17 14:26:24 crc kubenswrapper[4836]: I0217 14:26:24.922994 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Feb 17 14:26:24 crc kubenswrapper[4836]: I0217 14:26:24.923021 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Feb 17 14:26:24 crc kubenswrapper[4836]: I0217 14:26:24.964540 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-j4jj9" podStartSLOduration=31.295861317 podStartE2EDuration="38.964516747s" podCreationTimestamp="2026-02-17 14:25:46 +0000 UTC" firstStartedPulling="2026-02-17 14:26:09.476558746 +0000 UTC m=+1195.819487015" lastFinishedPulling="2026-02-17 14:26:17.145214176 +0000 UTC m=+1203.488142445" observedRunningTime="2026-02-17 14:26:24.961113026 +0000 UTC m=+1211.304041295" watchObservedRunningTime="2026-02-17 14:26:24.964516747 +0000 UTC m=+1211.307445006" Feb 17 14:26:25 crc kubenswrapper[4836]: I0217 14:26:25.934387 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-jbcz5" event={"ID":"e14b6d2f-85ef-4f0c-8a81-426aee02b456","Type":"ContainerStarted","Data":"3ea7444a593f14512dc997fa7b4dd1c0aa61dd99e0888b276623626f9d806659"} Feb 17 14:26:25 crc kubenswrapper[4836]: I0217 14:26:25.934918 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-j4jj9" Feb 17 14:26:25 crc kubenswrapper[4836]: I0217 14:26:25.935142 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-j4jj9" Feb 17 14:26:25 crc kubenswrapper[4836]: I0217 14:26:25.935573 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-666b6646f7-jbcz5" Feb 17 14:26:25 crc kubenswrapper[4836]: I0217 14:26:25.957048 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-666b6646f7-jbcz5" podStartSLOduration=-9223371986.897753 podStartE2EDuration="49.957022325s" podCreationTimestamp="2026-02-17 14:25:36 +0000 UTC" firstStartedPulling="2026-02-17 14:25:37.34631574 +0000 UTC m=+1163.689244009" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:26:25.956248765 +0000 UTC m=+1212.299177054" watchObservedRunningTime="2026-02-17 14:26:25.957022325 +0000 UTC m=+1212.299950594" Feb 17 14:26:25 crc kubenswrapper[4836]: I0217 14:26:25.976713 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Feb 17 14:26:26 crc kubenswrapper[4836]: I0217 14:26:26.265795 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-s6vqb"] Feb 17 14:26:26 crc kubenswrapper[4836]: I0217 14:26:26.266105 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-s6vqb" podUID="63d320ce-8669-4285-b4bc-dbb6eeb9a190" containerName="dnsmasq-dns" containerID="cri-o://d1ced8732b18e9a32bcf99bb2f034caca8afbe19ad1c6c3a49849748da69630c" gracePeriod=10 Feb 17 14:26:26 crc kubenswrapper[4836]: I0217 14:26:26.315710 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Feb 17 14:26:26 crc kubenswrapper[4836]: I0217 14:26:26.320574 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-hp877"] Feb 17 14:26:26 crc kubenswrapper[4836]: I0217 14:26:26.328096 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-hp877" Feb 17 14:26:26 crc kubenswrapper[4836]: I0217 14:26:26.330976 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Feb 17 14:26:26 crc kubenswrapper[4836]: I0217 14:26:26.358109 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-hp877"] Feb 17 14:26:26 crc kubenswrapper[4836]: I0217 14:26:26.450606 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-6s7lx"] Feb 17 14:26:26 crc kubenswrapper[4836]: I0217 14:26:26.453124 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-6s7lx" Feb 17 14:26:26 crc kubenswrapper[4836]: I0217 14:26:26.457277 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Feb 17 14:26:26 crc kubenswrapper[4836]: I0217 14:26:26.466047 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-6s7lx"] Feb 17 14:26:26 crc kubenswrapper[4836]: I0217 14:26:26.468669 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ae1de151-2799-49ba-839c-70e035c6f1d5-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-hp877\" (UID: \"ae1de151-2799-49ba-839c-70e035c6f1d5\") " pod="openstack/dnsmasq-dns-7fd796d7df-hp877" Feb 17 14:26:26 crc kubenswrapper[4836]: I0217 14:26:26.468841 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ae1de151-2799-49ba-839c-70e035c6f1d5-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-hp877\" (UID: \"ae1de151-2799-49ba-839c-70e035c6f1d5\") " pod="openstack/dnsmasq-dns-7fd796d7df-hp877" Feb 17 14:26:26 crc kubenswrapper[4836]: I0217 14:26:26.468881 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98zd4\" (UniqueName: \"kubernetes.io/projected/ae1de151-2799-49ba-839c-70e035c6f1d5-kube-api-access-98zd4\") pod \"dnsmasq-dns-7fd796d7df-hp877\" (UID: \"ae1de151-2799-49ba-839c-70e035c6f1d5\") " pod="openstack/dnsmasq-dns-7fd796d7df-hp877" Feb 17 14:26:26 crc kubenswrapper[4836]: I0217 14:26:26.468931 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae1de151-2799-49ba-839c-70e035c6f1d5-config\") pod \"dnsmasq-dns-7fd796d7df-hp877\" (UID: \"ae1de151-2799-49ba-839c-70e035c6f1d5\") " pod="openstack/dnsmasq-dns-7fd796d7df-hp877" Feb 17 14:26:26 crc kubenswrapper[4836]: I0217 14:26:26.570012 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf32834e-7ae4-4e3b-b532-dd87f6a9223e-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-6s7lx\" (UID: \"bf32834e-7ae4-4e3b-b532-dd87f6a9223e\") " pod="openstack/ovn-controller-metrics-6s7lx" Feb 17 14:26:26 crc kubenswrapper[4836]: I0217 14:26:26.570067 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdsn4\" (UniqueName: \"kubernetes.io/projected/bf32834e-7ae4-4e3b-b532-dd87f6a9223e-kube-api-access-kdsn4\") pod \"ovn-controller-metrics-6s7lx\" (UID: \"bf32834e-7ae4-4e3b-b532-dd87f6a9223e\") " pod="openstack/ovn-controller-metrics-6s7lx" Feb 17 14:26:26 crc kubenswrapper[4836]: I0217 14:26:26.570131 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ae1de151-2799-49ba-839c-70e035c6f1d5-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-hp877\" (UID: \"ae1de151-2799-49ba-839c-70e035c6f1d5\") " pod="openstack/dnsmasq-dns-7fd796d7df-hp877" Feb 17 14:26:26 crc kubenswrapper[4836]: I0217 14:26:26.570151 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/bf32834e-7ae4-4e3b-b532-dd87f6a9223e-ovs-rundir\") pod \"ovn-controller-metrics-6s7lx\" (UID: \"bf32834e-7ae4-4e3b-b532-dd87f6a9223e\") " pod="openstack/ovn-controller-metrics-6s7lx" Feb 17 14:26:26 crc kubenswrapper[4836]: I0217 14:26:26.570183 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/bf32834e-7ae4-4e3b-b532-dd87f6a9223e-ovn-rundir\") pod \"ovn-controller-metrics-6s7lx\" (UID: \"bf32834e-7ae4-4e3b-b532-dd87f6a9223e\") " pod="openstack/ovn-controller-metrics-6s7lx" Feb 17 14:26:26 crc kubenswrapper[4836]: I0217 14:26:26.570206 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98zd4\" (UniqueName: \"kubernetes.io/projected/ae1de151-2799-49ba-839c-70e035c6f1d5-kube-api-access-98zd4\") pod \"dnsmasq-dns-7fd796d7df-hp877\" (UID: \"ae1de151-2799-49ba-839c-70e035c6f1d5\") " pod="openstack/dnsmasq-dns-7fd796d7df-hp877" Feb 17 14:26:26 crc kubenswrapper[4836]: I0217 14:26:26.570236 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae1de151-2799-49ba-839c-70e035c6f1d5-config\") pod \"dnsmasq-dns-7fd796d7df-hp877\" (UID: \"ae1de151-2799-49ba-839c-70e035c6f1d5\") " pod="openstack/dnsmasq-dns-7fd796d7df-hp877" Feb 17 14:26:26 crc kubenswrapper[4836]: I0217 14:26:26.570253 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf32834e-7ae4-4e3b-b532-dd87f6a9223e-config\") pod \"ovn-controller-metrics-6s7lx\" (UID: \"bf32834e-7ae4-4e3b-b532-dd87f6a9223e\") " pod="openstack/ovn-controller-metrics-6s7lx" Feb 17 14:26:26 crc kubenswrapper[4836]: I0217 14:26:26.570319 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf32834e-7ae4-4e3b-b532-dd87f6a9223e-combined-ca-bundle\") pod \"ovn-controller-metrics-6s7lx\" (UID: \"bf32834e-7ae4-4e3b-b532-dd87f6a9223e\") " pod="openstack/ovn-controller-metrics-6s7lx" Feb 17 14:26:26 crc kubenswrapper[4836]: I0217 14:26:26.570342 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ae1de151-2799-49ba-839c-70e035c6f1d5-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-hp877\" (UID: \"ae1de151-2799-49ba-839c-70e035c6f1d5\") " pod="openstack/dnsmasq-dns-7fd796d7df-hp877" Feb 17 14:26:26 crc kubenswrapper[4836]: I0217 14:26:26.571139 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ae1de151-2799-49ba-839c-70e035c6f1d5-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-hp877\" (UID: \"ae1de151-2799-49ba-839c-70e035c6f1d5\") " pod="openstack/dnsmasq-dns-7fd796d7df-hp877" Feb 17 14:26:26 crc kubenswrapper[4836]: I0217 14:26:26.571514 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ae1de151-2799-49ba-839c-70e035c6f1d5-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-hp877\" (UID: \"ae1de151-2799-49ba-839c-70e035c6f1d5\") " pod="openstack/dnsmasq-dns-7fd796d7df-hp877" Feb 17 14:26:26 crc kubenswrapper[4836]: I0217 14:26:26.572338 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae1de151-2799-49ba-839c-70e035c6f1d5-config\") pod \"dnsmasq-dns-7fd796d7df-hp877\" (UID: \"ae1de151-2799-49ba-839c-70e035c6f1d5\") " pod="openstack/dnsmasq-dns-7fd796d7df-hp877" Feb 17 14:26:26 crc kubenswrapper[4836]: I0217 14:26:26.617505 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98zd4\" (UniqueName: \"kubernetes.io/projected/ae1de151-2799-49ba-839c-70e035c6f1d5-kube-api-access-98zd4\") pod \"dnsmasq-dns-7fd796d7df-hp877\" (UID: \"ae1de151-2799-49ba-839c-70e035c6f1d5\") " pod="openstack/dnsmasq-dns-7fd796d7df-hp877" Feb 17 14:26:26 crc kubenswrapper[4836]: I0217 14:26:26.651198 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-hp877" Feb 17 14:26:26 crc kubenswrapper[4836]: I0217 14:26:26.672524 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/bf32834e-7ae4-4e3b-b532-dd87f6a9223e-ovn-rundir\") pod \"ovn-controller-metrics-6s7lx\" (UID: \"bf32834e-7ae4-4e3b-b532-dd87f6a9223e\") " pod="openstack/ovn-controller-metrics-6s7lx" Feb 17 14:26:26 crc kubenswrapper[4836]: I0217 14:26:26.672636 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf32834e-7ae4-4e3b-b532-dd87f6a9223e-config\") pod \"ovn-controller-metrics-6s7lx\" (UID: \"bf32834e-7ae4-4e3b-b532-dd87f6a9223e\") " pod="openstack/ovn-controller-metrics-6s7lx" Feb 17 14:26:26 crc kubenswrapper[4836]: I0217 14:26:26.672763 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf32834e-7ae4-4e3b-b532-dd87f6a9223e-combined-ca-bundle\") pod \"ovn-controller-metrics-6s7lx\" (UID: \"bf32834e-7ae4-4e3b-b532-dd87f6a9223e\") " pod="openstack/ovn-controller-metrics-6s7lx" Feb 17 14:26:26 crc kubenswrapper[4836]: I0217 14:26:26.672902 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf32834e-7ae4-4e3b-b532-dd87f6a9223e-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-6s7lx\" (UID: \"bf32834e-7ae4-4e3b-b532-dd87f6a9223e\") " pod="openstack/ovn-controller-metrics-6s7lx" Feb 17 14:26:26 crc kubenswrapper[4836]: I0217 14:26:26.672993 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdsn4\" (UniqueName: \"kubernetes.io/projected/bf32834e-7ae4-4e3b-b532-dd87f6a9223e-kube-api-access-kdsn4\") pod \"ovn-controller-metrics-6s7lx\" (UID: \"bf32834e-7ae4-4e3b-b532-dd87f6a9223e\") " pod="openstack/ovn-controller-metrics-6s7lx" Feb 17 14:26:26 crc kubenswrapper[4836]: I0217 14:26:26.673107 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/bf32834e-7ae4-4e3b-b532-dd87f6a9223e-ovs-rundir\") pod \"ovn-controller-metrics-6s7lx\" (UID: \"bf32834e-7ae4-4e3b-b532-dd87f6a9223e\") " pod="openstack/ovn-controller-metrics-6s7lx" Feb 17 14:26:26 crc kubenswrapper[4836]: I0217 14:26:26.673480 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/bf32834e-7ae4-4e3b-b532-dd87f6a9223e-ovs-rundir\") pod \"ovn-controller-metrics-6s7lx\" (UID: \"bf32834e-7ae4-4e3b-b532-dd87f6a9223e\") " pod="openstack/ovn-controller-metrics-6s7lx" Feb 17 14:26:26 crc kubenswrapper[4836]: I0217 14:26:26.673841 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/bf32834e-7ae4-4e3b-b532-dd87f6a9223e-ovn-rundir\") pod \"ovn-controller-metrics-6s7lx\" (UID: \"bf32834e-7ae4-4e3b-b532-dd87f6a9223e\") " pod="openstack/ovn-controller-metrics-6s7lx" Feb 17 14:26:26 crc kubenswrapper[4836]: I0217 14:26:26.675184 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf32834e-7ae4-4e3b-b532-dd87f6a9223e-config\") pod \"ovn-controller-metrics-6s7lx\" (UID: \"bf32834e-7ae4-4e3b-b532-dd87f6a9223e\") " pod="openstack/ovn-controller-metrics-6s7lx" Feb 17 14:26:26 crc kubenswrapper[4836]: I0217 14:26:26.680907 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf32834e-7ae4-4e3b-b532-dd87f6a9223e-combined-ca-bundle\") pod \"ovn-controller-metrics-6s7lx\" (UID: \"bf32834e-7ae4-4e3b-b532-dd87f6a9223e\") " pod="openstack/ovn-controller-metrics-6s7lx" Feb 17 14:26:26 crc kubenswrapper[4836]: I0217 14:26:26.684085 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf32834e-7ae4-4e3b-b532-dd87f6a9223e-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-6s7lx\" (UID: \"bf32834e-7ae4-4e3b-b532-dd87f6a9223e\") " pod="openstack/ovn-controller-metrics-6s7lx" Feb 17 14:26:26 crc kubenswrapper[4836]: I0217 14:26:26.708108 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdsn4\" (UniqueName: \"kubernetes.io/projected/bf32834e-7ae4-4e3b-b532-dd87f6a9223e-kube-api-access-kdsn4\") pod \"ovn-controller-metrics-6s7lx\" (UID: \"bf32834e-7ae4-4e3b-b532-dd87f6a9223e\") " pod="openstack/ovn-controller-metrics-6s7lx" Feb 17 14:26:26 crc kubenswrapper[4836]: I0217 14:26:26.810113 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-6s7lx" Feb 17 14:26:26 crc kubenswrapper[4836]: I0217 14:26:26.851272 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-jbcz5"] Feb 17 14:26:26 crc kubenswrapper[4836]: I0217 14:26:26.896898 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-bpss8"] Feb 17 14:26:26 crc kubenswrapper[4836]: I0217 14:26:26.899589 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-bpss8" Feb 17 14:26:26 crc kubenswrapper[4836]: I0217 14:26:26.904041 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Feb 17 14:26:26 crc kubenswrapper[4836]: I0217 14:26:26.915317 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-bpss8"] Feb 17 14:26:26 crc kubenswrapper[4836]: I0217 14:26:26.947157 4836 generic.go:334] "Generic (PLEG): container finished" podID="63d320ce-8669-4285-b4bc-dbb6eeb9a190" containerID="d1ced8732b18e9a32bcf99bb2f034caca8afbe19ad1c6c3a49849748da69630c" exitCode=0 Feb 17 14:26:26 crc kubenswrapper[4836]: I0217 14:26:26.947235 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-s6vqb" event={"ID":"63d320ce-8669-4285-b4bc-dbb6eeb9a190","Type":"ContainerDied","Data":"d1ced8732b18e9a32bcf99bb2f034caca8afbe19ad1c6c3a49849748da69630c"} Feb 17 14:26:26 crc kubenswrapper[4836]: I0217 14:26:26.986946 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7e0a6937-945b-48fc-a328-6715e10ffddc-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-bpss8\" (UID: \"7e0a6937-945b-48fc-a328-6715e10ffddc\") " pod="openstack/dnsmasq-dns-86db49b7ff-bpss8" Feb 17 14:26:26 crc kubenswrapper[4836]: I0217 14:26:26.987231 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7e0a6937-945b-48fc-a328-6715e10ffddc-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-bpss8\" (UID: \"7e0a6937-945b-48fc-a328-6715e10ffddc\") " pod="openstack/dnsmasq-dns-86db49b7ff-bpss8" Feb 17 14:26:26 crc kubenswrapper[4836]: I0217 14:26:26.987366 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e0a6937-945b-48fc-a328-6715e10ffddc-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-bpss8\" (UID: \"7e0a6937-945b-48fc-a328-6715e10ffddc\") " pod="openstack/dnsmasq-dns-86db49b7ff-bpss8" Feb 17 14:26:26 crc kubenswrapper[4836]: I0217 14:26:26.987517 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tx4bn\" (UniqueName: \"kubernetes.io/projected/7e0a6937-945b-48fc-a328-6715e10ffddc-kube-api-access-tx4bn\") pod \"dnsmasq-dns-86db49b7ff-bpss8\" (UID: \"7e0a6937-945b-48fc-a328-6715e10ffddc\") " pod="openstack/dnsmasq-dns-86db49b7ff-bpss8" Feb 17 14:26:26 crc kubenswrapper[4836]: I0217 14:26:26.988043 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e0a6937-945b-48fc-a328-6715e10ffddc-config\") pod \"dnsmasq-dns-86db49b7ff-bpss8\" (UID: \"7e0a6937-945b-48fc-a328-6715e10ffddc\") " pod="openstack/dnsmasq-dns-86db49b7ff-bpss8" Feb 17 14:26:27 crc kubenswrapper[4836]: I0217 14:26:27.089988 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tx4bn\" (UniqueName: \"kubernetes.io/projected/7e0a6937-945b-48fc-a328-6715e10ffddc-kube-api-access-tx4bn\") pod \"dnsmasq-dns-86db49b7ff-bpss8\" (UID: \"7e0a6937-945b-48fc-a328-6715e10ffddc\") " pod="openstack/dnsmasq-dns-86db49b7ff-bpss8" Feb 17 14:26:27 crc kubenswrapper[4836]: I0217 14:26:27.090161 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e0a6937-945b-48fc-a328-6715e10ffddc-config\") pod \"dnsmasq-dns-86db49b7ff-bpss8\" (UID: \"7e0a6937-945b-48fc-a328-6715e10ffddc\") " pod="openstack/dnsmasq-dns-86db49b7ff-bpss8" Feb 17 14:26:27 crc kubenswrapper[4836]: I0217 14:26:27.090281 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7e0a6937-945b-48fc-a328-6715e10ffddc-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-bpss8\" (UID: \"7e0a6937-945b-48fc-a328-6715e10ffddc\") " pod="openstack/dnsmasq-dns-86db49b7ff-bpss8" Feb 17 14:26:27 crc kubenswrapper[4836]: I0217 14:26:27.090415 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7e0a6937-945b-48fc-a328-6715e10ffddc-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-bpss8\" (UID: \"7e0a6937-945b-48fc-a328-6715e10ffddc\") " pod="openstack/dnsmasq-dns-86db49b7ff-bpss8" Feb 17 14:26:27 crc kubenswrapper[4836]: I0217 14:26:27.090443 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e0a6937-945b-48fc-a328-6715e10ffddc-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-bpss8\" (UID: \"7e0a6937-945b-48fc-a328-6715e10ffddc\") " pod="openstack/dnsmasq-dns-86db49b7ff-bpss8" Feb 17 14:26:27 crc kubenswrapper[4836]: I0217 14:26:27.092394 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7e0a6937-945b-48fc-a328-6715e10ffddc-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-bpss8\" (UID: \"7e0a6937-945b-48fc-a328-6715e10ffddc\") " pod="openstack/dnsmasq-dns-86db49b7ff-bpss8" Feb 17 14:26:27 crc kubenswrapper[4836]: I0217 14:26:27.092897 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7e0a6937-945b-48fc-a328-6715e10ffddc-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-bpss8\" (UID: \"7e0a6937-945b-48fc-a328-6715e10ffddc\") " pod="openstack/dnsmasq-dns-86db49b7ff-bpss8" Feb 17 14:26:27 crc kubenswrapper[4836]: I0217 14:26:27.093560 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e0a6937-945b-48fc-a328-6715e10ffddc-config\") pod \"dnsmasq-dns-86db49b7ff-bpss8\" (UID: \"7e0a6937-945b-48fc-a328-6715e10ffddc\") " pod="openstack/dnsmasq-dns-86db49b7ff-bpss8" Feb 17 14:26:27 crc kubenswrapper[4836]: I0217 14:26:27.093598 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e0a6937-945b-48fc-a328-6715e10ffddc-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-bpss8\" (UID: \"7e0a6937-945b-48fc-a328-6715e10ffddc\") " pod="openstack/dnsmasq-dns-86db49b7ff-bpss8" Feb 17 14:26:27 crc kubenswrapper[4836]: I0217 14:26:27.111554 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tx4bn\" (UniqueName: \"kubernetes.io/projected/7e0a6937-945b-48fc-a328-6715e10ffddc-kube-api-access-tx4bn\") pod \"dnsmasq-dns-86db49b7ff-bpss8\" (UID: \"7e0a6937-945b-48fc-a328-6715e10ffddc\") " pod="openstack/dnsmasq-dns-86db49b7ff-bpss8" Feb 17 14:26:27 crc kubenswrapper[4836]: I0217 14:26:27.173269 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-6s7lx"] Feb 17 14:26:27 crc kubenswrapper[4836]: I0217 14:26:27.227947 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-hp877"] Feb 17 14:26:27 crc kubenswrapper[4836]: W0217 14:26:27.230710 4836 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podae1de151_2799_49ba_839c_70e035c6f1d5.slice/crio-124b726413d3b60c95f594f724c67ccaf14d05521a568a0d9ac52ffab7dc6d70 WatchSource:0}: Error finding container 124b726413d3b60c95f594f724c67ccaf14d05521a568a0d9ac52ffab7dc6d70: Status 404 returned error can't find the container with id 124b726413d3b60c95f594f724c67ccaf14d05521a568a0d9ac52ffab7dc6d70 Feb 17 14:26:27 crc kubenswrapper[4836]: I0217 14:26:27.237950 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-bpss8" Feb 17 14:26:27 crc kubenswrapper[4836]: I0217 14:26:27.795264 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-bpss8"] Feb 17 14:26:28 crc kubenswrapper[4836]: I0217 14:26:27.999810 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-bpss8" event={"ID":"7e0a6937-945b-48fc-a328-6715e10ffddc","Type":"ContainerStarted","Data":"ff5ecfc3d719da4b799fbc70b95c4645eaf91702ed47ae7bfdec7b990d4e151b"} Feb 17 14:26:28 crc kubenswrapper[4836]: I0217 14:26:28.036189 4836 generic.go:334] "Generic (PLEG): container finished" podID="a6016745-1634-4eb6-afee-b98ce9ab8f56" containerID="aee74edc0c06a08e555878906493cce427efbca90aaeb3c4fe3a23355ef32693" exitCode=0 Feb 17 14:26:28 crc kubenswrapper[4836]: I0217 14:26:28.036329 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"a6016745-1634-4eb6-afee-b98ce9ab8f56","Type":"ContainerDied","Data":"aee74edc0c06a08e555878906493cce427efbca90aaeb3c4fe3a23355ef32693"} Feb 17 14:26:28 crc kubenswrapper[4836]: I0217 14:26:28.065584 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-6s7lx" event={"ID":"bf32834e-7ae4-4e3b-b532-dd87f6a9223e","Type":"ContainerStarted","Data":"0f274f14a3714e4c0d22bf75af7b6f378d54a8f80ed9352c143e7eeb68adab24"} Feb 17 14:26:28 crc kubenswrapper[4836]: I0217 14:26:28.090895 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-hp877" event={"ID":"ae1de151-2799-49ba-839c-70e035c6f1d5","Type":"ContainerStarted","Data":"124b726413d3b60c95f594f724c67ccaf14d05521a568a0d9ac52ffab7dc6d70"} Feb 17 14:26:28 crc kubenswrapper[4836]: I0217 14:26:28.117753 4836 generic.go:334] "Generic (PLEG): container finished" podID="e29295a6-4250-4a7d-84c5-a8b3ddf96bd0" containerID="1aeb38549c5093ddcbd19fe025e8df306afcc08ba355a33bcd16537686f0d989" exitCode=0 Feb 17 14:26:28 crc kubenswrapper[4836]: I0217 14:26:28.118244 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e29295a6-4250-4a7d-84c5-a8b3ddf96bd0","Type":"ContainerDied","Data":"1aeb38549c5093ddcbd19fe025e8df306afcc08ba355a33bcd16537686f0d989"} Feb 17 14:26:28 crc kubenswrapper[4836]: I0217 14:26:28.118740 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-666b6646f7-jbcz5" podUID="e14b6d2f-85ef-4f0c-8a81-426aee02b456" containerName="dnsmasq-dns" containerID="cri-o://3ea7444a593f14512dc997fa7b4dd1c0aa61dd99e0888b276623626f9d806659" gracePeriod=10 Feb 17 14:26:28 crc kubenswrapper[4836]: I0217 14:26:28.523335 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Feb 17 14:26:28 crc kubenswrapper[4836]: I0217 14:26:28.701206 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Feb 17 14:26:28 crc kubenswrapper[4836]: I0217 14:26:28.706308 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 17 14:26:28 crc kubenswrapper[4836]: I0217 14:26:28.713578 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-ksw98" Feb 17 14:26:28 crc kubenswrapper[4836]: I0217 14:26:28.713888 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Feb 17 14:26:28 crc kubenswrapper[4836]: I0217 14:26:28.714034 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Feb 17 14:26:28 crc kubenswrapper[4836]: I0217 14:26:28.714199 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Feb 17 14:26:28 crc kubenswrapper[4836]: I0217 14:26:28.740930 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 17 14:26:28 crc kubenswrapper[4836]: I0217 14:26:28.758863 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0f031114-b776-4180-ab6e-eb5868f34d3e-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"0f031114-b776-4180-ab6e-eb5868f34d3e\") " pod="openstack/ovn-northd-0" Feb 17 14:26:28 crc kubenswrapper[4836]: I0217 14:26:28.758942 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6h66n\" (UniqueName: \"kubernetes.io/projected/0f031114-b776-4180-ab6e-eb5868f34d3e-kube-api-access-6h66n\") pod \"ovn-northd-0\" (UID: \"0f031114-b776-4180-ab6e-eb5868f34d3e\") " pod="openstack/ovn-northd-0" Feb 17 14:26:28 crc kubenswrapper[4836]: I0217 14:26:28.758964 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/0f031114-b776-4180-ab6e-eb5868f34d3e-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"0f031114-b776-4180-ab6e-eb5868f34d3e\") " pod="openstack/ovn-northd-0" Feb 17 14:26:28 crc kubenswrapper[4836]: I0217 14:26:28.759023 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f031114-b776-4180-ab6e-eb5868f34d3e-config\") pod \"ovn-northd-0\" (UID: \"0f031114-b776-4180-ab6e-eb5868f34d3e\") " pod="openstack/ovn-northd-0" Feb 17 14:26:28 crc kubenswrapper[4836]: I0217 14:26:28.759061 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f031114-b776-4180-ab6e-eb5868f34d3e-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"0f031114-b776-4180-ab6e-eb5868f34d3e\") " pod="openstack/ovn-northd-0" Feb 17 14:26:28 crc kubenswrapper[4836]: I0217 14:26:28.759110 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0f031114-b776-4180-ab6e-eb5868f34d3e-scripts\") pod \"ovn-northd-0\" (UID: \"0f031114-b776-4180-ab6e-eb5868f34d3e\") " pod="openstack/ovn-northd-0" Feb 17 14:26:28 crc kubenswrapper[4836]: I0217 14:26:28.759128 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0f031114-b776-4180-ab6e-eb5868f34d3e-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"0f031114-b776-4180-ab6e-eb5868f34d3e\") " pod="openstack/ovn-northd-0" Feb 17 14:26:28 crc kubenswrapper[4836]: I0217 14:26:28.860519 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f031114-b776-4180-ab6e-eb5868f34d3e-config\") pod \"ovn-northd-0\" (UID: \"0f031114-b776-4180-ab6e-eb5868f34d3e\") " pod="openstack/ovn-northd-0" Feb 17 14:26:28 crc kubenswrapper[4836]: I0217 14:26:28.860612 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f031114-b776-4180-ab6e-eb5868f34d3e-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"0f031114-b776-4180-ab6e-eb5868f34d3e\") " pod="openstack/ovn-northd-0" Feb 17 14:26:28 crc kubenswrapper[4836]: I0217 14:26:28.860664 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0f031114-b776-4180-ab6e-eb5868f34d3e-scripts\") pod \"ovn-northd-0\" (UID: \"0f031114-b776-4180-ab6e-eb5868f34d3e\") " pod="openstack/ovn-northd-0" Feb 17 14:26:28 crc kubenswrapper[4836]: I0217 14:26:28.860685 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0f031114-b776-4180-ab6e-eb5868f34d3e-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"0f031114-b776-4180-ab6e-eb5868f34d3e\") " pod="openstack/ovn-northd-0" Feb 17 14:26:28 crc kubenswrapper[4836]: I0217 14:26:28.860717 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0f031114-b776-4180-ab6e-eb5868f34d3e-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"0f031114-b776-4180-ab6e-eb5868f34d3e\") " pod="openstack/ovn-northd-0" Feb 17 14:26:28 crc kubenswrapper[4836]: I0217 14:26:28.860769 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6h66n\" (UniqueName: \"kubernetes.io/projected/0f031114-b776-4180-ab6e-eb5868f34d3e-kube-api-access-6h66n\") pod \"ovn-northd-0\" (UID: \"0f031114-b776-4180-ab6e-eb5868f34d3e\") " pod="openstack/ovn-northd-0" Feb 17 14:26:28 crc kubenswrapper[4836]: I0217 14:26:28.860791 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/0f031114-b776-4180-ab6e-eb5868f34d3e-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"0f031114-b776-4180-ab6e-eb5868f34d3e\") " pod="openstack/ovn-northd-0" Feb 17 14:26:28 crc kubenswrapper[4836]: I0217 14:26:28.867550 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0f031114-b776-4180-ab6e-eb5868f34d3e-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"0f031114-b776-4180-ab6e-eb5868f34d3e\") " pod="openstack/ovn-northd-0" Feb 17 14:26:28 crc kubenswrapper[4836]: I0217 14:26:28.871699 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/0f031114-b776-4180-ab6e-eb5868f34d3e-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"0f031114-b776-4180-ab6e-eb5868f34d3e\") " pod="openstack/ovn-northd-0" Feb 17 14:26:28 crc kubenswrapper[4836]: I0217 14:26:28.872775 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0f031114-b776-4180-ab6e-eb5868f34d3e-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"0f031114-b776-4180-ab6e-eb5868f34d3e\") " pod="openstack/ovn-northd-0" Feb 17 14:26:28 crc kubenswrapper[4836]: I0217 14:26:28.872930 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f031114-b776-4180-ab6e-eb5868f34d3e-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"0f031114-b776-4180-ab6e-eb5868f34d3e\") " pod="openstack/ovn-northd-0" Feb 17 14:26:28 crc kubenswrapper[4836]: I0217 14:26:28.894769 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6h66n\" (UniqueName: \"kubernetes.io/projected/0f031114-b776-4180-ab6e-eb5868f34d3e-kube-api-access-6h66n\") pod \"ovn-northd-0\" (UID: \"0f031114-b776-4180-ab6e-eb5868f34d3e\") " pod="openstack/ovn-northd-0" Feb 17 14:26:28 crc kubenswrapper[4836]: I0217 14:26:28.898507 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f031114-b776-4180-ab6e-eb5868f34d3e-config\") pod \"ovn-northd-0\" (UID: \"0f031114-b776-4180-ab6e-eb5868f34d3e\") " pod="openstack/ovn-northd-0" Feb 17 14:26:28 crc kubenswrapper[4836]: I0217 14:26:28.898527 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0f031114-b776-4180-ab6e-eb5868f34d3e-scripts\") pod \"ovn-northd-0\" (UID: \"0f031114-b776-4180-ab6e-eb5868f34d3e\") " pod="openstack/ovn-northd-0" Feb 17 14:26:29 crc kubenswrapper[4836]: I0217 14:26:29.038438 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 17 14:26:29 crc kubenswrapper[4836]: I0217 14:26:29.137984 4836 generic.go:334] "Generic (PLEG): container finished" podID="e14b6d2f-85ef-4f0c-8a81-426aee02b456" containerID="3ea7444a593f14512dc997fa7b4dd1c0aa61dd99e0888b276623626f9d806659" exitCode=0 Feb 17 14:26:29 crc kubenswrapper[4836]: I0217 14:26:29.138062 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-jbcz5" event={"ID":"e14b6d2f-85ef-4f0c-8a81-426aee02b456","Type":"ContainerDied","Data":"3ea7444a593f14512dc997fa7b4dd1c0aa61dd99e0888b276623626f9d806659"} Feb 17 14:26:29 crc kubenswrapper[4836]: I0217 14:26:29.144458 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-s6vqb" event={"ID":"63d320ce-8669-4285-b4bc-dbb6eeb9a190","Type":"ContainerDied","Data":"0ccc818ba3aecccefe49bbab270ac8d64079fabeee863e8305c62599ebffa6de"} Feb 17 14:26:29 crc kubenswrapper[4836]: I0217 14:26:29.144553 4836 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ccc818ba3aecccefe49bbab270ac8d64079fabeee863e8305c62599ebffa6de" Feb 17 14:26:29 crc kubenswrapper[4836]: I0217 14:26:29.197940 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-s6vqb" Feb 17 14:26:29 crc kubenswrapper[4836]: I0217 14:26:29.374144 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63d320ce-8669-4285-b4bc-dbb6eeb9a190-config\") pod \"63d320ce-8669-4285-b4bc-dbb6eeb9a190\" (UID: \"63d320ce-8669-4285-b4bc-dbb6eeb9a190\") " Feb 17 14:26:29 crc kubenswrapper[4836]: I0217 14:26:29.374385 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/63d320ce-8669-4285-b4bc-dbb6eeb9a190-dns-svc\") pod \"63d320ce-8669-4285-b4bc-dbb6eeb9a190\" (UID: \"63d320ce-8669-4285-b4bc-dbb6eeb9a190\") " Feb 17 14:26:29 crc kubenswrapper[4836]: I0217 14:26:29.374468 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8ntrv\" (UniqueName: \"kubernetes.io/projected/63d320ce-8669-4285-b4bc-dbb6eeb9a190-kube-api-access-8ntrv\") pod \"63d320ce-8669-4285-b4bc-dbb6eeb9a190\" (UID: \"63d320ce-8669-4285-b4bc-dbb6eeb9a190\") " Feb 17 14:26:29 crc kubenswrapper[4836]: I0217 14:26:29.382937 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63d320ce-8669-4285-b4bc-dbb6eeb9a190-kube-api-access-8ntrv" (OuterVolumeSpecName: "kube-api-access-8ntrv") pod "63d320ce-8669-4285-b4bc-dbb6eeb9a190" (UID: "63d320ce-8669-4285-b4bc-dbb6eeb9a190"). InnerVolumeSpecName "kube-api-access-8ntrv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:26:29 crc kubenswrapper[4836]: I0217 14:26:29.436281 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/63d320ce-8669-4285-b4bc-dbb6eeb9a190-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "63d320ce-8669-4285-b4bc-dbb6eeb9a190" (UID: "63d320ce-8669-4285-b4bc-dbb6eeb9a190"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:26:29 crc kubenswrapper[4836]: I0217 14:26:29.444965 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/63d320ce-8669-4285-b4bc-dbb6eeb9a190-config" (OuterVolumeSpecName: "config") pod "63d320ce-8669-4285-b4bc-dbb6eeb9a190" (UID: "63d320ce-8669-4285-b4bc-dbb6eeb9a190"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:26:29 crc kubenswrapper[4836]: I0217 14:26:29.479498 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8ntrv\" (UniqueName: \"kubernetes.io/projected/63d320ce-8669-4285-b4bc-dbb6eeb9a190-kube-api-access-8ntrv\") on node \"crc\" DevicePath \"\"" Feb 17 14:26:29 crc kubenswrapper[4836]: I0217 14:26:29.479549 4836 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63d320ce-8669-4285-b4bc-dbb6eeb9a190-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:26:29 crc kubenswrapper[4836]: I0217 14:26:29.479560 4836 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/63d320ce-8669-4285-b4bc-dbb6eeb9a190-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 14:26:29 crc kubenswrapper[4836]: I0217 14:26:29.623574 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Feb 17 14:26:29 crc kubenswrapper[4836]: I0217 14:26:29.623633 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Feb 17 14:26:29 crc kubenswrapper[4836]: I0217 14:26:29.698858 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 17 14:26:29 crc kubenswrapper[4836]: I0217 14:26:29.765408 4836 patch_prober.go:28] interesting pod/machine-config-daemon-bkk9g container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 14:26:29 crc kubenswrapper[4836]: I0217 14:26:29.765505 4836 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" podUID="895a19c9-a3f0-4a15-aa19-19347121388c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 14:26:29 crc kubenswrapper[4836]: I0217 14:26:29.777071 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Feb 17 14:26:29 crc kubenswrapper[4836]: I0217 14:26:29.800525 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-jbcz5" Feb 17 14:26:29 crc kubenswrapper[4836]: I0217 14:26:29.894707 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e14b6d2f-85ef-4f0c-8a81-426aee02b456-dns-svc\") pod \"e14b6d2f-85ef-4f0c-8a81-426aee02b456\" (UID: \"e14b6d2f-85ef-4f0c-8a81-426aee02b456\") " Feb 17 14:26:29 crc kubenswrapper[4836]: I0217 14:26:29.894767 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4zckh\" (UniqueName: \"kubernetes.io/projected/e14b6d2f-85ef-4f0c-8a81-426aee02b456-kube-api-access-4zckh\") pod \"e14b6d2f-85ef-4f0c-8a81-426aee02b456\" (UID: \"e14b6d2f-85ef-4f0c-8a81-426aee02b456\") " Feb 17 14:26:29 crc kubenswrapper[4836]: I0217 14:26:29.894803 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e14b6d2f-85ef-4f0c-8a81-426aee02b456-config\") pod \"e14b6d2f-85ef-4f0c-8a81-426aee02b456\" (UID: \"e14b6d2f-85ef-4f0c-8a81-426aee02b456\") " Feb 17 14:26:29 crc kubenswrapper[4836]: I0217 14:26:29.903519 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e14b6d2f-85ef-4f0c-8a81-426aee02b456-kube-api-access-4zckh" (OuterVolumeSpecName: "kube-api-access-4zckh") pod "e14b6d2f-85ef-4f0c-8a81-426aee02b456" (UID: "e14b6d2f-85ef-4f0c-8a81-426aee02b456"). InnerVolumeSpecName "kube-api-access-4zckh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:26:29 crc kubenswrapper[4836]: I0217 14:26:29.938790 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e14b6d2f-85ef-4f0c-8a81-426aee02b456-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e14b6d2f-85ef-4f0c-8a81-426aee02b456" (UID: "e14b6d2f-85ef-4f0c-8a81-426aee02b456"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:26:29 crc kubenswrapper[4836]: I0217 14:26:29.944516 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e14b6d2f-85ef-4f0c-8a81-426aee02b456-config" (OuterVolumeSpecName: "config") pod "e14b6d2f-85ef-4f0c-8a81-426aee02b456" (UID: "e14b6d2f-85ef-4f0c-8a81-426aee02b456"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:26:29 crc kubenswrapper[4836]: I0217 14:26:29.996826 4836 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e14b6d2f-85ef-4f0c-8a81-426aee02b456-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 14:26:29 crc kubenswrapper[4836]: I0217 14:26:29.996867 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4zckh\" (UniqueName: \"kubernetes.io/projected/e14b6d2f-85ef-4f0c-8a81-426aee02b456-kube-api-access-4zckh\") on node \"crc\" DevicePath \"\"" Feb 17 14:26:29 crc kubenswrapper[4836]: I0217 14:26:29.996883 4836 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e14b6d2f-85ef-4f0c-8a81-426aee02b456-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:26:30 crc kubenswrapper[4836]: I0217 14:26:30.160227 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"a6016745-1634-4eb6-afee-b98ce9ab8f56","Type":"ContainerStarted","Data":"d4d85ba381cf0b62a0a0175503f952a077c1eb634e3436f3651878883d6540f2"} Feb 17 14:26:30 crc kubenswrapper[4836]: I0217 14:26:30.163353 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-6s7lx" event={"ID":"bf32834e-7ae4-4e3b-b532-dd87f6a9223e","Type":"ContainerStarted","Data":"476655d8bcfbe2366d26089d77df2c41aa8693d47f6e231f7ca2793e699a4216"} Feb 17 14:26:30 crc kubenswrapper[4836]: I0217 14:26:30.164843 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"0f031114-b776-4180-ab6e-eb5868f34d3e","Type":"ContainerStarted","Data":"0770d15fb85402ea5503965c244932ef8b8f57b07f17801eaf1cf0d20cc68dca"} Feb 17 14:26:30 crc kubenswrapper[4836]: I0217 14:26:30.166396 4836 generic.go:334] "Generic (PLEG): container finished" podID="ae1de151-2799-49ba-839c-70e035c6f1d5" containerID="6238b015658e3e1a044d71695f4d830d8dd3bda46833739bdcd6ad73a556976d" exitCode=0 Feb 17 14:26:30 crc kubenswrapper[4836]: I0217 14:26:30.166472 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-hp877" event={"ID":"ae1de151-2799-49ba-839c-70e035c6f1d5","Type":"ContainerDied","Data":"6238b015658e3e1a044d71695f4d830d8dd3bda46833739bdcd6ad73a556976d"} Feb 17 14:26:30 crc kubenswrapper[4836]: I0217 14:26:30.170087 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-jbcz5" event={"ID":"e14b6d2f-85ef-4f0c-8a81-426aee02b456","Type":"ContainerDied","Data":"4be2faa5279826c8447da22307f09f3ad1d1675b115d7c7c5cab72070952c1fe"} Feb 17 14:26:30 crc kubenswrapper[4836]: I0217 14:26:30.170144 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-jbcz5" Feb 17 14:26:30 crc kubenswrapper[4836]: I0217 14:26:30.170152 4836 scope.go:117] "RemoveContainer" containerID="3ea7444a593f14512dc997fa7b4dd1c0aa61dd99e0888b276623626f9d806659" Feb 17 14:26:30 crc kubenswrapper[4836]: I0217 14:26:30.173174 4836 generic.go:334] "Generic (PLEG): container finished" podID="7e0a6937-945b-48fc-a328-6715e10ffddc" containerID="5d73acc7d3b7d21dfd57bd1f5f6891bf754918c51d20232beb9b0071a1de3710" exitCode=0 Feb 17 14:26:30 crc kubenswrapper[4836]: I0217 14:26:30.173267 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-s6vqb" Feb 17 14:26:30 crc kubenswrapper[4836]: I0217 14:26:30.173395 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-bpss8" event={"ID":"7e0a6937-945b-48fc-a328-6715e10ffddc","Type":"ContainerDied","Data":"5d73acc7d3b7d21dfd57bd1f5f6891bf754918c51d20232beb9b0071a1de3710"} Feb 17 14:26:30 crc kubenswrapper[4836]: I0217 14:26:30.202067 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=-9223371985.652733 podStartE2EDuration="51.202042435s" podCreationTimestamp="2026-02-17 14:25:39 +0000 UTC" firstStartedPulling="2026-02-17 14:25:50.943527059 +0000 UTC m=+1177.286455328" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:26:30.191335961 +0000 UTC m=+1216.534264240" watchObservedRunningTime="2026-02-17 14:26:30.202042435 +0000 UTC m=+1216.544970704" Feb 17 14:26:30 crc kubenswrapper[4836]: I0217 14:26:30.211037 4836 scope.go:117] "RemoveContainer" containerID="7bca88336a02b6b00bc19416ffdd2164736c7a5342d72427305b5d2ff3839adf" Feb 17 14:26:30 crc kubenswrapper[4836]: I0217 14:26:30.304324 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-6s7lx" podStartSLOduration=4.304280602 podStartE2EDuration="4.304280602s" podCreationTimestamp="2026-02-17 14:26:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:26:30.285507986 +0000 UTC m=+1216.628436255" watchObservedRunningTime="2026-02-17 14:26:30.304280602 +0000 UTC m=+1216.647208881" Feb 17 14:26:30 crc kubenswrapper[4836]: I0217 14:26:30.334419 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-jbcz5"] Feb 17 14:26:30 crc kubenswrapper[4836]: I0217 14:26:30.352152 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-jbcz5"] Feb 17 14:26:30 crc kubenswrapper[4836]: I0217 14:26:30.372501 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-s6vqb"] Feb 17 14:26:30 crc kubenswrapper[4836]: I0217 14:26:30.372704 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Feb 17 14:26:30 crc kubenswrapper[4836]: I0217 14:26:30.391897 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-s6vqb"] Feb 17 14:26:30 crc kubenswrapper[4836]: I0217 14:26:30.587814 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63d320ce-8669-4285-b4bc-dbb6eeb9a190" path="/var/lib/kubelet/pods/63d320ce-8669-4285-b4bc-dbb6eeb9a190/volumes" Feb 17 14:26:30 crc kubenswrapper[4836]: I0217 14:26:30.588935 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e14b6d2f-85ef-4f0c-8a81-426aee02b456" path="/var/lib/kubelet/pods/e14b6d2f-85ef-4f0c-8a81-426aee02b456/volumes" Feb 17 14:26:31 crc kubenswrapper[4836]: I0217 14:26:31.138439 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Feb 17 14:26:31 crc kubenswrapper[4836]: I0217 14:26:31.138516 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Feb 17 14:26:31 crc kubenswrapper[4836]: I0217 14:26:31.198169 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-bpss8" event={"ID":"7e0a6937-945b-48fc-a328-6715e10ffddc","Type":"ContainerStarted","Data":"a4dd5c55405656df129bfcb6d3d7edde886d28467877f4823411944db38277ef"} Feb 17 14:26:31 crc kubenswrapper[4836]: I0217 14:26:31.199421 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86db49b7ff-bpss8" Feb 17 14:26:31 crc kubenswrapper[4836]: I0217 14:26:31.205470 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-hp877" event={"ID":"ae1de151-2799-49ba-839c-70e035c6f1d5","Type":"ContainerStarted","Data":"9e306714806082c8f26efe22ad79196500631adf5249bfbc7b5f3c70a80f192f"} Feb 17 14:26:31 crc kubenswrapper[4836]: I0217 14:26:31.206853 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7fd796d7df-hp877" Feb 17 14:26:31 crc kubenswrapper[4836]: I0217 14:26:31.209010 4836 generic.go:334] "Generic (PLEG): container finished" podID="039a526c-4f5a-4641-9340-b18459145569" containerID="9fc719884946b23c18eb39d431c1a3a86925f7b12eb5058327ff5297c2544b72" exitCode=0 Feb 17 14:26:31 crc kubenswrapper[4836]: I0217 14:26:31.209087 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"039a526c-4f5a-4641-9340-b18459145569","Type":"ContainerDied","Data":"9fc719884946b23c18eb39d431c1a3a86925f7b12eb5058327ff5297c2544b72"} Feb 17 14:26:31 crc kubenswrapper[4836]: I0217 14:26:31.222252 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86db49b7ff-bpss8" podStartSLOduration=5.222229577 podStartE2EDuration="5.222229577s" podCreationTimestamp="2026-02-17 14:26:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:26:31.220080559 +0000 UTC m=+1217.563008828" watchObservedRunningTime="2026-02-17 14:26:31.222229577 +0000 UTC m=+1217.565157846" Feb 17 14:26:31 crc kubenswrapper[4836]: I0217 14:26:31.288651 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7fd796d7df-hp877" podStartSLOduration=5.288626045 podStartE2EDuration="5.288626045s" podCreationTimestamp="2026-02-17 14:26:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:26:31.2748636 +0000 UTC m=+1217.617791889" watchObservedRunningTime="2026-02-17 14:26:31.288626045 +0000 UTC m=+1217.631554314" Feb 17 14:26:32 crc kubenswrapper[4836]: I0217 14:26:32.227606 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"0f031114-b776-4180-ab6e-eb5868f34d3e","Type":"ContainerStarted","Data":"5306b2ee0e0512bed9154941beda2bb67a18a17518b1967232d0c4bb2e53b785"} Feb 17 14:26:32 crc kubenswrapper[4836]: I0217 14:26:32.227971 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"0f031114-b776-4180-ab6e-eb5868f34d3e","Type":"ContainerStarted","Data":"1655c3da47c805f9ddc21bc36e579de15f15d66e96becd5ec0544bc750bfe3ed"} Feb 17 14:26:32 crc kubenswrapper[4836]: I0217 14:26:32.228090 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Feb 17 14:26:32 crc kubenswrapper[4836]: I0217 14:26:32.260077 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.958174922 podStartE2EDuration="4.260057966s" podCreationTimestamp="2026-02-17 14:26:28 +0000 UTC" firstStartedPulling="2026-02-17 14:26:29.703217592 +0000 UTC m=+1216.046145861" lastFinishedPulling="2026-02-17 14:26:31.005100636 +0000 UTC m=+1217.348028905" observedRunningTime="2026-02-17 14:26:32.252127715 +0000 UTC m=+1218.595055984" watchObservedRunningTime="2026-02-17 14:26:32.260057966 +0000 UTC m=+1218.602986235" Feb 17 14:26:32 crc kubenswrapper[4836]: I0217 14:26:32.385915 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-d8f3-account-create-update-kmlvm"] Feb 17 14:26:32 crc kubenswrapper[4836]: E0217 14:26:32.386282 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e14b6d2f-85ef-4f0c-8a81-426aee02b456" containerName="dnsmasq-dns" Feb 17 14:26:32 crc kubenswrapper[4836]: I0217 14:26:32.386309 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="e14b6d2f-85ef-4f0c-8a81-426aee02b456" containerName="dnsmasq-dns" Feb 17 14:26:32 crc kubenswrapper[4836]: E0217 14:26:32.386331 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63d320ce-8669-4285-b4bc-dbb6eeb9a190" containerName="init" Feb 17 14:26:32 crc kubenswrapper[4836]: I0217 14:26:32.386338 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="63d320ce-8669-4285-b4bc-dbb6eeb9a190" containerName="init" Feb 17 14:26:32 crc kubenswrapper[4836]: E0217 14:26:32.386352 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e14b6d2f-85ef-4f0c-8a81-426aee02b456" containerName="init" Feb 17 14:26:32 crc kubenswrapper[4836]: I0217 14:26:32.386358 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="e14b6d2f-85ef-4f0c-8a81-426aee02b456" containerName="init" Feb 17 14:26:32 crc kubenswrapper[4836]: E0217 14:26:32.386375 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63d320ce-8669-4285-b4bc-dbb6eeb9a190" containerName="dnsmasq-dns" Feb 17 14:26:32 crc kubenswrapper[4836]: I0217 14:26:32.386380 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="63d320ce-8669-4285-b4bc-dbb6eeb9a190" containerName="dnsmasq-dns" Feb 17 14:26:32 crc kubenswrapper[4836]: I0217 14:26:32.386560 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="e14b6d2f-85ef-4f0c-8a81-426aee02b456" containerName="dnsmasq-dns" Feb 17 14:26:32 crc kubenswrapper[4836]: I0217 14:26:32.386582 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="63d320ce-8669-4285-b4bc-dbb6eeb9a190" containerName="dnsmasq-dns" Feb 17 14:26:32 crc kubenswrapper[4836]: I0217 14:26:32.387506 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-d8f3-account-create-update-kmlvm" Feb 17 14:26:32 crc kubenswrapper[4836]: I0217 14:26:32.389792 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Feb 17 14:26:32 crc kubenswrapper[4836]: I0217 14:26:32.425241 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-d8f3-account-create-update-kmlvm"] Feb 17 14:26:32 crc kubenswrapper[4836]: I0217 14:26:32.456053 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-k7zc9"] Feb 17 14:26:32 crc kubenswrapper[4836]: I0217 14:26:32.458379 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-k7zc9" Feb 17 14:26:32 crc kubenswrapper[4836]: I0217 14:26:32.484619 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2ae1659d-7892-4744-a570-4ba7c65e4caf-operator-scripts\") pod \"keystone-d8f3-account-create-update-kmlvm\" (UID: \"2ae1659d-7892-4744-a570-4ba7c65e4caf\") " pod="openstack/keystone-d8f3-account-create-update-kmlvm" Feb 17 14:26:32 crc kubenswrapper[4836]: I0217 14:26:32.484802 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9ljc\" (UniqueName: \"kubernetes.io/projected/2ae1659d-7892-4744-a570-4ba7c65e4caf-kube-api-access-g9ljc\") pod \"keystone-d8f3-account-create-update-kmlvm\" (UID: \"2ae1659d-7892-4744-a570-4ba7c65e4caf\") " pod="openstack/keystone-d8f3-account-create-update-kmlvm" Feb 17 14:26:32 crc kubenswrapper[4836]: I0217 14:26:32.485602 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-k7zc9"] Feb 17 14:26:32 crc kubenswrapper[4836]: I0217 14:26:32.587025 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stl2m\" (UniqueName: \"kubernetes.io/projected/e562d506-21d2-4edd-90b8-97bd11bf068e-kube-api-access-stl2m\") pod \"keystone-db-create-k7zc9\" (UID: \"e562d506-21d2-4edd-90b8-97bd11bf068e\") " pod="openstack/keystone-db-create-k7zc9" Feb 17 14:26:32 crc kubenswrapper[4836]: I0217 14:26:32.587109 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2ae1659d-7892-4744-a570-4ba7c65e4caf-operator-scripts\") pod \"keystone-d8f3-account-create-update-kmlvm\" (UID: \"2ae1659d-7892-4744-a570-4ba7c65e4caf\") " pod="openstack/keystone-d8f3-account-create-update-kmlvm" Feb 17 14:26:32 crc kubenswrapper[4836]: I0217 14:26:32.587162 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e562d506-21d2-4edd-90b8-97bd11bf068e-operator-scripts\") pod \"keystone-db-create-k7zc9\" (UID: \"e562d506-21d2-4edd-90b8-97bd11bf068e\") " pod="openstack/keystone-db-create-k7zc9" Feb 17 14:26:32 crc kubenswrapper[4836]: I0217 14:26:32.587225 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9ljc\" (UniqueName: \"kubernetes.io/projected/2ae1659d-7892-4744-a570-4ba7c65e4caf-kube-api-access-g9ljc\") pod \"keystone-d8f3-account-create-update-kmlvm\" (UID: \"2ae1659d-7892-4744-a570-4ba7c65e4caf\") " pod="openstack/keystone-d8f3-account-create-update-kmlvm" Feb 17 14:26:32 crc kubenswrapper[4836]: I0217 14:26:32.588348 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2ae1659d-7892-4744-a570-4ba7c65e4caf-operator-scripts\") pod \"keystone-d8f3-account-create-update-kmlvm\" (UID: \"2ae1659d-7892-4744-a570-4ba7c65e4caf\") " pod="openstack/keystone-d8f3-account-create-update-kmlvm" Feb 17 14:26:32 crc kubenswrapper[4836]: I0217 14:26:32.619578 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9ljc\" (UniqueName: \"kubernetes.io/projected/2ae1659d-7892-4744-a570-4ba7c65e4caf-kube-api-access-g9ljc\") pod \"keystone-d8f3-account-create-update-kmlvm\" (UID: \"2ae1659d-7892-4744-a570-4ba7c65e4caf\") " pod="openstack/keystone-d8f3-account-create-update-kmlvm" Feb 17 14:26:32 crc kubenswrapper[4836]: I0217 14:26:32.648360 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-hx7tv"] Feb 17 14:26:32 crc kubenswrapper[4836]: I0217 14:26:32.651007 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-hx7tv" Feb 17 14:26:32 crc kubenswrapper[4836]: I0217 14:26:32.665195 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-hx7tv"] Feb 17 14:26:32 crc kubenswrapper[4836]: I0217 14:26:32.696654 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e562d506-21d2-4edd-90b8-97bd11bf068e-operator-scripts\") pod \"keystone-db-create-k7zc9\" (UID: \"e562d506-21d2-4edd-90b8-97bd11bf068e\") " pod="openstack/keystone-db-create-k7zc9" Feb 17 14:26:32 crc kubenswrapper[4836]: I0217 14:26:32.697280 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-stl2m\" (UniqueName: \"kubernetes.io/projected/e562d506-21d2-4edd-90b8-97bd11bf068e-kube-api-access-stl2m\") pod \"keystone-db-create-k7zc9\" (UID: \"e562d506-21d2-4edd-90b8-97bd11bf068e\") " pod="openstack/keystone-db-create-k7zc9" Feb 17 14:26:32 crc kubenswrapper[4836]: I0217 14:26:32.697978 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e562d506-21d2-4edd-90b8-97bd11bf068e-operator-scripts\") pod \"keystone-db-create-k7zc9\" (UID: \"e562d506-21d2-4edd-90b8-97bd11bf068e\") " pod="openstack/keystone-db-create-k7zc9" Feb 17 14:26:32 crc kubenswrapper[4836]: I0217 14:26:32.722603 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-d8f3-account-create-update-kmlvm" Feb 17 14:26:32 crc kubenswrapper[4836]: I0217 14:26:32.722826 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-stl2m\" (UniqueName: \"kubernetes.io/projected/e562d506-21d2-4edd-90b8-97bd11bf068e-kube-api-access-stl2m\") pod \"keystone-db-create-k7zc9\" (UID: \"e562d506-21d2-4edd-90b8-97bd11bf068e\") " pod="openstack/keystone-db-create-k7zc9" Feb 17 14:26:32 crc kubenswrapper[4836]: I0217 14:26:32.778387 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-83de-account-create-update-fh75b"] Feb 17 14:26:32 crc kubenswrapper[4836]: I0217 14:26:32.780244 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-83de-account-create-update-fh75b" Feb 17 14:26:32 crc kubenswrapper[4836]: I0217 14:26:32.784626 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Feb 17 14:26:32 crc kubenswrapper[4836]: I0217 14:26:32.786495 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-k7zc9" Feb 17 14:26:32 crc kubenswrapper[4836]: I0217 14:26:32.808619 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/add50d48-0a1c-4d2f-bcc3-ae9355e95c3b-operator-scripts\") pod \"placement-db-create-hx7tv\" (UID: \"add50d48-0a1c-4d2f-bcc3-ae9355e95c3b\") " pod="openstack/placement-db-create-hx7tv" Feb 17 14:26:32 crc kubenswrapper[4836]: I0217 14:26:32.808900 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfzfp\" (UniqueName: \"kubernetes.io/projected/add50d48-0a1c-4d2f-bcc3-ae9355e95c3b-kube-api-access-sfzfp\") pod \"placement-db-create-hx7tv\" (UID: \"add50d48-0a1c-4d2f-bcc3-ae9355e95c3b\") " pod="openstack/placement-db-create-hx7tv" Feb 17 14:26:32 crc kubenswrapper[4836]: I0217 14:26:32.824101 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-83de-account-create-update-fh75b"] Feb 17 14:26:32 crc kubenswrapper[4836]: I0217 14:26:32.914437 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/54905e17-d443-4465-8f70-7be04a89086f-operator-scripts\") pod \"placement-83de-account-create-update-fh75b\" (UID: \"54905e17-d443-4465-8f70-7be04a89086f\") " pod="openstack/placement-83de-account-create-update-fh75b" Feb 17 14:26:32 crc kubenswrapper[4836]: I0217 14:26:32.915008 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/add50d48-0a1c-4d2f-bcc3-ae9355e95c3b-operator-scripts\") pod \"placement-db-create-hx7tv\" (UID: \"add50d48-0a1c-4d2f-bcc3-ae9355e95c3b\") " pod="openstack/placement-db-create-hx7tv" Feb 17 14:26:32 crc kubenswrapper[4836]: I0217 14:26:32.915113 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8nc5\" (UniqueName: \"kubernetes.io/projected/54905e17-d443-4465-8f70-7be04a89086f-kube-api-access-r8nc5\") pod \"placement-83de-account-create-update-fh75b\" (UID: \"54905e17-d443-4465-8f70-7be04a89086f\") " pod="openstack/placement-83de-account-create-update-fh75b" Feb 17 14:26:32 crc kubenswrapper[4836]: I0217 14:26:32.915176 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfzfp\" (UniqueName: \"kubernetes.io/projected/add50d48-0a1c-4d2f-bcc3-ae9355e95c3b-kube-api-access-sfzfp\") pod \"placement-db-create-hx7tv\" (UID: \"add50d48-0a1c-4d2f-bcc3-ae9355e95c3b\") " pod="openstack/placement-db-create-hx7tv" Feb 17 14:26:32 crc kubenswrapper[4836]: I0217 14:26:32.916904 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/add50d48-0a1c-4d2f-bcc3-ae9355e95c3b-operator-scripts\") pod \"placement-db-create-hx7tv\" (UID: \"add50d48-0a1c-4d2f-bcc3-ae9355e95c3b\") " pod="openstack/placement-db-create-hx7tv" Feb 17 14:26:32 crc kubenswrapper[4836]: I0217 14:26:32.940993 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfzfp\" (UniqueName: \"kubernetes.io/projected/add50d48-0a1c-4d2f-bcc3-ae9355e95c3b-kube-api-access-sfzfp\") pod \"placement-db-create-hx7tv\" (UID: \"add50d48-0a1c-4d2f-bcc3-ae9355e95c3b\") " pod="openstack/placement-db-create-hx7tv" Feb 17 14:26:33 crc kubenswrapper[4836]: I0217 14:26:33.007820 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-hx7tv" Feb 17 14:26:33 crc kubenswrapper[4836]: I0217 14:26:33.020347 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/54905e17-d443-4465-8f70-7be04a89086f-operator-scripts\") pod \"placement-83de-account-create-update-fh75b\" (UID: \"54905e17-d443-4465-8f70-7be04a89086f\") " pod="openstack/placement-83de-account-create-update-fh75b" Feb 17 14:26:33 crc kubenswrapper[4836]: I0217 14:26:33.021098 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8nc5\" (UniqueName: \"kubernetes.io/projected/54905e17-d443-4465-8f70-7be04a89086f-kube-api-access-r8nc5\") pod \"placement-83de-account-create-update-fh75b\" (UID: \"54905e17-d443-4465-8f70-7be04a89086f\") " pod="openstack/placement-83de-account-create-update-fh75b" Feb 17 14:26:33 crc kubenswrapper[4836]: I0217 14:26:33.021113 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/54905e17-d443-4465-8f70-7be04a89086f-operator-scripts\") pod \"placement-83de-account-create-update-fh75b\" (UID: \"54905e17-d443-4465-8f70-7be04a89086f\") " pod="openstack/placement-83de-account-create-update-fh75b" Feb 17 14:26:33 crc kubenswrapper[4836]: I0217 14:26:33.046260 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8nc5\" (UniqueName: \"kubernetes.io/projected/54905e17-d443-4465-8f70-7be04a89086f-kube-api-access-r8nc5\") pod \"placement-83de-account-create-update-fh75b\" (UID: \"54905e17-d443-4465-8f70-7be04a89086f\") " pod="openstack/placement-83de-account-create-update-fh75b" Feb 17 14:26:33 crc kubenswrapper[4836]: I0217 14:26:33.229887 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-83de-account-create-update-fh75b" Feb 17 14:26:33 crc kubenswrapper[4836]: I0217 14:26:33.324831 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-d8f3-account-create-update-kmlvm"] Feb 17 14:26:33 crc kubenswrapper[4836]: I0217 14:26:33.470776 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-k7zc9"] Feb 17 14:26:33 crc kubenswrapper[4836]: I0217 14:26:33.574362 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-hx7tv"] Feb 17 14:26:33 crc kubenswrapper[4836]: I0217 14:26:33.881451 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Feb 17 14:26:33 crc kubenswrapper[4836]: I0217 14:26:33.894750 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-83de-account-create-update-fh75b"] Feb 17 14:26:34 crc kubenswrapper[4836]: I0217 14:26:34.266509 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-d8f3-account-create-update-kmlvm" event={"ID":"2ae1659d-7892-4744-a570-4ba7c65e4caf","Type":"ContainerStarted","Data":"053f72cbdf2c7e5f30db435af69e5bbe1df08b8271492028d870e534720e3fc6"} Feb 17 14:26:34 crc kubenswrapper[4836]: I0217 14:26:34.344405 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-hp877"] Feb 17 14:26:34 crc kubenswrapper[4836]: I0217 14:26:34.350869 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7fd796d7df-hp877" podUID="ae1de151-2799-49ba-839c-70e035c6f1d5" containerName="dnsmasq-dns" containerID="cri-o://9e306714806082c8f26efe22ad79196500631adf5249bfbc7b5f3c70a80f192f" gracePeriod=10 Feb 17 14:26:34 crc kubenswrapper[4836]: I0217 14:26:34.389065 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-698758b865-wbh2w"] Feb 17 14:26:34 crc kubenswrapper[4836]: I0217 14:26:34.391735 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-wbh2w" Feb 17 14:26:34 crc kubenswrapper[4836]: I0217 14:26:34.403998 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-wbh2w"] Feb 17 14:26:34 crc kubenswrapper[4836]: I0217 14:26:34.471873 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/312259c2-4f8f-401d-a19e-64d0bc7dd35f-dns-svc\") pod \"dnsmasq-dns-698758b865-wbh2w\" (UID: \"312259c2-4f8f-401d-a19e-64d0bc7dd35f\") " pod="openstack/dnsmasq-dns-698758b865-wbh2w" Feb 17 14:26:34 crc kubenswrapper[4836]: I0217 14:26:34.471968 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/312259c2-4f8f-401d-a19e-64d0bc7dd35f-config\") pod \"dnsmasq-dns-698758b865-wbh2w\" (UID: \"312259c2-4f8f-401d-a19e-64d0bc7dd35f\") " pod="openstack/dnsmasq-dns-698758b865-wbh2w" Feb 17 14:26:34 crc kubenswrapper[4836]: I0217 14:26:34.472218 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/312259c2-4f8f-401d-a19e-64d0bc7dd35f-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-wbh2w\" (UID: \"312259c2-4f8f-401d-a19e-64d0bc7dd35f\") " pod="openstack/dnsmasq-dns-698758b865-wbh2w" Feb 17 14:26:34 crc kubenswrapper[4836]: I0217 14:26:34.472265 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76vb5\" (UniqueName: \"kubernetes.io/projected/312259c2-4f8f-401d-a19e-64d0bc7dd35f-kube-api-access-76vb5\") pod \"dnsmasq-dns-698758b865-wbh2w\" (UID: \"312259c2-4f8f-401d-a19e-64d0bc7dd35f\") " pod="openstack/dnsmasq-dns-698758b865-wbh2w" Feb 17 14:26:34 crc kubenswrapper[4836]: I0217 14:26:34.472319 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/312259c2-4f8f-401d-a19e-64d0bc7dd35f-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-wbh2w\" (UID: \"312259c2-4f8f-401d-a19e-64d0bc7dd35f\") " pod="openstack/dnsmasq-dns-698758b865-wbh2w" Feb 17 14:26:34 crc kubenswrapper[4836]: I0217 14:26:34.585636 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/312259c2-4f8f-401d-a19e-64d0bc7dd35f-dns-svc\") pod \"dnsmasq-dns-698758b865-wbh2w\" (UID: \"312259c2-4f8f-401d-a19e-64d0bc7dd35f\") " pod="openstack/dnsmasq-dns-698758b865-wbh2w" Feb 17 14:26:34 crc kubenswrapper[4836]: I0217 14:26:34.586178 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/312259c2-4f8f-401d-a19e-64d0bc7dd35f-config\") pod \"dnsmasq-dns-698758b865-wbh2w\" (UID: \"312259c2-4f8f-401d-a19e-64d0bc7dd35f\") " pod="openstack/dnsmasq-dns-698758b865-wbh2w" Feb 17 14:26:34 crc kubenswrapper[4836]: I0217 14:26:34.586420 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/312259c2-4f8f-401d-a19e-64d0bc7dd35f-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-wbh2w\" (UID: \"312259c2-4f8f-401d-a19e-64d0bc7dd35f\") " pod="openstack/dnsmasq-dns-698758b865-wbh2w" Feb 17 14:26:34 crc kubenswrapper[4836]: I0217 14:26:34.586481 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76vb5\" (UniqueName: \"kubernetes.io/projected/312259c2-4f8f-401d-a19e-64d0bc7dd35f-kube-api-access-76vb5\") pod \"dnsmasq-dns-698758b865-wbh2w\" (UID: \"312259c2-4f8f-401d-a19e-64d0bc7dd35f\") " pod="openstack/dnsmasq-dns-698758b865-wbh2w" Feb 17 14:26:34 crc kubenswrapper[4836]: I0217 14:26:34.586522 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/312259c2-4f8f-401d-a19e-64d0bc7dd35f-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-wbh2w\" (UID: \"312259c2-4f8f-401d-a19e-64d0bc7dd35f\") " pod="openstack/dnsmasq-dns-698758b865-wbh2w" Feb 17 14:26:34 crc kubenswrapper[4836]: I0217 14:26:34.587286 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/312259c2-4f8f-401d-a19e-64d0bc7dd35f-dns-svc\") pod \"dnsmasq-dns-698758b865-wbh2w\" (UID: \"312259c2-4f8f-401d-a19e-64d0bc7dd35f\") " pod="openstack/dnsmasq-dns-698758b865-wbh2w" Feb 17 14:26:34 crc kubenswrapper[4836]: I0217 14:26:34.587752 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/312259c2-4f8f-401d-a19e-64d0bc7dd35f-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-wbh2w\" (UID: \"312259c2-4f8f-401d-a19e-64d0bc7dd35f\") " pod="openstack/dnsmasq-dns-698758b865-wbh2w" Feb 17 14:26:34 crc kubenswrapper[4836]: I0217 14:26:34.589058 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/312259c2-4f8f-401d-a19e-64d0bc7dd35f-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-wbh2w\" (UID: \"312259c2-4f8f-401d-a19e-64d0bc7dd35f\") " pod="openstack/dnsmasq-dns-698758b865-wbh2w" Feb 17 14:26:34 crc kubenswrapper[4836]: I0217 14:26:34.589287 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/312259c2-4f8f-401d-a19e-64d0bc7dd35f-config\") pod \"dnsmasq-dns-698758b865-wbh2w\" (UID: \"312259c2-4f8f-401d-a19e-64d0bc7dd35f\") " pod="openstack/dnsmasq-dns-698758b865-wbh2w" Feb 17 14:26:34 crc kubenswrapper[4836]: I0217 14:26:34.618084 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76vb5\" (UniqueName: \"kubernetes.io/projected/312259c2-4f8f-401d-a19e-64d0bc7dd35f-kube-api-access-76vb5\") pod \"dnsmasq-dns-698758b865-wbh2w\" (UID: \"312259c2-4f8f-401d-a19e-64d0bc7dd35f\") " pod="openstack/dnsmasq-dns-698758b865-wbh2w" Feb 17 14:26:34 crc kubenswrapper[4836]: I0217 14:26:34.745490 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-wbh2w" Feb 17 14:26:35 crc kubenswrapper[4836]: I0217 14:26:35.294597 4836 generic.go:334] "Generic (PLEG): container finished" podID="ae1de151-2799-49ba-839c-70e035c6f1d5" containerID="9e306714806082c8f26efe22ad79196500631adf5249bfbc7b5f3c70a80f192f" exitCode=0 Feb 17 14:26:35 crc kubenswrapper[4836]: I0217 14:26:35.294673 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-hp877" event={"ID":"ae1de151-2799-49ba-839c-70e035c6f1d5","Type":"ContainerDied","Data":"9e306714806082c8f26efe22ad79196500631adf5249bfbc7b5f3c70a80f192f"} Feb 17 14:26:35 crc kubenswrapper[4836]: I0217 14:26:35.514358 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Feb 17 14:26:35 crc kubenswrapper[4836]: I0217 14:26:35.521684 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 17 14:26:35 crc kubenswrapper[4836]: I0217 14:26:35.525361 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Feb 17 14:26:35 crc kubenswrapper[4836]: I0217 14:26:35.525365 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Feb 17 14:26:35 crc kubenswrapper[4836]: I0217 14:26:35.525515 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Feb 17 14:26:35 crc kubenswrapper[4836]: I0217 14:26:35.527729 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-g6scn" Feb 17 14:26:35 crc kubenswrapper[4836]: I0217 14:26:35.549555 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Feb 17 14:26:35 crc kubenswrapper[4836]: I0217 14:26:35.616450 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqdzq\" (UniqueName: \"kubernetes.io/projected/e482046c-502a-4f41-b013-7b3ef1c71ee1-kube-api-access-pqdzq\") pod \"swift-storage-0\" (UID: \"e482046c-502a-4f41-b013-7b3ef1c71ee1\") " pod="openstack/swift-storage-0" Feb 17 14:26:35 crc kubenswrapper[4836]: I0217 14:26:35.616618 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-0d825549-7bd9-4e47-a4b1-bd74526d0dee\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0d825549-7bd9-4e47-a4b1-bd74526d0dee\") pod \"swift-storage-0\" (UID: \"e482046c-502a-4f41-b013-7b3ef1c71ee1\") " pod="openstack/swift-storage-0" Feb 17 14:26:35 crc kubenswrapper[4836]: I0217 14:26:35.616675 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/e482046c-502a-4f41-b013-7b3ef1c71ee1-cache\") pod \"swift-storage-0\" (UID: \"e482046c-502a-4f41-b013-7b3ef1c71ee1\") " pod="openstack/swift-storage-0" Feb 17 14:26:35 crc kubenswrapper[4836]: I0217 14:26:35.616750 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/e482046c-502a-4f41-b013-7b3ef1c71ee1-lock\") pod \"swift-storage-0\" (UID: \"e482046c-502a-4f41-b013-7b3ef1c71ee1\") " pod="openstack/swift-storage-0" Feb 17 14:26:35 crc kubenswrapper[4836]: I0217 14:26:35.616815 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e482046c-502a-4f41-b013-7b3ef1c71ee1-etc-swift\") pod \"swift-storage-0\" (UID: \"e482046c-502a-4f41-b013-7b3ef1c71ee1\") " pod="openstack/swift-storage-0" Feb 17 14:26:35 crc kubenswrapper[4836]: I0217 14:26:35.616874 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e482046c-502a-4f41-b013-7b3ef1c71ee1-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"e482046c-502a-4f41-b013-7b3ef1c71ee1\") " pod="openstack/swift-storage-0" Feb 17 14:26:35 crc kubenswrapper[4836]: I0217 14:26:35.719917 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e482046c-502a-4f41-b013-7b3ef1c71ee1-etc-swift\") pod \"swift-storage-0\" (UID: \"e482046c-502a-4f41-b013-7b3ef1c71ee1\") " pod="openstack/swift-storage-0" Feb 17 14:26:35 crc kubenswrapper[4836]: I0217 14:26:35.720014 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e482046c-502a-4f41-b013-7b3ef1c71ee1-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"e482046c-502a-4f41-b013-7b3ef1c71ee1\") " pod="openstack/swift-storage-0" Feb 17 14:26:35 crc kubenswrapper[4836]: I0217 14:26:35.720130 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pqdzq\" (UniqueName: \"kubernetes.io/projected/e482046c-502a-4f41-b013-7b3ef1c71ee1-kube-api-access-pqdzq\") pod \"swift-storage-0\" (UID: \"e482046c-502a-4f41-b013-7b3ef1c71ee1\") " pod="openstack/swift-storage-0" Feb 17 14:26:35 crc kubenswrapper[4836]: I0217 14:26:35.720216 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-0d825549-7bd9-4e47-a4b1-bd74526d0dee\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0d825549-7bd9-4e47-a4b1-bd74526d0dee\") pod \"swift-storage-0\" (UID: \"e482046c-502a-4f41-b013-7b3ef1c71ee1\") " pod="openstack/swift-storage-0" Feb 17 14:26:35 crc kubenswrapper[4836]: I0217 14:26:35.720269 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/e482046c-502a-4f41-b013-7b3ef1c71ee1-cache\") pod \"swift-storage-0\" (UID: \"e482046c-502a-4f41-b013-7b3ef1c71ee1\") " pod="openstack/swift-storage-0" Feb 17 14:26:35 crc kubenswrapper[4836]: E0217 14:26:35.720331 4836 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 17 14:26:35 crc kubenswrapper[4836]: E0217 14:26:35.720372 4836 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 17 14:26:35 crc kubenswrapper[4836]: E0217 14:26:35.720454 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e482046c-502a-4f41-b013-7b3ef1c71ee1-etc-swift podName:e482046c-502a-4f41-b013-7b3ef1c71ee1 nodeName:}" failed. No retries permitted until 2026-02-17 14:26:36.220423801 +0000 UTC m=+1222.563352140 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/e482046c-502a-4f41-b013-7b3ef1c71ee1-etc-swift") pod "swift-storage-0" (UID: "e482046c-502a-4f41-b013-7b3ef1c71ee1") : configmap "swift-ring-files" not found Feb 17 14:26:35 crc kubenswrapper[4836]: I0217 14:26:35.720350 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/e482046c-502a-4f41-b013-7b3ef1c71ee1-lock\") pod \"swift-storage-0\" (UID: \"e482046c-502a-4f41-b013-7b3ef1c71ee1\") " pod="openstack/swift-storage-0" Feb 17 14:26:35 crc kubenswrapper[4836]: I0217 14:26:35.720954 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/e482046c-502a-4f41-b013-7b3ef1c71ee1-cache\") pod \"swift-storage-0\" (UID: \"e482046c-502a-4f41-b013-7b3ef1c71ee1\") " pod="openstack/swift-storage-0" Feb 17 14:26:35 crc kubenswrapper[4836]: I0217 14:26:35.721520 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/e482046c-502a-4f41-b013-7b3ef1c71ee1-lock\") pod \"swift-storage-0\" (UID: \"e482046c-502a-4f41-b013-7b3ef1c71ee1\") " pod="openstack/swift-storage-0" Feb 17 14:26:35 crc kubenswrapper[4836]: I0217 14:26:35.725072 4836 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 17 14:26:35 crc kubenswrapper[4836]: I0217 14:26:35.725122 4836 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-0d825549-7bd9-4e47-a4b1-bd74526d0dee\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0d825549-7bd9-4e47-a4b1-bd74526d0dee\") pod \"swift-storage-0\" (UID: \"e482046c-502a-4f41-b013-7b3ef1c71ee1\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/3fb15f4e3277f1f113896c526bb3ebf7a54f83f6fad85785ce0d01aa07563fdc/globalmount\"" pod="openstack/swift-storage-0" Feb 17 14:26:35 crc kubenswrapper[4836]: I0217 14:26:35.734289 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e482046c-502a-4f41-b013-7b3ef1c71ee1-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"e482046c-502a-4f41-b013-7b3ef1c71ee1\") " pod="openstack/swift-storage-0" Feb 17 14:26:35 crc kubenswrapper[4836]: I0217 14:26:35.746277 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqdzq\" (UniqueName: \"kubernetes.io/projected/e482046c-502a-4f41-b013-7b3ef1c71ee1-kube-api-access-pqdzq\") pod \"swift-storage-0\" (UID: \"e482046c-502a-4f41-b013-7b3ef1c71ee1\") " pod="openstack/swift-storage-0" Feb 17 14:26:35 crc kubenswrapper[4836]: I0217 14:26:35.775895 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-0d825549-7bd9-4e47-a4b1-bd74526d0dee\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0d825549-7bd9-4e47-a4b1-bd74526d0dee\") pod \"swift-storage-0\" (UID: \"e482046c-502a-4f41-b013-7b3ef1c71ee1\") " pod="openstack/swift-storage-0" Feb 17 14:26:36 crc kubenswrapper[4836]: I0217 14:26:36.233551 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e482046c-502a-4f41-b013-7b3ef1c71ee1-etc-swift\") pod \"swift-storage-0\" (UID: \"e482046c-502a-4f41-b013-7b3ef1c71ee1\") " pod="openstack/swift-storage-0" Feb 17 14:26:36 crc kubenswrapper[4836]: E0217 14:26:36.234193 4836 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 17 14:26:36 crc kubenswrapper[4836]: E0217 14:26:36.234212 4836 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 17 14:26:36 crc kubenswrapper[4836]: E0217 14:26:36.234269 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e482046c-502a-4f41-b013-7b3ef1c71ee1-etc-swift podName:e482046c-502a-4f41-b013-7b3ef1c71ee1 nodeName:}" failed. No retries permitted until 2026-02-17 14:26:37.234254191 +0000 UTC m=+1223.577182450 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/e482046c-502a-4f41-b013-7b3ef1c71ee1-etc-swift") pod "swift-storage-0" (UID: "e482046c-502a-4f41-b013-7b3ef1c71ee1") : configmap "swift-ring-files" not found Feb 17 14:26:36 crc kubenswrapper[4836]: I0217 14:26:36.560776 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-dbzmx"] Feb 17 14:26:36 crc kubenswrapper[4836]: I0217 14:26:36.562048 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-dbzmx" Feb 17 14:26:36 crc kubenswrapper[4836]: I0217 14:26:36.563812 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Feb 17 14:26:36 crc kubenswrapper[4836]: I0217 14:26:36.564559 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Feb 17 14:26:36 crc kubenswrapper[4836]: I0217 14:26:36.568669 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Feb 17 14:26:36 crc kubenswrapper[4836]: I0217 14:26:36.599853 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-dbzmx"] Feb 17 14:26:36 crc kubenswrapper[4836]: I0217 14:26:36.641060 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/cb33695b-c451-44b2-8a2a-fe534a4040e3-etc-swift\") pod \"swift-ring-rebalance-dbzmx\" (UID: \"cb33695b-c451-44b2-8a2a-fe534a4040e3\") " pod="openstack/swift-ring-rebalance-dbzmx" Feb 17 14:26:36 crc kubenswrapper[4836]: I0217 14:26:36.641175 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cb33695b-c451-44b2-8a2a-fe534a4040e3-scripts\") pod \"swift-ring-rebalance-dbzmx\" (UID: \"cb33695b-c451-44b2-8a2a-fe534a4040e3\") " pod="openstack/swift-ring-rebalance-dbzmx" Feb 17 14:26:36 crc kubenswrapper[4836]: I0217 14:26:36.641219 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb33695b-c451-44b2-8a2a-fe534a4040e3-combined-ca-bundle\") pod \"swift-ring-rebalance-dbzmx\" (UID: \"cb33695b-c451-44b2-8a2a-fe534a4040e3\") " pod="openstack/swift-ring-rebalance-dbzmx" Feb 17 14:26:36 crc kubenswrapper[4836]: I0217 14:26:36.641310 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/cb33695b-c451-44b2-8a2a-fe534a4040e3-dispersionconf\") pod \"swift-ring-rebalance-dbzmx\" (UID: \"cb33695b-c451-44b2-8a2a-fe534a4040e3\") " pod="openstack/swift-ring-rebalance-dbzmx" Feb 17 14:26:36 crc kubenswrapper[4836]: I0217 14:26:36.641411 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/cb33695b-c451-44b2-8a2a-fe534a4040e3-ring-data-devices\") pod \"swift-ring-rebalance-dbzmx\" (UID: \"cb33695b-c451-44b2-8a2a-fe534a4040e3\") " pod="openstack/swift-ring-rebalance-dbzmx" Feb 17 14:26:36 crc kubenswrapper[4836]: I0217 14:26:36.641454 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/cb33695b-c451-44b2-8a2a-fe534a4040e3-swiftconf\") pod \"swift-ring-rebalance-dbzmx\" (UID: \"cb33695b-c451-44b2-8a2a-fe534a4040e3\") " pod="openstack/swift-ring-rebalance-dbzmx" Feb 17 14:26:36 crc kubenswrapper[4836]: I0217 14:26:36.641542 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tlxn6\" (UniqueName: \"kubernetes.io/projected/cb33695b-c451-44b2-8a2a-fe534a4040e3-kube-api-access-tlxn6\") pod \"swift-ring-rebalance-dbzmx\" (UID: \"cb33695b-c451-44b2-8a2a-fe534a4040e3\") " pod="openstack/swift-ring-rebalance-dbzmx" Feb 17 14:26:36 crc kubenswrapper[4836]: I0217 14:26:36.656611 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-pn587"] Feb 17 14:26:36 crc kubenswrapper[4836]: I0217 14:26:36.658290 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-pn587" Feb 17 14:26:36 crc kubenswrapper[4836]: I0217 14:26:36.675024 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-pn587"] Feb 17 14:26:36 crc kubenswrapper[4836]: I0217 14:26:36.687846 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-d162-account-create-update-khb5j"] Feb 17 14:26:36 crc kubenswrapper[4836]: I0217 14:26:36.704020 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-d162-account-create-update-khb5j" Feb 17 14:26:36 crc kubenswrapper[4836]: I0217 14:26:36.708897 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Feb 17 14:26:36 crc kubenswrapper[4836]: I0217 14:26:36.733883 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-d162-account-create-update-khb5j"] Feb 17 14:26:36 crc kubenswrapper[4836]: I0217 14:26:36.750868 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/cb33695b-c451-44b2-8a2a-fe534a4040e3-etc-swift\") pod \"swift-ring-rebalance-dbzmx\" (UID: \"cb33695b-c451-44b2-8a2a-fe534a4040e3\") " pod="openstack/swift-ring-rebalance-dbzmx" Feb 17 14:26:36 crc kubenswrapper[4836]: I0217 14:26:36.751040 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cb33695b-c451-44b2-8a2a-fe534a4040e3-scripts\") pod \"swift-ring-rebalance-dbzmx\" (UID: \"cb33695b-c451-44b2-8a2a-fe534a4040e3\") " pod="openstack/swift-ring-rebalance-dbzmx" Feb 17 14:26:36 crc kubenswrapper[4836]: I0217 14:26:36.751077 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96b57\" (UniqueName: \"kubernetes.io/projected/77cd04d8-7d93-4b49-b91b-8cc5fdf8f53b-kube-api-access-96b57\") pod \"glance-db-create-pn587\" (UID: \"77cd04d8-7d93-4b49-b91b-8cc5fdf8f53b\") " pod="openstack/glance-db-create-pn587" Feb 17 14:26:36 crc kubenswrapper[4836]: I0217 14:26:36.751117 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb33695b-c451-44b2-8a2a-fe534a4040e3-combined-ca-bundle\") pod \"swift-ring-rebalance-dbzmx\" (UID: \"cb33695b-c451-44b2-8a2a-fe534a4040e3\") " pod="openstack/swift-ring-rebalance-dbzmx" Feb 17 14:26:36 crc kubenswrapper[4836]: I0217 14:26:36.751200 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/cb33695b-c451-44b2-8a2a-fe534a4040e3-dispersionconf\") pod \"swift-ring-rebalance-dbzmx\" (UID: \"cb33695b-c451-44b2-8a2a-fe534a4040e3\") " pod="openstack/swift-ring-rebalance-dbzmx" Feb 17 14:26:36 crc kubenswrapper[4836]: I0217 14:26:36.751256 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/77cd04d8-7d93-4b49-b91b-8cc5fdf8f53b-operator-scripts\") pod \"glance-db-create-pn587\" (UID: \"77cd04d8-7d93-4b49-b91b-8cc5fdf8f53b\") " pod="openstack/glance-db-create-pn587" Feb 17 14:26:36 crc kubenswrapper[4836]: I0217 14:26:36.751348 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/cb33695b-c451-44b2-8a2a-fe534a4040e3-ring-data-devices\") pod \"swift-ring-rebalance-dbzmx\" (UID: \"cb33695b-c451-44b2-8a2a-fe534a4040e3\") " pod="openstack/swift-ring-rebalance-dbzmx" Feb 17 14:26:36 crc kubenswrapper[4836]: I0217 14:26:36.751412 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/cb33695b-c451-44b2-8a2a-fe534a4040e3-swiftconf\") pod \"swift-ring-rebalance-dbzmx\" (UID: \"cb33695b-c451-44b2-8a2a-fe534a4040e3\") " pod="openstack/swift-ring-rebalance-dbzmx" Feb 17 14:26:36 crc kubenswrapper[4836]: I0217 14:26:36.751448 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8sfbb\" (UniqueName: \"kubernetes.io/projected/1a5ee61c-8aa3-4bb4-a7da-2d61ca1561f5-kube-api-access-8sfbb\") pod \"glance-d162-account-create-update-khb5j\" (UID: \"1a5ee61c-8aa3-4bb4-a7da-2d61ca1561f5\") " pod="openstack/glance-d162-account-create-update-khb5j" Feb 17 14:26:36 crc kubenswrapper[4836]: I0217 14:26:36.751558 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tlxn6\" (UniqueName: \"kubernetes.io/projected/cb33695b-c451-44b2-8a2a-fe534a4040e3-kube-api-access-tlxn6\") pod \"swift-ring-rebalance-dbzmx\" (UID: \"cb33695b-c451-44b2-8a2a-fe534a4040e3\") " pod="openstack/swift-ring-rebalance-dbzmx" Feb 17 14:26:36 crc kubenswrapper[4836]: I0217 14:26:36.751641 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1a5ee61c-8aa3-4bb4-a7da-2d61ca1561f5-operator-scripts\") pod \"glance-d162-account-create-update-khb5j\" (UID: \"1a5ee61c-8aa3-4bb4-a7da-2d61ca1561f5\") " pod="openstack/glance-d162-account-create-update-khb5j" Feb 17 14:26:36 crc kubenswrapper[4836]: I0217 14:26:36.753072 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/cb33695b-c451-44b2-8a2a-fe534a4040e3-etc-swift\") pod \"swift-ring-rebalance-dbzmx\" (UID: \"cb33695b-c451-44b2-8a2a-fe534a4040e3\") " pod="openstack/swift-ring-rebalance-dbzmx" Feb 17 14:26:36 crc kubenswrapper[4836]: I0217 14:26:36.753807 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/cb33695b-c451-44b2-8a2a-fe534a4040e3-ring-data-devices\") pod \"swift-ring-rebalance-dbzmx\" (UID: \"cb33695b-c451-44b2-8a2a-fe534a4040e3\") " pod="openstack/swift-ring-rebalance-dbzmx" Feb 17 14:26:36 crc kubenswrapper[4836]: I0217 14:26:36.756201 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cb33695b-c451-44b2-8a2a-fe534a4040e3-scripts\") pod \"swift-ring-rebalance-dbzmx\" (UID: \"cb33695b-c451-44b2-8a2a-fe534a4040e3\") " pod="openstack/swift-ring-rebalance-dbzmx" Feb 17 14:26:36 crc kubenswrapper[4836]: I0217 14:26:36.761553 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/cb33695b-c451-44b2-8a2a-fe534a4040e3-dispersionconf\") pod \"swift-ring-rebalance-dbzmx\" (UID: \"cb33695b-c451-44b2-8a2a-fe534a4040e3\") " pod="openstack/swift-ring-rebalance-dbzmx" Feb 17 14:26:36 crc kubenswrapper[4836]: I0217 14:26:36.764159 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb33695b-c451-44b2-8a2a-fe534a4040e3-combined-ca-bundle\") pod \"swift-ring-rebalance-dbzmx\" (UID: \"cb33695b-c451-44b2-8a2a-fe534a4040e3\") " pod="openstack/swift-ring-rebalance-dbzmx" Feb 17 14:26:36 crc kubenswrapper[4836]: I0217 14:26:36.774325 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/cb33695b-c451-44b2-8a2a-fe534a4040e3-swiftconf\") pod \"swift-ring-rebalance-dbzmx\" (UID: \"cb33695b-c451-44b2-8a2a-fe534a4040e3\") " pod="openstack/swift-ring-rebalance-dbzmx" Feb 17 14:26:36 crc kubenswrapper[4836]: I0217 14:26:36.780204 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tlxn6\" (UniqueName: \"kubernetes.io/projected/cb33695b-c451-44b2-8a2a-fe534a4040e3-kube-api-access-tlxn6\") pod \"swift-ring-rebalance-dbzmx\" (UID: \"cb33695b-c451-44b2-8a2a-fe534a4040e3\") " pod="openstack/swift-ring-rebalance-dbzmx" Feb 17 14:26:36 crc kubenswrapper[4836]: I0217 14:26:36.853218 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8sfbb\" (UniqueName: \"kubernetes.io/projected/1a5ee61c-8aa3-4bb4-a7da-2d61ca1561f5-kube-api-access-8sfbb\") pod \"glance-d162-account-create-update-khb5j\" (UID: \"1a5ee61c-8aa3-4bb4-a7da-2d61ca1561f5\") " pod="openstack/glance-d162-account-create-update-khb5j" Feb 17 14:26:36 crc kubenswrapper[4836]: I0217 14:26:36.853589 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1a5ee61c-8aa3-4bb4-a7da-2d61ca1561f5-operator-scripts\") pod \"glance-d162-account-create-update-khb5j\" (UID: \"1a5ee61c-8aa3-4bb4-a7da-2d61ca1561f5\") " pod="openstack/glance-d162-account-create-update-khb5j" Feb 17 14:26:36 crc kubenswrapper[4836]: I0217 14:26:36.853806 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96b57\" (UniqueName: \"kubernetes.io/projected/77cd04d8-7d93-4b49-b91b-8cc5fdf8f53b-kube-api-access-96b57\") pod \"glance-db-create-pn587\" (UID: \"77cd04d8-7d93-4b49-b91b-8cc5fdf8f53b\") " pod="openstack/glance-db-create-pn587" Feb 17 14:26:36 crc kubenswrapper[4836]: I0217 14:26:36.853954 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/77cd04d8-7d93-4b49-b91b-8cc5fdf8f53b-operator-scripts\") pod \"glance-db-create-pn587\" (UID: \"77cd04d8-7d93-4b49-b91b-8cc5fdf8f53b\") " pod="openstack/glance-db-create-pn587" Feb 17 14:26:36 crc kubenswrapper[4836]: I0217 14:26:36.854551 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1a5ee61c-8aa3-4bb4-a7da-2d61ca1561f5-operator-scripts\") pod \"glance-d162-account-create-update-khb5j\" (UID: \"1a5ee61c-8aa3-4bb4-a7da-2d61ca1561f5\") " pod="openstack/glance-d162-account-create-update-khb5j" Feb 17 14:26:36 crc kubenswrapper[4836]: I0217 14:26:36.854763 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/77cd04d8-7d93-4b49-b91b-8cc5fdf8f53b-operator-scripts\") pod \"glance-db-create-pn587\" (UID: \"77cd04d8-7d93-4b49-b91b-8cc5fdf8f53b\") " pod="openstack/glance-db-create-pn587" Feb 17 14:26:36 crc kubenswrapper[4836]: I0217 14:26:36.873913 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8sfbb\" (UniqueName: \"kubernetes.io/projected/1a5ee61c-8aa3-4bb4-a7da-2d61ca1561f5-kube-api-access-8sfbb\") pod \"glance-d162-account-create-update-khb5j\" (UID: \"1a5ee61c-8aa3-4bb4-a7da-2d61ca1561f5\") " pod="openstack/glance-d162-account-create-update-khb5j" Feb 17 14:26:36 crc kubenswrapper[4836]: I0217 14:26:36.879535 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96b57\" (UniqueName: \"kubernetes.io/projected/77cd04d8-7d93-4b49-b91b-8cc5fdf8f53b-kube-api-access-96b57\") pod \"glance-db-create-pn587\" (UID: \"77cd04d8-7d93-4b49-b91b-8cc5fdf8f53b\") " pod="openstack/glance-db-create-pn587" Feb 17 14:26:36 crc kubenswrapper[4836]: I0217 14:26:36.893132 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-dbzmx" Feb 17 14:26:36 crc kubenswrapper[4836]: I0217 14:26:36.954213 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-r4gdh" Feb 17 14:26:36 crc kubenswrapper[4836]: I0217 14:26:36.989113 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-pn587" Feb 17 14:26:37 crc kubenswrapper[4836]: I0217 14:26:37.050772 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-d162-account-create-update-khb5j" Feb 17 14:26:37 crc kubenswrapper[4836]: I0217 14:26:37.087087 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-bqn6j" Feb 17 14:26:37 crc kubenswrapper[4836]: I0217 14:26:37.185244 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-querier-58c84b5844-fsq2h" Feb 17 14:26:37 crc kubenswrapper[4836]: I0217 14:26:37.241034 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86db49b7ff-bpss8" Feb 17 14:26:37 crc kubenswrapper[4836]: I0217 14:26:37.261390 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e482046c-502a-4f41-b013-7b3ef1c71ee1-etc-swift\") pod \"swift-storage-0\" (UID: \"e482046c-502a-4f41-b013-7b3ef1c71ee1\") " pod="openstack/swift-storage-0" Feb 17 14:26:37 crc kubenswrapper[4836]: E0217 14:26:37.261755 4836 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 17 14:26:37 crc kubenswrapper[4836]: E0217 14:26:37.261785 4836 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 17 14:26:37 crc kubenswrapper[4836]: E0217 14:26:37.261854 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e482046c-502a-4f41-b013-7b3ef1c71ee1-etc-swift podName:e482046c-502a-4f41-b013-7b3ef1c71ee1 nodeName:}" failed. No retries permitted until 2026-02-17 14:26:39.261833318 +0000 UTC m=+1225.604761587 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/e482046c-502a-4f41-b013-7b3ef1c71ee1-etc-swift") pod "swift-storage-0" (UID: "e482046c-502a-4f41-b013-7b3ef1c71ee1") : configmap "swift-ring-files" not found Feb 17 14:26:37 crc kubenswrapper[4836]: I0217 14:26:37.465909 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Feb 17 14:26:37 crc kubenswrapper[4836]: I0217 14:26:37.588694 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Feb 17 14:26:37 crc kubenswrapper[4836]: I0217 14:26:37.828338 4836 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cloudkitty-lokistack-ingester-0" podUID="1c33fb01-9bf7-43f1-86d5-004e70d3721c" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Feb 17 14:26:38 crc kubenswrapper[4836]: I0217 14:26:38.002249 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-compactor-0" Feb 17 14:26:38 crc kubenswrapper[4836]: I0217 14:26:38.172641 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 17 14:26:38 crc kubenswrapper[4836]: I0217 14:26:38.177121 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-c8vxs"] Feb 17 14:26:38 crc kubenswrapper[4836]: I0217 14:26:38.179479 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-c8vxs" Feb 17 14:26:38 crc kubenswrapper[4836]: I0217 14:26:38.184154 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Feb 17 14:26:38 crc kubenswrapper[4836]: I0217 14:26:38.187941 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-c8vxs"] Feb 17 14:26:38 crc kubenswrapper[4836]: I0217 14:26:38.284352 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f86lz\" (UniqueName: \"kubernetes.io/projected/21d02a34-d68b-4cae-9f03-0b15d07fe948-kube-api-access-f86lz\") pod \"root-account-create-update-c8vxs\" (UID: \"21d02a34-d68b-4cae-9f03-0b15d07fe948\") " pod="openstack/root-account-create-update-c8vxs" Feb 17 14:26:38 crc kubenswrapper[4836]: I0217 14:26:38.284507 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/21d02a34-d68b-4cae-9f03-0b15d07fe948-operator-scripts\") pod \"root-account-create-update-c8vxs\" (UID: \"21d02a34-d68b-4cae-9f03-0b15d07fe948\") " pod="openstack/root-account-create-update-c8vxs" Feb 17 14:26:38 crc kubenswrapper[4836]: I0217 14:26:38.385982 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f86lz\" (UniqueName: \"kubernetes.io/projected/21d02a34-d68b-4cae-9f03-0b15d07fe948-kube-api-access-f86lz\") pod \"root-account-create-update-c8vxs\" (UID: \"21d02a34-d68b-4cae-9f03-0b15d07fe948\") " pod="openstack/root-account-create-update-c8vxs" Feb 17 14:26:38 crc kubenswrapper[4836]: I0217 14:26:38.386138 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/21d02a34-d68b-4cae-9f03-0b15d07fe948-operator-scripts\") pod \"root-account-create-update-c8vxs\" (UID: \"21d02a34-d68b-4cae-9f03-0b15d07fe948\") " pod="openstack/root-account-create-update-c8vxs" Feb 17 14:26:38 crc kubenswrapper[4836]: I0217 14:26:38.387853 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/21d02a34-d68b-4cae-9f03-0b15d07fe948-operator-scripts\") pod \"root-account-create-update-c8vxs\" (UID: \"21d02a34-d68b-4cae-9f03-0b15d07fe948\") " pod="openstack/root-account-create-update-c8vxs" Feb 17 14:26:38 crc kubenswrapper[4836]: I0217 14:26:38.414012 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f86lz\" (UniqueName: \"kubernetes.io/projected/21d02a34-d68b-4cae-9f03-0b15d07fe948-kube-api-access-f86lz\") pod \"root-account-create-update-c8vxs\" (UID: \"21d02a34-d68b-4cae-9f03-0b15d07fe948\") " pod="openstack/root-account-create-update-c8vxs" Feb 17 14:26:38 crc kubenswrapper[4836]: I0217 14:26:38.543623 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-c8vxs" Feb 17 14:26:39 crc kubenswrapper[4836]: I0217 14:26:39.309707 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e482046c-502a-4f41-b013-7b3ef1c71ee1-etc-swift\") pod \"swift-storage-0\" (UID: \"e482046c-502a-4f41-b013-7b3ef1c71ee1\") " pod="openstack/swift-storage-0" Feb 17 14:26:39 crc kubenswrapper[4836]: E0217 14:26:39.310387 4836 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 17 14:26:39 crc kubenswrapper[4836]: E0217 14:26:39.310411 4836 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 17 14:26:39 crc kubenswrapper[4836]: E0217 14:26:39.310471 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e482046c-502a-4f41-b013-7b3ef1c71ee1-etc-swift podName:e482046c-502a-4f41-b013-7b3ef1c71ee1 nodeName:}" failed. No retries permitted until 2026-02-17 14:26:43.310452521 +0000 UTC m=+1229.653380790 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/e482046c-502a-4f41-b013-7b3ef1c71ee1-etc-swift") pod "swift-storage-0" (UID: "e482046c-502a-4f41-b013-7b3ef1c71ee1") : configmap "swift-ring-files" not found Feb 17 14:26:39 crc kubenswrapper[4836]: W0217 14:26:39.431082 4836 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod54905e17_d443_4465_8f70_7be04a89086f.slice/crio-944bbce44653f332757b41aa709ff915e239c63242463d545f1454d580e8215e WatchSource:0}: Error finding container 944bbce44653f332757b41aa709ff915e239c63242463d545f1454d580e8215e: Status 404 returned error can't find the container with id 944bbce44653f332757b41aa709ff915e239c63242463d545f1454d580e8215e Feb 17 14:26:39 crc kubenswrapper[4836]: W0217 14:26:39.434869 4836 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode562d506_21d2_4edd_90b8_97bd11bf068e.slice/crio-7bdbd8741734a15b799a8b486b726df845fdab465fc49bc23c59d7482c73e7d5 WatchSource:0}: Error finding container 7bdbd8741734a15b799a8b486b726df845fdab465fc49bc23c59d7482c73e7d5: Status 404 returned error can't find the container with id 7bdbd8741734a15b799a8b486b726df845fdab465fc49bc23c59d7482c73e7d5 Feb 17 14:26:39 crc kubenswrapper[4836]: I0217 14:26:39.671603 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-hp877" Feb 17 14:26:39 crc kubenswrapper[4836]: I0217 14:26:39.729247 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ae1de151-2799-49ba-839c-70e035c6f1d5-dns-svc\") pod \"ae1de151-2799-49ba-839c-70e035c6f1d5\" (UID: \"ae1de151-2799-49ba-839c-70e035c6f1d5\") " Feb 17 14:26:39 crc kubenswrapper[4836]: I0217 14:26:39.731854 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-98zd4\" (UniqueName: \"kubernetes.io/projected/ae1de151-2799-49ba-839c-70e035c6f1d5-kube-api-access-98zd4\") pod \"ae1de151-2799-49ba-839c-70e035c6f1d5\" (UID: \"ae1de151-2799-49ba-839c-70e035c6f1d5\") " Feb 17 14:26:39 crc kubenswrapper[4836]: I0217 14:26:39.731960 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae1de151-2799-49ba-839c-70e035c6f1d5-config\") pod \"ae1de151-2799-49ba-839c-70e035c6f1d5\" (UID: \"ae1de151-2799-49ba-839c-70e035c6f1d5\") " Feb 17 14:26:39 crc kubenswrapper[4836]: I0217 14:26:39.732272 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ae1de151-2799-49ba-839c-70e035c6f1d5-ovsdbserver-nb\") pod \"ae1de151-2799-49ba-839c-70e035c6f1d5\" (UID: \"ae1de151-2799-49ba-839c-70e035c6f1d5\") " Feb 17 14:26:39 crc kubenswrapper[4836]: I0217 14:26:39.736640 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae1de151-2799-49ba-839c-70e035c6f1d5-kube-api-access-98zd4" (OuterVolumeSpecName: "kube-api-access-98zd4") pod "ae1de151-2799-49ba-839c-70e035c6f1d5" (UID: "ae1de151-2799-49ba-839c-70e035c6f1d5"). InnerVolumeSpecName "kube-api-access-98zd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:26:39 crc kubenswrapper[4836]: I0217 14:26:39.803973 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae1de151-2799-49ba-839c-70e035c6f1d5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ae1de151-2799-49ba-839c-70e035c6f1d5" (UID: "ae1de151-2799-49ba-839c-70e035c6f1d5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:26:39 crc kubenswrapper[4836]: I0217 14:26:39.811534 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae1de151-2799-49ba-839c-70e035c6f1d5-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ae1de151-2799-49ba-839c-70e035c6f1d5" (UID: "ae1de151-2799-49ba-839c-70e035c6f1d5"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:26:39 crc kubenswrapper[4836]: I0217 14:26:39.820136 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae1de151-2799-49ba-839c-70e035c6f1d5-config" (OuterVolumeSpecName: "config") pod "ae1de151-2799-49ba-839c-70e035c6f1d5" (UID: "ae1de151-2799-49ba-839c-70e035c6f1d5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:26:39 crc kubenswrapper[4836]: I0217 14:26:39.834997 4836 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ae1de151-2799-49ba-839c-70e035c6f1d5-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 17 14:26:39 crc kubenswrapper[4836]: I0217 14:26:39.835035 4836 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ae1de151-2799-49ba-839c-70e035c6f1d5-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 14:26:39 crc kubenswrapper[4836]: I0217 14:26:39.835046 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-98zd4\" (UniqueName: \"kubernetes.io/projected/ae1de151-2799-49ba-839c-70e035c6f1d5-kube-api-access-98zd4\") on node \"crc\" DevicePath \"\"" Feb 17 14:26:39 crc kubenswrapper[4836]: I0217 14:26:39.835056 4836 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae1de151-2799-49ba-839c-70e035c6f1d5-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:26:40 crc kubenswrapper[4836]: I0217 14:26:40.204190 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-pn587"] Feb 17 14:26:40 crc kubenswrapper[4836]: I0217 14:26:40.394628 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-hx7tv" event={"ID":"add50d48-0a1c-4d2f-bcc3-ae9355e95c3b","Type":"ContainerStarted","Data":"8f88022ab4daa99006c48416f95fa6fcf0ec231af3f8553f0fffe8cc8f1971ee"} Feb 17 14:26:40 crc kubenswrapper[4836]: I0217 14:26:40.395083 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-hx7tv" event={"ID":"add50d48-0a1c-4d2f-bcc3-ae9355e95c3b","Type":"ContainerStarted","Data":"f20e9dc32e437145f802a537fa2665afc2ae79a191175483ec92ca0e4108918e"} Feb 17 14:26:40 crc kubenswrapper[4836]: I0217 14:26:40.420003 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-83de-account-create-update-fh75b" event={"ID":"54905e17-d443-4465-8f70-7be04a89086f","Type":"ContainerStarted","Data":"bf410eadcd21b6c409b08a23916bc0ac4d5ba43505387a89c251ab098b87e562"} Feb 17 14:26:40 crc kubenswrapper[4836]: I0217 14:26:40.420056 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-83de-account-create-update-fh75b" event={"ID":"54905e17-d443-4465-8f70-7be04a89086f","Type":"ContainerStarted","Data":"944bbce44653f332757b41aa709ff915e239c63242463d545f1454d580e8215e"} Feb 17 14:26:40 crc kubenswrapper[4836]: I0217 14:26:40.443340 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-d162-account-create-update-khb5j"] Feb 17 14:26:40 crc kubenswrapper[4836]: I0217 14:26:40.444795 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-k7zc9" event={"ID":"e562d506-21d2-4edd-90b8-97bd11bf068e","Type":"ContainerStarted","Data":"c0e6439979838c98e66157164ef8073f70f7245c52bc8c72b4753a2777fab786"} Feb 17 14:26:40 crc kubenswrapper[4836]: I0217 14:26:40.444909 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-k7zc9" event={"ID":"e562d506-21d2-4edd-90b8-97bd11bf068e","Type":"ContainerStarted","Data":"7bdbd8741734a15b799a8b486b726df845fdab465fc49bc23c59d7482c73e7d5"} Feb 17 14:26:40 crc kubenswrapper[4836]: I0217 14:26:40.462322 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-hx7tv" podStartSLOduration=8.462284608 podStartE2EDuration="8.462284608s" podCreationTimestamp="2026-02-17 14:26:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:26:40.419772583 +0000 UTC m=+1226.762700852" watchObservedRunningTime="2026-02-17 14:26:40.462284608 +0000 UTC m=+1226.805212867" Feb 17 14:26:40 crc kubenswrapper[4836]: I0217 14:26:40.495984 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-hp877" Feb 17 14:26:40 crc kubenswrapper[4836]: I0217 14:26:40.496401 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-hp877" event={"ID":"ae1de151-2799-49ba-839c-70e035c6f1d5","Type":"ContainerDied","Data":"124b726413d3b60c95f594f724c67ccaf14d05521a568a0d9ac52ffab7dc6d70"} Feb 17 14:26:40 crc kubenswrapper[4836]: I0217 14:26:40.496569 4836 scope.go:117] "RemoveContainer" containerID="9e306714806082c8f26efe22ad79196500631adf5249bfbc7b5f3c70a80f192f" Feb 17 14:26:40 crc kubenswrapper[4836]: I0217 14:26:40.500822 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-83de-account-create-update-fh75b" podStartSLOduration=8.500793959 podStartE2EDuration="8.500793959s" podCreationTimestamp="2026-02-17 14:26:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:26:40.449756727 +0000 UTC m=+1226.792684996" watchObservedRunningTime="2026-02-17 14:26:40.500793959 +0000 UTC m=+1226.843722228" Feb 17 14:26:40 crc kubenswrapper[4836]: I0217 14:26:40.512737 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-create-k7zc9" podStartSLOduration=8.512710094 podStartE2EDuration="8.512710094s" podCreationTimestamp="2026-02-17 14:26:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:26:40.495879138 +0000 UTC m=+1226.838807407" watchObservedRunningTime="2026-02-17 14:26:40.512710094 +0000 UTC m=+1226.855638363" Feb 17 14:26:40 crc kubenswrapper[4836]: I0217 14:26:40.550618 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-d8f3-account-create-update-kmlvm" event={"ID":"2ae1659d-7892-4744-a570-4ba7c65e4caf","Type":"ContainerStarted","Data":"bda2c6a640050c54150d82f44c6e78a2f7107b79ee0b4f6fd03e4d8c6e1019d3"} Feb 17 14:26:40 crc kubenswrapper[4836]: I0217 14:26:40.643889 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-d8f3-account-create-update-kmlvm" podStartSLOduration=8.643867268 podStartE2EDuration="8.643867268s" podCreationTimestamp="2026-02-17 14:26:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:26:40.601695372 +0000 UTC m=+1226.944623641" watchObservedRunningTime="2026-02-17 14:26:40.643867268 +0000 UTC m=+1226.986795537" Feb 17 14:26:40 crc kubenswrapper[4836]: I0217 14:26:40.694921 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-dbzmx"] Feb 17 14:26:40 crc kubenswrapper[4836]: I0217 14:26:40.694977 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"039a526c-4f5a-4641-9340-b18459145569","Type":"ContainerStarted","Data":"03a6dc27159e9ff8f5c2d8c4e46d28e3fd0b9d571e2b143e652c2a068b3a073e"} Feb 17 14:26:40 crc kubenswrapper[4836]: I0217 14:26:40.695004 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e29295a6-4250-4a7d-84c5-a8b3ddf96bd0","Type":"ContainerStarted","Data":"5de26698cc194f27aa6fa46281e03b3fa0bc2faa6bf0ef9b745f3fec33e05835"} Feb 17 14:26:40 crc kubenswrapper[4836]: I0217 14:26:40.695026 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-hp877"] Feb 17 14:26:40 crc kubenswrapper[4836]: I0217 14:26:40.699980 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-pn587" event={"ID":"77cd04d8-7d93-4b49-b91b-8cc5fdf8f53b","Type":"ContainerStarted","Data":"337993452ec5730e0c3cc016c9ba757fc9a43025082efe848b2e9b3eeab12528"} Feb 17 14:26:40 crc kubenswrapper[4836]: I0217 14:26:40.716824 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-hp877"] Feb 17 14:26:40 crc kubenswrapper[4836]: I0217 14:26:40.721763 4836 scope.go:117] "RemoveContainer" containerID="6238b015658e3e1a044d71695f4d830d8dd3bda46833739bdcd6ad73a556976d" Feb 17 14:26:40 crc kubenswrapper[4836]: I0217 14:26:40.932587 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-wbh2w"] Feb 17 14:26:40 crc kubenswrapper[4836]: I0217 14:26:40.952729 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-c8vxs"] Feb 17 14:26:40 crc kubenswrapper[4836]: W0217 14:26:40.953506 4836 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod21d02a34_d68b_4cae_9f03_0b15d07fe948.slice/crio-f11b06be301a9f9b682d6007240d8ae8a505a1f6da93c11fc5e55f0036da5ae0 WatchSource:0}: Error finding container f11b06be301a9f9b682d6007240d8ae8a505a1f6da93c11fc5e55f0036da5ae0: Status 404 returned error can't find the container with id f11b06be301a9f9b682d6007240d8ae8a505a1f6da93c11fc5e55f0036da5ae0 Feb 17 14:26:41 crc kubenswrapper[4836]: I0217 14:26:41.652400 4836 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-7fd796d7df-hp877" podUID="ae1de151-2799-49ba-839c-70e035c6f1d5" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.127:5353: i/o timeout" Feb 17 14:26:41 crc kubenswrapper[4836]: I0217 14:26:41.724111 4836 generic.go:334] "Generic (PLEG): container finished" podID="312259c2-4f8f-401d-a19e-64d0bc7dd35f" containerID="44931b40ada4bc7bee4acb5d1054d14507951ed9df360a9eb97ae5e6b0efb503" exitCode=0 Feb 17 14:26:41 crc kubenswrapper[4836]: I0217 14:26:41.724211 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-wbh2w" event={"ID":"312259c2-4f8f-401d-a19e-64d0bc7dd35f","Type":"ContainerDied","Data":"44931b40ada4bc7bee4acb5d1054d14507951ed9df360a9eb97ae5e6b0efb503"} Feb 17 14:26:41 crc kubenswrapper[4836]: I0217 14:26:41.724247 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-wbh2w" event={"ID":"312259c2-4f8f-401d-a19e-64d0bc7dd35f","Type":"ContainerStarted","Data":"c6f4101d16fd86bcceb0625244616ff16d1c5665adecebcc6d46b7d7f983a200"} Feb 17 14:26:41 crc kubenswrapper[4836]: I0217 14:26:41.729959 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-d162-account-create-update-khb5j" event={"ID":"1a5ee61c-8aa3-4bb4-a7da-2d61ca1561f5","Type":"ContainerStarted","Data":"55c6c8d1d911f68476c5d07d35dec7d57e500cdc1c29d64681255555160897dd"} Feb 17 14:26:41 crc kubenswrapper[4836]: I0217 14:26:41.730042 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-d162-account-create-update-khb5j" event={"ID":"1a5ee61c-8aa3-4bb4-a7da-2d61ca1561f5","Type":"ContainerStarted","Data":"4488137d41693a0eed0cf3344bd79369971085583a9f2a449f30914ec350a79a"} Feb 17 14:26:41 crc kubenswrapper[4836]: I0217 14:26:41.737073 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-pn587" event={"ID":"77cd04d8-7d93-4b49-b91b-8cc5fdf8f53b","Type":"ContainerStarted","Data":"0179fb4c7564ecef52fa63a2f91fe687b3340cb3f7aaa46ff46f4ec68e5ee26d"} Feb 17 14:26:41 crc kubenswrapper[4836]: I0217 14:26:41.762200 4836 generic.go:334] "Generic (PLEG): container finished" podID="add50d48-0a1c-4d2f-bcc3-ae9355e95c3b" containerID="8f88022ab4daa99006c48416f95fa6fcf0ec231af3f8553f0fffe8cc8f1971ee" exitCode=0 Feb 17 14:26:41 crc kubenswrapper[4836]: I0217 14:26:41.762328 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-hx7tv" event={"ID":"add50d48-0a1c-4d2f-bcc3-ae9355e95c3b","Type":"ContainerDied","Data":"8f88022ab4daa99006c48416f95fa6fcf0ec231af3f8553f0fffe8cc8f1971ee"} Feb 17 14:26:41 crc kubenswrapper[4836]: I0217 14:26:41.764627 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-dbzmx" event={"ID":"cb33695b-c451-44b2-8a2a-fe534a4040e3","Type":"ContainerStarted","Data":"0863004180b5c7074ba22f1ddb8c58005ebe6a0d2ac8583efc764697e8242881"} Feb 17 14:26:41 crc kubenswrapper[4836]: I0217 14:26:41.766244 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-c8vxs" event={"ID":"21d02a34-d68b-4cae-9f03-0b15d07fe948","Type":"ContainerStarted","Data":"4c54331d8c22a82e7135a4bdfa56b01c1bacccea5967146f9a8bb1c17d9ca3da"} Feb 17 14:26:41 crc kubenswrapper[4836]: I0217 14:26:41.766284 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-c8vxs" event={"ID":"21d02a34-d68b-4cae-9f03-0b15d07fe948","Type":"ContainerStarted","Data":"f11b06be301a9f9b682d6007240d8ae8a505a1f6da93c11fc5e55f0036da5ae0"} Feb 17 14:26:41 crc kubenswrapper[4836]: I0217 14:26:41.789019 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-create-pn587" podStartSLOduration=5.788988959 podStartE2EDuration="5.788988959s" podCreationTimestamp="2026-02-17 14:26:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:26:41.769254257 +0000 UTC m=+1228.112182526" watchObservedRunningTime="2026-02-17 14:26:41.788988959 +0000 UTC m=+1228.131917238" Feb 17 14:26:41 crc kubenswrapper[4836]: I0217 14:26:41.813827 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-d162-account-create-update-khb5j" podStartSLOduration=5.813790666 podStartE2EDuration="5.813790666s" podCreationTimestamp="2026-02-17 14:26:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:26:41.785885917 +0000 UTC m=+1228.128814186" watchObservedRunningTime="2026-02-17 14:26:41.813790666 +0000 UTC m=+1228.156718935" Feb 17 14:26:41 crc kubenswrapper[4836]: I0217 14:26:41.922845 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-c8vxs" podStartSLOduration=3.922817854 podStartE2EDuration="3.922817854s" podCreationTimestamp="2026-02-17 14:26:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:26:41.816729674 +0000 UTC m=+1228.159657943" watchObservedRunningTime="2026-02-17 14:26:41.922817854 +0000 UTC m=+1228.265746123" Feb 17 14:26:42 crc kubenswrapper[4836]: I0217 14:26:42.583754 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae1de151-2799-49ba-839c-70e035c6f1d5" path="/var/lib/kubelet/pods/ae1de151-2799-49ba-839c-70e035c6f1d5/volumes" Feb 17 14:26:42 crc kubenswrapper[4836]: I0217 14:26:42.782019 4836 generic.go:334] "Generic (PLEG): container finished" podID="77cd04d8-7d93-4b49-b91b-8cc5fdf8f53b" containerID="0179fb4c7564ecef52fa63a2f91fe687b3340cb3f7aaa46ff46f4ec68e5ee26d" exitCode=0 Feb 17 14:26:42 crc kubenswrapper[4836]: I0217 14:26:42.782149 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-pn587" event={"ID":"77cd04d8-7d93-4b49-b91b-8cc5fdf8f53b","Type":"ContainerDied","Data":"0179fb4c7564ecef52fa63a2f91fe687b3340cb3f7aaa46ff46f4ec68e5ee26d"} Feb 17 14:26:42 crc kubenswrapper[4836]: I0217 14:26:42.785198 4836 generic.go:334] "Generic (PLEG): container finished" podID="e562d506-21d2-4edd-90b8-97bd11bf068e" containerID="c0e6439979838c98e66157164ef8073f70f7245c52bc8c72b4753a2777fab786" exitCode=0 Feb 17 14:26:42 crc kubenswrapper[4836]: I0217 14:26:42.785280 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-k7zc9" event={"ID":"e562d506-21d2-4edd-90b8-97bd11bf068e","Type":"ContainerDied","Data":"c0e6439979838c98e66157164ef8073f70f7245c52bc8c72b4753a2777fab786"} Feb 17 14:26:42 crc kubenswrapper[4836]: I0217 14:26:42.790037 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-wbh2w" event={"ID":"312259c2-4f8f-401d-a19e-64d0bc7dd35f","Type":"ContainerStarted","Data":"de75bc86bd0570fcef07a3f3195cfec352721b59eef66e22b061ebca87ca6456"} Feb 17 14:26:42 crc kubenswrapper[4836]: I0217 14:26:42.790165 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-wbh2w" Feb 17 14:26:42 crc kubenswrapper[4836]: I0217 14:26:42.792092 4836 generic.go:334] "Generic (PLEG): container finished" podID="ec9408e6-0474-4f84-842e-b1c20f42a7b8" containerID="1e0077eb33d7cdccabd3d53eadba26bb33ef9899ccdc0c0e3003d7b300233249" exitCode=0 Feb 17 14:26:42 crc kubenswrapper[4836]: I0217 14:26:42.792173 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ec9408e6-0474-4f84-842e-b1c20f42a7b8","Type":"ContainerDied","Data":"1e0077eb33d7cdccabd3d53eadba26bb33ef9899ccdc0c0e3003d7b300233249"} Feb 17 14:26:42 crc kubenswrapper[4836]: I0217 14:26:42.804984 4836 generic.go:334] "Generic (PLEG): container finished" podID="6f866bb7-5209-4275-8884-df6f074b3f7c" containerID="85576fe15acb4ec82e880a96b65a7ac8f381e29f3114bed6ed63c37985fe03f0" exitCode=0 Feb 17 14:26:42 crc kubenswrapper[4836]: I0217 14:26:42.805166 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6f866bb7-5209-4275-8884-df6f074b3f7c","Type":"ContainerDied","Data":"85576fe15acb4ec82e880a96b65a7ac8f381e29f3114bed6ed63c37985fe03f0"} Feb 17 14:26:42 crc kubenswrapper[4836]: I0217 14:26:42.884848 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-698758b865-wbh2w" podStartSLOduration=8.884823284 podStartE2EDuration="8.884823284s" podCreationTimestamp="2026-02-17 14:26:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:26:42.8722132 +0000 UTC m=+1229.215141479" watchObservedRunningTime="2026-02-17 14:26:42.884823284 +0000 UTC m=+1229.227751553" Feb 17 14:26:43 crc kubenswrapper[4836]: I0217 14:26:43.342907 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e482046c-502a-4f41-b013-7b3ef1c71ee1-etc-swift\") pod \"swift-storage-0\" (UID: \"e482046c-502a-4f41-b013-7b3ef1c71ee1\") " pod="openstack/swift-storage-0" Feb 17 14:26:43 crc kubenswrapper[4836]: E0217 14:26:43.343163 4836 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 17 14:26:43 crc kubenswrapper[4836]: E0217 14:26:43.343195 4836 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 17 14:26:43 crc kubenswrapper[4836]: E0217 14:26:43.343274 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e482046c-502a-4f41-b013-7b3ef1c71ee1-etc-swift podName:e482046c-502a-4f41-b013-7b3ef1c71ee1 nodeName:}" failed. No retries permitted until 2026-02-17 14:26:51.343248737 +0000 UTC m=+1237.686177006 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/e482046c-502a-4f41-b013-7b3ef1c71ee1-etc-swift") pod "swift-storage-0" (UID: "e482046c-502a-4f41-b013-7b3ef1c71ee1") : configmap "swift-ring-files" not found Feb 17 14:26:43 crc kubenswrapper[4836]: I0217 14:26:43.818652 4836 generic.go:334] "Generic (PLEG): container finished" podID="2ae1659d-7892-4744-a570-4ba7c65e4caf" containerID="bda2c6a640050c54150d82f44c6e78a2f7107b79ee0b4f6fd03e4d8c6e1019d3" exitCode=0 Feb 17 14:26:43 crc kubenswrapper[4836]: I0217 14:26:43.818748 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-d8f3-account-create-update-kmlvm" event={"ID":"2ae1659d-7892-4744-a570-4ba7c65e4caf","Type":"ContainerDied","Data":"bda2c6a640050c54150d82f44c6e78a2f7107b79ee0b4f6fd03e4d8c6e1019d3"} Feb 17 14:26:43 crc kubenswrapper[4836]: I0217 14:26:43.820816 4836 generic.go:334] "Generic (PLEG): container finished" podID="1a5ee61c-8aa3-4bb4-a7da-2d61ca1561f5" containerID="55c6c8d1d911f68476c5d07d35dec7d57e500cdc1c29d64681255555160897dd" exitCode=0 Feb 17 14:26:43 crc kubenswrapper[4836]: I0217 14:26:43.820881 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-d162-account-create-update-khb5j" event={"ID":"1a5ee61c-8aa3-4bb4-a7da-2d61ca1561f5","Type":"ContainerDied","Data":"55c6c8d1d911f68476c5d07d35dec7d57e500cdc1c29d64681255555160897dd"} Feb 17 14:26:43 crc kubenswrapper[4836]: I0217 14:26:43.823037 4836 generic.go:334] "Generic (PLEG): container finished" podID="54905e17-d443-4465-8f70-7be04a89086f" containerID="bf410eadcd21b6c409b08a23916bc0ac4d5ba43505387a89c251ab098b87e562" exitCode=0 Feb 17 14:26:43 crc kubenswrapper[4836]: I0217 14:26:43.823115 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-83de-account-create-update-fh75b" event={"ID":"54905e17-d443-4465-8f70-7be04a89086f","Type":"ContainerDied","Data":"bf410eadcd21b6c409b08a23916bc0ac4d5ba43505387a89c251ab098b87e562"} Feb 17 14:26:43 crc kubenswrapper[4836]: I0217 14:26:43.825429 4836 generic.go:334] "Generic (PLEG): container finished" podID="21d02a34-d68b-4cae-9f03-0b15d07fe948" containerID="4c54331d8c22a82e7135a4bdfa56b01c1bacccea5967146f9a8bb1c17d9ca3da" exitCode=0 Feb 17 14:26:43 crc kubenswrapper[4836]: I0217 14:26:43.825630 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-c8vxs" event={"ID":"21d02a34-d68b-4cae-9f03-0b15d07fe948","Type":"ContainerDied","Data":"4c54331d8c22a82e7135a4bdfa56b01c1bacccea5967146f9a8bb1c17d9ca3da"} Feb 17 14:26:44 crc kubenswrapper[4836]: I0217 14:26:44.399348 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-pn587" Feb 17 14:26:44 crc kubenswrapper[4836]: I0217 14:26:44.405603 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-hx7tv" Feb 17 14:26:44 crc kubenswrapper[4836]: I0217 14:26:44.583779 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-96b57\" (UniqueName: \"kubernetes.io/projected/77cd04d8-7d93-4b49-b91b-8cc5fdf8f53b-kube-api-access-96b57\") pod \"77cd04d8-7d93-4b49-b91b-8cc5fdf8f53b\" (UID: \"77cd04d8-7d93-4b49-b91b-8cc5fdf8f53b\") " Feb 17 14:26:44 crc kubenswrapper[4836]: I0217 14:26:44.583835 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/77cd04d8-7d93-4b49-b91b-8cc5fdf8f53b-operator-scripts\") pod \"77cd04d8-7d93-4b49-b91b-8cc5fdf8f53b\" (UID: \"77cd04d8-7d93-4b49-b91b-8cc5fdf8f53b\") " Feb 17 14:26:44 crc kubenswrapper[4836]: I0217 14:26:44.583996 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/add50d48-0a1c-4d2f-bcc3-ae9355e95c3b-operator-scripts\") pod \"add50d48-0a1c-4d2f-bcc3-ae9355e95c3b\" (UID: \"add50d48-0a1c-4d2f-bcc3-ae9355e95c3b\") " Feb 17 14:26:44 crc kubenswrapper[4836]: I0217 14:26:44.584071 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sfzfp\" (UniqueName: \"kubernetes.io/projected/add50d48-0a1c-4d2f-bcc3-ae9355e95c3b-kube-api-access-sfzfp\") pod \"add50d48-0a1c-4d2f-bcc3-ae9355e95c3b\" (UID: \"add50d48-0a1c-4d2f-bcc3-ae9355e95c3b\") " Feb 17 14:26:44 crc kubenswrapper[4836]: I0217 14:26:44.585035 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/add50d48-0a1c-4d2f-bcc3-ae9355e95c3b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "add50d48-0a1c-4d2f-bcc3-ae9355e95c3b" (UID: "add50d48-0a1c-4d2f-bcc3-ae9355e95c3b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:26:44 crc kubenswrapper[4836]: I0217 14:26:44.585035 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77cd04d8-7d93-4b49-b91b-8cc5fdf8f53b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "77cd04d8-7d93-4b49-b91b-8cc5fdf8f53b" (UID: "77cd04d8-7d93-4b49-b91b-8cc5fdf8f53b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:26:44 crc kubenswrapper[4836]: I0217 14:26:44.592917 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77cd04d8-7d93-4b49-b91b-8cc5fdf8f53b-kube-api-access-96b57" (OuterVolumeSpecName: "kube-api-access-96b57") pod "77cd04d8-7d93-4b49-b91b-8cc5fdf8f53b" (UID: "77cd04d8-7d93-4b49-b91b-8cc5fdf8f53b"). InnerVolumeSpecName "kube-api-access-96b57". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:26:44 crc kubenswrapper[4836]: I0217 14:26:44.612976 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/add50d48-0a1c-4d2f-bcc3-ae9355e95c3b-kube-api-access-sfzfp" (OuterVolumeSpecName: "kube-api-access-sfzfp") pod "add50d48-0a1c-4d2f-bcc3-ae9355e95c3b" (UID: "add50d48-0a1c-4d2f-bcc3-ae9355e95c3b"). InnerVolumeSpecName "kube-api-access-sfzfp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:26:44 crc kubenswrapper[4836]: I0217 14:26:44.686019 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-96b57\" (UniqueName: \"kubernetes.io/projected/77cd04d8-7d93-4b49-b91b-8cc5fdf8f53b-kube-api-access-96b57\") on node \"crc\" DevicePath \"\"" Feb 17 14:26:44 crc kubenswrapper[4836]: I0217 14:26:44.686047 4836 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/77cd04d8-7d93-4b49-b91b-8cc5fdf8f53b-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 14:26:44 crc kubenswrapper[4836]: I0217 14:26:44.686056 4836 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/add50d48-0a1c-4d2f-bcc3-ae9355e95c3b-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 14:26:44 crc kubenswrapper[4836]: I0217 14:26:44.686065 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sfzfp\" (UniqueName: \"kubernetes.io/projected/add50d48-0a1c-4d2f-bcc3-ae9355e95c3b-kube-api-access-sfzfp\") on node \"crc\" DevicePath \"\"" Feb 17 14:26:44 crc kubenswrapper[4836]: I0217 14:26:44.835470 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-pn587" event={"ID":"77cd04d8-7d93-4b49-b91b-8cc5fdf8f53b","Type":"ContainerDied","Data":"337993452ec5730e0c3cc016c9ba757fc9a43025082efe848b2e9b3eeab12528"} Feb 17 14:26:44 crc kubenswrapper[4836]: I0217 14:26:44.835536 4836 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="337993452ec5730e0c3cc016c9ba757fc9a43025082efe848b2e9b3eeab12528" Feb 17 14:26:44 crc kubenswrapper[4836]: I0217 14:26:44.835496 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-pn587" Feb 17 14:26:44 crc kubenswrapper[4836]: I0217 14:26:44.836666 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-hx7tv" event={"ID":"add50d48-0a1c-4d2f-bcc3-ae9355e95c3b","Type":"ContainerDied","Data":"f20e9dc32e437145f802a537fa2665afc2ae79a191175483ec92ca0e4108918e"} Feb 17 14:26:44 crc kubenswrapper[4836]: I0217 14:26:44.836715 4836 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f20e9dc32e437145f802a537fa2665afc2ae79a191175483ec92ca0e4108918e" Feb 17 14:26:44 crc kubenswrapper[4836]: I0217 14:26:44.836681 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-hx7tv" Feb 17 14:26:44 crc kubenswrapper[4836]: I0217 14:26:44.838623 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"039a526c-4f5a-4641-9340-b18459145569","Type":"ContainerStarted","Data":"cbe9c1822db3e4df38f03422ecc405cf112afecffa429cbfc14cbe462e4d38fe"} Feb 17 14:26:44 crc kubenswrapper[4836]: I0217 14:26:44.839105 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/alertmanager-metric-storage-0" Feb 17 14:26:44 crc kubenswrapper[4836]: I0217 14:26:44.844251 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/alertmanager-metric-storage-0" Feb 17 14:26:44 crc kubenswrapper[4836]: I0217 14:26:44.849666 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e29295a6-4250-4a7d-84c5-a8b3ddf96bd0","Type":"ContainerStarted","Data":"2634435ab0e106f5ce9041eacdd8794376187c382228fa8d9f52a71bd9ec4553"} Feb 17 14:26:44 crc kubenswrapper[4836]: I0217 14:26:44.880340 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/alertmanager-metric-storage-0" podStartSLOduration=29.436356179 podStartE2EDuration="1m0.88030862s" podCreationTimestamp="2026-02-17 14:25:44 +0000 UTC" firstStartedPulling="2026-02-17 14:26:08.219685214 +0000 UTC m=+1194.562613483" lastFinishedPulling="2026-02-17 14:26:39.663637655 +0000 UTC m=+1226.006565924" observedRunningTime="2026-02-17 14:26:44.862773765 +0000 UTC m=+1231.205702054" watchObservedRunningTime="2026-02-17 14:26:44.88030862 +0000 UTC m=+1231.223236889" Feb 17 14:26:45 crc kubenswrapper[4836]: I0217 14:26:45.863181 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-d162-account-create-update-khb5j" event={"ID":"1a5ee61c-8aa3-4bb4-a7da-2d61ca1561f5","Type":"ContainerDied","Data":"4488137d41693a0eed0cf3344bd79369971085583a9f2a449f30914ec350a79a"} Feb 17 14:26:45 crc kubenswrapper[4836]: I0217 14:26:45.863511 4836 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4488137d41693a0eed0cf3344bd79369971085583a9f2a449f30914ec350a79a" Feb 17 14:26:45 crc kubenswrapper[4836]: I0217 14:26:45.870167 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-83de-account-create-update-fh75b" event={"ID":"54905e17-d443-4465-8f70-7be04a89086f","Type":"ContainerDied","Data":"944bbce44653f332757b41aa709ff915e239c63242463d545f1454d580e8215e"} Feb 17 14:26:45 crc kubenswrapper[4836]: I0217 14:26:45.870224 4836 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="944bbce44653f332757b41aa709ff915e239c63242463d545f1454d580e8215e" Feb 17 14:26:45 crc kubenswrapper[4836]: I0217 14:26:45.872906 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-k7zc9" event={"ID":"e562d506-21d2-4edd-90b8-97bd11bf068e","Type":"ContainerDied","Data":"7bdbd8741734a15b799a8b486b726df845fdab465fc49bc23c59d7482c73e7d5"} Feb 17 14:26:45 crc kubenswrapper[4836]: I0217 14:26:45.872935 4836 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7bdbd8741734a15b799a8b486b726df845fdab465fc49bc23c59d7482c73e7d5" Feb 17 14:26:45 crc kubenswrapper[4836]: I0217 14:26:45.874821 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-c8vxs" event={"ID":"21d02a34-d68b-4cae-9f03-0b15d07fe948","Type":"ContainerDied","Data":"f11b06be301a9f9b682d6007240d8ae8a505a1f6da93c11fc5e55f0036da5ae0"} Feb 17 14:26:45 crc kubenswrapper[4836]: I0217 14:26:45.874843 4836 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f11b06be301a9f9b682d6007240d8ae8a505a1f6da93c11fc5e55f0036da5ae0" Feb 17 14:26:45 crc kubenswrapper[4836]: I0217 14:26:45.876308 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-d8f3-account-create-update-kmlvm" event={"ID":"2ae1659d-7892-4744-a570-4ba7c65e4caf","Type":"ContainerDied","Data":"053f72cbdf2c7e5f30db435af69e5bbe1df08b8271492028d870e534720e3fc6"} Feb 17 14:26:45 crc kubenswrapper[4836]: I0217 14:26:45.876335 4836 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="053f72cbdf2c7e5f30db435af69e5bbe1df08b8271492028d870e534720e3fc6" Feb 17 14:26:46 crc kubenswrapper[4836]: I0217 14:26:46.011163 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-83de-account-create-update-fh75b" Feb 17 14:26:46 crc kubenswrapper[4836]: I0217 14:26:46.072765 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-k7zc9" Feb 17 14:26:46 crc kubenswrapper[4836]: I0217 14:26:46.132409 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/54905e17-d443-4465-8f70-7be04a89086f-operator-scripts\") pod \"54905e17-d443-4465-8f70-7be04a89086f\" (UID: \"54905e17-d443-4465-8f70-7be04a89086f\") " Feb 17 14:26:46 crc kubenswrapper[4836]: I0217 14:26:46.133009 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54905e17-d443-4465-8f70-7be04a89086f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "54905e17-d443-4465-8f70-7be04a89086f" (UID: "54905e17-d443-4465-8f70-7be04a89086f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:26:46 crc kubenswrapper[4836]: I0217 14:26:46.134446 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r8nc5\" (UniqueName: \"kubernetes.io/projected/54905e17-d443-4465-8f70-7be04a89086f-kube-api-access-r8nc5\") pod \"54905e17-d443-4465-8f70-7be04a89086f\" (UID: \"54905e17-d443-4465-8f70-7be04a89086f\") " Feb 17 14:26:46 crc kubenswrapper[4836]: I0217 14:26:46.135240 4836 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/54905e17-d443-4465-8f70-7be04a89086f-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 14:26:46 crc kubenswrapper[4836]: I0217 14:26:46.142767 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-d8f3-account-create-update-kmlvm" Feb 17 14:26:46 crc kubenswrapper[4836]: I0217 14:26:46.150833 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54905e17-d443-4465-8f70-7be04a89086f-kube-api-access-r8nc5" (OuterVolumeSpecName: "kube-api-access-r8nc5") pod "54905e17-d443-4465-8f70-7be04a89086f" (UID: "54905e17-d443-4465-8f70-7be04a89086f"). InnerVolumeSpecName "kube-api-access-r8nc5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:26:46 crc kubenswrapper[4836]: I0217 14:26:46.230610 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-c8vxs" Feb 17 14:26:46 crc kubenswrapper[4836]: I0217 14:26:46.235721 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g9ljc\" (UniqueName: \"kubernetes.io/projected/2ae1659d-7892-4744-a570-4ba7c65e4caf-kube-api-access-g9ljc\") pod \"2ae1659d-7892-4744-a570-4ba7c65e4caf\" (UID: \"2ae1659d-7892-4744-a570-4ba7c65e4caf\") " Feb 17 14:26:46 crc kubenswrapper[4836]: I0217 14:26:46.235800 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/21d02a34-d68b-4cae-9f03-0b15d07fe948-operator-scripts\") pod \"21d02a34-d68b-4cae-9f03-0b15d07fe948\" (UID: \"21d02a34-d68b-4cae-9f03-0b15d07fe948\") " Feb 17 14:26:46 crc kubenswrapper[4836]: I0217 14:26:46.235850 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-stl2m\" (UniqueName: \"kubernetes.io/projected/e562d506-21d2-4edd-90b8-97bd11bf068e-kube-api-access-stl2m\") pod \"e562d506-21d2-4edd-90b8-97bd11bf068e\" (UID: \"e562d506-21d2-4edd-90b8-97bd11bf068e\") " Feb 17 14:26:46 crc kubenswrapper[4836]: I0217 14:26:46.235885 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2ae1659d-7892-4744-a570-4ba7c65e4caf-operator-scripts\") pod \"2ae1659d-7892-4744-a570-4ba7c65e4caf\" (UID: \"2ae1659d-7892-4744-a570-4ba7c65e4caf\") " Feb 17 14:26:46 crc kubenswrapper[4836]: I0217 14:26:46.235920 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e562d506-21d2-4edd-90b8-97bd11bf068e-operator-scripts\") pod \"e562d506-21d2-4edd-90b8-97bd11bf068e\" (UID: \"e562d506-21d2-4edd-90b8-97bd11bf068e\") " Feb 17 14:26:46 crc kubenswrapper[4836]: I0217 14:26:46.235954 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f86lz\" (UniqueName: \"kubernetes.io/projected/21d02a34-d68b-4cae-9f03-0b15d07fe948-kube-api-access-f86lz\") pod \"21d02a34-d68b-4cae-9f03-0b15d07fe948\" (UID: \"21d02a34-d68b-4cae-9f03-0b15d07fe948\") " Feb 17 14:26:46 crc kubenswrapper[4836]: I0217 14:26:46.236226 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r8nc5\" (UniqueName: \"kubernetes.io/projected/54905e17-d443-4465-8f70-7be04a89086f-kube-api-access-r8nc5\") on node \"crc\" DevicePath \"\"" Feb 17 14:26:46 crc kubenswrapper[4836]: I0217 14:26:46.237083 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21d02a34-d68b-4cae-9f03-0b15d07fe948-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "21d02a34-d68b-4cae-9f03-0b15d07fe948" (UID: "21d02a34-d68b-4cae-9f03-0b15d07fe948"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:26:46 crc kubenswrapper[4836]: I0217 14:26:46.237470 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ae1659d-7892-4744-a570-4ba7c65e4caf-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2ae1659d-7892-4744-a570-4ba7c65e4caf" (UID: "2ae1659d-7892-4744-a570-4ba7c65e4caf"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:26:46 crc kubenswrapper[4836]: I0217 14:26:46.237564 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e562d506-21d2-4edd-90b8-97bd11bf068e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e562d506-21d2-4edd-90b8-97bd11bf068e" (UID: "e562d506-21d2-4edd-90b8-97bd11bf068e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:26:46 crc kubenswrapper[4836]: I0217 14:26:46.241030 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e562d506-21d2-4edd-90b8-97bd11bf068e-kube-api-access-stl2m" (OuterVolumeSpecName: "kube-api-access-stl2m") pod "e562d506-21d2-4edd-90b8-97bd11bf068e" (UID: "e562d506-21d2-4edd-90b8-97bd11bf068e"). InnerVolumeSpecName "kube-api-access-stl2m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:26:46 crc kubenswrapper[4836]: I0217 14:26:46.245937 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-d162-account-create-update-khb5j" Feb 17 14:26:46 crc kubenswrapper[4836]: I0217 14:26:46.246263 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21d02a34-d68b-4cae-9f03-0b15d07fe948-kube-api-access-f86lz" (OuterVolumeSpecName: "kube-api-access-f86lz") pod "21d02a34-d68b-4cae-9f03-0b15d07fe948" (UID: "21d02a34-d68b-4cae-9f03-0b15d07fe948"). InnerVolumeSpecName "kube-api-access-f86lz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:26:46 crc kubenswrapper[4836]: I0217 14:26:46.246345 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ae1659d-7892-4744-a570-4ba7c65e4caf-kube-api-access-g9ljc" (OuterVolumeSpecName: "kube-api-access-g9ljc") pod "2ae1659d-7892-4744-a570-4ba7c65e4caf" (UID: "2ae1659d-7892-4744-a570-4ba7c65e4caf"). InnerVolumeSpecName "kube-api-access-g9ljc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:26:46 crc kubenswrapper[4836]: I0217 14:26:46.338769 4836 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/21d02a34-d68b-4cae-9f03-0b15d07fe948-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 14:26:46 crc kubenswrapper[4836]: I0217 14:26:46.339418 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-stl2m\" (UniqueName: \"kubernetes.io/projected/e562d506-21d2-4edd-90b8-97bd11bf068e-kube-api-access-stl2m\") on node \"crc\" DevicePath \"\"" Feb 17 14:26:46 crc kubenswrapper[4836]: I0217 14:26:46.339753 4836 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2ae1659d-7892-4744-a570-4ba7c65e4caf-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 14:26:46 crc kubenswrapper[4836]: I0217 14:26:46.339824 4836 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e562d506-21d2-4edd-90b8-97bd11bf068e-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 14:26:46 crc kubenswrapper[4836]: I0217 14:26:46.339843 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f86lz\" (UniqueName: \"kubernetes.io/projected/21d02a34-d68b-4cae-9f03-0b15d07fe948-kube-api-access-f86lz\") on node \"crc\" DevicePath \"\"" Feb 17 14:26:46 crc kubenswrapper[4836]: I0217 14:26:46.339859 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g9ljc\" (UniqueName: \"kubernetes.io/projected/2ae1659d-7892-4744-a570-4ba7c65e4caf-kube-api-access-g9ljc\") on node \"crc\" DevicePath \"\"" Feb 17 14:26:46 crc kubenswrapper[4836]: I0217 14:26:46.441487 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1a5ee61c-8aa3-4bb4-a7da-2d61ca1561f5-operator-scripts\") pod \"1a5ee61c-8aa3-4bb4-a7da-2d61ca1561f5\" (UID: \"1a5ee61c-8aa3-4bb4-a7da-2d61ca1561f5\") " Feb 17 14:26:46 crc kubenswrapper[4836]: I0217 14:26:46.441711 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8sfbb\" (UniqueName: \"kubernetes.io/projected/1a5ee61c-8aa3-4bb4-a7da-2d61ca1561f5-kube-api-access-8sfbb\") pod \"1a5ee61c-8aa3-4bb4-a7da-2d61ca1561f5\" (UID: \"1a5ee61c-8aa3-4bb4-a7da-2d61ca1561f5\") " Feb 17 14:26:46 crc kubenswrapper[4836]: I0217 14:26:46.443082 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a5ee61c-8aa3-4bb4-a7da-2d61ca1561f5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1a5ee61c-8aa3-4bb4-a7da-2d61ca1561f5" (UID: "1a5ee61c-8aa3-4bb4-a7da-2d61ca1561f5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:26:46 crc kubenswrapper[4836]: I0217 14:26:46.447969 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a5ee61c-8aa3-4bb4-a7da-2d61ca1561f5-kube-api-access-8sfbb" (OuterVolumeSpecName: "kube-api-access-8sfbb") pod "1a5ee61c-8aa3-4bb4-a7da-2d61ca1561f5" (UID: "1a5ee61c-8aa3-4bb4-a7da-2d61ca1561f5"). InnerVolumeSpecName "kube-api-access-8sfbb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:26:46 crc kubenswrapper[4836]: I0217 14:26:46.547861 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8sfbb\" (UniqueName: \"kubernetes.io/projected/1a5ee61c-8aa3-4bb4-a7da-2d61ca1561f5-kube-api-access-8sfbb\") on node \"crc\" DevicePath \"\"" Feb 17 14:26:46 crc kubenswrapper[4836]: I0217 14:26:46.547968 4836 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1a5ee61c-8aa3-4bb4-a7da-2d61ca1561f5-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 14:26:46 crc kubenswrapper[4836]: I0217 14:26:46.887958 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6f866bb7-5209-4275-8884-df6f074b3f7c","Type":"ContainerStarted","Data":"a5c50e91fbe9d5bf5c447d513b9cd45546c7f0ab529bc7790065740b89966019"} Feb 17 14:26:46 crc kubenswrapper[4836]: I0217 14:26:46.889665 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-dbzmx" event={"ID":"cb33695b-c451-44b2-8a2a-fe534a4040e3","Type":"ContainerStarted","Data":"3f3e6d9b2f9b81e95f3278234cf18a3d4bff52824dc7f44df99e615056b57f74"} Feb 17 14:26:46 crc kubenswrapper[4836]: I0217 14:26:46.890251 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Feb 17 14:26:46 crc kubenswrapper[4836]: I0217 14:26:46.894518 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ec9408e6-0474-4f84-842e-b1c20f42a7b8","Type":"ContainerStarted","Data":"a85bb5d25c822d0a6fbd4857f4d63038e54f36103237d14bf65da5288ba6755c"} Feb 17 14:26:46 crc kubenswrapper[4836]: I0217 14:26:46.894565 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-k7zc9" Feb 17 14:26:46 crc kubenswrapper[4836]: I0217 14:26:46.895005 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-d162-account-create-update-khb5j" Feb 17 14:26:46 crc kubenswrapper[4836]: I0217 14:26:46.895059 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-83de-account-create-update-fh75b" Feb 17 14:26:46 crc kubenswrapper[4836]: I0217 14:26:46.895065 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-c8vxs" Feb 17 14:26:46 crc kubenswrapper[4836]: I0217 14:26:46.895182 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-d8f3-account-create-update-kmlvm" Feb 17 14:26:46 crc kubenswrapper[4836]: I0217 14:26:46.928527 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-dbzmx" podStartSLOduration=5.785915217 podStartE2EDuration="10.928500809s" podCreationTimestamp="2026-02-17 14:26:36 +0000 UTC" firstStartedPulling="2026-02-17 14:26:40.721955617 +0000 UTC m=+1227.064883886" lastFinishedPulling="2026-02-17 14:26:45.864541209 +0000 UTC m=+1232.207469478" observedRunningTime="2026-02-17 14:26:46.917141249 +0000 UTC m=+1233.260069528" watchObservedRunningTime="2026-02-17 14:26:46.928500809 +0000 UTC m=+1233.271429078" Feb 17 14:26:46 crc kubenswrapper[4836]: I0217 14:26:46.953054 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=41.335145136 podStartE2EDuration="1m9.95303316s" podCreationTimestamp="2026-02-17 14:25:37 +0000 UTC" firstStartedPulling="2026-02-17 14:25:39.25139841 +0000 UTC m=+1165.594326679" lastFinishedPulling="2026-02-17 14:26:07.869286434 +0000 UTC m=+1194.212214703" observedRunningTime="2026-02-17 14:26:46.944675108 +0000 UTC m=+1233.287603387" watchObservedRunningTime="2026-02-17 14:26:46.95303316 +0000 UTC m=+1233.295961429" Feb 17 14:26:46 crc kubenswrapper[4836]: I0217 14:26:46.979222 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=42.04717529 podStartE2EDuration="1m10.979202933s" podCreationTimestamp="2026-02-17 14:25:36 +0000 UTC" firstStartedPulling="2026-02-17 14:25:38.624540408 +0000 UTC m=+1164.967468677" lastFinishedPulling="2026-02-17 14:26:07.556568051 +0000 UTC m=+1193.899496320" observedRunningTime="2026-02-17 14:26:46.976911962 +0000 UTC m=+1233.319840241" watchObservedRunningTime="2026-02-17 14:26:46.979202933 +0000 UTC m=+1233.322131222" Feb 17 14:26:47 crc kubenswrapper[4836]: I0217 14:26:47.831924 4836 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cloudkitty-lokistack-ingester-0" podUID="1c33fb01-9bf7-43f1-86d5-004e70d3721c" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Feb 17 14:26:48 crc kubenswrapper[4836]: I0217 14:26:48.073045 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Feb 17 14:26:49 crc kubenswrapper[4836]: I0217 14:26:49.113086 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Feb 17 14:26:49 crc kubenswrapper[4836]: I0217 14:26:49.747496 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-698758b865-wbh2w" Feb 17 14:26:49 crc kubenswrapper[4836]: I0217 14:26:49.828768 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-bpss8"] Feb 17 14:26:49 crc kubenswrapper[4836]: I0217 14:26:49.829095 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86db49b7ff-bpss8" podUID="7e0a6937-945b-48fc-a328-6715e10ffddc" containerName="dnsmasq-dns" containerID="cri-o://a4dd5c55405656df129bfcb6d3d7edde886d28467877f4823411944db38277ef" gracePeriod=10 Feb 17 14:26:50 crc kubenswrapper[4836]: I0217 14:26:50.108747 4836 generic.go:334] "Generic (PLEG): container finished" podID="7e0a6937-945b-48fc-a328-6715e10ffddc" containerID="a4dd5c55405656df129bfcb6d3d7edde886d28467877f4823411944db38277ef" exitCode=0 Feb 17 14:26:50 crc kubenswrapper[4836]: I0217 14:26:50.108945 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-bpss8" event={"ID":"7e0a6937-945b-48fc-a328-6715e10ffddc","Type":"ContainerDied","Data":"a4dd5c55405656df129bfcb6d3d7edde886d28467877f4823411944db38277ef"} Feb 17 14:26:50 crc kubenswrapper[4836]: I0217 14:26:50.259650 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-c8vxs"] Feb 17 14:26:50 crc kubenswrapper[4836]: I0217 14:26:50.271397 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-c8vxs"] Feb 17 14:26:50 crc kubenswrapper[4836]: I0217 14:26:50.582587 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21d02a34-d68b-4cae-9f03-0b15d07fe948" path="/var/lib/kubelet/pods/21d02a34-d68b-4cae-9f03-0b15d07fe948/volumes" Feb 17 14:26:50 crc kubenswrapper[4836]: I0217 14:26:50.837999 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-bpss8" Feb 17 14:26:51 crc kubenswrapper[4836]: I0217 14:26:51.012939 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7e0a6937-945b-48fc-a328-6715e10ffddc-ovsdbserver-sb\") pod \"7e0a6937-945b-48fc-a328-6715e10ffddc\" (UID: \"7e0a6937-945b-48fc-a328-6715e10ffddc\") " Feb 17 14:26:51 crc kubenswrapper[4836]: I0217 14:26:51.013013 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e0a6937-945b-48fc-a328-6715e10ffddc-config\") pod \"7e0a6937-945b-48fc-a328-6715e10ffddc\" (UID: \"7e0a6937-945b-48fc-a328-6715e10ffddc\") " Feb 17 14:26:51 crc kubenswrapper[4836]: I0217 14:26:51.013111 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tx4bn\" (UniqueName: \"kubernetes.io/projected/7e0a6937-945b-48fc-a328-6715e10ffddc-kube-api-access-tx4bn\") pod \"7e0a6937-945b-48fc-a328-6715e10ffddc\" (UID: \"7e0a6937-945b-48fc-a328-6715e10ffddc\") " Feb 17 14:26:51 crc kubenswrapper[4836]: I0217 14:26:51.013226 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7e0a6937-945b-48fc-a328-6715e10ffddc-ovsdbserver-nb\") pod \"7e0a6937-945b-48fc-a328-6715e10ffddc\" (UID: \"7e0a6937-945b-48fc-a328-6715e10ffddc\") " Feb 17 14:26:51 crc kubenswrapper[4836]: I0217 14:26:51.013321 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e0a6937-945b-48fc-a328-6715e10ffddc-dns-svc\") pod \"7e0a6937-945b-48fc-a328-6715e10ffddc\" (UID: \"7e0a6937-945b-48fc-a328-6715e10ffddc\") " Feb 17 14:26:51 crc kubenswrapper[4836]: I0217 14:26:51.123853 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e0a6937-945b-48fc-a328-6715e10ffddc-kube-api-access-tx4bn" (OuterVolumeSpecName: "kube-api-access-tx4bn") pod "7e0a6937-945b-48fc-a328-6715e10ffddc" (UID: "7e0a6937-945b-48fc-a328-6715e10ffddc"). InnerVolumeSpecName "kube-api-access-tx4bn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:26:51 crc kubenswrapper[4836]: I0217 14:26:51.141699 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e29295a6-4250-4a7d-84c5-a8b3ddf96bd0","Type":"ContainerStarted","Data":"839af704fe28aeff5f1ab20ca6e7c7a0fb25790fc5bc232fe9131c132f8e0bf9"} Feb 17 14:26:51 crc kubenswrapper[4836]: I0217 14:26:51.144791 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-bpss8" event={"ID":"7e0a6937-945b-48fc-a328-6715e10ffddc","Type":"ContainerDied","Data":"ff5ecfc3d719da4b799fbc70b95c4645eaf91702ed47ae7bfdec7b990d4e151b"} Feb 17 14:26:51 crc kubenswrapper[4836]: I0217 14:26:51.145009 4836 scope.go:117] "RemoveContainer" containerID="a4dd5c55405656df129bfcb6d3d7edde886d28467877f4823411944db38277ef" Feb 17 14:26:51 crc kubenswrapper[4836]: I0217 14:26:51.145217 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-bpss8" Feb 17 14:26:51 crc kubenswrapper[4836]: I0217 14:26:51.218978 4836 scope.go:117] "RemoveContainer" containerID="5d73acc7d3b7d21dfd57bd1f5f6891bf754918c51d20232beb9b0071a1de3710" Feb 17 14:26:51 crc kubenswrapper[4836]: I0217 14:26:51.219083 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=23.439883842 podStartE2EDuration="1m7.219050666s" podCreationTimestamp="2026-02-17 14:25:44 +0000 UTC" firstStartedPulling="2026-02-17 14:26:06.510550124 +0000 UTC m=+1192.853478393" lastFinishedPulling="2026-02-17 14:26:50.289716948 +0000 UTC m=+1236.632645217" observedRunningTime="2026-02-17 14:26:51.192702944 +0000 UTC m=+1237.535631223" watchObservedRunningTime="2026-02-17 14:26:51.219050666 +0000 UTC m=+1237.561978935" Feb 17 14:26:51 crc kubenswrapper[4836]: I0217 14:26:51.219848 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e0a6937-945b-48fc-a328-6715e10ffddc-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7e0a6937-945b-48fc-a328-6715e10ffddc" (UID: "7e0a6937-945b-48fc-a328-6715e10ffddc"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:26:51 crc kubenswrapper[4836]: I0217 14:26:51.224596 4836 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7e0a6937-945b-48fc-a328-6715e10ffddc-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 17 14:26:51 crc kubenswrapper[4836]: I0217 14:26:51.224620 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tx4bn\" (UniqueName: \"kubernetes.io/projected/7e0a6937-945b-48fc-a328-6715e10ffddc-kube-api-access-tx4bn\") on node \"crc\" DevicePath \"\"" Feb 17 14:26:51 crc kubenswrapper[4836]: I0217 14:26:51.229726 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e0a6937-945b-48fc-a328-6715e10ffddc-config" (OuterVolumeSpecName: "config") pod "7e0a6937-945b-48fc-a328-6715e10ffddc" (UID: "7e0a6937-945b-48fc-a328-6715e10ffddc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:26:51 crc kubenswrapper[4836]: I0217 14:26:51.240114 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e0a6937-945b-48fc-a328-6715e10ffddc-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7e0a6937-945b-48fc-a328-6715e10ffddc" (UID: "7e0a6937-945b-48fc-a328-6715e10ffddc"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:26:51 crc kubenswrapper[4836]: I0217 14:26:51.244532 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e0a6937-945b-48fc-a328-6715e10ffddc-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7e0a6937-945b-48fc-a328-6715e10ffddc" (UID: "7e0a6937-945b-48fc-a328-6715e10ffddc"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:26:51 crc kubenswrapper[4836]: I0217 14:26:51.326615 4836 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e0a6937-945b-48fc-a328-6715e10ffddc-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:26:51 crc kubenswrapper[4836]: I0217 14:26:51.328660 4836 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7e0a6937-945b-48fc-a328-6715e10ffddc-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 17 14:26:51 crc kubenswrapper[4836]: I0217 14:26:51.328791 4836 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e0a6937-945b-48fc-a328-6715e10ffddc-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 14:26:51 crc kubenswrapper[4836]: I0217 14:26:51.430874 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e482046c-502a-4f41-b013-7b3ef1c71ee1-etc-swift\") pod \"swift-storage-0\" (UID: \"e482046c-502a-4f41-b013-7b3ef1c71ee1\") " pod="openstack/swift-storage-0" Feb 17 14:26:51 crc kubenswrapper[4836]: E0217 14:26:51.431315 4836 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 17 14:26:51 crc kubenswrapper[4836]: E0217 14:26:51.431785 4836 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 17 14:26:51 crc kubenswrapper[4836]: E0217 14:26:51.431899 4836 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e482046c-502a-4f41-b013-7b3ef1c71ee1-etc-swift podName:e482046c-502a-4f41-b013-7b3ef1c71ee1 nodeName:}" failed. No retries permitted until 2026-02-17 14:27:07.431860228 +0000 UTC m=+1253.774788497 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/e482046c-502a-4f41-b013-7b3ef1c71ee1-etc-swift") pod "swift-storage-0" (UID: "e482046c-502a-4f41-b013-7b3ef1c71ee1") : configmap "swift-ring-files" not found Feb 17 14:26:51 crc kubenswrapper[4836]: I0217 14:26:51.489507 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-bpss8"] Feb 17 14:26:51 crc kubenswrapper[4836]: I0217 14:26:51.506820 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-bpss8"] Feb 17 14:26:51 crc kubenswrapper[4836]: I0217 14:26:51.816218 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-z8g7x"] Feb 17 14:26:51 crc kubenswrapper[4836]: E0217 14:26:51.816648 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54905e17-d443-4465-8f70-7be04a89086f" containerName="mariadb-account-create-update" Feb 17 14:26:51 crc kubenswrapper[4836]: I0217 14:26:51.816662 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="54905e17-d443-4465-8f70-7be04a89086f" containerName="mariadb-account-create-update" Feb 17 14:26:51 crc kubenswrapper[4836]: E0217 14:26:51.816674 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae1de151-2799-49ba-839c-70e035c6f1d5" containerName="dnsmasq-dns" Feb 17 14:26:51 crc kubenswrapper[4836]: I0217 14:26:51.816680 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae1de151-2799-49ba-839c-70e035c6f1d5" containerName="dnsmasq-dns" Feb 17 14:26:51 crc kubenswrapper[4836]: E0217 14:26:51.816693 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e0a6937-945b-48fc-a328-6715e10ffddc" containerName="dnsmasq-dns" Feb 17 14:26:51 crc kubenswrapper[4836]: I0217 14:26:51.816700 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e0a6937-945b-48fc-a328-6715e10ffddc" containerName="dnsmasq-dns" Feb 17 14:26:51 crc kubenswrapper[4836]: E0217 14:26:51.816708 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21d02a34-d68b-4cae-9f03-0b15d07fe948" containerName="mariadb-account-create-update" Feb 17 14:26:51 crc kubenswrapper[4836]: I0217 14:26:51.816715 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="21d02a34-d68b-4cae-9f03-0b15d07fe948" containerName="mariadb-account-create-update" Feb 17 14:26:51 crc kubenswrapper[4836]: E0217 14:26:51.816735 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="add50d48-0a1c-4d2f-bcc3-ae9355e95c3b" containerName="mariadb-database-create" Feb 17 14:26:51 crc kubenswrapper[4836]: I0217 14:26:51.816742 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="add50d48-0a1c-4d2f-bcc3-ae9355e95c3b" containerName="mariadb-database-create" Feb 17 14:26:51 crc kubenswrapper[4836]: E0217 14:26:51.816755 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a5ee61c-8aa3-4bb4-a7da-2d61ca1561f5" containerName="mariadb-account-create-update" Feb 17 14:26:51 crc kubenswrapper[4836]: I0217 14:26:51.816761 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a5ee61c-8aa3-4bb4-a7da-2d61ca1561f5" containerName="mariadb-account-create-update" Feb 17 14:26:51 crc kubenswrapper[4836]: E0217 14:26:51.816774 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e562d506-21d2-4edd-90b8-97bd11bf068e" containerName="mariadb-database-create" Feb 17 14:26:51 crc kubenswrapper[4836]: I0217 14:26:51.816780 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="e562d506-21d2-4edd-90b8-97bd11bf068e" containerName="mariadb-database-create" Feb 17 14:26:51 crc kubenswrapper[4836]: E0217 14:26:51.816791 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77cd04d8-7d93-4b49-b91b-8cc5fdf8f53b" containerName="mariadb-database-create" Feb 17 14:26:51 crc kubenswrapper[4836]: I0217 14:26:51.816796 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="77cd04d8-7d93-4b49-b91b-8cc5fdf8f53b" containerName="mariadb-database-create" Feb 17 14:26:51 crc kubenswrapper[4836]: E0217 14:26:51.816806 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ae1659d-7892-4744-a570-4ba7c65e4caf" containerName="mariadb-account-create-update" Feb 17 14:26:51 crc kubenswrapper[4836]: I0217 14:26:51.816812 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ae1659d-7892-4744-a570-4ba7c65e4caf" containerName="mariadb-account-create-update" Feb 17 14:26:51 crc kubenswrapper[4836]: E0217 14:26:51.816822 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae1de151-2799-49ba-839c-70e035c6f1d5" containerName="init" Feb 17 14:26:51 crc kubenswrapper[4836]: I0217 14:26:51.816828 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae1de151-2799-49ba-839c-70e035c6f1d5" containerName="init" Feb 17 14:26:51 crc kubenswrapper[4836]: E0217 14:26:51.816841 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e0a6937-945b-48fc-a328-6715e10ffddc" containerName="init" Feb 17 14:26:51 crc kubenswrapper[4836]: I0217 14:26:51.816847 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e0a6937-945b-48fc-a328-6715e10ffddc" containerName="init" Feb 17 14:26:51 crc kubenswrapper[4836]: I0217 14:26:51.817004 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ae1659d-7892-4744-a570-4ba7c65e4caf" containerName="mariadb-account-create-update" Feb 17 14:26:51 crc kubenswrapper[4836]: I0217 14:26:51.817027 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae1de151-2799-49ba-839c-70e035c6f1d5" containerName="dnsmasq-dns" Feb 17 14:26:51 crc kubenswrapper[4836]: I0217 14:26:51.817035 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="e562d506-21d2-4edd-90b8-97bd11bf068e" containerName="mariadb-database-create" Feb 17 14:26:51 crc kubenswrapper[4836]: I0217 14:26:51.817049 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="77cd04d8-7d93-4b49-b91b-8cc5fdf8f53b" containerName="mariadb-database-create" Feb 17 14:26:51 crc kubenswrapper[4836]: I0217 14:26:51.817061 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e0a6937-945b-48fc-a328-6715e10ffddc" containerName="dnsmasq-dns" Feb 17 14:26:51 crc kubenswrapper[4836]: I0217 14:26:51.817072 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="54905e17-d443-4465-8f70-7be04a89086f" containerName="mariadb-account-create-update" Feb 17 14:26:51 crc kubenswrapper[4836]: I0217 14:26:51.817087 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="21d02a34-d68b-4cae-9f03-0b15d07fe948" containerName="mariadb-account-create-update" Feb 17 14:26:51 crc kubenswrapper[4836]: I0217 14:26:51.817102 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="add50d48-0a1c-4d2f-bcc3-ae9355e95c3b" containerName="mariadb-database-create" Feb 17 14:26:51 crc kubenswrapper[4836]: I0217 14:26:51.817112 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a5ee61c-8aa3-4bb4-a7da-2d61ca1561f5" containerName="mariadb-account-create-update" Feb 17 14:26:51 crc kubenswrapper[4836]: I0217 14:26:51.817834 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-z8g7x" Feb 17 14:26:51 crc kubenswrapper[4836]: I0217 14:26:51.820495 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Feb 17 14:26:51 crc kubenswrapper[4836]: I0217 14:26:51.820854 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-qbbvn" Feb 17 14:26:51 crc kubenswrapper[4836]: I0217 14:26:51.835021 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-z8g7x"] Feb 17 14:26:51 crc kubenswrapper[4836]: I0217 14:26:51.886168 4836 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-ghk5k" podUID="5949d44f-ef6d-417e-9035-9b235cd59863" containerName="ovn-controller" probeResult="failure" output=< Feb 17 14:26:51 crc kubenswrapper[4836]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Feb 17 14:26:51 crc kubenswrapper[4836]: > Feb 17 14:26:51 crc kubenswrapper[4836]: I0217 14:26:51.942608 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grffb\" (UniqueName: \"kubernetes.io/projected/df3a6cf1-bca0-45b2-9f7c-6d483452d49d-kube-api-access-grffb\") pod \"glance-db-sync-z8g7x\" (UID: \"df3a6cf1-bca0-45b2-9f7c-6d483452d49d\") " pod="openstack/glance-db-sync-z8g7x" Feb 17 14:26:51 crc kubenswrapper[4836]: I0217 14:26:51.943160 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df3a6cf1-bca0-45b2-9f7c-6d483452d49d-combined-ca-bundle\") pod \"glance-db-sync-z8g7x\" (UID: \"df3a6cf1-bca0-45b2-9f7c-6d483452d49d\") " pod="openstack/glance-db-sync-z8g7x" Feb 17 14:26:51 crc kubenswrapper[4836]: I0217 14:26:51.943315 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df3a6cf1-bca0-45b2-9f7c-6d483452d49d-config-data\") pod \"glance-db-sync-z8g7x\" (UID: \"df3a6cf1-bca0-45b2-9f7c-6d483452d49d\") " pod="openstack/glance-db-sync-z8g7x" Feb 17 14:26:51 crc kubenswrapper[4836]: I0217 14:26:51.943571 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/df3a6cf1-bca0-45b2-9f7c-6d483452d49d-db-sync-config-data\") pod \"glance-db-sync-z8g7x\" (UID: \"df3a6cf1-bca0-45b2-9f7c-6d483452d49d\") " pod="openstack/glance-db-sync-z8g7x" Feb 17 14:26:52 crc kubenswrapper[4836]: I0217 14:26:52.107036 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df3a6cf1-bca0-45b2-9f7c-6d483452d49d-combined-ca-bundle\") pod \"glance-db-sync-z8g7x\" (UID: \"df3a6cf1-bca0-45b2-9f7c-6d483452d49d\") " pod="openstack/glance-db-sync-z8g7x" Feb 17 14:26:52 crc kubenswrapper[4836]: I0217 14:26:52.107099 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df3a6cf1-bca0-45b2-9f7c-6d483452d49d-config-data\") pod \"glance-db-sync-z8g7x\" (UID: \"df3a6cf1-bca0-45b2-9f7c-6d483452d49d\") " pod="openstack/glance-db-sync-z8g7x" Feb 17 14:26:52 crc kubenswrapper[4836]: I0217 14:26:52.107189 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/df3a6cf1-bca0-45b2-9f7c-6d483452d49d-db-sync-config-data\") pod \"glance-db-sync-z8g7x\" (UID: \"df3a6cf1-bca0-45b2-9f7c-6d483452d49d\") " pod="openstack/glance-db-sync-z8g7x" Feb 17 14:26:52 crc kubenswrapper[4836]: I0217 14:26:52.107243 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grffb\" (UniqueName: \"kubernetes.io/projected/df3a6cf1-bca0-45b2-9f7c-6d483452d49d-kube-api-access-grffb\") pod \"glance-db-sync-z8g7x\" (UID: \"df3a6cf1-bca0-45b2-9f7c-6d483452d49d\") " pod="openstack/glance-db-sync-z8g7x" Feb 17 14:26:52 crc kubenswrapper[4836]: I0217 14:26:52.121808 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/df3a6cf1-bca0-45b2-9f7c-6d483452d49d-db-sync-config-data\") pod \"glance-db-sync-z8g7x\" (UID: \"df3a6cf1-bca0-45b2-9f7c-6d483452d49d\") " pod="openstack/glance-db-sync-z8g7x" Feb 17 14:26:52 crc kubenswrapper[4836]: I0217 14:26:52.131891 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df3a6cf1-bca0-45b2-9f7c-6d483452d49d-config-data\") pod \"glance-db-sync-z8g7x\" (UID: \"df3a6cf1-bca0-45b2-9f7c-6d483452d49d\") " pod="openstack/glance-db-sync-z8g7x" Feb 17 14:26:52 crc kubenswrapper[4836]: I0217 14:26:52.132245 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df3a6cf1-bca0-45b2-9f7c-6d483452d49d-combined-ca-bundle\") pod \"glance-db-sync-z8g7x\" (UID: \"df3a6cf1-bca0-45b2-9f7c-6d483452d49d\") " pod="openstack/glance-db-sync-z8g7x" Feb 17 14:26:52 crc kubenswrapper[4836]: I0217 14:26:52.141608 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grffb\" (UniqueName: \"kubernetes.io/projected/df3a6cf1-bca0-45b2-9f7c-6d483452d49d-kube-api-access-grffb\") pod \"glance-db-sync-z8g7x\" (UID: \"df3a6cf1-bca0-45b2-9f7c-6d483452d49d\") " pod="openstack/glance-db-sync-z8g7x" Feb 17 14:26:52 crc kubenswrapper[4836]: I0217 14:26:52.163823 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-z8g7x" Feb 17 14:26:52 crc kubenswrapper[4836]: I0217 14:26:52.587727 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e0a6937-945b-48fc-a328-6715e10ffddc" path="/var/lib/kubelet/pods/7e0a6937-945b-48fc-a328-6715e10ffddc/volumes" Feb 17 14:26:52 crc kubenswrapper[4836]: I0217 14:26:52.849114 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-z8g7x"] Feb 17 14:26:52 crc kubenswrapper[4836]: W0217 14:26:52.862395 4836 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddf3a6cf1_bca0_45b2_9f7c_6d483452d49d.slice/crio-a970e805deb8fc7e4ea80574fe0f4020e8d303f5c75ae4049947b41814dd24fc WatchSource:0}: Error finding container a970e805deb8fc7e4ea80574fe0f4020e8d303f5c75ae4049947b41814dd24fc: Status 404 returned error can't find the container with id a970e805deb8fc7e4ea80574fe0f4020e8d303f5c75ae4049947b41814dd24fc Feb 17 14:26:53 crc kubenswrapper[4836]: I0217 14:26:53.220120 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-z8g7x" event={"ID":"df3a6cf1-bca0-45b2-9f7c-6d483452d49d","Type":"ContainerStarted","Data":"a970e805deb8fc7e4ea80574fe0f4020e8d303f5c75ae4049947b41814dd24fc"} Feb 17 14:26:55 crc kubenswrapper[4836]: I0217 14:26:55.251825 4836 generic.go:334] "Generic (PLEG): container finished" podID="cb33695b-c451-44b2-8a2a-fe534a4040e3" containerID="3f3e6d9b2f9b81e95f3278234cf18a3d4bff52824dc7f44df99e615056b57f74" exitCode=0 Feb 17 14:26:55 crc kubenswrapper[4836]: I0217 14:26:55.252346 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-dbzmx" event={"ID":"cb33695b-c451-44b2-8a2a-fe534a4040e3","Type":"ContainerDied","Data":"3f3e6d9b2f9b81e95f3278234cf18a3d4bff52824dc7f44df99e615056b57f74"} Feb 17 14:26:55 crc kubenswrapper[4836]: I0217 14:26:55.257448 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-h9gmq"] Feb 17 14:26:55 crc kubenswrapper[4836]: I0217 14:26:55.259244 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-h9gmq" Feb 17 14:26:55 crc kubenswrapper[4836]: I0217 14:26:55.264655 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Feb 17 14:26:55 crc kubenswrapper[4836]: I0217 14:26:55.274736 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-h9gmq"] Feb 17 14:26:55 crc kubenswrapper[4836]: I0217 14:26:55.446547 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/caa6524b-2b3f-47c3-b55f-1435685df59d-operator-scripts\") pod \"root-account-create-update-h9gmq\" (UID: \"caa6524b-2b3f-47c3-b55f-1435685df59d\") " pod="openstack/root-account-create-update-h9gmq" Feb 17 14:26:55 crc kubenswrapper[4836]: I0217 14:26:55.446805 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5f4x\" (UniqueName: \"kubernetes.io/projected/caa6524b-2b3f-47c3-b55f-1435685df59d-kube-api-access-v5f4x\") pod \"root-account-create-update-h9gmq\" (UID: \"caa6524b-2b3f-47c3-b55f-1435685df59d\") " pod="openstack/root-account-create-update-h9gmq" Feb 17 14:26:55 crc kubenswrapper[4836]: I0217 14:26:55.471656 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Feb 17 14:26:55 crc kubenswrapper[4836]: I0217 14:26:55.549672 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/caa6524b-2b3f-47c3-b55f-1435685df59d-operator-scripts\") pod \"root-account-create-update-h9gmq\" (UID: \"caa6524b-2b3f-47c3-b55f-1435685df59d\") " pod="openstack/root-account-create-update-h9gmq" Feb 17 14:26:55 crc kubenswrapper[4836]: I0217 14:26:55.549820 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5f4x\" (UniqueName: \"kubernetes.io/projected/caa6524b-2b3f-47c3-b55f-1435685df59d-kube-api-access-v5f4x\") pod \"root-account-create-update-h9gmq\" (UID: \"caa6524b-2b3f-47c3-b55f-1435685df59d\") " pod="openstack/root-account-create-update-h9gmq" Feb 17 14:26:55 crc kubenswrapper[4836]: I0217 14:26:55.551006 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/caa6524b-2b3f-47c3-b55f-1435685df59d-operator-scripts\") pod \"root-account-create-update-h9gmq\" (UID: \"caa6524b-2b3f-47c3-b55f-1435685df59d\") " pod="openstack/root-account-create-update-h9gmq" Feb 17 14:26:55 crc kubenswrapper[4836]: I0217 14:26:55.581822 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5f4x\" (UniqueName: \"kubernetes.io/projected/caa6524b-2b3f-47c3-b55f-1435685df59d-kube-api-access-v5f4x\") pod \"root-account-create-update-h9gmq\" (UID: \"caa6524b-2b3f-47c3-b55f-1435685df59d\") " pod="openstack/root-account-create-update-h9gmq" Feb 17 14:26:55 crc kubenswrapper[4836]: I0217 14:26:55.583394 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-h9gmq" Feb 17 14:26:56 crc kubenswrapper[4836]: I0217 14:26:56.170615 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-h9gmq"] Feb 17 14:26:56 crc kubenswrapper[4836]: I0217 14:26:56.335763 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-h9gmq" event={"ID":"caa6524b-2b3f-47c3-b55f-1435685df59d","Type":"ContainerStarted","Data":"4412cdd3236c16e7c55d72426203ad2b29aa25a446957f2189655406d782c8f6"} Feb 17 14:26:56 crc kubenswrapper[4836]: I0217 14:26:56.896655 4836 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-ghk5k" podUID="5949d44f-ef6d-417e-9035-9b235cd59863" containerName="ovn-controller" probeResult="failure" output=< Feb 17 14:26:56 crc kubenswrapper[4836]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Feb 17 14:26:56 crc kubenswrapper[4836]: > Feb 17 14:26:56 crc kubenswrapper[4836]: I0217 14:26:56.940269 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-dbzmx" Feb 17 14:26:56 crc kubenswrapper[4836]: I0217 14:26:56.968411 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-j4jj9" Feb 17 14:26:56 crc kubenswrapper[4836]: I0217 14:26:56.970124 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-j4jj9" Feb 17 14:26:57 crc kubenswrapper[4836]: I0217 14:26:57.135754 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb33695b-c451-44b2-8a2a-fe534a4040e3-combined-ca-bundle\") pod \"cb33695b-c451-44b2-8a2a-fe534a4040e3\" (UID: \"cb33695b-c451-44b2-8a2a-fe534a4040e3\") " Feb 17 14:26:57 crc kubenswrapper[4836]: I0217 14:26:57.137793 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/cb33695b-c451-44b2-8a2a-fe534a4040e3-dispersionconf\") pod \"cb33695b-c451-44b2-8a2a-fe534a4040e3\" (UID: \"cb33695b-c451-44b2-8a2a-fe534a4040e3\") " Feb 17 14:26:57 crc kubenswrapper[4836]: I0217 14:26:57.138070 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/cb33695b-c451-44b2-8a2a-fe534a4040e3-etc-swift\") pod \"cb33695b-c451-44b2-8a2a-fe534a4040e3\" (UID: \"cb33695b-c451-44b2-8a2a-fe534a4040e3\") " Feb 17 14:26:57 crc kubenswrapper[4836]: I0217 14:26:57.138135 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tlxn6\" (UniqueName: \"kubernetes.io/projected/cb33695b-c451-44b2-8a2a-fe534a4040e3-kube-api-access-tlxn6\") pod \"cb33695b-c451-44b2-8a2a-fe534a4040e3\" (UID: \"cb33695b-c451-44b2-8a2a-fe534a4040e3\") " Feb 17 14:26:57 crc kubenswrapper[4836]: I0217 14:26:57.138170 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cb33695b-c451-44b2-8a2a-fe534a4040e3-scripts\") pod \"cb33695b-c451-44b2-8a2a-fe534a4040e3\" (UID: \"cb33695b-c451-44b2-8a2a-fe534a4040e3\") " Feb 17 14:26:57 crc kubenswrapper[4836]: I0217 14:26:57.138261 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/cb33695b-c451-44b2-8a2a-fe534a4040e3-ring-data-devices\") pod \"cb33695b-c451-44b2-8a2a-fe534a4040e3\" (UID: \"cb33695b-c451-44b2-8a2a-fe534a4040e3\") " Feb 17 14:26:57 crc kubenswrapper[4836]: I0217 14:26:57.138291 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/cb33695b-c451-44b2-8a2a-fe534a4040e3-swiftconf\") pod \"cb33695b-c451-44b2-8a2a-fe534a4040e3\" (UID: \"cb33695b-c451-44b2-8a2a-fe534a4040e3\") " Feb 17 14:26:57 crc kubenswrapper[4836]: I0217 14:26:57.139168 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb33695b-c451-44b2-8a2a-fe534a4040e3-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "cb33695b-c451-44b2-8a2a-fe534a4040e3" (UID: "cb33695b-c451-44b2-8a2a-fe534a4040e3"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:26:57 crc kubenswrapper[4836]: I0217 14:26:57.139531 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb33695b-c451-44b2-8a2a-fe534a4040e3-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "cb33695b-c451-44b2-8a2a-fe534a4040e3" (UID: "cb33695b-c451-44b2-8a2a-fe534a4040e3"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:26:57 crc kubenswrapper[4836]: I0217 14:26:57.140392 4836 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/cb33695b-c451-44b2-8a2a-fe534a4040e3-ring-data-devices\") on node \"crc\" DevicePath \"\"" Feb 17 14:26:57 crc kubenswrapper[4836]: I0217 14:26:57.140422 4836 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/cb33695b-c451-44b2-8a2a-fe534a4040e3-etc-swift\") on node \"crc\" DevicePath \"\"" Feb 17 14:26:57 crc kubenswrapper[4836]: I0217 14:26:57.149500 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb33695b-c451-44b2-8a2a-fe534a4040e3-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "cb33695b-c451-44b2-8a2a-fe534a4040e3" (UID: "cb33695b-c451-44b2-8a2a-fe534a4040e3"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:26:57 crc kubenswrapper[4836]: I0217 14:26:57.287182 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb33695b-c451-44b2-8a2a-fe534a4040e3-kube-api-access-tlxn6" (OuterVolumeSpecName: "kube-api-access-tlxn6") pod "cb33695b-c451-44b2-8a2a-fe534a4040e3" (UID: "cb33695b-c451-44b2-8a2a-fe534a4040e3"). InnerVolumeSpecName "kube-api-access-tlxn6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:26:57 crc kubenswrapper[4836]: I0217 14:26:57.290562 4836 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/cb33695b-c451-44b2-8a2a-fe534a4040e3-dispersionconf\") on node \"crc\" DevicePath \"\"" Feb 17 14:26:57 crc kubenswrapper[4836]: I0217 14:26:57.290608 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tlxn6\" (UniqueName: \"kubernetes.io/projected/cb33695b-c451-44b2-8a2a-fe534a4040e3-kube-api-access-tlxn6\") on node \"crc\" DevicePath \"\"" Feb 17 14:26:57 crc kubenswrapper[4836]: I0217 14:26:57.313265 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb33695b-c451-44b2-8a2a-fe534a4040e3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cb33695b-c451-44b2-8a2a-fe534a4040e3" (UID: "cb33695b-c451-44b2-8a2a-fe534a4040e3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:26:57 crc kubenswrapper[4836]: I0217 14:26:57.342965 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb33695b-c451-44b2-8a2a-fe534a4040e3-scripts" (OuterVolumeSpecName: "scripts") pod "cb33695b-c451-44b2-8a2a-fe534a4040e3" (UID: "cb33695b-c451-44b2-8a2a-fe534a4040e3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:26:57 crc kubenswrapper[4836]: I0217 14:26:57.344728 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb33695b-c451-44b2-8a2a-fe534a4040e3-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "cb33695b-c451-44b2-8a2a-fe534a4040e3" (UID: "cb33695b-c451-44b2-8a2a-fe534a4040e3"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:26:57 crc kubenswrapper[4836]: I0217 14:26:57.351051 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ghk5k-config-tv4tb"] Feb 17 14:26:57 crc kubenswrapper[4836]: E0217 14:26:57.351928 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb33695b-c451-44b2-8a2a-fe534a4040e3" containerName="swift-ring-rebalance" Feb 17 14:26:57 crc kubenswrapper[4836]: I0217 14:26:57.351964 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb33695b-c451-44b2-8a2a-fe534a4040e3" containerName="swift-ring-rebalance" Feb 17 14:26:57 crc kubenswrapper[4836]: I0217 14:26:57.352242 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb33695b-c451-44b2-8a2a-fe534a4040e3" containerName="swift-ring-rebalance" Feb 17 14:26:57 crc kubenswrapper[4836]: I0217 14:26:57.353356 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ghk5k-config-tv4tb" Feb 17 14:26:57 crc kubenswrapper[4836]: I0217 14:26:57.357835 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Feb 17 14:26:57 crc kubenswrapper[4836]: I0217 14:26:57.369110 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-dbzmx" event={"ID":"cb33695b-c451-44b2-8a2a-fe534a4040e3","Type":"ContainerDied","Data":"0863004180b5c7074ba22f1ddb8c58005ebe6a0d2ac8583efc764697e8242881"} Feb 17 14:26:57 crc kubenswrapper[4836]: I0217 14:26:57.369175 4836 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0863004180b5c7074ba22f1ddb8c58005ebe6a0d2ac8583efc764697e8242881" Feb 17 14:26:57 crc kubenswrapper[4836]: I0217 14:26:57.369263 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-dbzmx" Feb 17 14:26:57 crc kubenswrapper[4836]: I0217 14:26:57.379456 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ghk5k-config-tv4tb"] Feb 17 14:26:57 crc kubenswrapper[4836]: I0217 14:26:57.396841 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/fb9f3dbb-ea37-4057-97c1-b93cbb39aaec-var-run\") pod \"ovn-controller-ghk5k-config-tv4tb\" (UID: \"fb9f3dbb-ea37-4057-97c1-b93cbb39aaec\") " pod="openstack/ovn-controller-ghk5k-config-tv4tb" Feb 17 14:26:57 crc kubenswrapper[4836]: I0217 14:26:57.396951 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/fb9f3dbb-ea37-4057-97c1-b93cbb39aaec-additional-scripts\") pod \"ovn-controller-ghk5k-config-tv4tb\" (UID: \"fb9f3dbb-ea37-4057-97c1-b93cbb39aaec\") " pod="openstack/ovn-controller-ghk5k-config-tv4tb" Feb 17 14:26:57 crc kubenswrapper[4836]: I0217 14:26:57.397563 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fb9f3dbb-ea37-4057-97c1-b93cbb39aaec-scripts\") pod \"ovn-controller-ghk5k-config-tv4tb\" (UID: \"fb9f3dbb-ea37-4057-97c1-b93cbb39aaec\") " pod="openstack/ovn-controller-ghk5k-config-tv4tb" Feb 17 14:26:57 crc kubenswrapper[4836]: I0217 14:26:57.397646 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/fb9f3dbb-ea37-4057-97c1-b93cbb39aaec-var-log-ovn\") pod \"ovn-controller-ghk5k-config-tv4tb\" (UID: \"fb9f3dbb-ea37-4057-97c1-b93cbb39aaec\") " pod="openstack/ovn-controller-ghk5k-config-tv4tb" Feb 17 14:26:57 crc kubenswrapper[4836]: I0217 14:26:57.397760 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xz7ht\" (UniqueName: \"kubernetes.io/projected/fb9f3dbb-ea37-4057-97c1-b93cbb39aaec-kube-api-access-xz7ht\") pod \"ovn-controller-ghk5k-config-tv4tb\" (UID: \"fb9f3dbb-ea37-4057-97c1-b93cbb39aaec\") " pod="openstack/ovn-controller-ghk5k-config-tv4tb" Feb 17 14:26:57 crc kubenswrapper[4836]: I0217 14:26:57.397822 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/fb9f3dbb-ea37-4057-97c1-b93cbb39aaec-var-run-ovn\") pod \"ovn-controller-ghk5k-config-tv4tb\" (UID: \"fb9f3dbb-ea37-4057-97c1-b93cbb39aaec\") " pod="openstack/ovn-controller-ghk5k-config-tv4tb" Feb 17 14:26:57 crc kubenswrapper[4836]: I0217 14:26:57.397960 4836 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb33695b-c451-44b2-8a2a-fe534a4040e3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:26:57 crc kubenswrapper[4836]: I0217 14:26:57.397979 4836 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cb33695b-c451-44b2-8a2a-fe534a4040e3-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 14:26:57 crc kubenswrapper[4836]: I0217 14:26:57.397991 4836 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/cb33695b-c451-44b2-8a2a-fe534a4040e3-swiftconf\") on node \"crc\" DevicePath \"\"" Feb 17 14:26:57 crc kubenswrapper[4836]: I0217 14:26:57.398937 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-h9gmq" event={"ID":"caa6524b-2b3f-47c3-b55f-1435685df59d","Type":"ContainerStarted","Data":"ca8e0602e1b36f3c2d9bfabc7020988df18e6945d19646bd583313467d47a539"} Feb 17 14:26:57 crc kubenswrapper[4836]: I0217 14:26:57.455821 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-h9gmq" podStartSLOduration=2.455781994 podStartE2EDuration="2.455781994s" podCreationTimestamp="2026-02-17 14:26:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:26:57.428837576 +0000 UTC m=+1243.771765865" watchObservedRunningTime="2026-02-17 14:26:57.455781994 +0000 UTC m=+1243.798710273" Feb 17 14:26:58 crc kubenswrapper[4836]: I0217 14:26:57.504515 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/fb9f3dbb-ea37-4057-97c1-b93cbb39aaec-additional-scripts\") pod \"ovn-controller-ghk5k-config-tv4tb\" (UID: \"fb9f3dbb-ea37-4057-97c1-b93cbb39aaec\") " pod="openstack/ovn-controller-ghk5k-config-tv4tb" Feb 17 14:26:58 crc kubenswrapper[4836]: I0217 14:26:57.505436 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fb9f3dbb-ea37-4057-97c1-b93cbb39aaec-scripts\") pod \"ovn-controller-ghk5k-config-tv4tb\" (UID: \"fb9f3dbb-ea37-4057-97c1-b93cbb39aaec\") " pod="openstack/ovn-controller-ghk5k-config-tv4tb" Feb 17 14:26:58 crc kubenswrapper[4836]: I0217 14:26:57.505487 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/fb9f3dbb-ea37-4057-97c1-b93cbb39aaec-var-log-ovn\") pod \"ovn-controller-ghk5k-config-tv4tb\" (UID: \"fb9f3dbb-ea37-4057-97c1-b93cbb39aaec\") " pod="openstack/ovn-controller-ghk5k-config-tv4tb" Feb 17 14:26:58 crc kubenswrapper[4836]: I0217 14:26:57.505608 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xz7ht\" (UniqueName: \"kubernetes.io/projected/fb9f3dbb-ea37-4057-97c1-b93cbb39aaec-kube-api-access-xz7ht\") pod \"ovn-controller-ghk5k-config-tv4tb\" (UID: \"fb9f3dbb-ea37-4057-97c1-b93cbb39aaec\") " pod="openstack/ovn-controller-ghk5k-config-tv4tb" Feb 17 14:26:58 crc kubenswrapper[4836]: I0217 14:26:57.505693 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/fb9f3dbb-ea37-4057-97c1-b93cbb39aaec-var-run-ovn\") pod \"ovn-controller-ghk5k-config-tv4tb\" (UID: \"fb9f3dbb-ea37-4057-97c1-b93cbb39aaec\") " pod="openstack/ovn-controller-ghk5k-config-tv4tb" Feb 17 14:26:58 crc kubenswrapper[4836]: I0217 14:26:57.505872 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/fb9f3dbb-ea37-4057-97c1-b93cbb39aaec-var-run\") pod \"ovn-controller-ghk5k-config-tv4tb\" (UID: \"fb9f3dbb-ea37-4057-97c1-b93cbb39aaec\") " pod="openstack/ovn-controller-ghk5k-config-tv4tb" Feb 17 14:26:58 crc kubenswrapper[4836]: I0217 14:26:57.506455 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/fb9f3dbb-ea37-4057-97c1-b93cbb39aaec-var-run\") pod \"ovn-controller-ghk5k-config-tv4tb\" (UID: \"fb9f3dbb-ea37-4057-97c1-b93cbb39aaec\") " pod="openstack/ovn-controller-ghk5k-config-tv4tb" Feb 17 14:26:58 crc kubenswrapper[4836]: I0217 14:26:57.507108 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/fb9f3dbb-ea37-4057-97c1-b93cbb39aaec-var-log-ovn\") pod \"ovn-controller-ghk5k-config-tv4tb\" (UID: \"fb9f3dbb-ea37-4057-97c1-b93cbb39aaec\") " pod="openstack/ovn-controller-ghk5k-config-tv4tb" Feb 17 14:26:58 crc kubenswrapper[4836]: I0217 14:26:57.507656 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/fb9f3dbb-ea37-4057-97c1-b93cbb39aaec-additional-scripts\") pod \"ovn-controller-ghk5k-config-tv4tb\" (UID: \"fb9f3dbb-ea37-4057-97c1-b93cbb39aaec\") " pod="openstack/ovn-controller-ghk5k-config-tv4tb" Feb 17 14:26:58 crc kubenswrapper[4836]: I0217 14:26:57.507991 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/fb9f3dbb-ea37-4057-97c1-b93cbb39aaec-var-run-ovn\") pod \"ovn-controller-ghk5k-config-tv4tb\" (UID: \"fb9f3dbb-ea37-4057-97c1-b93cbb39aaec\") " pod="openstack/ovn-controller-ghk5k-config-tv4tb" Feb 17 14:26:58 crc kubenswrapper[4836]: I0217 14:26:57.511251 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fb9f3dbb-ea37-4057-97c1-b93cbb39aaec-scripts\") pod \"ovn-controller-ghk5k-config-tv4tb\" (UID: \"fb9f3dbb-ea37-4057-97c1-b93cbb39aaec\") " pod="openstack/ovn-controller-ghk5k-config-tv4tb" Feb 17 14:26:58 crc kubenswrapper[4836]: I0217 14:26:57.533484 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xz7ht\" (UniqueName: \"kubernetes.io/projected/fb9f3dbb-ea37-4057-97c1-b93cbb39aaec-kube-api-access-xz7ht\") pod \"ovn-controller-ghk5k-config-tv4tb\" (UID: \"fb9f3dbb-ea37-4057-97c1-b93cbb39aaec\") " pod="openstack/ovn-controller-ghk5k-config-tv4tb" Feb 17 14:26:58 crc kubenswrapper[4836]: I0217 14:26:57.762927 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ghk5k-config-tv4tb" Feb 17 14:26:58 crc kubenswrapper[4836]: I0217 14:26:57.836899 4836 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cloudkitty-lokistack-ingester-0" podUID="1c33fb01-9bf7-43f1-86d5-004e70d3721c" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Feb 17 14:26:58 crc kubenswrapper[4836]: I0217 14:26:58.074908 4836 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="ec9408e6-0474-4f84-842e-b1c20f42a7b8" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.107:5671: connect: connection refused" Feb 17 14:26:58 crc kubenswrapper[4836]: I0217 14:26:58.476585 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Feb 17 14:26:58 crc kubenswrapper[4836]: I0217 14:26:58.496427 4836 generic.go:334] "Generic (PLEG): container finished" podID="caa6524b-2b3f-47c3-b55f-1435685df59d" containerID="ca8e0602e1b36f3c2d9bfabc7020988df18e6945d19646bd583313467d47a539" exitCode=0 Feb 17 14:26:58 crc kubenswrapper[4836]: I0217 14:26:58.496529 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-h9gmq" event={"ID":"caa6524b-2b3f-47c3-b55f-1435685df59d","Type":"ContainerDied","Data":"ca8e0602e1b36f3c2d9bfabc7020988df18e6945d19646bd583313467d47a539"} Feb 17 14:26:58 crc kubenswrapper[4836]: I0217 14:26:58.892715 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ghk5k-config-tv4tb"] Feb 17 14:26:58 crc kubenswrapper[4836]: W0217 14:26:58.918503 4836 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfb9f3dbb_ea37_4057_97c1_b93cbb39aaec.slice/crio-c2bde1d5ba9adc61b0950993bd532e188bd3fc6a817496d9051110a296f9c5b2 WatchSource:0}: Error finding container c2bde1d5ba9adc61b0950993bd532e188bd3fc6a817496d9051110a296f9c5b2: Status 404 returned error can't find the container with id c2bde1d5ba9adc61b0950993bd532e188bd3fc6a817496d9051110a296f9c5b2 Feb 17 14:26:59 crc kubenswrapper[4836]: I0217 14:26:59.514388 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ghk5k-config-tv4tb" event={"ID":"fb9f3dbb-ea37-4057-97c1-b93cbb39aaec","Type":"ContainerStarted","Data":"faf1f0c01e2ba58effda0101e73091532e490c7632b908240461cde1c4eacd7e"} Feb 17 14:26:59 crc kubenswrapper[4836]: I0217 14:26:59.515095 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ghk5k-config-tv4tb" event={"ID":"fb9f3dbb-ea37-4057-97c1-b93cbb39aaec","Type":"ContainerStarted","Data":"c2bde1d5ba9adc61b0950993bd532e188bd3fc6a817496d9051110a296f9c5b2"} Feb 17 14:26:59 crc kubenswrapper[4836]: I0217 14:26:59.766046 4836 patch_prober.go:28] interesting pod/machine-config-daemon-bkk9g container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 14:26:59 crc kubenswrapper[4836]: I0217 14:26:59.766146 4836 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" podUID="895a19c9-a3f0-4a15-aa19-19347121388c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 14:26:59 crc kubenswrapper[4836]: I0217 14:26:59.766236 4836 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" Feb 17 14:26:59 crc kubenswrapper[4836]: I0217 14:26:59.767609 4836 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"790067b54b3531952a7756a09b793da1fc53330ef71b8011e59f530ae444594e"} pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 14:26:59 crc kubenswrapper[4836]: I0217 14:26:59.767726 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" podUID="895a19c9-a3f0-4a15-aa19-19347121388c" containerName="machine-config-daemon" containerID="cri-o://790067b54b3531952a7756a09b793da1fc53330ef71b8011e59f530ae444594e" gracePeriod=600 Feb 17 14:27:00 crc kubenswrapper[4836]: I0217 14:27:00.143236 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-h9gmq" Feb 17 14:27:00 crc kubenswrapper[4836]: I0217 14:27:00.180009 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ghk5k-config-tv4tb" podStartSLOduration=3.179973354 podStartE2EDuration="3.179973354s" podCreationTimestamp="2026-02-17 14:26:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:26:59.543119181 +0000 UTC m=+1245.886047470" watchObservedRunningTime="2026-02-17 14:27:00.179973354 +0000 UTC m=+1246.522901623" Feb 17 14:27:00 crc kubenswrapper[4836]: I0217 14:27:00.253664 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/caa6524b-2b3f-47c3-b55f-1435685df59d-operator-scripts\") pod \"caa6524b-2b3f-47c3-b55f-1435685df59d\" (UID: \"caa6524b-2b3f-47c3-b55f-1435685df59d\") " Feb 17 14:27:00 crc kubenswrapper[4836]: I0217 14:27:00.253902 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v5f4x\" (UniqueName: \"kubernetes.io/projected/caa6524b-2b3f-47c3-b55f-1435685df59d-kube-api-access-v5f4x\") pod \"caa6524b-2b3f-47c3-b55f-1435685df59d\" (UID: \"caa6524b-2b3f-47c3-b55f-1435685df59d\") " Feb 17 14:27:00 crc kubenswrapper[4836]: I0217 14:27:00.254690 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/caa6524b-2b3f-47c3-b55f-1435685df59d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "caa6524b-2b3f-47c3-b55f-1435685df59d" (UID: "caa6524b-2b3f-47c3-b55f-1435685df59d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:27:00 crc kubenswrapper[4836]: I0217 14:27:00.254807 4836 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/caa6524b-2b3f-47c3-b55f-1435685df59d-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 14:27:00 crc kubenswrapper[4836]: I0217 14:27:00.261799 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/caa6524b-2b3f-47c3-b55f-1435685df59d-kube-api-access-v5f4x" (OuterVolumeSpecName: "kube-api-access-v5f4x") pod "caa6524b-2b3f-47c3-b55f-1435685df59d" (UID: "caa6524b-2b3f-47c3-b55f-1435685df59d"). InnerVolumeSpecName "kube-api-access-v5f4x". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:27:00 crc kubenswrapper[4836]: I0217 14:27:00.358194 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v5f4x\" (UniqueName: \"kubernetes.io/projected/caa6524b-2b3f-47c3-b55f-1435685df59d-kube-api-access-v5f4x\") on node \"crc\" DevicePath \"\"" Feb 17 14:27:00 crc kubenswrapper[4836]: I0217 14:27:00.471684 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Feb 17 14:27:00 crc kubenswrapper[4836]: I0217 14:27:00.476625 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Feb 17 14:27:00 crc kubenswrapper[4836]: I0217 14:27:00.710505 4836 generic.go:334] "Generic (PLEG): container finished" podID="fb9f3dbb-ea37-4057-97c1-b93cbb39aaec" containerID="faf1f0c01e2ba58effda0101e73091532e490c7632b908240461cde1c4eacd7e" exitCode=0 Feb 17 14:27:00 crc kubenswrapper[4836]: I0217 14:27:00.710737 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ghk5k-config-tv4tb" event={"ID":"fb9f3dbb-ea37-4057-97c1-b93cbb39aaec","Type":"ContainerDied","Data":"faf1f0c01e2ba58effda0101e73091532e490c7632b908240461cde1c4eacd7e"} Feb 17 14:27:00 crc kubenswrapper[4836]: I0217 14:27:00.718030 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-h9gmq" event={"ID":"caa6524b-2b3f-47c3-b55f-1435685df59d","Type":"ContainerDied","Data":"4412cdd3236c16e7c55d72426203ad2b29aa25a446957f2189655406d782c8f6"} Feb 17 14:27:00 crc kubenswrapper[4836]: I0217 14:27:00.718114 4836 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4412cdd3236c16e7c55d72426203ad2b29aa25a446957f2189655406d782c8f6" Feb 17 14:27:00 crc kubenswrapper[4836]: I0217 14:27:00.718206 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-h9gmq" Feb 17 14:27:00 crc kubenswrapper[4836]: I0217 14:27:00.735360 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" event={"ID":"895a19c9-a3f0-4a15-aa19-19347121388c","Type":"ContainerDied","Data":"790067b54b3531952a7756a09b793da1fc53330ef71b8011e59f530ae444594e"} Feb 17 14:27:00 crc kubenswrapper[4836]: I0217 14:27:00.735709 4836 scope.go:117] "RemoveContainer" containerID="89b78e4cc2264dc06417ab903dd2a1618c1aee2c1d950babae0b011a2e9eac59" Feb 17 14:27:00 crc kubenswrapper[4836]: I0217 14:27:00.735297 4836 generic.go:334] "Generic (PLEG): container finished" podID="895a19c9-a3f0-4a15-aa19-19347121388c" containerID="790067b54b3531952a7756a09b793da1fc53330ef71b8011e59f530ae444594e" exitCode=0 Feb 17 14:27:00 crc kubenswrapper[4836]: I0217 14:27:00.735952 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" event={"ID":"895a19c9-a3f0-4a15-aa19-19347121388c","Type":"ContainerStarted","Data":"3c09fe81ffce38e5d9ef4195d8e69df0edfb238c5a8b73cb36be460e79dea4bb"} Feb 17 14:27:00 crc kubenswrapper[4836]: I0217 14:27:00.738369 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Feb 17 14:27:01 crc kubenswrapper[4836]: I0217 14:27:01.882982 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ghk5k" Feb 17 14:27:04 crc kubenswrapper[4836]: I0217 14:27:04.956519 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 17 14:27:04 crc kubenswrapper[4836]: I0217 14:27:04.958176 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="e29295a6-4250-4a7d-84c5-a8b3ddf96bd0" containerName="prometheus" containerID="cri-o://5de26698cc194f27aa6fa46281e03b3fa0bc2faa6bf0ef9b745f3fec33e05835" gracePeriod=600 Feb 17 14:27:04 crc kubenswrapper[4836]: I0217 14:27:04.958587 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="e29295a6-4250-4a7d-84c5-a8b3ddf96bd0" containerName="thanos-sidecar" containerID="cri-o://839af704fe28aeff5f1ab20ca6e7c7a0fb25790fc5bc232fe9131c132f8e0bf9" gracePeriod=600 Feb 17 14:27:04 crc kubenswrapper[4836]: I0217 14:27:04.958692 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="e29295a6-4250-4a7d-84c5-a8b3ddf96bd0" containerName="config-reloader" containerID="cri-o://2634435ab0e106f5ce9041eacdd8794376187c382228fa8d9f52a71bd9ec4553" gracePeriod=600 Feb 17 14:27:05 crc kubenswrapper[4836]: I0217 14:27:05.472142 4836 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="e29295a6-4250-4a7d-84c5-a8b3ddf96bd0" containerName="prometheus" probeResult="failure" output="Get \"http://10.217.0.114:9090/-/ready\": dial tcp 10.217.0.114:9090: connect: connection refused" Feb 17 14:27:05 crc kubenswrapper[4836]: I0217 14:27:05.933325 4836 generic.go:334] "Generic (PLEG): container finished" podID="e29295a6-4250-4a7d-84c5-a8b3ddf96bd0" containerID="839af704fe28aeff5f1ab20ca6e7c7a0fb25790fc5bc232fe9131c132f8e0bf9" exitCode=0 Feb 17 14:27:05 crc kubenswrapper[4836]: I0217 14:27:05.933369 4836 generic.go:334] "Generic (PLEG): container finished" podID="e29295a6-4250-4a7d-84c5-a8b3ddf96bd0" containerID="2634435ab0e106f5ce9041eacdd8794376187c382228fa8d9f52a71bd9ec4553" exitCode=0 Feb 17 14:27:05 crc kubenswrapper[4836]: I0217 14:27:05.933379 4836 generic.go:334] "Generic (PLEG): container finished" podID="e29295a6-4250-4a7d-84c5-a8b3ddf96bd0" containerID="5de26698cc194f27aa6fa46281e03b3fa0bc2faa6bf0ef9b745f3fec33e05835" exitCode=0 Feb 17 14:27:05 crc kubenswrapper[4836]: I0217 14:27:05.933407 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e29295a6-4250-4a7d-84c5-a8b3ddf96bd0","Type":"ContainerDied","Data":"839af704fe28aeff5f1ab20ca6e7c7a0fb25790fc5bc232fe9131c132f8e0bf9"} Feb 17 14:27:05 crc kubenswrapper[4836]: I0217 14:27:05.933459 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e29295a6-4250-4a7d-84c5-a8b3ddf96bd0","Type":"ContainerDied","Data":"2634435ab0e106f5ce9041eacdd8794376187c382228fa8d9f52a71bd9ec4553"} Feb 17 14:27:05 crc kubenswrapper[4836]: I0217 14:27:05.933475 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e29295a6-4250-4a7d-84c5-a8b3ddf96bd0","Type":"ContainerDied","Data":"5de26698cc194f27aa6fa46281e03b3fa0bc2faa6bf0ef9b745f3fec33e05835"} Feb 17 14:27:07 crc kubenswrapper[4836]: I0217 14:27:07.478982 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e482046c-502a-4f41-b013-7b3ef1c71ee1-etc-swift\") pod \"swift-storage-0\" (UID: \"e482046c-502a-4f41-b013-7b3ef1c71ee1\") " pod="openstack/swift-storage-0" Feb 17 14:27:07 crc kubenswrapper[4836]: I0217 14:27:07.490418 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e482046c-502a-4f41-b013-7b3ef1c71ee1-etc-swift\") pod \"swift-storage-0\" (UID: \"e482046c-502a-4f41-b013-7b3ef1c71ee1\") " pod="openstack/swift-storage-0" Feb 17 14:27:07 crc kubenswrapper[4836]: I0217 14:27:07.890593 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 17 14:27:07 crc kubenswrapper[4836]: I0217 14:27:07.902342 4836 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cloudkitty-lokistack-ingester-0" podUID="1c33fb01-9bf7-43f1-86d5-004e70d3721c" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Feb 17 14:27:08 crc kubenswrapper[4836]: I0217 14:27:08.074616 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Feb 17 14:27:08 crc kubenswrapper[4836]: I0217 14:27:08.448830 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-w5qdk"] Feb 17 14:27:08 crc kubenswrapper[4836]: E0217 14:27:08.453879 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="caa6524b-2b3f-47c3-b55f-1435685df59d" containerName="mariadb-account-create-update" Feb 17 14:27:08 crc kubenswrapper[4836]: I0217 14:27:08.453999 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="caa6524b-2b3f-47c3-b55f-1435685df59d" containerName="mariadb-account-create-update" Feb 17 14:27:08 crc kubenswrapper[4836]: I0217 14:27:08.454522 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="caa6524b-2b3f-47c3-b55f-1435685df59d" containerName="mariadb-account-create-update" Feb 17 14:27:08 crc kubenswrapper[4836]: I0217 14:27:08.455784 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-w5qdk" Feb 17 14:27:08 crc kubenswrapper[4836]: I0217 14:27:08.474789 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-w5qdk"] Feb 17 14:27:08 crc kubenswrapper[4836]: I0217 14:27:08.865639 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eb354e85-311d-40bb-ae4a-5c535d4d89b9-operator-scripts\") pod \"cinder-db-create-w5qdk\" (UID: \"eb354e85-311d-40bb-ae4a-5c535d4d89b9\") " pod="openstack/cinder-db-create-w5qdk" Feb 17 14:27:08 crc kubenswrapper[4836]: I0217 14:27:08.874523 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q65k4\" (UniqueName: \"kubernetes.io/projected/eb354e85-311d-40bb-ae4a-5c535d4d89b9-kube-api-access-q65k4\") pod \"cinder-db-create-w5qdk\" (UID: \"eb354e85-311d-40bb-ae4a-5c535d4d89b9\") " pod="openstack/cinder-db-create-w5qdk" Feb 17 14:27:08 crc kubenswrapper[4836]: I0217 14:27:08.959972 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-0d11-account-create-update-jf72z"] Feb 17 14:27:08 crc kubenswrapper[4836]: I0217 14:27:08.962215 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-0d11-account-create-update-jf72z" Feb 17 14:27:08 crc kubenswrapper[4836]: I0217 14:27:08.974491 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-0d11-account-create-update-jf72z"] Feb 17 14:27:08 crc kubenswrapper[4836]: I0217 14:27:08.977588 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q65k4\" (UniqueName: \"kubernetes.io/projected/eb354e85-311d-40bb-ae4a-5c535d4d89b9-kube-api-access-q65k4\") pod \"cinder-db-create-w5qdk\" (UID: \"eb354e85-311d-40bb-ae4a-5c535d4d89b9\") " pod="openstack/cinder-db-create-w5qdk" Feb 17 14:27:08 crc kubenswrapper[4836]: I0217 14:27:08.977875 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwcxc\" (UniqueName: \"kubernetes.io/projected/f9ee15e8-6695-454f-83ad-d54176458497-kube-api-access-mwcxc\") pod \"cinder-0d11-account-create-update-jf72z\" (UID: \"f9ee15e8-6695-454f-83ad-d54176458497\") " pod="openstack/cinder-0d11-account-create-update-jf72z" Feb 17 14:27:08 crc kubenswrapper[4836]: I0217 14:27:08.978029 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9ee15e8-6695-454f-83ad-d54176458497-operator-scripts\") pod \"cinder-0d11-account-create-update-jf72z\" (UID: \"f9ee15e8-6695-454f-83ad-d54176458497\") " pod="openstack/cinder-0d11-account-create-update-jf72z" Feb 17 14:27:08 crc kubenswrapper[4836]: I0217 14:27:08.978149 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eb354e85-311d-40bb-ae4a-5c535d4d89b9-operator-scripts\") pod \"cinder-db-create-w5qdk\" (UID: \"eb354e85-311d-40bb-ae4a-5c535d4d89b9\") " pod="openstack/cinder-db-create-w5qdk" Feb 17 14:27:08 crc kubenswrapper[4836]: I0217 14:27:08.980229 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eb354e85-311d-40bb-ae4a-5c535d4d89b9-operator-scripts\") pod \"cinder-db-create-w5qdk\" (UID: \"eb354e85-311d-40bb-ae4a-5c535d4d89b9\") " pod="openstack/cinder-db-create-w5qdk" Feb 17 14:27:08 crc kubenswrapper[4836]: I0217 14:27:08.984035 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Feb 17 14:27:09 crc kubenswrapper[4836]: I0217 14:27:09.080493 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwcxc\" (UniqueName: \"kubernetes.io/projected/f9ee15e8-6695-454f-83ad-d54176458497-kube-api-access-mwcxc\") pod \"cinder-0d11-account-create-update-jf72z\" (UID: \"f9ee15e8-6695-454f-83ad-d54176458497\") " pod="openstack/cinder-0d11-account-create-update-jf72z" Feb 17 14:27:09 crc kubenswrapper[4836]: I0217 14:27:09.080596 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9ee15e8-6695-454f-83ad-d54176458497-operator-scripts\") pod \"cinder-0d11-account-create-update-jf72z\" (UID: \"f9ee15e8-6695-454f-83ad-d54176458497\") " pod="openstack/cinder-0d11-account-create-update-jf72z" Feb 17 14:27:09 crc kubenswrapper[4836]: I0217 14:27:09.081773 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9ee15e8-6695-454f-83ad-d54176458497-operator-scripts\") pod \"cinder-0d11-account-create-update-jf72z\" (UID: \"f9ee15e8-6695-454f-83ad-d54176458497\") " pod="openstack/cinder-0d11-account-create-update-jf72z" Feb 17 14:27:09 crc kubenswrapper[4836]: I0217 14:27:09.264337 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwcxc\" (UniqueName: \"kubernetes.io/projected/f9ee15e8-6695-454f-83ad-d54176458497-kube-api-access-mwcxc\") pod \"cinder-0d11-account-create-update-jf72z\" (UID: \"f9ee15e8-6695-454f-83ad-d54176458497\") " pod="openstack/cinder-0d11-account-create-update-jf72z" Feb 17 14:27:09 crc kubenswrapper[4836]: I0217 14:27:09.264402 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q65k4\" (UniqueName: \"kubernetes.io/projected/eb354e85-311d-40bb-ae4a-5c535d4d89b9-kube-api-access-q65k4\") pod \"cinder-db-create-w5qdk\" (UID: \"eb354e85-311d-40bb-ae4a-5c535d4d89b9\") " pod="openstack/cinder-db-create-w5qdk" Feb 17 14:27:09 crc kubenswrapper[4836]: I0217 14:27:09.292581 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-0d11-account-create-update-jf72z" Feb 17 14:27:09 crc kubenswrapper[4836]: I0217 14:27:09.406211 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-w5qdk" Feb 17 14:27:09 crc kubenswrapper[4836]: I0217 14:27:09.457040 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-2ea0-account-create-update-p7p99"] Feb 17 14:27:09 crc kubenswrapper[4836]: I0217 14:27:09.459322 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-2ea0-account-create-update-p7p99" Feb 17 14:27:09 crc kubenswrapper[4836]: I0217 14:27:09.464273 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-db-secret" Feb 17 14:27:09 crc kubenswrapper[4836]: I0217 14:27:09.479009 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-db-create-jjrp2"] Feb 17 14:27:09 crc kubenswrapper[4836]: I0217 14:27:09.481025 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-create-jjrp2" Feb 17 14:27:09 crc kubenswrapper[4836]: I0217 14:27:09.493612 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a1fe36f3-d6b6-44e0-b85b-6def754fd08e-operator-scripts\") pod \"cloudkitty-db-create-jjrp2\" (UID: \"a1fe36f3-d6b6-44e0-b85b-6def754fd08e\") " pod="openstack/cloudkitty-db-create-jjrp2" Feb 17 14:27:09 crc kubenswrapper[4836]: I0217 14:27:09.494154 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2ee1a0f2-86df-4f97-957a-22bbd7da4505-operator-scripts\") pod \"cloudkitty-2ea0-account-create-update-p7p99\" (UID: \"2ee1a0f2-86df-4f97-957a-22bbd7da4505\") " pod="openstack/cloudkitty-2ea0-account-create-update-p7p99" Feb 17 14:27:09 crc kubenswrapper[4836]: I0217 14:27:09.494482 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zfw7\" (UniqueName: \"kubernetes.io/projected/a1fe36f3-d6b6-44e0-b85b-6def754fd08e-kube-api-access-6zfw7\") pod \"cloudkitty-db-create-jjrp2\" (UID: \"a1fe36f3-d6b6-44e0-b85b-6def754fd08e\") " pod="openstack/cloudkitty-db-create-jjrp2" Feb 17 14:27:09 crc kubenswrapper[4836]: I0217 14:27:09.504262 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7d6c\" (UniqueName: \"kubernetes.io/projected/2ee1a0f2-86df-4f97-957a-22bbd7da4505-kube-api-access-j7d6c\") pod \"cloudkitty-2ea0-account-create-update-p7p99\" (UID: \"2ee1a0f2-86df-4f97-957a-22bbd7da4505\") " pod="openstack/cloudkitty-2ea0-account-create-update-p7p99" Feb 17 14:27:09 crc kubenswrapper[4836]: I0217 14:27:09.498934 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-db-create-jjrp2"] Feb 17 14:27:09 crc kubenswrapper[4836]: I0217 14:27:09.610397 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-69hk6"] Feb 17 14:27:09 crc kubenswrapper[4836]: I0217 14:27:09.610483 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6zfw7\" (UniqueName: \"kubernetes.io/projected/a1fe36f3-d6b6-44e0-b85b-6def754fd08e-kube-api-access-6zfw7\") pod \"cloudkitty-db-create-jjrp2\" (UID: \"a1fe36f3-d6b6-44e0-b85b-6def754fd08e\") " pod="openstack/cloudkitty-db-create-jjrp2" Feb 17 14:27:09 crc kubenswrapper[4836]: I0217 14:27:09.611158 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7d6c\" (UniqueName: \"kubernetes.io/projected/2ee1a0f2-86df-4f97-957a-22bbd7da4505-kube-api-access-j7d6c\") pod \"cloudkitty-2ea0-account-create-update-p7p99\" (UID: \"2ee1a0f2-86df-4f97-957a-22bbd7da4505\") " pod="openstack/cloudkitty-2ea0-account-create-update-p7p99" Feb 17 14:27:09 crc kubenswrapper[4836]: I0217 14:27:09.611491 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a1fe36f3-d6b6-44e0-b85b-6def754fd08e-operator-scripts\") pod \"cloudkitty-db-create-jjrp2\" (UID: \"a1fe36f3-d6b6-44e0-b85b-6def754fd08e\") " pod="openstack/cloudkitty-db-create-jjrp2" Feb 17 14:27:09 crc kubenswrapper[4836]: I0217 14:27:09.611534 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2ee1a0f2-86df-4f97-957a-22bbd7da4505-operator-scripts\") pod \"cloudkitty-2ea0-account-create-update-p7p99\" (UID: \"2ee1a0f2-86df-4f97-957a-22bbd7da4505\") " pod="openstack/cloudkitty-2ea0-account-create-update-p7p99" Feb 17 14:27:09 crc kubenswrapper[4836]: I0217 14:27:09.613254 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2ee1a0f2-86df-4f97-957a-22bbd7da4505-operator-scripts\") pod \"cloudkitty-2ea0-account-create-update-p7p99\" (UID: \"2ee1a0f2-86df-4f97-957a-22bbd7da4505\") " pod="openstack/cloudkitty-2ea0-account-create-update-p7p99" Feb 17 14:27:09 crc kubenswrapper[4836]: I0217 14:27:09.613247 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a1fe36f3-d6b6-44e0-b85b-6def754fd08e-operator-scripts\") pod \"cloudkitty-db-create-jjrp2\" (UID: \"a1fe36f3-d6b6-44e0-b85b-6def754fd08e\") " pod="openstack/cloudkitty-db-create-jjrp2" Feb 17 14:27:09 crc kubenswrapper[4836]: I0217 14:27:09.613498 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-69hk6" Feb 17 14:27:10 crc kubenswrapper[4836]: I0217 14:27:10.042973 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4edeb89f-0bd9-466e-a9f9-2d45575d2c72-operator-scripts\") pod \"barbican-db-create-69hk6\" (UID: \"4edeb89f-0bd9-466e-a9f9-2d45575d2c72\") " pod="openstack/barbican-db-create-69hk6" Feb 17 14:27:10 crc kubenswrapper[4836]: I0217 14:27:10.045119 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zm6jm\" (UniqueName: \"kubernetes.io/projected/4edeb89f-0bd9-466e-a9f9-2d45575d2c72-kube-api-access-zm6jm\") pod \"barbican-db-create-69hk6\" (UID: \"4edeb89f-0bd9-466e-a9f9-2d45575d2c72\") " pod="openstack/barbican-db-create-69hk6" Feb 17 14:27:10 crc kubenswrapper[4836]: I0217 14:27:10.051142 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-2ea0-account-create-update-p7p99"] Feb 17 14:27:10 crc kubenswrapper[4836]: I0217 14:27:10.071330 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-q25rr"] Feb 17 14:27:10 crc kubenswrapper[4836]: I0217 14:27:10.073153 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-q25rr" Feb 17 14:27:10 crc kubenswrapper[4836]: I0217 14:27:10.088095 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-69hk6"] Feb 17 14:27:10 crc kubenswrapper[4836]: I0217 14:27:10.103079 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-q25rr"] Feb 17 14:27:10 crc kubenswrapper[4836]: I0217 14:27:10.103195 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-652c-account-create-update-lswdv"] Feb 17 14:27:10 crc kubenswrapper[4836]: I0217 14:27:10.105023 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 17 14:27:10 crc kubenswrapper[4836]: I0217 14:27:10.105324 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-s87v5" Feb 17 14:27:10 crc kubenswrapper[4836]: I0217 14:27:10.105492 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 17 14:27:10 crc kubenswrapper[4836]: I0217 14:27:10.107776 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-652c-account-create-update-lswdv" Feb 17 14:27:10 crc kubenswrapper[4836]: I0217 14:27:10.108965 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 17 14:27:10 crc kubenswrapper[4836]: I0217 14:27:10.111529 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7d6c\" (UniqueName: \"kubernetes.io/projected/2ee1a0f2-86df-4f97-957a-22bbd7da4505-kube-api-access-j7d6c\") pod \"cloudkitty-2ea0-account-create-update-p7p99\" (UID: \"2ee1a0f2-86df-4f97-957a-22bbd7da4505\") " pod="openstack/cloudkitty-2ea0-account-create-update-p7p99" Feb 17 14:27:10 crc kubenswrapper[4836]: I0217 14:27:10.123066 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Feb 17 14:27:10 crc kubenswrapper[4836]: I0217 14:27:10.159923 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a1d4ef8-03d9-42d8-ae0b-9410767ed25f-combined-ca-bundle\") pod \"keystone-db-sync-q25rr\" (UID: \"6a1d4ef8-03d9-42d8-ae0b-9410767ed25f\") " pod="openstack/keystone-db-sync-q25rr" Feb 17 14:27:10 crc kubenswrapper[4836]: I0217 14:27:10.160030 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4edeb89f-0bd9-466e-a9f9-2d45575d2c72-operator-scripts\") pod \"barbican-db-create-69hk6\" (UID: \"4edeb89f-0bd9-466e-a9f9-2d45575d2c72\") " pod="openstack/barbican-db-create-69hk6" Feb 17 14:27:10 crc kubenswrapper[4836]: I0217 14:27:10.160095 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqspw\" (UniqueName: \"kubernetes.io/projected/767841a7-db94-430a-b408-10e5bd0350e5-kube-api-access-bqspw\") pod \"barbican-652c-account-create-update-lswdv\" (UID: \"767841a7-db94-430a-b408-10e5bd0350e5\") " pod="openstack/barbican-652c-account-create-update-lswdv" Feb 17 14:27:10 crc kubenswrapper[4836]: I0217 14:27:10.160171 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njnhf\" (UniqueName: \"kubernetes.io/projected/6a1d4ef8-03d9-42d8-ae0b-9410767ed25f-kube-api-access-njnhf\") pod \"keystone-db-sync-q25rr\" (UID: \"6a1d4ef8-03d9-42d8-ae0b-9410767ed25f\") " pod="openstack/keystone-db-sync-q25rr" Feb 17 14:27:10 crc kubenswrapper[4836]: I0217 14:27:10.160220 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zm6jm\" (UniqueName: \"kubernetes.io/projected/4edeb89f-0bd9-466e-a9f9-2d45575d2c72-kube-api-access-zm6jm\") pod \"barbican-db-create-69hk6\" (UID: \"4edeb89f-0bd9-466e-a9f9-2d45575d2c72\") " pod="openstack/barbican-db-create-69hk6" Feb 17 14:27:10 crc kubenswrapper[4836]: I0217 14:27:10.160265 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/767841a7-db94-430a-b408-10e5bd0350e5-operator-scripts\") pod \"barbican-652c-account-create-update-lswdv\" (UID: \"767841a7-db94-430a-b408-10e5bd0350e5\") " pod="openstack/barbican-652c-account-create-update-lswdv" Feb 17 14:27:10 crc kubenswrapper[4836]: I0217 14:27:10.160321 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a1d4ef8-03d9-42d8-ae0b-9410767ed25f-config-data\") pod \"keystone-db-sync-q25rr\" (UID: \"6a1d4ef8-03d9-42d8-ae0b-9410767ed25f\") " pod="openstack/keystone-db-sync-q25rr" Feb 17 14:27:10 crc kubenswrapper[4836]: I0217 14:27:10.161408 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4edeb89f-0bd9-466e-a9f9-2d45575d2c72-operator-scripts\") pod \"barbican-db-create-69hk6\" (UID: \"4edeb89f-0bd9-466e-a9f9-2d45575d2c72\") " pod="openstack/barbican-db-create-69hk6" Feb 17 14:27:10 crc kubenswrapper[4836]: I0217 14:27:10.164594 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-652c-account-create-update-lswdv"] Feb 17 14:27:10 crc kubenswrapper[4836]: I0217 14:27:10.197552 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zfw7\" (UniqueName: \"kubernetes.io/projected/a1fe36f3-d6b6-44e0-b85b-6def754fd08e-kube-api-access-6zfw7\") pod \"cloudkitty-db-create-jjrp2\" (UID: \"a1fe36f3-d6b6-44e0-b85b-6def754fd08e\") " pod="openstack/cloudkitty-db-create-jjrp2" Feb 17 14:27:10 crc kubenswrapper[4836]: I0217 14:27:10.209337 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zm6jm\" (UniqueName: \"kubernetes.io/projected/4edeb89f-0bd9-466e-a9f9-2d45575d2c72-kube-api-access-zm6jm\") pod \"barbican-db-create-69hk6\" (UID: \"4edeb89f-0bd9-466e-a9f9-2d45575d2c72\") " pod="openstack/barbican-db-create-69hk6" Feb 17 14:27:10 crc kubenswrapper[4836]: I0217 14:27:10.234453 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-nwjd8"] Feb 17 14:27:10 crc kubenswrapper[4836]: I0217 14:27:10.237196 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-nwjd8" Feb 17 14:27:10 crc kubenswrapper[4836]: I0217 14:27:10.253601 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-nwjd8"] Feb 17 14:27:10 crc kubenswrapper[4836]: I0217 14:27:10.264669 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d4ce1c7a-57e8-491e-84ab-8aed8baea37b-operator-scripts\") pod \"neutron-db-create-nwjd8\" (UID: \"d4ce1c7a-57e8-491e-84ab-8aed8baea37b\") " pod="openstack/neutron-db-create-nwjd8" Feb 17 14:27:10 crc kubenswrapper[4836]: I0217 14:27:10.264951 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/767841a7-db94-430a-b408-10e5bd0350e5-operator-scripts\") pod \"barbican-652c-account-create-update-lswdv\" (UID: \"767841a7-db94-430a-b408-10e5bd0350e5\") " pod="openstack/barbican-652c-account-create-update-lswdv" Feb 17 14:27:10 crc kubenswrapper[4836]: I0217 14:27:10.265092 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a1d4ef8-03d9-42d8-ae0b-9410767ed25f-config-data\") pod \"keystone-db-sync-q25rr\" (UID: \"6a1d4ef8-03d9-42d8-ae0b-9410767ed25f\") " pod="openstack/keystone-db-sync-q25rr" Feb 17 14:27:10 crc kubenswrapper[4836]: I0217 14:27:10.265233 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a1d4ef8-03d9-42d8-ae0b-9410767ed25f-combined-ca-bundle\") pod \"keystone-db-sync-q25rr\" (UID: \"6a1d4ef8-03d9-42d8-ae0b-9410767ed25f\") " pod="openstack/keystone-db-sync-q25rr" Feb 17 14:27:10 crc kubenswrapper[4836]: I0217 14:27:10.265448 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2xc5\" (UniqueName: \"kubernetes.io/projected/d4ce1c7a-57e8-491e-84ab-8aed8baea37b-kube-api-access-h2xc5\") pod \"neutron-db-create-nwjd8\" (UID: \"d4ce1c7a-57e8-491e-84ab-8aed8baea37b\") " pod="openstack/neutron-db-create-nwjd8" Feb 17 14:27:10 crc kubenswrapper[4836]: I0217 14:27:10.265588 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqspw\" (UniqueName: \"kubernetes.io/projected/767841a7-db94-430a-b408-10e5bd0350e5-kube-api-access-bqspw\") pod \"barbican-652c-account-create-update-lswdv\" (UID: \"767841a7-db94-430a-b408-10e5bd0350e5\") " pod="openstack/barbican-652c-account-create-update-lswdv" Feb 17 14:27:10 crc kubenswrapper[4836]: I0217 14:27:10.265829 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njnhf\" (UniqueName: \"kubernetes.io/projected/6a1d4ef8-03d9-42d8-ae0b-9410767ed25f-kube-api-access-njnhf\") pod \"keystone-db-sync-q25rr\" (UID: \"6a1d4ef8-03d9-42d8-ae0b-9410767ed25f\") " pod="openstack/keystone-db-sync-q25rr" Feb 17 14:27:10 crc kubenswrapper[4836]: I0217 14:27:10.266337 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/767841a7-db94-430a-b408-10e5bd0350e5-operator-scripts\") pod \"barbican-652c-account-create-update-lswdv\" (UID: \"767841a7-db94-430a-b408-10e5bd0350e5\") " pod="openstack/barbican-652c-account-create-update-lswdv" Feb 17 14:27:10 crc kubenswrapper[4836]: I0217 14:27:10.270759 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a1d4ef8-03d9-42d8-ae0b-9410767ed25f-combined-ca-bundle\") pod \"keystone-db-sync-q25rr\" (UID: \"6a1d4ef8-03d9-42d8-ae0b-9410767ed25f\") " pod="openstack/keystone-db-sync-q25rr" Feb 17 14:27:10 crc kubenswrapper[4836]: I0217 14:27:10.289882 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-14cb-account-create-update-xw2dd"] Feb 17 14:27:10 crc kubenswrapper[4836]: I0217 14:27:10.292193 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-14cb-account-create-update-xw2dd" Feb 17 14:27:10 crc kubenswrapper[4836]: I0217 14:27:10.300759 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a1d4ef8-03d9-42d8-ae0b-9410767ed25f-config-data\") pod \"keystone-db-sync-q25rr\" (UID: \"6a1d4ef8-03d9-42d8-ae0b-9410767ed25f\") " pod="openstack/keystone-db-sync-q25rr" Feb 17 14:27:10 crc kubenswrapper[4836]: I0217 14:27:10.301141 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Feb 17 14:27:10 crc kubenswrapper[4836]: I0217 14:27:10.304570 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njnhf\" (UniqueName: \"kubernetes.io/projected/6a1d4ef8-03d9-42d8-ae0b-9410767ed25f-kube-api-access-njnhf\") pod \"keystone-db-sync-q25rr\" (UID: \"6a1d4ef8-03d9-42d8-ae0b-9410767ed25f\") " pod="openstack/keystone-db-sync-q25rr" Feb 17 14:27:10 crc kubenswrapper[4836]: I0217 14:27:10.308205 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqspw\" (UniqueName: \"kubernetes.io/projected/767841a7-db94-430a-b408-10e5bd0350e5-kube-api-access-bqspw\") pod \"barbican-652c-account-create-update-lswdv\" (UID: \"767841a7-db94-430a-b408-10e5bd0350e5\") " pod="openstack/barbican-652c-account-create-update-lswdv" Feb 17 14:27:10 crc kubenswrapper[4836]: I0217 14:27:10.322533 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-14cb-account-create-update-xw2dd"] Feb 17 14:27:10 crc kubenswrapper[4836]: I0217 14:27:10.368068 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d4ce1c7a-57e8-491e-84ab-8aed8baea37b-operator-scripts\") pod \"neutron-db-create-nwjd8\" (UID: \"d4ce1c7a-57e8-491e-84ab-8aed8baea37b\") " pod="openstack/neutron-db-create-nwjd8" Feb 17 14:27:10 crc kubenswrapper[4836]: I0217 14:27:10.368167 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxpsh\" (UniqueName: \"kubernetes.io/projected/623225aa-2492-494e-be5b-92acef6f23cf-kube-api-access-sxpsh\") pod \"neutron-14cb-account-create-update-xw2dd\" (UID: \"623225aa-2492-494e-be5b-92acef6f23cf\") " pod="openstack/neutron-14cb-account-create-update-xw2dd" Feb 17 14:27:10 crc kubenswrapper[4836]: I0217 14:27:10.368264 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2xc5\" (UniqueName: \"kubernetes.io/projected/d4ce1c7a-57e8-491e-84ab-8aed8baea37b-kube-api-access-h2xc5\") pod \"neutron-db-create-nwjd8\" (UID: \"d4ce1c7a-57e8-491e-84ab-8aed8baea37b\") " pod="openstack/neutron-db-create-nwjd8" Feb 17 14:27:10 crc kubenswrapper[4836]: I0217 14:27:10.368328 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/623225aa-2492-494e-be5b-92acef6f23cf-operator-scripts\") pod \"neutron-14cb-account-create-update-xw2dd\" (UID: \"623225aa-2492-494e-be5b-92acef6f23cf\") " pod="openstack/neutron-14cb-account-create-update-xw2dd" Feb 17 14:27:10 crc kubenswrapper[4836]: I0217 14:27:10.369402 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d4ce1c7a-57e8-491e-84ab-8aed8baea37b-operator-scripts\") pod \"neutron-db-create-nwjd8\" (UID: \"d4ce1c7a-57e8-491e-84ab-8aed8baea37b\") " pod="openstack/neutron-db-create-nwjd8" Feb 17 14:27:10 crc kubenswrapper[4836]: I0217 14:27:10.373676 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-69hk6" Feb 17 14:27:10 crc kubenswrapper[4836]: I0217 14:27:10.387923 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-2ea0-account-create-update-p7p99" Feb 17 14:27:10 crc kubenswrapper[4836]: I0217 14:27:10.393581 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2xc5\" (UniqueName: \"kubernetes.io/projected/d4ce1c7a-57e8-491e-84ab-8aed8baea37b-kube-api-access-h2xc5\") pod \"neutron-db-create-nwjd8\" (UID: \"d4ce1c7a-57e8-491e-84ab-8aed8baea37b\") " pod="openstack/neutron-db-create-nwjd8" Feb 17 14:27:10 crc kubenswrapper[4836]: I0217 14:27:10.439367 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-create-jjrp2" Feb 17 14:27:10 crc kubenswrapper[4836]: I0217 14:27:10.470732 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/623225aa-2492-494e-be5b-92acef6f23cf-operator-scripts\") pod \"neutron-14cb-account-create-update-xw2dd\" (UID: \"623225aa-2492-494e-be5b-92acef6f23cf\") " pod="openstack/neutron-14cb-account-create-update-xw2dd" Feb 17 14:27:10 crc kubenswrapper[4836]: I0217 14:27:10.470904 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxpsh\" (UniqueName: \"kubernetes.io/projected/623225aa-2492-494e-be5b-92acef6f23cf-kube-api-access-sxpsh\") pod \"neutron-14cb-account-create-update-xw2dd\" (UID: \"623225aa-2492-494e-be5b-92acef6f23cf\") " pod="openstack/neutron-14cb-account-create-update-xw2dd" Feb 17 14:27:10 crc kubenswrapper[4836]: I0217 14:27:10.471979 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/623225aa-2492-494e-be5b-92acef6f23cf-operator-scripts\") pod \"neutron-14cb-account-create-update-xw2dd\" (UID: \"623225aa-2492-494e-be5b-92acef6f23cf\") " pod="openstack/neutron-14cb-account-create-update-xw2dd" Feb 17 14:27:10 crc kubenswrapper[4836]: I0217 14:27:10.476143 4836 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="e29295a6-4250-4a7d-84c5-a8b3ddf96bd0" containerName="prometheus" probeResult="failure" output="Get \"http://10.217.0.114:9090/-/ready\": dial tcp 10.217.0.114:9090: connect: connection refused" Feb 17 14:27:10 crc kubenswrapper[4836]: I0217 14:27:10.492382 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxpsh\" (UniqueName: \"kubernetes.io/projected/623225aa-2492-494e-be5b-92acef6f23cf-kube-api-access-sxpsh\") pod \"neutron-14cb-account-create-update-xw2dd\" (UID: \"623225aa-2492-494e-be5b-92acef6f23cf\") " pod="openstack/neutron-14cb-account-create-update-xw2dd" Feb 17 14:27:10 crc kubenswrapper[4836]: I0217 14:27:10.494608 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-q25rr" Feb 17 14:27:10 crc kubenswrapper[4836]: I0217 14:27:10.551800 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-652c-account-create-update-lswdv" Feb 17 14:27:10 crc kubenswrapper[4836]: I0217 14:27:10.601634 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-nwjd8" Feb 17 14:27:10 crc kubenswrapper[4836]: I0217 14:27:10.703237 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-14cb-account-create-update-xw2dd" Feb 17 14:27:15 crc kubenswrapper[4836]: I0217 14:27:15.472655 4836 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="e29295a6-4250-4a7d-84c5-a8b3ddf96bd0" containerName="prometheus" probeResult="failure" output="Get \"http://10.217.0.114:9090/-/ready\": dial tcp 10.217.0.114:9090: connect: connection refused" Feb 17 14:27:15 crc kubenswrapper[4836]: I0217 14:27:15.473295 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Feb 17 14:27:16 crc kubenswrapper[4836]: E0217 14:27:16.760531 4836 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-glance-api:current-podified" Feb 17 14:27:16 crc kubenswrapper[4836]: E0217 14:27:16.760854 4836 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:glance-db-sync,Image:quay.io/podified-antelope-centos9/openstack-glance-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/glance/glance.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-grffb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42415,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42415,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-db-sync-z8g7x_openstack(df3a6cf1-bca0-45b2-9f7c-6d483452d49d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 14:27:16 crc kubenswrapper[4836]: E0217 14:27:16.762059 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/glance-db-sync-z8g7x" podUID="df3a6cf1-bca0-45b2-9f7c-6d483452d49d" Feb 17 14:27:16 crc kubenswrapper[4836]: I0217 14:27:16.835362 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ghk5k-config-tv4tb" Feb 17 14:27:16 crc kubenswrapper[4836]: I0217 14:27:16.963131 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xz7ht\" (UniqueName: \"kubernetes.io/projected/fb9f3dbb-ea37-4057-97c1-b93cbb39aaec-kube-api-access-xz7ht\") pod \"fb9f3dbb-ea37-4057-97c1-b93cbb39aaec\" (UID: \"fb9f3dbb-ea37-4057-97c1-b93cbb39aaec\") " Feb 17 14:27:16 crc kubenswrapper[4836]: I0217 14:27:16.963374 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/fb9f3dbb-ea37-4057-97c1-b93cbb39aaec-var-log-ovn\") pod \"fb9f3dbb-ea37-4057-97c1-b93cbb39aaec\" (UID: \"fb9f3dbb-ea37-4057-97c1-b93cbb39aaec\") " Feb 17 14:27:16 crc kubenswrapper[4836]: I0217 14:27:16.963442 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/fb9f3dbb-ea37-4057-97c1-b93cbb39aaec-var-run-ovn\") pod \"fb9f3dbb-ea37-4057-97c1-b93cbb39aaec\" (UID: \"fb9f3dbb-ea37-4057-97c1-b93cbb39aaec\") " Feb 17 14:27:16 crc kubenswrapper[4836]: I0217 14:27:16.963585 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/fb9f3dbb-ea37-4057-97c1-b93cbb39aaec-var-run\") pod \"fb9f3dbb-ea37-4057-97c1-b93cbb39aaec\" (UID: \"fb9f3dbb-ea37-4057-97c1-b93cbb39aaec\") " Feb 17 14:27:16 crc kubenswrapper[4836]: I0217 14:27:16.963676 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fb9f3dbb-ea37-4057-97c1-b93cbb39aaec-scripts\") pod \"fb9f3dbb-ea37-4057-97c1-b93cbb39aaec\" (UID: \"fb9f3dbb-ea37-4057-97c1-b93cbb39aaec\") " Feb 17 14:27:16 crc kubenswrapper[4836]: I0217 14:27:16.963730 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/fb9f3dbb-ea37-4057-97c1-b93cbb39aaec-additional-scripts\") pod \"fb9f3dbb-ea37-4057-97c1-b93cbb39aaec\" (UID: \"fb9f3dbb-ea37-4057-97c1-b93cbb39aaec\") " Feb 17 14:27:16 crc kubenswrapper[4836]: I0217 14:27:16.964522 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fb9f3dbb-ea37-4057-97c1-b93cbb39aaec-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "fb9f3dbb-ea37-4057-97c1-b93cbb39aaec" (UID: "fb9f3dbb-ea37-4057-97c1-b93cbb39aaec"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 14:27:16 crc kubenswrapper[4836]: I0217 14:27:16.964582 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fb9f3dbb-ea37-4057-97c1-b93cbb39aaec-var-run" (OuterVolumeSpecName: "var-run") pod "fb9f3dbb-ea37-4057-97c1-b93cbb39aaec" (UID: "fb9f3dbb-ea37-4057-97c1-b93cbb39aaec"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 14:27:16 crc kubenswrapper[4836]: I0217 14:27:16.964796 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fb9f3dbb-ea37-4057-97c1-b93cbb39aaec-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "fb9f3dbb-ea37-4057-97c1-b93cbb39aaec" (UID: "fb9f3dbb-ea37-4057-97c1-b93cbb39aaec"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 14:27:16 crc kubenswrapper[4836]: I0217 14:27:16.965872 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb9f3dbb-ea37-4057-97c1-b93cbb39aaec-scripts" (OuterVolumeSpecName: "scripts") pod "fb9f3dbb-ea37-4057-97c1-b93cbb39aaec" (UID: "fb9f3dbb-ea37-4057-97c1-b93cbb39aaec"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:27:16 crc kubenswrapper[4836]: I0217 14:27:16.966815 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb9f3dbb-ea37-4057-97c1-b93cbb39aaec-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "fb9f3dbb-ea37-4057-97c1-b93cbb39aaec" (UID: "fb9f3dbb-ea37-4057-97c1-b93cbb39aaec"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:27:16 crc kubenswrapper[4836]: I0217 14:27:16.970624 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb9f3dbb-ea37-4057-97c1-b93cbb39aaec-kube-api-access-xz7ht" (OuterVolumeSpecName: "kube-api-access-xz7ht") pod "fb9f3dbb-ea37-4057-97c1-b93cbb39aaec" (UID: "fb9f3dbb-ea37-4057-97c1-b93cbb39aaec"). InnerVolumeSpecName "kube-api-access-xz7ht". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:27:17 crc kubenswrapper[4836]: I0217 14:27:17.171823 4836 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/fb9f3dbb-ea37-4057-97c1-b93cbb39aaec-var-run\") on node \"crc\" DevicePath \"\"" Feb 17 14:27:17 crc kubenswrapper[4836]: I0217 14:27:17.171879 4836 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fb9f3dbb-ea37-4057-97c1-b93cbb39aaec-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 14:27:17 crc kubenswrapper[4836]: I0217 14:27:17.172862 4836 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/fb9f3dbb-ea37-4057-97c1-b93cbb39aaec-additional-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 14:27:17 crc kubenswrapper[4836]: I0217 14:27:17.172886 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xz7ht\" (UniqueName: \"kubernetes.io/projected/fb9f3dbb-ea37-4057-97c1-b93cbb39aaec-kube-api-access-xz7ht\") on node \"crc\" DevicePath \"\"" Feb 17 14:27:17 crc kubenswrapper[4836]: I0217 14:27:17.172897 4836 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/fb9f3dbb-ea37-4057-97c1-b93cbb39aaec-var-log-ovn\") on node \"crc\" DevicePath \"\"" Feb 17 14:27:17 crc kubenswrapper[4836]: I0217 14:27:17.172909 4836 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/fb9f3dbb-ea37-4057-97c1-b93cbb39aaec-var-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 17 14:27:17 crc kubenswrapper[4836]: I0217 14:27:17.375464 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ghk5k-config-tv4tb" Feb 17 14:27:17 crc kubenswrapper[4836]: I0217 14:27:17.375604 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ghk5k-config-tv4tb" event={"ID":"fb9f3dbb-ea37-4057-97c1-b93cbb39aaec","Type":"ContainerDied","Data":"c2bde1d5ba9adc61b0950993bd532e188bd3fc6a817496d9051110a296f9c5b2"} Feb 17 14:27:17 crc kubenswrapper[4836]: I0217 14:27:17.376370 4836 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c2bde1d5ba9adc61b0950993bd532e188bd3fc6a817496d9051110a296f9c5b2" Feb 17 14:27:17 crc kubenswrapper[4836]: E0217 14:27:17.376483 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-glance-api:current-podified\\\"\"" pod="openstack/glance-db-sync-z8g7x" podUID="df3a6cf1-bca0-45b2-9f7c-6d483452d49d" Feb 17 14:27:17 crc kubenswrapper[4836]: I0217 14:27:17.377101 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 17 14:27:17 crc kubenswrapper[4836]: I0217 14:27:17.424441 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e29295a6-4250-4a7d-84c5-a8b3ddf96bd0-thanos-prometheus-http-client-file\") pod \"e29295a6-4250-4a7d-84c5-a8b3ddf96bd0\" (UID: \"e29295a6-4250-4a7d-84c5-a8b3ddf96bd0\") " Feb 17 14:27:17 crc kubenswrapper[4836]: I0217 14:27:17.424639 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-93f26e02-6577-44e5-880e-5ede6b185735\") pod \"e29295a6-4250-4a7d-84c5-a8b3ddf96bd0\" (UID: \"e29295a6-4250-4a7d-84c5-a8b3ddf96bd0\") " Feb 17 14:27:17 crc kubenswrapper[4836]: I0217 14:27:17.424698 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e29295a6-4250-4a7d-84c5-a8b3ddf96bd0-tls-assets\") pod \"e29295a6-4250-4a7d-84c5-a8b3ddf96bd0\" (UID: \"e29295a6-4250-4a7d-84c5-a8b3ddf96bd0\") " Feb 17 14:27:17 crc kubenswrapper[4836]: I0217 14:27:17.424728 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t8z8l\" (UniqueName: \"kubernetes.io/projected/e29295a6-4250-4a7d-84c5-a8b3ddf96bd0-kube-api-access-t8z8l\") pod \"e29295a6-4250-4a7d-84c5-a8b3ddf96bd0\" (UID: \"e29295a6-4250-4a7d-84c5-a8b3ddf96bd0\") " Feb 17 14:27:17 crc kubenswrapper[4836]: I0217 14:27:17.424768 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e29295a6-4250-4a7d-84c5-a8b3ddf96bd0-web-config\") pod \"e29295a6-4250-4a7d-84c5-a8b3ddf96bd0\" (UID: \"e29295a6-4250-4a7d-84c5-a8b3ddf96bd0\") " Feb 17 14:27:17 crc kubenswrapper[4836]: I0217 14:27:17.424796 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/e29295a6-4250-4a7d-84c5-a8b3ddf96bd0-prometheus-metric-storage-rulefiles-2\") pod \"e29295a6-4250-4a7d-84c5-a8b3ddf96bd0\" (UID: \"e29295a6-4250-4a7d-84c5-a8b3ddf96bd0\") " Feb 17 14:27:17 crc kubenswrapper[4836]: I0217 14:27:17.424821 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e29295a6-4250-4a7d-84c5-a8b3ddf96bd0-config-out\") pod \"e29295a6-4250-4a7d-84c5-a8b3ddf96bd0\" (UID: \"e29295a6-4250-4a7d-84c5-a8b3ddf96bd0\") " Feb 17 14:27:17 crc kubenswrapper[4836]: I0217 14:27:17.424859 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e29295a6-4250-4a7d-84c5-a8b3ddf96bd0-config\") pod \"e29295a6-4250-4a7d-84c5-a8b3ddf96bd0\" (UID: \"e29295a6-4250-4a7d-84c5-a8b3ddf96bd0\") " Feb 17 14:27:17 crc kubenswrapper[4836]: I0217 14:27:17.424910 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e29295a6-4250-4a7d-84c5-a8b3ddf96bd0-prometheus-metric-storage-rulefiles-0\") pod \"e29295a6-4250-4a7d-84c5-a8b3ddf96bd0\" (UID: \"e29295a6-4250-4a7d-84c5-a8b3ddf96bd0\") " Feb 17 14:27:17 crc kubenswrapper[4836]: I0217 14:27:17.424964 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/e29295a6-4250-4a7d-84c5-a8b3ddf96bd0-prometheus-metric-storage-rulefiles-1\") pod \"e29295a6-4250-4a7d-84c5-a8b3ddf96bd0\" (UID: \"e29295a6-4250-4a7d-84c5-a8b3ddf96bd0\") " Feb 17 14:27:17 crc kubenswrapper[4836]: I0217 14:27:17.426041 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e29295a6-4250-4a7d-84c5-a8b3ddf96bd0-prometheus-metric-storage-rulefiles-2" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-2") pod "e29295a6-4250-4a7d-84c5-a8b3ddf96bd0" (UID: "e29295a6-4250-4a7d-84c5-a8b3ddf96bd0"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-2". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:27:17 crc kubenswrapper[4836]: I0217 14:27:17.426082 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e29295a6-4250-4a7d-84c5-a8b3ddf96bd0-prometheus-metric-storage-rulefiles-1" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-1") pod "e29295a6-4250-4a7d-84c5-a8b3ddf96bd0" (UID: "e29295a6-4250-4a7d-84c5-a8b3ddf96bd0"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:27:17 crc kubenswrapper[4836]: I0217 14:27:17.427600 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e29295a6-4250-4a7d-84c5-a8b3ddf96bd0-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "e29295a6-4250-4a7d-84c5-a8b3ddf96bd0" (UID: "e29295a6-4250-4a7d-84c5-a8b3ddf96bd0"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:27:17 crc kubenswrapper[4836]: I0217 14:27:17.435244 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e29295a6-4250-4a7d-84c5-a8b3ddf96bd0-config-out" (OuterVolumeSpecName: "config-out") pod "e29295a6-4250-4a7d-84c5-a8b3ddf96bd0" (UID: "e29295a6-4250-4a7d-84c5-a8b3ddf96bd0"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:27:17 crc kubenswrapper[4836]: I0217 14:27:17.435339 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e29295a6-4250-4a7d-84c5-a8b3ddf96bd0-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "e29295a6-4250-4a7d-84c5-a8b3ddf96bd0" (UID: "e29295a6-4250-4a7d-84c5-a8b3ddf96bd0"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:27:17 crc kubenswrapper[4836]: I0217 14:27:17.435455 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e29295a6-4250-4a7d-84c5-a8b3ddf96bd0-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "e29295a6-4250-4a7d-84c5-a8b3ddf96bd0" (UID: "e29295a6-4250-4a7d-84c5-a8b3ddf96bd0"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:27:17 crc kubenswrapper[4836]: I0217 14:27:17.445619 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e29295a6-4250-4a7d-84c5-a8b3ddf96bd0-config" (OuterVolumeSpecName: "config") pod "e29295a6-4250-4a7d-84c5-a8b3ddf96bd0" (UID: "e29295a6-4250-4a7d-84c5-a8b3ddf96bd0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:27:17 crc kubenswrapper[4836]: I0217 14:27:17.451399 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e29295a6-4250-4a7d-84c5-a8b3ddf96bd0-kube-api-access-t8z8l" (OuterVolumeSpecName: "kube-api-access-t8z8l") pod "e29295a6-4250-4a7d-84c5-a8b3ddf96bd0" (UID: "e29295a6-4250-4a7d-84c5-a8b3ddf96bd0"). InnerVolumeSpecName "kube-api-access-t8z8l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:27:17 crc kubenswrapper[4836]: I0217 14:27:17.487989 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-93f26e02-6577-44e5-880e-5ede6b185735" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "e29295a6-4250-4a7d-84c5-a8b3ddf96bd0" (UID: "e29295a6-4250-4a7d-84c5-a8b3ddf96bd0"). InnerVolumeSpecName "pvc-93f26e02-6577-44e5-880e-5ede6b185735". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 17 14:27:17 crc kubenswrapper[4836]: I0217 14:27:17.498230 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e29295a6-4250-4a7d-84c5-a8b3ddf96bd0-web-config" (OuterVolumeSpecName: "web-config") pod "e29295a6-4250-4a7d-84c5-a8b3ddf96bd0" (UID: "e29295a6-4250-4a7d-84c5-a8b3ddf96bd0"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:27:17 crc kubenswrapper[4836]: I0217 14:27:17.526881 4836 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e29295a6-4250-4a7d-84c5-a8b3ddf96bd0-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Feb 17 14:27:17 crc kubenswrapper[4836]: I0217 14:27:17.527471 4836 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-93f26e02-6577-44e5-880e-5ede6b185735\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-93f26e02-6577-44e5-880e-5ede6b185735\") on node \"crc\" " Feb 17 14:27:17 crc kubenswrapper[4836]: I0217 14:27:17.527500 4836 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e29295a6-4250-4a7d-84c5-a8b3ddf96bd0-tls-assets\") on node \"crc\" DevicePath \"\"" Feb 17 14:27:17 crc kubenswrapper[4836]: I0217 14:27:17.527517 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t8z8l\" (UniqueName: \"kubernetes.io/projected/e29295a6-4250-4a7d-84c5-a8b3ddf96bd0-kube-api-access-t8z8l\") on node \"crc\" DevicePath \"\"" Feb 17 14:27:17 crc kubenswrapper[4836]: I0217 14:27:17.527533 4836 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e29295a6-4250-4a7d-84c5-a8b3ddf96bd0-web-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:27:17 crc kubenswrapper[4836]: I0217 14:27:17.527545 4836 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/e29295a6-4250-4a7d-84c5-a8b3ddf96bd0-prometheus-metric-storage-rulefiles-2\") on node \"crc\" DevicePath \"\"" Feb 17 14:27:17 crc kubenswrapper[4836]: I0217 14:27:17.527557 4836 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e29295a6-4250-4a7d-84c5-a8b3ddf96bd0-config-out\") on node \"crc\" DevicePath \"\"" Feb 17 14:27:17 crc kubenswrapper[4836]: I0217 14:27:17.527573 4836 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/e29295a6-4250-4a7d-84c5-a8b3ddf96bd0-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:27:17 crc kubenswrapper[4836]: I0217 14:27:17.527585 4836 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e29295a6-4250-4a7d-84c5-a8b3ddf96bd0-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Feb 17 14:27:17 crc kubenswrapper[4836]: I0217 14:27:17.527597 4836 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/e29295a6-4250-4a7d-84c5-a8b3ddf96bd0-prometheus-metric-storage-rulefiles-1\") on node \"crc\" DevicePath \"\"" Feb 17 14:27:17 crc kubenswrapper[4836]: I0217 14:27:17.553681 4836 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 17 14:27:17 crc kubenswrapper[4836]: I0217 14:27:17.553981 4836 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-93f26e02-6577-44e5-880e-5ede6b185735" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-93f26e02-6577-44e5-880e-5ede6b185735") on node "crc" Feb 17 14:27:17 crc kubenswrapper[4836]: I0217 14:27:17.632553 4836 reconciler_common.go:293] "Volume detached for volume \"pvc-93f26e02-6577-44e5-880e-5ede6b185735\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-93f26e02-6577-44e5-880e-5ede6b185735\") on node \"crc\" DevicePath \"\"" Feb 17 14:27:17 crc kubenswrapper[4836]: E0217 14:27:17.815743 4836 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfb9f3dbb_ea37_4057_97c1_b93cbb39aaec.slice/crio-c2bde1d5ba9adc61b0950993bd532e188bd3fc6a817496d9051110a296f9c5b2\": RecentStats: unable to find data in memory cache]" Feb 17 14:27:17 crc kubenswrapper[4836]: I0217 14:27:17.838687 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-ingester-0" Feb 17 14:27:18 crc kubenswrapper[4836]: I0217 14:27:18.004775 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ghk5k-config-tv4tb"] Feb 17 14:27:18 crc kubenswrapper[4836]: I0217 14:27:18.341513 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-ghk5k-config-tv4tb"] Feb 17 14:27:18 crc kubenswrapper[4836]: W0217 14:27:18.414474 4836 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd4ce1c7a_57e8_491e_84ab_8aed8baea37b.slice/crio-c2e28b74f07f40dbbfa11b857fc086ce56671a9ec1d5525b52424bc04b43fc61 WatchSource:0}: Error finding container c2e28b74f07f40dbbfa11b857fc086ce56671a9ec1d5525b52424bc04b43fc61: Status 404 returned error can't find the container with id c2e28b74f07f40dbbfa11b857fc086ce56671a9ec1d5525b52424bc04b43fc61 Feb 17 14:27:18 crc kubenswrapper[4836]: W0217 14:27:18.418863 4836 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod623225aa_2492_494e_be5b_92acef6f23cf.slice/crio-f1efd21934a9bbf27db4f3919668653a876c21cef3f9f7e29e0c0ecf147efc4b WatchSource:0}: Error finding container f1efd21934a9bbf27db4f3919668653a876c21cef3f9f7e29e0c0ecf147efc4b: Status 404 returned error can't find the container with id f1efd21934a9bbf27db4f3919668653a876c21cef3f9f7e29e0c0ecf147efc4b Feb 17 14:27:18 crc kubenswrapper[4836]: I0217 14:27:18.423056 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-14cb-account-create-update-xw2dd"] Feb 17 14:27:18 crc kubenswrapper[4836]: I0217 14:27:18.425278 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e29295a6-4250-4a7d-84c5-a8b3ddf96bd0","Type":"ContainerDied","Data":"cefd70541e5e6c57648aaec13bc3ac8008ad32d2cca2fd2d95d8a18012223fb3"} Feb 17 14:27:18 crc kubenswrapper[4836]: I0217 14:27:18.425380 4836 scope.go:117] "RemoveContainer" containerID="839af704fe28aeff5f1ab20ca6e7c7a0fb25790fc5bc232fe9131c132f8e0bf9" Feb 17 14:27:18 crc kubenswrapper[4836]: I0217 14:27:18.425600 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 17 14:27:18 crc kubenswrapper[4836]: I0217 14:27:18.439744 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Feb 17 14:27:18 crc kubenswrapper[4836]: I0217 14:27:18.445181 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-nwjd8"] Feb 17 14:27:18 crc kubenswrapper[4836]: I0217 14:27:18.549469 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 17 14:27:18 crc kubenswrapper[4836]: I0217 14:27:18.607580 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb9f3dbb-ea37-4057-97c1-b93cbb39aaec" path="/var/lib/kubelet/pods/fb9f3dbb-ea37-4057-97c1-b93cbb39aaec/volumes" Feb 17 14:27:18 crc kubenswrapper[4836]: I0217 14:27:18.610089 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 17 14:27:18 crc kubenswrapper[4836]: I0217 14:27:18.610149 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 17 14:27:18 crc kubenswrapper[4836]: E0217 14:27:18.612479 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e29295a6-4250-4a7d-84c5-a8b3ddf96bd0" containerName="config-reloader" Feb 17 14:27:18 crc kubenswrapper[4836]: I0217 14:27:18.612516 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="e29295a6-4250-4a7d-84c5-a8b3ddf96bd0" containerName="config-reloader" Feb 17 14:27:18 crc kubenswrapper[4836]: E0217 14:27:18.612538 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e29295a6-4250-4a7d-84c5-a8b3ddf96bd0" containerName="prometheus" Feb 17 14:27:18 crc kubenswrapper[4836]: I0217 14:27:18.612546 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="e29295a6-4250-4a7d-84c5-a8b3ddf96bd0" containerName="prometheus" Feb 17 14:27:18 crc kubenswrapper[4836]: E0217 14:27:18.612574 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb9f3dbb-ea37-4057-97c1-b93cbb39aaec" containerName="ovn-config" Feb 17 14:27:18 crc kubenswrapper[4836]: I0217 14:27:18.612583 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb9f3dbb-ea37-4057-97c1-b93cbb39aaec" containerName="ovn-config" Feb 17 14:27:18 crc kubenswrapper[4836]: E0217 14:27:18.612759 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e29295a6-4250-4a7d-84c5-a8b3ddf96bd0" containerName="init-config-reloader" Feb 17 14:27:18 crc kubenswrapper[4836]: I0217 14:27:18.612777 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="e29295a6-4250-4a7d-84c5-a8b3ddf96bd0" containerName="init-config-reloader" Feb 17 14:27:18 crc kubenswrapper[4836]: E0217 14:27:18.612792 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e29295a6-4250-4a7d-84c5-a8b3ddf96bd0" containerName="thanos-sidecar" Feb 17 14:27:18 crc kubenswrapper[4836]: I0217 14:27:18.612801 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="e29295a6-4250-4a7d-84c5-a8b3ddf96bd0" containerName="thanos-sidecar" Feb 17 14:27:18 crc kubenswrapper[4836]: I0217 14:27:18.613171 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="e29295a6-4250-4a7d-84c5-a8b3ddf96bd0" containerName="config-reloader" Feb 17 14:27:18 crc kubenswrapper[4836]: I0217 14:27:18.613198 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="e29295a6-4250-4a7d-84c5-a8b3ddf96bd0" containerName="thanos-sidecar" Feb 17 14:27:18 crc kubenswrapper[4836]: I0217 14:27:18.613222 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="e29295a6-4250-4a7d-84c5-a8b3ddf96bd0" containerName="prometheus" Feb 17 14:27:18 crc kubenswrapper[4836]: I0217 14:27:18.613238 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb9f3dbb-ea37-4057-97c1-b93cbb39aaec" containerName="ovn-config" Feb 17 14:27:18 crc kubenswrapper[4836]: I0217 14:27:18.613354 4836 scope.go:117] "RemoveContainer" containerID="2634435ab0e106f5ce9041eacdd8794376187c382228fa8d9f52a71bd9ec4553" Feb 17 14:27:18 crc kubenswrapper[4836]: I0217 14:27:18.616326 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 17 14:27:18 crc kubenswrapper[4836]: I0217 14:27:18.627914 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Feb 17 14:27:18 crc kubenswrapper[4836]: I0217 14:27:18.627922 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Feb 17 14:27:18 crc kubenswrapper[4836]: I0217 14:27:18.628512 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Feb 17 14:27:18 crc kubenswrapper[4836]: I0217 14:27:18.628559 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Feb 17 14:27:18 crc kubenswrapper[4836]: I0217 14:27:18.628804 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-x7d2x" Feb 17 14:27:18 crc kubenswrapper[4836]: I0217 14:27:18.629043 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Feb 17 14:27:18 crc kubenswrapper[4836]: I0217 14:27:18.629185 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Feb 17 14:27:18 crc kubenswrapper[4836]: I0217 14:27:18.630651 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-metric-storage-prometheus-svc" Feb 17 14:27:18 crc kubenswrapper[4836]: I0217 14:27:18.633152 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Feb 17 14:27:18 crc kubenswrapper[4836]: I0217 14:27:18.634043 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 17 14:27:18 crc kubenswrapper[4836]: I0217 14:27:18.643654 4836 scope.go:117] "RemoveContainer" containerID="5de26698cc194f27aa6fa46281e03b3fa0bc2faa6bf0ef9b745f3fec33e05835" Feb 17 14:27:18 crc kubenswrapper[4836]: I0217 14:27:18.734705 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6fec8667-7189-4e29-8362-37dd935d2db7-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"6fec8667-7189-4e29-8362-37dd935d2db7\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:27:18 crc kubenswrapper[4836]: I0217 14:27:18.736490 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6fec8667-7189-4e29-8362-37dd935d2db7-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"6fec8667-7189-4e29-8362-37dd935d2db7\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:27:18 crc kubenswrapper[4836]: I0217 14:27:18.736611 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fec8667-7189-4e29-8362-37dd935d2db7-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"6fec8667-7189-4e29-8362-37dd935d2db7\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:27:18 crc kubenswrapper[4836]: I0217 14:27:18.736664 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/6fec8667-7189-4e29-8362-37dd935d2db7-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"6fec8667-7189-4e29-8362-37dd935d2db7\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:27:18 crc kubenswrapper[4836]: I0217 14:27:18.736731 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmvdd\" (UniqueName: \"kubernetes.io/projected/6fec8667-7189-4e29-8362-37dd935d2db7-kube-api-access-lmvdd\") pod \"prometheus-metric-storage-0\" (UID: \"6fec8667-7189-4e29-8362-37dd935d2db7\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:27:18 crc kubenswrapper[4836]: I0217 14:27:18.736758 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/6fec8667-7189-4e29-8362-37dd935d2db7-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"6fec8667-7189-4e29-8362-37dd935d2db7\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:27:18 crc kubenswrapper[4836]: I0217 14:27:18.736969 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/6fec8667-7189-4e29-8362-37dd935d2db7-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"6fec8667-7189-4e29-8362-37dd935d2db7\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:27:18 crc kubenswrapper[4836]: I0217 14:27:18.737032 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-93f26e02-6577-44e5-880e-5ede6b185735\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-93f26e02-6577-44e5-880e-5ede6b185735\") pod \"prometheus-metric-storage-0\" (UID: \"6fec8667-7189-4e29-8362-37dd935d2db7\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:27:18 crc kubenswrapper[4836]: I0217 14:27:18.737058 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/6fec8667-7189-4e29-8362-37dd935d2db7-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"6fec8667-7189-4e29-8362-37dd935d2db7\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:27:18 crc kubenswrapper[4836]: I0217 14:27:18.737154 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6fec8667-7189-4e29-8362-37dd935d2db7-config\") pod \"prometheus-metric-storage-0\" (UID: \"6fec8667-7189-4e29-8362-37dd935d2db7\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:27:18 crc kubenswrapper[4836]: I0217 14:27:18.737718 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6fec8667-7189-4e29-8362-37dd935d2db7-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"6fec8667-7189-4e29-8362-37dd935d2db7\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:27:18 crc kubenswrapper[4836]: I0217 14:27:18.737793 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/6fec8667-7189-4e29-8362-37dd935d2db7-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"6fec8667-7189-4e29-8362-37dd935d2db7\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:27:18 crc kubenswrapper[4836]: I0217 14:27:18.737834 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/6fec8667-7189-4e29-8362-37dd935d2db7-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"6fec8667-7189-4e29-8362-37dd935d2db7\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:27:18 crc kubenswrapper[4836]: I0217 14:27:18.754452 4836 scope.go:117] "RemoveContainer" containerID="1aeb38549c5093ddcbd19fe025e8df306afcc08ba355a33bcd16537686f0d989" Feb 17 14:27:18 crc kubenswrapper[4836]: I0217 14:27:18.839777 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/6fec8667-7189-4e29-8362-37dd935d2db7-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"6fec8667-7189-4e29-8362-37dd935d2db7\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:27:18 crc kubenswrapper[4836]: I0217 14:27:18.839846 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/6fec8667-7189-4e29-8362-37dd935d2db7-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"6fec8667-7189-4e29-8362-37dd935d2db7\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:27:18 crc kubenswrapper[4836]: I0217 14:27:18.839950 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6fec8667-7189-4e29-8362-37dd935d2db7-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"6fec8667-7189-4e29-8362-37dd935d2db7\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:27:18 crc kubenswrapper[4836]: I0217 14:27:18.840002 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6fec8667-7189-4e29-8362-37dd935d2db7-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"6fec8667-7189-4e29-8362-37dd935d2db7\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:27:18 crc kubenswrapper[4836]: I0217 14:27:18.840043 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fec8667-7189-4e29-8362-37dd935d2db7-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"6fec8667-7189-4e29-8362-37dd935d2db7\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:27:18 crc kubenswrapper[4836]: I0217 14:27:18.840077 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/6fec8667-7189-4e29-8362-37dd935d2db7-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"6fec8667-7189-4e29-8362-37dd935d2db7\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:27:18 crc kubenswrapper[4836]: I0217 14:27:18.840118 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmvdd\" (UniqueName: \"kubernetes.io/projected/6fec8667-7189-4e29-8362-37dd935d2db7-kube-api-access-lmvdd\") pod \"prometheus-metric-storage-0\" (UID: \"6fec8667-7189-4e29-8362-37dd935d2db7\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:27:18 crc kubenswrapper[4836]: I0217 14:27:18.840146 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/6fec8667-7189-4e29-8362-37dd935d2db7-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"6fec8667-7189-4e29-8362-37dd935d2db7\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:27:18 crc kubenswrapper[4836]: I0217 14:27:18.840207 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/6fec8667-7189-4e29-8362-37dd935d2db7-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"6fec8667-7189-4e29-8362-37dd935d2db7\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:27:18 crc kubenswrapper[4836]: I0217 14:27:18.840235 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-93f26e02-6577-44e5-880e-5ede6b185735\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-93f26e02-6577-44e5-880e-5ede6b185735\") pod \"prometheus-metric-storage-0\" (UID: \"6fec8667-7189-4e29-8362-37dd935d2db7\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:27:18 crc kubenswrapper[4836]: I0217 14:27:18.840277 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/6fec8667-7189-4e29-8362-37dd935d2db7-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"6fec8667-7189-4e29-8362-37dd935d2db7\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:27:18 crc kubenswrapper[4836]: I0217 14:27:18.840341 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6fec8667-7189-4e29-8362-37dd935d2db7-config\") pod \"prometheus-metric-storage-0\" (UID: \"6fec8667-7189-4e29-8362-37dd935d2db7\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:27:18 crc kubenswrapper[4836]: I0217 14:27:18.840360 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6fec8667-7189-4e29-8362-37dd935d2db7-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"6fec8667-7189-4e29-8362-37dd935d2db7\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:27:18 crc kubenswrapper[4836]: I0217 14:27:18.845352 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/6fec8667-7189-4e29-8362-37dd935d2db7-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"6fec8667-7189-4e29-8362-37dd935d2db7\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:27:18 crc kubenswrapper[4836]: I0217 14:27:18.845752 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/6fec8667-7189-4e29-8362-37dd935d2db7-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"6fec8667-7189-4e29-8362-37dd935d2db7\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:27:18 crc kubenswrapper[4836]: I0217 14:27:18.847029 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/6fec8667-7189-4e29-8362-37dd935d2db7-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"6fec8667-7189-4e29-8362-37dd935d2db7\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:27:18 crc kubenswrapper[4836]: I0217 14:27:18.849030 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fec8667-7189-4e29-8362-37dd935d2db7-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"6fec8667-7189-4e29-8362-37dd935d2db7\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:27:18 crc kubenswrapper[4836]: I0217 14:27:18.850983 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/6fec8667-7189-4e29-8362-37dd935d2db7-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"6fec8667-7189-4e29-8362-37dd935d2db7\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:27:18 crc kubenswrapper[4836]: I0217 14:27:18.853431 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6fec8667-7189-4e29-8362-37dd935d2db7-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"6fec8667-7189-4e29-8362-37dd935d2db7\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:27:18 crc kubenswrapper[4836]: I0217 14:27:18.855030 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6fec8667-7189-4e29-8362-37dd935d2db7-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"6fec8667-7189-4e29-8362-37dd935d2db7\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:27:18 crc kubenswrapper[4836]: I0217 14:27:18.856133 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/6fec8667-7189-4e29-8362-37dd935d2db7-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"6fec8667-7189-4e29-8362-37dd935d2db7\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:27:18 crc kubenswrapper[4836]: I0217 14:27:18.857379 4836 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 17 14:27:18 crc kubenswrapper[4836]: I0217 14:27:18.857492 4836 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-93f26e02-6577-44e5-880e-5ede6b185735\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-93f26e02-6577-44e5-880e-5ede6b185735\") pod \"prometheus-metric-storage-0\" (UID: \"6fec8667-7189-4e29-8362-37dd935d2db7\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/94da064c7e93eda9403c837c8900dc0ec43041d0305170815d7b87148c388206/globalmount\"" pod="openstack/prometheus-metric-storage-0" Feb 17 14:27:18 crc kubenswrapper[4836]: I0217 14:27:18.859520 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6fec8667-7189-4e29-8362-37dd935d2db7-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"6fec8667-7189-4e29-8362-37dd935d2db7\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:27:18 crc kubenswrapper[4836]: I0217 14:27:18.864248 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/6fec8667-7189-4e29-8362-37dd935d2db7-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"6fec8667-7189-4e29-8362-37dd935d2db7\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:27:18 crc kubenswrapper[4836]: I0217 14:27:18.872865 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/6fec8667-7189-4e29-8362-37dd935d2db7-config\") pod \"prometheus-metric-storage-0\" (UID: \"6fec8667-7189-4e29-8362-37dd935d2db7\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:27:18 crc kubenswrapper[4836]: I0217 14:27:18.889511 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmvdd\" (UniqueName: \"kubernetes.io/projected/6fec8667-7189-4e29-8362-37dd935d2db7-kube-api-access-lmvdd\") pod \"prometheus-metric-storage-0\" (UID: \"6fec8667-7189-4e29-8362-37dd935d2db7\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:27:18 crc kubenswrapper[4836]: I0217 14:27:18.922467 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Feb 17 14:27:18 crc kubenswrapper[4836]: W0217 14:27:18.932076 4836 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf9ee15e8_6695_454f_83ad_d54176458497.slice/crio-f0bdbf6a269741552be971089824b352c183b18e469724857214ea40b421f6ba WatchSource:0}: Error finding container f0bdbf6a269741552be971089824b352c183b18e469724857214ea40b421f6ba: Status 404 returned error can't find the container with id f0bdbf6a269741552be971089824b352c183b18e469724857214ea40b421f6ba Feb 17 14:27:18 crc kubenswrapper[4836]: I0217 14:27:18.932433 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-2ea0-account-create-update-p7p99"] Feb 17 14:27:18 crc kubenswrapper[4836]: I0217 14:27:18.978397 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-93f26e02-6577-44e5-880e-5ede6b185735\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-93f26e02-6577-44e5-880e-5ede6b185735\") pod \"prometheus-metric-storage-0\" (UID: \"6fec8667-7189-4e29-8362-37dd935d2db7\") " pod="openstack/prometheus-metric-storage-0" Feb 17 14:27:18 crc kubenswrapper[4836]: W0217 14:27:18.997092 4836 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda1fe36f3_d6b6_44e0_b85b_6def754fd08e.slice/crio-17df418fd624b8478184e0378c99426f8f65d89a01b5f290b6361a3a1e8ae4b1 WatchSource:0}: Error finding container 17df418fd624b8478184e0378c99426f8f65d89a01b5f290b6361a3a1e8ae4b1: Status 404 returned error can't find the container with id 17df418fd624b8478184e0378c99426f8f65d89a01b5f290b6361a3a1e8ae4b1 Feb 17 14:27:19 crc kubenswrapper[4836]: I0217 14:27:19.013132 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-db-secret" Feb 17 14:27:19 crc kubenswrapper[4836]: I0217 14:27:19.013397 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Feb 17 14:27:19 crc kubenswrapper[4836]: I0217 14:27:19.017608 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-652c-account-create-update-lswdv"] Feb 17 14:27:19 crc kubenswrapper[4836]: I0217 14:27:19.030801 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 17 14:27:19 crc kubenswrapper[4836]: W0217 14:27:19.031963 4836 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6a1d4ef8_03d9_42d8_ae0b_9410767ed25f.slice/crio-d0ca56828b8b775526488a0852f84f24a4af21e12c68a30c67a55c19e97b65de WatchSource:0}: Error finding container d0ca56828b8b775526488a0852f84f24a4af21e12c68a30c67a55c19e97b65de: Status 404 returned error can't find the container with id d0ca56828b8b775526488a0852f84f24a4af21e12c68a30c67a55c19e97b65de Feb 17 14:27:19 crc kubenswrapper[4836]: I0217 14:27:19.069756 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-0d11-account-create-update-jf72z"] Feb 17 14:27:19 crc kubenswrapper[4836]: I0217 14:27:19.110639 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-db-create-jjrp2"] Feb 17 14:27:19 crc kubenswrapper[4836]: I0217 14:27:19.124838 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-69hk6"] Feb 17 14:27:19 crc kubenswrapper[4836]: I0217 14:27:19.144163 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-q25rr"] Feb 17 14:27:19 crc kubenswrapper[4836]: I0217 14:27:19.154365 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-w5qdk"] Feb 17 14:27:19 crc kubenswrapper[4836]: I0217 14:27:19.212078 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Feb 17 14:27:19 crc kubenswrapper[4836]: W0217 14:27:19.317180 4836 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode482046c_502a_4f41_b013_7b3ef1c71ee1.slice/crio-bdd974c7e983ba9188d62382b19b6a16428ca529e476c2ae048d286d85f2cce3 WatchSource:0}: Error finding container bdd974c7e983ba9188d62382b19b6a16428ca529e476c2ae048d286d85f2cce3: Status 404 returned error can't find the container with id bdd974c7e983ba9188d62382b19b6a16428ca529e476c2ae048d286d85f2cce3 Feb 17 14:27:19 crc kubenswrapper[4836]: I0217 14:27:19.651625 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-2ea0-account-create-update-p7p99" event={"ID":"2ee1a0f2-86df-4f97-957a-22bbd7da4505","Type":"ContainerStarted","Data":"fbea486a2a4fe13c5e5757c175c46cee2b4c46bf0588a5aa9f5c9b50a17e6502"} Feb 17 14:27:19 crc kubenswrapper[4836]: I0217 14:27:19.663747 4836 generic.go:334] "Generic (PLEG): container finished" podID="623225aa-2492-494e-be5b-92acef6f23cf" containerID="b3fd8198bda32089f8d16c7005023bc9355442a69582a28217b4faa19a58edfd" exitCode=0 Feb 17 14:27:19 crc kubenswrapper[4836]: I0217 14:27:19.664024 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-14cb-account-create-update-xw2dd" event={"ID":"623225aa-2492-494e-be5b-92acef6f23cf","Type":"ContainerDied","Data":"b3fd8198bda32089f8d16c7005023bc9355442a69582a28217b4faa19a58edfd"} Feb 17 14:27:19 crc kubenswrapper[4836]: I0217 14:27:19.664115 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-14cb-account-create-update-xw2dd" event={"ID":"623225aa-2492-494e-be5b-92acef6f23cf","Type":"ContainerStarted","Data":"f1efd21934a9bbf27db4f3919668653a876c21cef3f9f7e29e0c0ecf147efc4b"} Feb 17 14:27:19 crc kubenswrapper[4836]: I0217 14:27:19.689776 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"e482046c-502a-4f41-b013-7b3ef1c71ee1","Type":"ContainerStarted","Data":"bdd974c7e983ba9188d62382b19b6a16428ca529e476c2ae048d286d85f2cce3"} Feb 17 14:27:19 crc kubenswrapper[4836]: I0217 14:27:19.702039 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-create-jjrp2" event={"ID":"a1fe36f3-d6b6-44e0-b85b-6def754fd08e","Type":"ContainerStarted","Data":"17df418fd624b8478184e0378c99426f8f65d89a01b5f290b6361a3a1e8ae4b1"} Feb 17 14:27:19 crc kubenswrapper[4836]: I0217 14:27:19.707762 4836 generic.go:334] "Generic (PLEG): container finished" podID="d4ce1c7a-57e8-491e-84ab-8aed8baea37b" containerID="7e6f04d96e5a077df5020259f367870723b0f91e790c0b81e936bf2cbc3790f9" exitCode=0 Feb 17 14:27:19 crc kubenswrapper[4836]: I0217 14:27:19.707864 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-nwjd8" event={"ID":"d4ce1c7a-57e8-491e-84ab-8aed8baea37b","Type":"ContainerDied","Data":"7e6f04d96e5a077df5020259f367870723b0f91e790c0b81e936bf2cbc3790f9"} Feb 17 14:27:19 crc kubenswrapper[4836]: I0217 14:27:19.707906 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-nwjd8" event={"ID":"d4ce1c7a-57e8-491e-84ab-8aed8baea37b","Type":"ContainerStarted","Data":"c2e28b74f07f40dbbfa11b857fc086ce56671a9ec1d5525b52424bc04b43fc61"} Feb 17 14:27:19 crc kubenswrapper[4836]: I0217 14:27:19.739657 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-q25rr" event={"ID":"6a1d4ef8-03d9-42d8-ae0b-9410767ed25f","Type":"ContainerStarted","Data":"d0ca56828b8b775526488a0852f84f24a4af21e12c68a30c67a55c19e97b65de"} Feb 17 14:27:19 crc kubenswrapper[4836]: I0217 14:27:19.759763 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-0d11-account-create-update-jf72z" event={"ID":"f9ee15e8-6695-454f-83ad-d54176458497","Type":"ContainerStarted","Data":"f0bdbf6a269741552be971089824b352c183b18e469724857214ea40b421f6ba"} Feb 17 14:27:19 crc kubenswrapper[4836]: I0217 14:27:19.785160 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-w5qdk" event={"ID":"eb354e85-311d-40bb-ae4a-5c535d4d89b9","Type":"ContainerStarted","Data":"e496c31a00d09e5ed74cd58d9920320d6ca8639e2eda3e165b272a2eff9d6bd6"} Feb 17 14:27:19 crc kubenswrapper[4836]: I0217 14:27:19.786956 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-69hk6" event={"ID":"4edeb89f-0bd9-466e-a9f9-2d45575d2c72","Type":"ContainerStarted","Data":"d3d6bb45c56fb523eb76b17cc800f28e1531da5278e29bfe6d07de89f3199e47"} Feb 17 14:27:19 crc kubenswrapper[4836]: I0217 14:27:19.804148 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-652c-account-create-update-lswdv" event={"ID":"767841a7-db94-430a-b408-10e5bd0350e5","Type":"ContainerStarted","Data":"7e8ac5cbf4b170d941ee6315c12ea589e4cbab28df2a33a941f9c1feb21af48e"} Feb 17 14:27:19 crc kubenswrapper[4836]: I0217 14:27:19.816241 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 17 14:27:20 crc kubenswrapper[4836]: I0217 14:27:20.589686 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e29295a6-4250-4a7d-84c5-a8b3ddf96bd0" path="/var/lib/kubelet/pods/e29295a6-4250-4a7d-84c5-a8b3ddf96bd0/volumes" Feb 17 14:27:20 crc kubenswrapper[4836]: I0217 14:27:20.836490 4836 generic.go:334] "Generic (PLEG): container finished" podID="4edeb89f-0bd9-466e-a9f9-2d45575d2c72" containerID="3ae7c112e0518db5ada6508ad8c57217e914b3d3401ff927d4aa18b2e2dd9f79" exitCode=0 Feb 17 14:27:20 crc kubenswrapper[4836]: I0217 14:27:20.837811 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-69hk6" event={"ID":"4edeb89f-0bd9-466e-a9f9-2d45575d2c72","Type":"ContainerDied","Data":"3ae7c112e0518db5ada6508ad8c57217e914b3d3401ff927d4aa18b2e2dd9f79"} Feb 17 14:27:20 crc kubenswrapper[4836]: I0217 14:27:20.843545 4836 generic.go:334] "Generic (PLEG): container finished" podID="a1fe36f3-d6b6-44e0-b85b-6def754fd08e" containerID="86d009aabc2aafe94768037f28b03b96d85141a639669b82cdbd2fa653d9696d" exitCode=0 Feb 17 14:27:20 crc kubenswrapper[4836]: I0217 14:27:20.843677 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-create-jjrp2" event={"ID":"a1fe36f3-d6b6-44e0-b85b-6def754fd08e","Type":"ContainerDied","Data":"86d009aabc2aafe94768037f28b03b96d85141a639669b82cdbd2fa653d9696d"} Feb 17 14:27:20 crc kubenswrapper[4836]: I0217 14:27:20.846832 4836 generic.go:334] "Generic (PLEG): container finished" podID="767841a7-db94-430a-b408-10e5bd0350e5" containerID="5e36e16a50074efc0038c12585afeefa45bc968423f053fecc01a7a460fc9fd3" exitCode=0 Feb 17 14:27:20 crc kubenswrapper[4836]: I0217 14:27:20.847039 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-652c-account-create-update-lswdv" event={"ID":"767841a7-db94-430a-b408-10e5bd0350e5","Type":"ContainerDied","Data":"5e36e16a50074efc0038c12585afeefa45bc968423f053fecc01a7a460fc9fd3"} Feb 17 14:27:20 crc kubenswrapper[4836]: I0217 14:27:20.850020 4836 generic.go:334] "Generic (PLEG): container finished" podID="2ee1a0f2-86df-4f97-957a-22bbd7da4505" containerID="e3b5cb6d26fdb2e586683ff31b8abe63df8d533a376c42dd280747ab5e165f5e" exitCode=0 Feb 17 14:27:20 crc kubenswrapper[4836]: I0217 14:27:20.850115 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-2ea0-account-create-update-p7p99" event={"ID":"2ee1a0f2-86df-4f97-957a-22bbd7da4505","Type":"ContainerDied","Data":"e3b5cb6d26fdb2e586683ff31b8abe63df8d533a376c42dd280747ab5e165f5e"} Feb 17 14:27:20 crc kubenswrapper[4836]: I0217 14:27:20.852414 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"6fec8667-7189-4e29-8362-37dd935d2db7","Type":"ContainerStarted","Data":"bb66f128cce682cdff9affc9b52d41d1a5e4fb7196fde6c011efbd2fe8f4b847"} Feb 17 14:27:20 crc kubenswrapper[4836]: I0217 14:27:20.863257 4836 generic.go:334] "Generic (PLEG): container finished" podID="f9ee15e8-6695-454f-83ad-d54176458497" containerID="7f08e0024064e8fd1c473afb57d745eb10366b72696b8824621db71657c54472" exitCode=0 Feb 17 14:27:20 crc kubenswrapper[4836]: I0217 14:27:20.863557 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-0d11-account-create-update-jf72z" event={"ID":"f9ee15e8-6695-454f-83ad-d54176458497","Type":"ContainerDied","Data":"7f08e0024064e8fd1c473afb57d745eb10366b72696b8824621db71657c54472"} Feb 17 14:27:20 crc kubenswrapper[4836]: I0217 14:27:20.877527 4836 generic.go:334] "Generic (PLEG): container finished" podID="eb354e85-311d-40bb-ae4a-5c535d4d89b9" containerID="0112cdba6fc4f4acf8102f48cb77deaeb49a0b5c8b49e3c6adcdb559d7e100b6" exitCode=0 Feb 17 14:27:20 crc kubenswrapper[4836]: I0217 14:27:20.877843 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-w5qdk" event={"ID":"eb354e85-311d-40bb-ae4a-5c535d4d89b9","Type":"ContainerDied","Data":"0112cdba6fc4f4acf8102f48cb77deaeb49a0b5c8b49e3c6adcdb559d7e100b6"} Feb 17 14:27:21 crc kubenswrapper[4836]: I0217 14:27:21.521362 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-14cb-account-create-update-xw2dd" Feb 17 14:27:21 crc kubenswrapper[4836]: I0217 14:27:21.534215 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-nwjd8" Feb 17 14:27:21 crc kubenswrapper[4836]: I0217 14:27:21.624729 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sxpsh\" (UniqueName: \"kubernetes.io/projected/623225aa-2492-494e-be5b-92acef6f23cf-kube-api-access-sxpsh\") pod \"623225aa-2492-494e-be5b-92acef6f23cf\" (UID: \"623225aa-2492-494e-be5b-92acef6f23cf\") " Feb 17 14:27:21 crc kubenswrapper[4836]: I0217 14:27:21.625013 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/623225aa-2492-494e-be5b-92acef6f23cf-operator-scripts\") pod \"623225aa-2492-494e-be5b-92acef6f23cf\" (UID: \"623225aa-2492-494e-be5b-92acef6f23cf\") " Feb 17 14:27:21 crc kubenswrapper[4836]: I0217 14:27:21.625650 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/623225aa-2492-494e-be5b-92acef6f23cf-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "623225aa-2492-494e-be5b-92acef6f23cf" (UID: "623225aa-2492-494e-be5b-92acef6f23cf"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:27:21 crc kubenswrapper[4836]: I0217 14:27:21.626046 4836 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/623225aa-2492-494e-be5b-92acef6f23cf-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 14:27:21 crc kubenswrapper[4836]: I0217 14:27:21.630684 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/623225aa-2492-494e-be5b-92acef6f23cf-kube-api-access-sxpsh" (OuterVolumeSpecName: "kube-api-access-sxpsh") pod "623225aa-2492-494e-be5b-92acef6f23cf" (UID: "623225aa-2492-494e-be5b-92acef6f23cf"). InnerVolumeSpecName "kube-api-access-sxpsh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:27:21 crc kubenswrapper[4836]: I0217 14:27:21.727535 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d4ce1c7a-57e8-491e-84ab-8aed8baea37b-operator-scripts\") pod \"d4ce1c7a-57e8-491e-84ab-8aed8baea37b\" (UID: \"d4ce1c7a-57e8-491e-84ab-8aed8baea37b\") " Feb 17 14:27:21 crc kubenswrapper[4836]: I0217 14:27:21.727680 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h2xc5\" (UniqueName: \"kubernetes.io/projected/d4ce1c7a-57e8-491e-84ab-8aed8baea37b-kube-api-access-h2xc5\") pod \"d4ce1c7a-57e8-491e-84ab-8aed8baea37b\" (UID: \"d4ce1c7a-57e8-491e-84ab-8aed8baea37b\") " Feb 17 14:27:21 crc kubenswrapper[4836]: I0217 14:27:21.728150 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sxpsh\" (UniqueName: \"kubernetes.io/projected/623225aa-2492-494e-be5b-92acef6f23cf-kube-api-access-sxpsh\") on node \"crc\" DevicePath \"\"" Feb 17 14:27:21 crc kubenswrapper[4836]: I0217 14:27:21.731512 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4ce1c7a-57e8-491e-84ab-8aed8baea37b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d4ce1c7a-57e8-491e-84ab-8aed8baea37b" (UID: "d4ce1c7a-57e8-491e-84ab-8aed8baea37b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:27:21 crc kubenswrapper[4836]: I0217 14:27:21.734078 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4ce1c7a-57e8-491e-84ab-8aed8baea37b-kube-api-access-h2xc5" (OuterVolumeSpecName: "kube-api-access-h2xc5") pod "d4ce1c7a-57e8-491e-84ab-8aed8baea37b" (UID: "d4ce1c7a-57e8-491e-84ab-8aed8baea37b"). InnerVolumeSpecName "kube-api-access-h2xc5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:27:21 crc kubenswrapper[4836]: I0217 14:27:21.830995 4836 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d4ce1c7a-57e8-491e-84ab-8aed8baea37b-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 14:27:21 crc kubenswrapper[4836]: I0217 14:27:21.831055 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h2xc5\" (UniqueName: \"kubernetes.io/projected/d4ce1c7a-57e8-491e-84ab-8aed8baea37b-kube-api-access-h2xc5\") on node \"crc\" DevicePath \"\"" Feb 17 14:27:21 crc kubenswrapper[4836]: I0217 14:27:21.893896 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-nwjd8" event={"ID":"d4ce1c7a-57e8-491e-84ab-8aed8baea37b","Type":"ContainerDied","Data":"c2e28b74f07f40dbbfa11b857fc086ce56671a9ec1d5525b52424bc04b43fc61"} Feb 17 14:27:21 crc kubenswrapper[4836]: I0217 14:27:21.893952 4836 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c2e28b74f07f40dbbfa11b857fc086ce56671a9ec1d5525b52424bc04b43fc61" Feb 17 14:27:21 crc kubenswrapper[4836]: I0217 14:27:21.894025 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-nwjd8" Feb 17 14:27:21 crc kubenswrapper[4836]: I0217 14:27:21.895558 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-14cb-account-create-update-xw2dd" event={"ID":"623225aa-2492-494e-be5b-92acef6f23cf","Type":"ContainerDied","Data":"f1efd21934a9bbf27db4f3919668653a876c21cef3f9f7e29e0c0ecf147efc4b"} Feb 17 14:27:21 crc kubenswrapper[4836]: I0217 14:27:21.895583 4836 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f1efd21934a9bbf27db4f3919668653a876c21cef3f9f7e29e0c0ecf147efc4b" Feb 17 14:27:21 crc kubenswrapper[4836]: I0217 14:27:21.895659 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-14cb-account-create-update-xw2dd" Feb 17 14:27:21 crc kubenswrapper[4836]: I0217 14:27:21.904325 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"e482046c-502a-4f41-b013-7b3ef1c71ee1","Type":"ContainerStarted","Data":"a711cf2d892c351da83705eaa0cc64eb3b3425e6beec5a7dae46e099c405eacd"} Feb 17 14:27:23 crc kubenswrapper[4836]: I0217 14:27:23.929724 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"e482046c-502a-4f41-b013-7b3ef1c71ee1","Type":"ContainerStarted","Data":"6589c0845f92c6a5bdd2d7f0de1decf41fd63691cfcde22131a6bea30b14f06a"} Feb 17 14:27:24 crc kubenswrapper[4836]: I0217 14:27:24.955495 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"6fec8667-7189-4e29-8362-37dd935d2db7","Type":"ContainerStarted","Data":"a82e37c7eb14ee548654e466a1de02d0ef7f18f1bf7fd37d772effc7cc961f91"} Feb 17 14:27:27 crc kubenswrapper[4836]: I0217 14:27:27.992781 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-create-jjrp2" event={"ID":"a1fe36f3-d6b6-44e0-b85b-6def754fd08e","Type":"ContainerDied","Data":"17df418fd624b8478184e0378c99426f8f65d89a01b5f290b6361a3a1e8ae4b1"} Feb 17 14:27:27 crc kubenswrapper[4836]: I0217 14:27:27.993791 4836 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="17df418fd624b8478184e0378c99426f8f65d89a01b5f290b6361a3a1e8ae4b1" Feb 17 14:27:27 crc kubenswrapper[4836]: I0217 14:27:27.995714 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-652c-account-create-update-lswdv" event={"ID":"767841a7-db94-430a-b408-10e5bd0350e5","Type":"ContainerDied","Data":"7e8ac5cbf4b170d941ee6315c12ea589e4cbab28df2a33a941f9c1feb21af48e"} Feb 17 14:27:27 crc kubenswrapper[4836]: I0217 14:27:27.995746 4836 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7e8ac5cbf4b170d941ee6315c12ea589e4cbab28df2a33a941f9c1feb21af48e" Feb 17 14:27:28 crc kubenswrapper[4836]: I0217 14:27:28.000006 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-2ea0-account-create-update-p7p99" event={"ID":"2ee1a0f2-86df-4f97-957a-22bbd7da4505","Type":"ContainerDied","Data":"fbea486a2a4fe13c5e5757c175c46cee2b4c46bf0588a5aa9f5c9b50a17e6502"} Feb 17 14:27:28 crc kubenswrapper[4836]: I0217 14:27:28.000038 4836 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fbea486a2a4fe13c5e5757c175c46cee2b4c46bf0588a5aa9f5c9b50a17e6502" Feb 17 14:27:28 crc kubenswrapper[4836]: I0217 14:27:28.001478 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-0d11-account-create-update-jf72z" event={"ID":"f9ee15e8-6695-454f-83ad-d54176458497","Type":"ContainerDied","Data":"f0bdbf6a269741552be971089824b352c183b18e469724857214ea40b421f6ba"} Feb 17 14:27:28 crc kubenswrapper[4836]: I0217 14:27:28.001509 4836 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f0bdbf6a269741552be971089824b352c183b18e469724857214ea40b421f6ba" Feb 17 14:27:28 crc kubenswrapper[4836]: I0217 14:27:28.003332 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-w5qdk" event={"ID":"eb354e85-311d-40bb-ae4a-5c535d4d89b9","Type":"ContainerDied","Data":"e496c31a00d09e5ed74cd58d9920320d6ca8639e2eda3e165b272a2eff9d6bd6"} Feb 17 14:27:28 crc kubenswrapper[4836]: I0217 14:27:28.003367 4836 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e496c31a00d09e5ed74cd58d9920320d6ca8639e2eda3e165b272a2eff9d6bd6" Feb 17 14:27:28 crc kubenswrapper[4836]: I0217 14:27:28.005018 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-69hk6" event={"ID":"4edeb89f-0bd9-466e-a9f9-2d45575d2c72","Type":"ContainerDied","Data":"d3d6bb45c56fb523eb76b17cc800f28e1531da5278e29bfe6d07de89f3199e47"} Feb 17 14:27:28 crc kubenswrapper[4836]: I0217 14:27:28.005046 4836 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d3d6bb45c56fb523eb76b17cc800f28e1531da5278e29bfe6d07de89f3199e47" Feb 17 14:27:28 crc kubenswrapper[4836]: I0217 14:27:28.019584 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-0d11-account-create-update-jf72z" Feb 17 14:27:28 crc kubenswrapper[4836]: I0217 14:27:28.051696 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-69hk6" Feb 17 14:27:28 crc kubenswrapper[4836]: I0217 14:27:28.066377 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-w5qdk" Feb 17 14:27:28 crc kubenswrapper[4836]: I0217 14:27:28.070274 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zm6jm\" (UniqueName: \"kubernetes.io/projected/4edeb89f-0bd9-466e-a9f9-2d45575d2c72-kube-api-access-zm6jm\") pod \"4edeb89f-0bd9-466e-a9f9-2d45575d2c72\" (UID: \"4edeb89f-0bd9-466e-a9f9-2d45575d2c72\") " Feb 17 14:27:28 crc kubenswrapper[4836]: I0217 14:27:28.070358 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mwcxc\" (UniqueName: \"kubernetes.io/projected/f9ee15e8-6695-454f-83ad-d54176458497-kube-api-access-mwcxc\") pod \"f9ee15e8-6695-454f-83ad-d54176458497\" (UID: \"f9ee15e8-6695-454f-83ad-d54176458497\") " Feb 17 14:27:28 crc kubenswrapper[4836]: I0217 14:27:28.070453 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9ee15e8-6695-454f-83ad-d54176458497-operator-scripts\") pod \"f9ee15e8-6695-454f-83ad-d54176458497\" (UID: \"f9ee15e8-6695-454f-83ad-d54176458497\") " Feb 17 14:27:28 crc kubenswrapper[4836]: I0217 14:27:28.070497 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4edeb89f-0bd9-466e-a9f9-2d45575d2c72-operator-scripts\") pod \"4edeb89f-0bd9-466e-a9f9-2d45575d2c72\" (UID: \"4edeb89f-0bd9-466e-a9f9-2d45575d2c72\") " Feb 17 14:27:28 crc kubenswrapper[4836]: I0217 14:27:28.070547 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q65k4\" (UniqueName: \"kubernetes.io/projected/eb354e85-311d-40bb-ae4a-5c535d4d89b9-kube-api-access-q65k4\") pod \"eb354e85-311d-40bb-ae4a-5c535d4d89b9\" (UID: \"eb354e85-311d-40bb-ae4a-5c535d4d89b9\") " Feb 17 14:27:28 crc kubenswrapper[4836]: I0217 14:27:28.070749 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eb354e85-311d-40bb-ae4a-5c535d4d89b9-operator-scripts\") pod \"eb354e85-311d-40bb-ae4a-5c535d4d89b9\" (UID: \"eb354e85-311d-40bb-ae4a-5c535d4d89b9\") " Feb 17 14:27:28 crc kubenswrapper[4836]: I0217 14:27:28.072153 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb354e85-311d-40bb-ae4a-5c535d4d89b9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "eb354e85-311d-40bb-ae4a-5c535d4d89b9" (UID: "eb354e85-311d-40bb-ae4a-5c535d4d89b9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:27:28 crc kubenswrapper[4836]: I0217 14:27:28.073205 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9ee15e8-6695-454f-83ad-d54176458497-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f9ee15e8-6695-454f-83ad-d54176458497" (UID: "f9ee15e8-6695-454f-83ad-d54176458497"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:27:28 crc kubenswrapper[4836]: I0217 14:27:28.075945 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4edeb89f-0bd9-466e-a9f9-2d45575d2c72-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4edeb89f-0bd9-466e-a9f9-2d45575d2c72" (UID: "4edeb89f-0bd9-466e-a9f9-2d45575d2c72"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:27:28 crc kubenswrapper[4836]: I0217 14:27:28.079267 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9ee15e8-6695-454f-83ad-d54176458497-kube-api-access-mwcxc" (OuterVolumeSpecName: "kube-api-access-mwcxc") pod "f9ee15e8-6695-454f-83ad-d54176458497" (UID: "f9ee15e8-6695-454f-83ad-d54176458497"). InnerVolumeSpecName "kube-api-access-mwcxc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:27:28 crc kubenswrapper[4836]: I0217 14:27:28.082120 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4edeb89f-0bd9-466e-a9f9-2d45575d2c72-kube-api-access-zm6jm" (OuterVolumeSpecName: "kube-api-access-zm6jm") pod "4edeb89f-0bd9-466e-a9f9-2d45575d2c72" (UID: "4edeb89f-0bd9-466e-a9f9-2d45575d2c72"). InnerVolumeSpecName "kube-api-access-zm6jm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:27:28 crc kubenswrapper[4836]: I0217 14:27:28.090633 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb354e85-311d-40bb-ae4a-5c535d4d89b9-kube-api-access-q65k4" (OuterVolumeSpecName: "kube-api-access-q65k4") pod "eb354e85-311d-40bb-ae4a-5c535d4d89b9" (UID: "eb354e85-311d-40bb-ae4a-5c535d4d89b9"). InnerVolumeSpecName "kube-api-access-q65k4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:27:28 crc kubenswrapper[4836]: I0217 14:27:28.172484 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zm6jm\" (UniqueName: \"kubernetes.io/projected/4edeb89f-0bd9-466e-a9f9-2d45575d2c72-kube-api-access-zm6jm\") on node \"crc\" DevicePath \"\"" Feb 17 14:27:28 crc kubenswrapper[4836]: I0217 14:27:28.172533 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mwcxc\" (UniqueName: \"kubernetes.io/projected/f9ee15e8-6695-454f-83ad-d54176458497-kube-api-access-mwcxc\") on node \"crc\" DevicePath \"\"" Feb 17 14:27:28 crc kubenswrapper[4836]: I0217 14:27:28.172546 4836 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9ee15e8-6695-454f-83ad-d54176458497-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 14:27:28 crc kubenswrapper[4836]: I0217 14:27:28.172559 4836 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4edeb89f-0bd9-466e-a9f9-2d45575d2c72-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 14:27:28 crc kubenswrapper[4836]: I0217 14:27:28.172569 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q65k4\" (UniqueName: \"kubernetes.io/projected/eb354e85-311d-40bb-ae4a-5c535d4d89b9-kube-api-access-q65k4\") on node \"crc\" DevicePath \"\"" Feb 17 14:27:28 crc kubenswrapper[4836]: I0217 14:27:28.172578 4836 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eb354e85-311d-40bb-ae4a-5c535d4d89b9-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 14:27:28 crc kubenswrapper[4836]: I0217 14:27:28.173176 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-2ea0-account-create-update-p7p99" Feb 17 14:27:28 crc kubenswrapper[4836]: I0217 14:27:28.182603 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-652c-account-create-update-lswdv" Feb 17 14:27:28 crc kubenswrapper[4836]: I0217 14:27:28.200585 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-create-jjrp2" Feb 17 14:27:28 crc kubenswrapper[4836]: I0217 14:27:28.376040 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2ee1a0f2-86df-4f97-957a-22bbd7da4505-operator-scripts\") pod \"2ee1a0f2-86df-4f97-957a-22bbd7da4505\" (UID: \"2ee1a0f2-86df-4f97-957a-22bbd7da4505\") " Feb 17 14:27:28 crc kubenswrapper[4836]: I0217 14:27:28.376657 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j7d6c\" (UniqueName: \"kubernetes.io/projected/2ee1a0f2-86df-4f97-957a-22bbd7da4505-kube-api-access-j7d6c\") pod \"2ee1a0f2-86df-4f97-957a-22bbd7da4505\" (UID: \"2ee1a0f2-86df-4f97-957a-22bbd7da4505\") " Feb 17 14:27:28 crc kubenswrapper[4836]: I0217 14:27:28.376762 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bqspw\" (UniqueName: \"kubernetes.io/projected/767841a7-db94-430a-b408-10e5bd0350e5-kube-api-access-bqspw\") pod \"767841a7-db94-430a-b408-10e5bd0350e5\" (UID: \"767841a7-db94-430a-b408-10e5bd0350e5\") " Feb 17 14:27:28 crc kubenswrapper[4836]: I0217 14:27:28.376809 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a1fe36f3-d6b6-44e0-b85b-6def754fd08e-operator-scripts\") pod \"a1fe36f3-d6b6-44e0-b85b-6def754fd08e\" (UID: \"a1fe36f3-d6b6-44e0-b85b-6def754fd08e\") " Feb 17 14:27:28 crc kubenswrapper[4836]: I0217 14:27:28.376874 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/767841a7-db94-430a-b408-10e5bd0350e5-operator-scripts\") pod \"767841a7-db94-430a-b408-10e5bd0350e5\" (UID: \"767841a7-db94-430a-b408-10e5bd0350e5\") " Feb 17 14:27:28 crc kubenswrapper[4836]: I0217 14:27:28.376910 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6zfw7\" (UniqueName: \"kubernetes.io/projected/a1fe36f3-d6b6-44e0-b85b-6def754fd08e-kube-api-access-6zfw7\") pod \"a1fe36f3-d6b6-44e0-b85b-6def754fd08e\" (UID: \"a1fe36f3-d6b6-44e0-b85b-6def754fd08e\") " Feb 17 14:27:28 crc kubenswrapper[4836]: I0217 14:27:28.379374 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/767841a7-db94-430a-b408-10e5bd0350e5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "767841a7-db94-430a-b408-10e5bd0350e5" (UID: "767841a7-db94-430a-b408-10e5bd0350e5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:27:28 crc kubenswrapper[4836]: I0217 14:27:28.379783 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ee1a0f2-86df-4f97-957a-22bbd7da4505-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2ee1a0f2-86df-4f97-957a-22bbd7da4505" (UID: "2ee1a0f2-86df-4f97-957a-22bbd7da4505"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:27:28 crc kubenswrapper[4836]: I0217 14:27:28.379893 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1fe36f3-d6b6-44e0-b85b-6def754fd08e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a1fe36f3-d6b6-44e0-b85b-6def754fd08e" (UID: "a1fe36f3-d6b6-44e0-b85b-6def754fd08e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:27:28 crc kubenswrapper[4836]: I0217 14:27:28.384152 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/767841a7-db94-430a-b408-10e5bd0350e5-kube-api-access-bqspw" (OuterVolumeSpecName: "kube-api-access-bqspw") pod "767841a7-db94-430a-b408-10e5bd0350e5" (UID: "767841a7-db94-430a-b408-10e5bd0350e5"). InnerVolumeSpecName "kube-api-access-bqspw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:27:28 crc kubenswrapper[4836]: I0217 14:27:28.384199 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ee1a0f2-86df-4f97-957a-22bbd7da4505-kube-api-access-j7d6c" (OuterVolumeSpecName: "kube-api-access-j7d6c") pod "2ee1a0f2-86df-4f97-957a-22bbd7da4505" (UID: "2ee1a0f2-86df-4f97-957a-22bbd7da4505"). InnerVolumeSpecName "kube-api-access-j7d6c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:27:28 crc kubenswrapper[4836]: I0217 14:27:28.384218 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1fe36f3-d6b6-44e0-b85b-6def754fd08e-kube-api-access-6zfw7" (OuterVolumeSpecName: "kube-api-access-6zfw7") pod "a1fe36f3-d6b6-44e0-b85b-6def754fd08e" (UID: "a1fe36f3-d6b6-44e0-b85b-6def754fd08e"). InnerVolumeSpecName "kube-api-access-6zfw7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:27:28 crc kubenswrapper[4836]: I0217 14:27:28.481675 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bqspw\" (UniqueName: \"kubernetes.io/projected/767841a7-db94-430a-b408-10e5bd0350e5-kube-api-access-bqspw\") on node \"crc\" DevicePath \"\"" Feb 17 14:27:28 crc kubenswrapper[4836]: I0217 14:27:28.481720 4836 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a1fe36f3-d6b6-44e0-b85b-6def754fd08e-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 14:27:28 crc kubenswrapper[4836]: I0217 14:27:28.481731 4836 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/767841a7-db94-430a-b408-10e5bd0350e5-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 14:27:28 crc kubenswrapper[4836]: I0217 14:27:28.481741 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6zfw7\" (UniqueName: \"kubernetes.io/projected/a1fe36f3-d6b6-44e0-b85b-6def754fd08e-kube-api-access-6zfw7\") on node \"crc\" DevicePath \"\"" Feb 17 14:27:28 crc kubenswrapper[4836]: I0217 14:27:28.481751 4836 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2ee1a0f2-86df-4f97-957a-22bbd7da4505-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 14:27:28 crc kubenswrapper[4836]: I0217 14:27:28.481760 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j7d6c\" (UniqueName: \"kubernetes.io/projected/2ee1a0f2-86df-4f97-957a-22bbd7da4505-kube-api-access-j7d6c\") on node \"crc\" DevicePath \"\"" Feb 17 14:27:29 crc kubenswrapper[4836]: I0217 14:27:29.029771 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-q25rr" event={"ID":"6a1d4ef8-03d9-42d8-ae0b-9410767ed25f","Type":"ContainerStarted","Data":"515b55d1439f54ad3649999fcf112b0e86238d037ec2170a1978295a22c02429"} Feb 17 14:27:29 crc kubenswrapper[4836]: I0217 14:27:29.038105 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-0d11-account-create-update-jf72z" Feb 17 14:27:29 crc kubenswrapper[4836]: I0217 14:27:29.039434 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"e482046c-502a-4f41-b013-7b3ef1c71ee1","Type":"ContainerStarted","Data":"10e8f1a8271fe8b9a3e079d4e3f9b1c9e1a94071cc3e1794381ca8b55232643b"} Feb 17 14:27:29 crc kubenswrapper[4836]: I0217 14:27:29.040934 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"e482046c-502a-4f41-b013-7b3ef1c71ee1","Type":"ContainerStarted","Data":"7f1925fc13f4afdcc4c45288037b0e54b58885bef6bfe0968e5743fdedd3eee5"} Feb 17 14:27:29 crc kubenswrapper[4836]: I0217 14:27:29.040713 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-2ea0-account-create-update-p7p99" Feb 17 14:27:29 crc kubenswrapper[4836]: I0217 14:27:29.040778 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-69hk6" Feb 17 14:27:29 crc kubenswrapper[4836]: I0217 14:27:29.040624 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-create-jjrp2" Feb 17 14:27:29 crc kubenswrapper[4836]: I0217 14:27:29.040837 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-w5qdk" Feb 17 14:27:29 crc kubenswrapper[4836]: I0217 14:27:29.040831 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-652c-account-create-update-lswdv" Feb 17 14:27:29 crc kubenswrapper[4836]: I0217 14:27:29.071598 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-q25rr" podStartSLOduration=11.378752471 podStartE2EDuration="20.071535812s" podCreationTimestamp="2026-02-17 14:27:09 +0000 UTC" firstStartedPulling="2026-02-17 14:27:19.094410367 +0000 UTC m=+1265.437338636" lastFinishedPulling="2026-02-17 14:27:27.787193708 +0000 UTC m=+1274.130121977" observedRunningTime="2026-02-17 14:27:29.0544771 +0000 UTC m=+1275.397405369" watchObservedRunningTime="2026-02-17 14:27:29.071535812 +0000 UTC m=+1275.414464081" Feb 17 14:27:32 crc kubenswrapper[4836]: I0217 14:27:32.079628 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-z8g7x" event={"ID":"df3a6cf1-bca0-45b2-9f7c-6d483452d49d","Type":"ContainerStarted","Data":"2953db160f228060c084b5fd479ec149c2b0acd6cacae4957fb68229d08ae1b9"} Feb 17 14:27:32 crc kubenswrapper[4836]: I0217 14:27:32.092357 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"e482046c-502a-4f41-b013-7b3ef1c71ee1","Type":"ContainerStarted","Data":"22a3a4e55efe4e3d464889001a9fe901ffb4ea46c14cdd7e85c9d2c2e6e3edfd"} Feb 17 14:27:32 crc kubenswrapper[4836]: I0217 14:27:32.092438 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"e482046c-502a-4f41-b013-7b3ef1c71ee1","Type":"ContainerStarted","Data":"5e6c4cd2951fb0b422fb40c5f945bb3f2e7b7e9db69228f8713f1bc45540baa5"} Feb 17 14:27:32 crc kubenswrapper[4836]: I0217 14:27:32.092459 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"e482046c-502a-4f41-b013-7b3ef1c71ee1","Type":"ContainerStarted","Data":"4cb65cd5b122c3dcb9c9f25106640404b4eabc5c79e62548ea4c28fea1377b9a"} Feb 17 14:27:32 crc kubenswrapper[4836]: I0217 14:27:32.106143 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-z8g7x" podStartSLOduration=2.930026148 podStartE2EDuration="41.106107481s" podCreationTimestamp="2026-02-17 14:26:51 +0000 UTC" firstStartedPulling="2026-02-17 14:26:52.865521887 +0000 UTC m=+1239.208450156" lastFinishedPulling="2026-02-17 14:27:31.04160322 +0000 UTC m=+1277.384531489" observedRunningTime="2026-02-17 14:27:32.100490569 +0000 UTC m=+1278.443418858" watchObservedRunningTime="2026-02-17 14:27:32.106107481 +0000 UTC m=+1278.449035760" Feb 17 14:27:33 crc kubenswrapper[4836]: I0217 14:27:33.110042 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"e482046c-502a-4f41-b013-7b3ef1c71ee1","Type":"ContainerStarted","Data":"09f154f4fd28f63dbec7ac1035524a84210785f03e39a3ac0cfc39a54b0f40e4"} Feb 17 14:27:34 crc kubenswrapper[4836]: I0217 14:27:34.228709 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"e482046c-502a-4f41-b013-7b3ef1c71ee1","Type":"ContainerStarted","Data":"1d67e755bebbdcbc6c4235f60792c46f335a26119c8d75fdd344cb8cbda7ab2e"} Feb 17 14:27:34 crc kubenswrapper[4836]: I0217 14:27:34.229926 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"e482046c-502a-4f41-b013-7b3ef1c71ee1","Type":"ContainerStarted","Data":"59ef4eb444efd2a3547796e2e5dbbda7d2ba921912ba9e5ccad7ea3bd4ca8b8c"} Feb 17 14:27:34 crc kubenswrapper[4836]: I0217 14:27:34.231780 4836 generic.go:334] "Generic (PLEG): container finished" podID="6a1d4ef8-03d9-42d8-ae0b-9410767ed25f" containerID="515b55d1439f54ad3649999fcf112b0e86238d037ec2170a1978295a22c02429" exitCode=0 Feb 17 14:27:34 crc kubenswrapper[4836]: I0217 14:27:34.231828 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-q25rr" event={"ID":"6a1d4ef8-03d9-42d8-ae0b-9410767ed25f","Type":"ContainerDied","Data":"515b55d1439f54ad3649999fcf112b0e86238d037ec2170a1978295a22c02429"} Feb 17 14:27:35 crc kubenswrapper[4836]: I0217 14:27:35.255871 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"e482046c-502a-4f41-b013-7b3ef1c71ee1","Type":"ContainerStarted","Data":"12fae51aea87a0815d125bc2d63bb27751d34a74693a02e2653210bd2a718db7"} Feb 17 14:27:35 crc kubenswrapper[4836]: I0217 14:27:35.256467 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"e482046c-502a-4f41-b013-7b3ef1c71ee1","Type":"ContainerStarted","Data":"025091d0945dd7416700daf100e14ba799e1c45450a4fd76dca0971c8617473d"} Feb 17 14:27:35 crc kubenswrapper[4836]: I0217 14:27:35.256500 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"e482046c-502a-4f41-b013-7b3ef1c71ee1","Type":"ContainerStarted","Data":"ef3b36336f0744c33a0143bf2d440f444db1cbbcfe714850f5511d030edb053f"} Feb 17 14:27:35 crc kubenswrapper[4836]: I0217 14:27:35.746746 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-q25rr" Feb 17 14:27:35 crc kubenswrapper[4836]: I0217 14:27:35.863989 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a1d4ef8-03d9-42d8-ae0b-9410767ed25f-config-data\") pod \"6a1d4ef8-03d9-42d8-ae0b-9410767ed25f\" (UID: \"6a1d4ef8-03d9-42d8-ae0b-9410767ed25f\") " Feb 17 14:27:35 crc kubenswrapper[4836]: I0217 14:27:35.864150 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a1d4ef8-03d9-42d8-ae0b-9410767ed25f-combined-ca-bundle\") pod \"6a1d4ef8-03d9-42d8-ae0b-9410767ed25f\" (UID: \"6a1d4ef8-03d9-42d8-ae0b-9410767ed25f\") " Feb 17 14:27:35 crc kubenswrapper[4836]: I0217 14:27:35.864773 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-njnhf\" (UniqueName: \"kubernetes.io/projected/6a1d4ef8-03d9-42d8-ae0b-9410767ed25f-kube-api-access-njnhf\") pod \"6a1d4ef8-03d9-42d8-ae0b-9410767ed25f\" (UID: \"6a1d4ef8-03d9-42d8-ae0b-9410767ed25f\") " Feb 17 14:27:35 crc kubenswrapper[4836]: I0217 14:27:35.874683 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a1d4ef8-03d9-42d8-ae0b-9410767ed25f-kube-api-access-njnhf" (OuterVolumeSpecName: "kube-api-access-njnhf") pod "6a1d4ef8-03d9-42d8-ae0b-9410767ed25f" (UID: "6a1d4ef8-03d9-42d8-ae0b-9410767ed25f"). InnerVolumeSpecName "kube-api-access-njnhf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:27:35 crc kubenswrapper[4836]: I0217 14:27:35.895674 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a1d4ef8-03d9-42d8-ae0b-9410767ed25f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6a1d4ef8-03d9-42d8-ae0b-9410767ed25f" (UID: "6a1d4ef8-03d9-42d8-ae0b-9410767ed25f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:27:35 crc kubenswrapper[4836]: I0217 14:27:35.930939 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a1d4ef8-03d9-42d8-ae0b-9410767ed25f-config-data" (OuterVolumeSpecName: "config-data") pod "6a1d4ef8-03d9-42d8-ae0b-9410767ed25f" (UID: "6a1d4ef8-03d9-42d8-ae0b-9410767ed25f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:27:35 crc kubenswrapper[4836]: I0217 14:27:35.967866 4836 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a1d4ef8-03d9-42d8-ae0b-9410767ed25f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:27:35 crc kubenswrapper[4836]: I0217 14:27:35.968391 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-njnhf\" (UniqueName: \"kubernetes.io/projected/6a1d4ef8-03d9-42d8-ae0b-9410767ed25f-kube-api-access-njnhf\") on node \"crc\" DevicePath \"\"" Feb 17 14:27:35 crc kubenswrapper[4836]: I0217 14:27:35.968407 4836 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a1d4ef8-03d9-42d8-ae0b-9410767ed25f-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 14:27:36 crc kubenswrapper[4836]: I0217 14:27:36.271139 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-q25rr" Feb 17 14:27:36 crc kubenswrapper[4836]: I0217 14:27:36.271135 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-q25rr" event={"ID":"6a1d4ef8-03d9-42d8-ae0b-9410767ed25f","Type":"ContainerDied","Data":"d0ca56828b8b775526488a0852f84f24a4af21e12c68a30c67a55c19e97b65de"} Feb 17 14:27:36 crc kubenswrapper[4836]: I0217 14:27:36.271342 4836 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d0ca56828b8b775526488a0852f84f24a4af21e12c68a30c67a55c19e97b65de" Feb 17 14:27:36 crc kubenswrapper[4836]: I0217 14:27:36.298441 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"e482046c-502a-4f41-b013-7b3ef1c71ee1","Type":"ContainerStarted","Data":"400f3b9611221e6f2eab0fbb1342619856c817ae813d92448fcfbd94d6c95a02"} Feb 17 14:27:36 crc kubenswrapper[4836]: I0217 14:27:36.298517 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"e482046c-502a-4f41-b013-7b3ef1c71ee1","Type":"ContainerStarted","Data":"f662b0f988bf0c51a41f39bffce76367bf85b3cac50090e090da9029607a1a75"} Feb 17 14:27:36 crc kubenswrapper[4836]: I0217 14:27:36.411126 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=48.463657773 podStartE2EDuration="1m2.411098228s" podCreationTimestamp="2026-02-17 14:26:34 +0000 UTC" firstStartedPulling="2026-02-17 14:27:19.326606863 +0000 UTC m=+1265.669535132" lastFinishedPulling="2026-02-17 14:27:33.274047318 +0000 UTC m=+1279.616975587" observedRunningTime="2026-02-17 14:27:36.353899012 +0000 UTC m=+1282.696827311" watchObservedRunningTime="2026-02-17 14:27:36.411098228 +0000 UTC m=+1282.754026487" Feb 17 14:27:36 crc kubenswrapper[4836]: I0217 14:27:36.613946 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-r25dh"] Feb 17 14:27:36 crc kubenswrapper[4836]: E0217 14:27:36.614852 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="623225aa-2492-494e-be5b-92acef6f23cf" containerName="mariadb-account-create-update" Feb 17 14:27:36 crc kubenswrapper[4836]: I0217 14:27:36.614881 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="623225aa-2492-494e-be5b-92acef6f23cf" containerName="mariadb-account-create-update" Feb 17 14:27:36 crc kubenswrapper[4836]: E0217 14:27:36.614918 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a1d4ef8-03d9-42d8-ae0b-9410767ed25f" containerName="keystone-db-sync" Feb 17 14:27:36 crc kubenswrapper[4836]: I0217 14:27:36.614929 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a1d4ef8-03d9-42d8-ae0b-9410767ed25f" containerName="keystone-db-sync" Feb 17 14:27:36 crc kubenswrapper[4836]: E0217 14:27:36.614943 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4edeb89f-0bd9-466e-a9f9-2d45575d2c72" containerName="mariadb-database-create" Feb 17 14:27:36 crc kubenswrapper[4836]: I0217 14:27:36.614951 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="4edeb89f-0bd9-466e-a9f9-2d45575d2c72" containerName="mariadb-database-create" Feb 17 14:27:36 crc kubenswrapper[4836]: E0217 14:27:36.614983 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb354e85-311d-40bb-ae4a-5c535d4d89b9" containerName="mariadb-database-create" Feb 17 14:27:36 crc kubenswrapper[4836]: I0217 14:27:36.614990 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb354e85-311d-40bb-ae4a-5c535d4d89b9" containerName="mariadb-database-create" Feb 17 14:27:36 crc kubenswrapper[4836]: E0217 14:27:36.615011 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1fe36f3-d6b6-44e0-b85b-6def754fd08e" containerName="mariadb-database-create" Feb 17 14:27:36 crc kubenswrapper[4836]: I0217 14:27:36.615017 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1fe36f3-d6b6-44e0-b85b-6def754fd08e" containerName="mariadb-database-create" Feb 17 14:27:36 crc kubenswrapper[4836]: E0217 14:27:36.615033 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4ce1c7a-57e8-491e-84ab-8aed8baea37b" containerName="mariadb-database-create" Feb 17 14:27:36 crc kubenswrapper[4836]: I0217 14:27:36.615039 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4ce1c7a-57e8-491e-84ab-8aed8baea37b" containerName="mariadb-database-create" Feb 17 14:27:36 crc kubenswrapper[4836]: E0217 14:27:36.615048 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ee1a0f2-86df-4f97-957a-22bbd7da4505" containerName="mariadb-account-create-update" Feb 17 14:27:36 crc kubenswrapper[4836]: I0217 14:27:36.615054 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ee1a0f2-86df-4f97-957a-22bbd7da4505" containerName="mariadb-account-create-update" Feb 17 14:27:36 crc kubenswrapper[4836]: E0217 14:27:36.615063 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="767841a7-db94-430a-b408-10e5bd0350e5" containerName="mariadb-account-create-update" Feb 17 14:27:36 crc kubenswrapper[4836]: I0217 14:27:36.615070 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="767841a7-db94-430a-b408-10e5bd0350e5" containerName="mariadb-account-create-update" Feb 17 14:27:36 crc kubenswrapper[4836]: E0217 14:27:36.615083 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9ee15e8-6695-454f-83ad-d54176458497" containerName="mariadb-account-create-update" Feb 17 14:27:36 crc kubenswrapper[4836]: I0217 14:27:36.615091 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9ee15e8-6695-454f-83ad-d54176458497" containerName="mariadb-account-create-update" Feb 17 14:27:36 crc kubenswrapper[4836]: I0217 14:27:36.615336 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="623225aa-2492-494e-be5b-92acef6f23cf" containerName="mariadb-account-create-update" Feb 17 14:27:36 crc kubenswrapper[4836]: I0217 14:27:36.615364 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1fe36f3-d6b6-44e0-b85b-6def754fd08e" containerName="mariadb-database-create" Feb 17 14:27:36 crc kubenswrapper[4836]: I0217 14:27:36.615375 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a1d4ef8-03d9-42d8-ae0b-9410767ed25f" containerName="keystone-db-sync" Feb 17 14:27:36 crc kubenswrapper[4836]: I0217 14:27:36.615388 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4ce1c7a-57e8-491e-84ab-8aed8baea37b" containerName="mariadb-database-create" Feb 17 14:27:36 crc kubenswrapper[4836]: I0217 14:27:36.615400 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="4edeb89f-0bd9-466e-a9f9-2d45575d2c72" containerName="mariadb-database-create" Feb 17 14:27:36 crc kubenswrapper[4836]: I0217 14:27:36.615408 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ee1a0f2-86df-4f97-957a-22bbd7da4505" containerName="mariadb-account-create-update" Feb 17 14:27:36 crc kubenswrapper[4836]: I0217 14:27:36.615418 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb354e85-311d-40bb-ae4a-5c535d4d89b9" containerName="mariadb-database-create" Feb 17 14:27:36 crc kubenswrapper[4836]: I0217 14:27:36.615424 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9ee15e8-6695-454f-83ad-d54176458497" containerName="mariadb-account-create-update" Feb 17 14:27:36 crc kubenswrapper[4836]: I0217 14:27:36.615435 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="767841a7-db94-430a-b408-10e5bd0350e5" containerName="mariadb-account-create-update" Feb 17 14:27:36 crc kubenswrapper[4836]: I0217 14:27:36.617187 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-r25dh" Feb 17 14:27:36 crc kubenswrapper[4836]: I0217 14:27:36.636968 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 17 14:27:36 crc kubenswrapper[4836]: I0217 14:27:36.637146 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 17 14:27:36 crc kubenswrapper[4836]: I0217 14:27:36.637846 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 17 14:27:36 crc kubenswrapper[4836]: I0217 14:27:36.638222 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 17 14:27:36 crc kubenswrapper[4836]: I0217 14:27:36.643876 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-f877ddd87-w4x5z"] Feb 17 14:27:36 crc kubenswrapper[4836]: I0217 14:27:36.648682 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f877ddd87-w4x5z" Feb 17 14:27:36 crc kubenswrapper[4836]: I0217 14:27:36.676786 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-r25dh"] Feb 17 14:27:36 crc kubenswrapper[4836]: I0217 14:27:36.679487 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-s87v5" Feb 17 14:27:36 crc kubenswrapper[4836]: I0217 14:27:36.688891 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f877ddd87-w4x5z"] Feb 17 14:27:36 crc kubenswrapper[4836]: I0217 14:27:36.757058 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b18f8ba-fa1b-4a70-8774-0df51c645ed9-scripts\") pod \"keystone-bootstrap-r25dh\" (UID: \"9b18f8ba-fa1b-4a70-8774-0df51c645ed9\") " pod="openstack/keystone-bootstrap-r25dh" Feb 17 14:27:36 crc kubenswrapper[4836]: I0217 14:27:36.757127 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9b18f8ba-fa1b-4a70-8774-0df51c645ed9-fernet-keys\") pod \"keystone-bootstrap-r25dh\" (UID: \"9b18f8ba-fa1b-4a70-8774-0df51c645ed9\") " pod="openstack/keystone-bootstrap-r25dh" Feb 17 14:27:36 crc kubenswrapper[4836]: I0217 14:27:36.757207 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbt2q\" (UniqueName: \"kubernetes.io/projected/9b18f8ba-fa1b-4a70-8774-0df51c645ed9-kube-api-access-lbt2q\") pod \"keystone-bootstrap-r25dh\" (UID: \"9b18f8ba-fa1b-4a70-8774-0df51c645ed9\") " pod="openstack/keystone-bootstrap-r25dh" Feb 17 14:27:36 crc kubenswrapper[4836]: I0217 14:27:36.757268 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9b18f8ba-fa1b-4a70-8774-0df51c645ed9-credential-keys\") pod \"keystone-bootstrap-r25dh\" (UID: \"9b18f8ba-fa1b-4a70-8774-0df51c645ed9\") " pod="openstack/keystone-bootstrap-r25dh" Feb 17 14:27:36 crc kubenswrapper[4836]: I0217 14:27:36.757353 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fn7bv\" (UniqueName: \"kubernetes.io/projected/d985347f-7490-475c-a126-182ed65224d4-kube-api-access-fn7bv\") pod \"dnsmasq-dns-f877ddd87-w4x5z\" (UID: \"d985347f-7490-475c-a126-182ed65224d4\") " pod="openstack/dnsmasq-dns-f877ddd87-w4x5z" Feb 17 14:27:36 crc kubenswrapper[4836]: I0217 14:27:36.757448 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d985347f-7490-475c-a126-182ed65224d4-config\") pod \"dnsmasq-dns-f877ddd87-w4x5z\" (UID: \"d985347f-7490-475c-a126-182ed65224d4\") " pod="openstack/dnsmasq-dns-f877ddd87-w4x5z" Feb 17 14:27:36 crc kubenswrapper[4836]: I0217 14:27:36.757500 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d985347f-7490-475c-a126-182ed65224d4-ovsdbserver-sb\") pod \"dnsmasq-dns-f877ddd87-w4x5z\" (UID: \"d985347f-7490-475c-a126-182ed65224d4\") " pod="openstack/dnsmasq-dns-f877ddd87-w4x5z" Feb 17 14:27:36 crc kubenswrapper[4836]: I0217 14:27:36.757540 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b18f8ba-fa1b-4a70-8774-0df51c645ed9-combined-ca-bundle\") pod \"keystone-bootstrap-r25dh\" (UID: \"9b18f8ba-fa1b-4a70-8774-0df51c645ed9\") " pod="openstack/keystone-bootstrap-r25dh" Feb 17 14:27:36 crc kubenswrapper[4836]: I0217 14:27:36.757601 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d985347f-7490-475c-a126-182ed65224d4-ovsdbserver-nb\") pod \"dnsmasq-dns-f877ddd87-w4x5z\" (UID: \"d985347f-7490-475c-a126-182ed65224d4\") " pod="openstack/dnsmasq-dns-f877ddd87-w4x5z" Feb 17 14:27:36 crc kubenswrapper[4836]: I0217 14:27:36.757636 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d985347f-7490-475c-a126-182ed65224d4-dns-svc\") pod \"dnsmasq-dns-f877ddd87-w4x5z\" (UID: \"d985347f-7490-475c-a126-182ed65224d4\") " pod="openstack/dnsmasq-dns-f877ddd87-w4x5z" Feb 17 14:27:36 crc kubenswrapper[4836]: I0217 14:27:36.757689 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b18f8ba-fa1b-4a70-8774-0df51c645ed9-config-data\") pod \"keystone-bootstrap-r25dh\" (UID: \"9b18f8ba-fa1b-4a70-8774-0df51c645ed9\") " pod="openstack/keystone-bootstrap-r25dh" Feb 17 14:27:36 crc kubenswrapper[4836]: I0217 14:27:36.864136 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b18f8ba-fa1b-4a70-8774-0df51c645ed9-scripts\") pod \"keystone-bootstrap-r25dh\" (UID: \"9b18f8ba-fa1b-4a70-8774-0df51c645ed9\") " pod="openstack/keystone-bootstrap-r25dh" Feb 17 14:27:36 crc kubenswrapper[4836]: I0217 14:27:36.864204 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9b18f8ba-fa1b-4a70-8774-0df51c645ed9-fernet-keys\") pod \"keystone-bootstrap-r25dh\" (UID: \"9b18f8ba-fa1b-4a70-8774-0df51c645ed9\") " pod="openstack/keystone-bootstrap-r25dh" Feb 17 14:27:36 crc kubenswrapper[4836]: I0217 14:27:36.864310 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbt2q\" (UniqueName: \"kubernetes.io/projected/9b18f8ba-fa1b-4a70-8774-0df51c645ed9-kube-api-access-lbt2q\") pod \"keystone-bootstrap-r25dh\" (UID: \"9b18f8ba-fa1b-4a70-8774-0df51c645ed9\") " pod="openstack/keystone-bootstrap-r25dh" Feb 17 14:27:36 crc kubenswrapper[4836]: I0217 14:27:36.864402 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9b18f8ba-fa1b-4a70-8774-0df51c645ed9-credential-keys\") pod \"keystone-bootstrap-r25dh\" (UID: \"9b18f8ba-fa1b-4a70-8774-0df51c645ed9\") " pod="openstack/keystone-bootstrap-r25dh" Feb 17 14:27:36 crc kubenswrapper[4836]: I0217 14:27:36.864466 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fn7bv\" (UniqueName: \"kubernetes.io/projected/d985347f-7490-475c-a126-182ed65224d4-kube-api-access-fn7bv\") pod \"dnsmasq-dns-f877ddd87-w4x5z\" (UID: \"d985347f-7490-475c-a126-182ed65224d4\") " pod="openstack/dnsmasq-dns-f877ddd87-w4x5z" Feb 17 14:27:36 crc kubenswrapper[4836]: I0217 14:27:36.864580 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d985347f-7490-475c-a126-182ed65224d4-config\") pod \"dnsmasq-dns-f877ddd87-w4x5z\" (UID: \"d985347f-7490-475c-a126-182ed65224d4\") " pod="openstack/dnsmasq-dns-f877ddd87-w4x5z" Feb 17 14:27:36 crc kubenswrapper[4836]: I0217 14:27:36.864625 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d985347f-7490-475c-a126-182ed65224d4-ovsdbserver-sb\") pod \"dnsmasq-dns-f877ddd87-w4x5z\" (UID: \"d985347f-7490-475c-a126-182ed65224d4\") " pod="openstack/dnsmasq-dns-f877ddd87-w4x5z" Feb 17 14:27:36 crc kubenswrapper[4836]: I0217 14:27:36.864672 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b18f8ba-fa1b-4a70-8774-0df51c645ed9-combined-ca-bundle\") pod \"keystone-bootstrap-r25dh\" (UID: \"9b18f8ba-fa1b-4a70-8774-0df51c645ed9\") " pod="openstack/keystone-bootstrap-r25dh" Feb 17 14:27:36 crc kubenswrapper[4836]: I0217 14:27:36.864736 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d985347f-7490-475c-a126-182ed65224d4-ovsdbserver-nb\") pod \"dnsmasq-dns-f877ddd87-w4x5z\" (UID: \"d985347f-7490-475c-a126-182ed65224d4\") " pod="openstack/dnsmasq-dns-f877ddd87-w4x5z" Feb 17 14:27:36 crc kubenswrapper[4836]: I0217 14:27:36.864770 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d985347f-7490-475c-a126-182ed65224d4-dns-svc\") pod \"dnsmasq-dns-f877ddd87-w4x5z\" (UID: \"d985347f-7490-475c-a126-182ed65224d4\") " pod="openstack/dnsmasq-dns-f877ddd87-w4x5z" Feb 17 14:27:36 crc kubenswrapper[4836]: I0217 14:27:36.864818 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b18f8ba-fa1b-4a70-8774-0df51c645ed9-config-data\") pod \"keystone-bootstrap-r25dh\" (UID: \"9b18f8ba-fa1b-4a70-8774-0df51c645ed9\") " pod="openstack/keystone-bootstrap-r25dh" Feb 17 14:27:36 crc kubenswrapper[4836]: I0217 14:27:36.867501 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d985347f-7490-475c-a126-182ed65224d4-config\") pod \"dnsmasq-dns-f877ddd87-w4x5z\" (UID: \"d985347f-7490-475c-a126-182ed65224d4\") " pod="openstack/dnsmasq-dns-f877ddd87-w4x5z" Feb 17 14:27:36 crc kubenswrapper[4836]: I0217 14:27:36.868433 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d985347f-7490-475c-a126-182ed65224d4-ovsdbserver-sb\") pod \"dnsmasq-dns-f877ddd87-w4x5z\" (UID: \"d985347f-7490-475c-a126-182ed65224d4\") " pod="openstack/dnsmasq-dns-f877ddd87-w4x5z" Feb 17 14:27:36 crc kubenswrapper[4836]: I0217 14:27:36.868853 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d985347f-7490-475c-a126-182ed65224d4-ovsdbserver-nb\") pod \"dnsmasq-dns-f877ddd87-w4x5z\" (UID: \"d985347f-7490-475c-a126-182ed65224d4\") " pod="openstack/dnsmasq-dns-f877ddd87-w4x5z" Feb 17 14:27:36 crc kubenswrapper[4836]: I0217 14:27:36.869060 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d985347f-7490-475c-a126-182ed65224d4-dns-svc\") pod \"dnsmasq-dns-f877ddd87-w4x5z\" (UID: \"d985347f-7490-475c-a126-182ed65224d4\") " pod="openstack/dnsmasq-dns-f877ddd87-w4x5z" Feb 17 14:27:36 crc kubenswrapper[4836]: I0217 14:27:36.910601 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b18f8ba-fa1b-4a70-8774-0df51c645ed9-combined-ca-bundle\") pod \"keystone-bootstrap-r25dh\" (UID: \"9b18f8ba-fa1b-4a70-8774-0df51c645ed9\") " pod="openstack/keystone-bootstrap-r25dh" Feb 17 14:27:36 crc kubenswrapper[4836]: I0217 14:27:36.913264 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b18f8ba-fa1b-4a70-8774-0df51c645ed9-scripts\") pod \"keystone-bootstrap-r25dh\" (UID: \"9b18f8ba-fa1b-4a70-8774-0df51c645ed9\") " pod="openstack/keystone-bootstrap-r25dh" Feb 17 14:27:36 crc kubenswrapper[4836]: I0217 14:27:36.922228 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9b18f8ba-fa1b-4a70-8774-0df51c645ed9-fernet-keys\") pod \"keystone-bootstrap-r25dh\" (UID: \"9b18f8ba-fa1b-4a70-8774-0df51c645ed9\") " pod="openstack/keystone-bootstrap-r25dh" Feb 17 14:27:36 crc kubenswrapper[4836]: I0217 14:27:36.924559 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b18f8ba-fa1b-4a70-8774-0df51c645ed9-config-data\") pod \"keystone-bootstrap-r25dh\" (UID: \"9b18f8ba-fa1b-4a70-8774-0df51c645ed9\") " pod="openstack/keystone-bootstrap-r25dh" Feb 17 14:27:36 crc kubenswrapper[4836]: I0217 14:27:36.934155 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbt2q\" (UniqueName: \"kubernetes.io/projected/9b18f8ba-fa1b-4a70-8774-0df51c645ed9-kube-api-access-lbt2q\") pod \"keystone-bootstrap-r25dh\" (UID: \"9b18f8ba-fa1b-4a70-8774-0df51c645ed9\") " pod="openstack/keystone-bootstrap-r25dh" Feb 17 14:27:36 crc kubenswrapper[4836]: I0217 14:27:36.934728 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9b18f8ba-fa1b-4a70-8774-0df51c645ed9-credential-keys\") pod \"keystone-bootstrap-r25dh\" (UID: \"9b18f8ba-fa1b-4a70-8774-0df51c645ed9\") " pod="openstack/keystone-bootstrap-r25dh" Feb 17 14:27:36 crc kubenswrapper[4836]: I0217 14:27:36.962361 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-qqwhc"] Feb 17 14:27:36 crc kubenswrapper[4836]: I0217 14:27:36.964199 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-qqwhc" Feb 17 14:27:36 crc kubenswrapper[4836]: I0217 14:27:36.982094 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 17 14:27:36 crc kubenswrapper[4836]: I0217 14:27:36.982453 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 17 14:27:36 crc kubenswrapper[4836]: I0217 14:27:36.982611 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-cg95t" Feb 17 14:27:36 crc kubenswrapper[4836]: I0217 14:27:36.990458 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fn7bv\" (UniqueName: \"kubernetes.io/projected/d985347f-7490-475c-a126-182ed65224d4-kube-api-access-fn7bv\") pod \"dnsmasq-dns-f877ddd87-w4x5z\" (UID: \"d985347f-7490-475c-a126-182ed65224d4\") " pod="openstack/dnsmasq-dns-f877ddd87-w4x5z" Feb 17 14:27:37 crc kubenswrapper[4836]: I0217 14:27:37.027181 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f877ddd87-w4x5z" Feb 17 14:27:37 crc kubenswrapper[4836]: I0217 14:27:37.028194 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-r25dh" Feb 17 14:27:37 crc kubenswrapper[4836]: I0217 14:27:37.046067 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-qqwhc"] Feb 17 14:27:37 crc kubenswrapper[4836]: I0217 14:27:37.105230 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8185c649-f1ad-4230-830d-07d002e5b358-scripts\") pod \"cinder-db-sync-qqwhc\" (UID: \"8185c649-f1ad-4230-830d-07d002e5b358\") " pod="openstack/cinder-db-sync-qqwhc" Feb 17 14:27:37 crc kubenswrapper[4836]: I0217 14:27:37.105313 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftmbq\" (UniqueName: \"kubernetes.io/projected/8185c649-f1ad-4230-830d-07d002e5b358-kube-api-access-ftmbq\") pod \"cinder-db-sync-qqwhc\" (UID: \"8185c649-f1ad-4230-830d-07d002e5b358\") " pod="openstack/cinder-db-sync-qqwhc" Feb 17 14:27:37 crc kubenswrapper[4836]: I0217 14:27:37.117414 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8185c649-f1ad-4230-830d-07d002e5b358-combined-ca-bundle\") pod \"cinder-db-sync-qqwhc\" (UID: \"8185c649-f1ad-4230-830d-07d002e5b358\") " pod="openstack/cinder-db-sync-qqwhc" Feb 17 14:27:37 crc kubenswrapper[4836]: I0217 14:27:37.117776 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8185c649-f1ad-4230-830d-07d002e5b358-db-sync-config-data\") pod \"cinder-db-sync-qqwhc\" (UID: \"8185c649-f1ad-4230-830d-07d002e5b358\") " pod="openstack/cinder-db-sync-qqwhc" Feb 17 14:27:37 crc kubenswrapper[4836]: I0217 14:27:37.117922 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8185c649-f1ad-4230-830d-07d002e5b358-etc-machine-id\") pod \"cinder-db-sync-qqwhc\" (UID: \"8185c649-f1ad-4230-830d-07d002e5b358\") " pod="openstack/cinder-db-sync-qqwhc" Feb 17 14:27:37 crc kubenswrapper[4836]: I0217 14:27:37.118078 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8185c649-f1ad-4230-830d-07d002e5b358-config-data\") pod \"cinder-db-sync-qqwhc\" (UID: \"8185c649-f1ad-4230-830d-07d002e5b358\") " pod="openstack/cinder-db-sync-qqwhc" Feb 17 14:27:37 crc kubenswrapper[4836]: I0217 14:27:37.129973 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-db-sync-pvljf"] Feb 17 14:27:37 crc kubenswrapper[4836]: I0217 14:27:37.132031 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-pvljf" Feb 17 14:27:37 crc kubenswrapper[4836]: I0217 14:27:37.159156 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-cloudkitty-dockercfg-l28cf" Feb 17 14:27:37 crc kubenswrapper[4836]: I0217 14:27:37.160157 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-client-internal" Feb 17 14:27:37 crc kubenswrapper[4836]: I0217 14:27:37.160472 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-config-data" Feb 17 14:27:37 crc kubenswrapper[4836]: I0217 14:27:37.160612 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-scripts" Feb 17 14:27:37 crc kubenswrapper[4836]: I0217 14:27:37.196581 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-db-sync-pvljf"] Feb 17 14:27:37 crc kubenswrapper[4836]: I0217 14:27:37.227783 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4e016162-2025-44ad-989d-ce71d9f8f9bf-scripts\") pod \"cloudkitty-db-sync-pvljf\" (UID: \"4e016162-2025-44ad-989d-ce71d9f8f9bf\") " pod="openstack/cloudkitty-db-sync-pvljf" Feb 17 14:27:37 crc kubenswrapper[4836]: I0217 14:27:37.227888 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8185c649-f1ad-4230-830d-07d002e5b358-scripts\") pod \"cinder-db-sync-qqwhc\" (UID: \"8185c649-f1ad-4230-830d-07d002e5b358\") " pod="openstack/cinder-db-sync-qqwhc" Feb 17 14:27:37 crc kubenswrapper[4836]: I0217 14:27:37.227971 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftmbq\" (UniqueName: \"kubernetes.io/projected/8185c649-f1ad-4230-830d-07d002e5b358-kube-api-access-ftmbq\") pod \"cinder-db-sync-qqwhc\" (UID: \"8185c649-f1ad-4230-830d-07d002e5b358\") " pod="openstack/cinder-db-sync-qqwhc" Feb 17 14:27:37 crc kubenswrapper[4836]: I0217 14:27:37.228028 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8185c649-f1ad-4230-830d-07d002e5b358-combined-ca-bundle\") pod \"cinder-db-sync-qqwhc\" (UID: \"8185c649-f1ad-4230-830d-07d002e5b358\") " pod="openstack/cinder-db-sync-qqwhc" Feb 17 14:27:37 crc kubenswrapper[4836]: I0217 14:27:37.228134 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/4e016162-2025-44ad-989d-ce71d9f8f9bf-certs\") pod \"cloudkitty-db-sync-pvljf\" (UID: \"4e016162-2025-44ad-989d-ce71d9f8f9bf\") " pod="openstack/cloudkitty-db-sync-pvljf" Feb 17 14:27:37 crc kubenswrapper[4836]: I0217 14:27:37.228219 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8185c649-f1ad-4230-830d-07d002e5b358-db-sync-config-data\") pod \"cinder-db-sync-qqwhc\" (UID: \"8185c649-f1ad-4230-830d-07d002e5b358\") " pod="openstack/cinder-db-sync-qqwhc" Feb 17 14:27:37 crc kubenswrapper[4836]: I0217 14:27:37.228846 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8185c649-f1ad-4230-830d-07d002e5b358-etc-machine-id\") pod \"cinder-db-sync-qqwhc\" (UID: \"8185c649-f1ad-4230-830d-07d002e5b358\") " pod="openstack/cinder-db-sync-qqwhc" Feb 17 14:27:37 crc kubenswrapper[4836]: I0217 14:27:37.229163 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8185c649-f1ad-4230-830d-07d002e5b358-config-data\") pod \"cinder-db-sync-qqwhc\" (UID: \"8185c649-f1ad-4230-830d-07d002e5b358\") " pod="openstack/cinder-db-sync-qqwhc" Feb 17 14:27:37 crc kubenswrapper[4836]: I0217 14:27:37.229226 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e016162-2025-44ad-989d-ce71d9f8f9bf-combined-ca-bundle\") pod \"cloudkitty-db-sync-pvljf\" (UID: \"4e016162-2025-44ad-989d-ce71d9f8f9bf\") " pod="openstack/cloudkitty-db-sync-pvljf" Feb 17 14:27:37 crc kubenswrapper[4836]: I0217 14:27:37.229425 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e016162-2025-44ad-989d-ce71d9f8f9bf-config-data\") pod \"cloudkitty-db-sync-pvljf\" (UID: \"4e016162-2025-44ad-989d-ce71d9f8f9bf\") " pod="openstack/cloudkitty-db-sync-pvljf" Feb 17 14:27:37 crc kubenswrapper[4836]: I0217 14:27:37.229602 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfrn2\" (UniqueName: \"kubernetes.io/projected/4e016162-2025-44ad-989d-ce71d9f8f9bf-kube-api-access-hfrn2\") pod \"cloudkitty-db-sync-pvljf\" (UID: \"4e016162-2025-44ad-989d-ce71d9f8f9bf\") " pod="openstack/cloudkitty-db-sync-pvljf" Feb 17 14:27:37 crc kubenswrapper[4836]: I0217 14:27:37.229907 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8185c649-f1ad-4230-830d-07d002e5b358-etc-machine-id\") pod \"cinder-db-sync-qqwhc\" (UID: \"8185c649-f1ad-4230-830d-07d002e5b358\") " pod="openstack/cinder-db-sync-qqwhc" Feb 17 14:27:37 crc kubenswrapper[4836]: I0217 14:27:37.530490 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8185c649-f1ad-4230-830d-07d002e5b358-combined-ca-bundle\") pod \"cinder-db-sync-qqwhc\" (UID: \"8185c649-f1ad-4230-830d-07d002e5b358\") " pod="openstack/cinder-db-sync-qqwhc" Feb 17 14:27:37 crc kubenswrapper[4836]: I0217 14:27:37.551632 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8185c649-f1ad-4230-830d-07d002e5b358-db-sync-config-data\") pod \"cinder-db-sync-qqwhc\" (UID: \"8185c649-f1ad-4230-830d-07d002e5b358\") " pod="openstack/cinder-db-sync-qqwhc" Feb 17 14:27:37 crc kubenswrapper[4836]: I0217 14:27:37.555123 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/4e016162-2025-44ad-989d-ce71d9f8f9bf-certs\") pod \"cloudkitty-db-sync-pvljf\" (UID: \"4e016162-2025-44ad-989d-ce71d9f8f9bf\") " pod="openstack/cloudkitty-db-sync-pvljf" Feb 17 14:27:37 crc kubenswrapper[4836]: I0217 14:27:37.555355 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e016162-2025-44ad-989d-ce71d9f8f9bf-combined-ca-bundle\") pod \"cloudkitty-db-sync-pvljf\" (UID: \"4e016162-2025-44ad-989d-ce71d9f8f9bf\") " pod="openstack/cloudkitty-db-sync-pvljf" Feb 17 14:27:37 crc kubenswrapper[4836]: I0217 14:27:37.555433 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e016162-2025-44ad-989d-ce71d9f8f9bf-config-data\") pod \"cloudkitty-db-sync-pvljf\" (UID: \"4e016162-2025-44ad-989d-ce71d9f8f9bf\") " pod="openstack/cloudkitty-db-sync-pvljf" Feb 17 14:27:37 crc kubenswrapper[4836]: I0217 14:27:37.555511 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hfrn2\" (UniqueName: \"kubernetes.io/projected/4e016162-2025-44ad-989d-ce71d9f8f9bf-kube-api-access-hfrn2\") pod \"cloudkitty-db-sync-pvljf\" (UID: \"4e016162-2025-44ad-989d-ce71d9f8f9bf\") " pod="openstack/cloudkitty-db-sync-pvljf" Feb 17 14:27:37 crc kubenswrapper[4836]: I0217 14:27:37.555598 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4e016162-2025-44ad-989d-ce71d9f8f9bf-scripts\") pod \"cloudkitty-db-sync-pvljf\" (UID: \"4e016162-2025-44ad-989d-ce71d9f8f9bf\") " pod="openstack/cloudkitty-db-sync-pvljf" Feb 17 14:27:37 crc kubenswrapper[4836]: I0217 14:27:37.561137 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4e016162-2025-44ad-989d-ce71d9f8f9bf-scripts\") pod \"cloudkitty-db-sync-pvljf\" (UID: \"4e016162-2025-44ad-989d-ce71d9f8f9bf\") " pod="openstack/cloudkitty-db-sync-pvljf" Feb 17 14:27:37 crc kubenswrapper[4836]: I0217 14:27:37.569105 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e016162-2025-44ad-989d-ce71d9f8f9bf-config-data\") pod \"cloudkitty-db-sync-pvljf\" (UID: \"4e016162-2025-44ad-989d-ce71d9f8f9bf\") " pod="openstack/cloudkitty-db-sync-pvljf" Feb 17 14:27:37 crc kubenswrapper[4836]: I0217 14:27:37.580195 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e016162-2025-44ad-989d-ce71d9f8f9bf-combined-ca-bundle\") pod \"cloudkitty-db-sync-pvljf\" (UID: \"4e016162-2025-44ad-989d-ce71d9f8f9bf\") " pod="openstack/cloudkitty-db-sync-pvljf" Feb 17 14:27:37 crc kubenswrapper[4836]: I0217 14:27:37.593488 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/4e016162-2025-44ad-989d-ce71d9f8f9bf-certs\") pod \"cloudkitty-db-sync-pvljf\" (UID: \"4e016162-2025-44ad-989d-ce71d9f8f9bf\") " pod="openstack/cloudkitty-db-sync-pvljf" Feb 17 14:27:37 crc kubenswrapper[4836]: I0217 14:27:37.594131 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8185c649-f1ad-4230-830d-07d002e5b358-scripts\") pod \"cinder-db-sync-qqwhc\" (UID: \"8185c649-f1ad-4230-830d-07d002e5b358\") " pod="openstack/cinder-db-sync-qqwhc" Feb 17 14:27:37 crc kubenswrapper[4836]: I0217 14:27:37.596618 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftmbq\" (UniqueName: \"kubernetes.io/projected/8185c649-f1ad-4230-830d-07d002e5b358-kube-api-access-ftmbq\") pod \"cinder-db-sync-qqwhc\" (UID: \"8185c649-f1ad-4230-830d-07d002e5b358\") " pod="openstack/cinder-db-sync-qqwhc" Feb 17 14:27:37 crc kubenswrapper[4836]: I0217 14:27:37.598668 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-g9l4s"] Feb 17 14:27:37 crc kubenswrapper[4836]: I0217 14:27:37.612944 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-g9l4s" Feb 17 14:27:37 crc kubenswrapper[4836]: I0217 14:27:37.619916 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-fkh7w" Feb 17 14:27:37 crc kubenswrapper[4836]: I0217 14:27:37.634539 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8185c649-f1ad-4230-830d-07d002e5b358-config-data\") pod \"cinder-db-sync-qqwhc\" (UID: \"8185c649-f1ad-4230-830d-07d002e5b358\") " pod="openstack/cinder-db-sync-qqwhc" Feb 17 14:27:37 crc kubenswrapper[4836]: I0217 14:27:37.661353 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 17 14:27:37 crc kubenswrapper[4836]: I0217 14:27:37.721287 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfrn2\" (UniqueName: \"kubernetes.io/projected/4e016162-2025-44ad-989d-ce71d9f8f9bf-kube-api-access-hfrn2\") pod \"cloudkitty-db-sync-pvljf\" (UID: \"4e016162-2025-44ad-989d-ce71d9f8f9bf\") " pod="openstack/cloudkitty-db-sync-pvljf" Feb 17 14:27:37 crc kubenswrapper[4836]: I0217 14:27:37.742274 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-g9l4s"] Feb 17 14:27:37 crc kubenswrapper[4836]: I0217 14:27:37.757912 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-qqwhc" Feb 17 14:27:37 crc kubenswrapper[4836]: I0217 14:27:37.769032 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqtgf\" (UniqueName: \"kubernetes.io/projected/18361bc2-5db1-4611-be18-38593e0b5d5d-kube-api-access-sqtgf\") pod \"barbican-db-sync-g9l4s\" (UID: \"18361bc2-5db1-4611-be18-38593e0b5d5d\") " pod="openstack/barbican-db-sync-g9l4s" Feb 17 14:27:37 crc kubenswrapper[4836]: I0217 14:27:37.769141 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/18361bc2-5db1-4611-be18-38593e0b5d5d-db-sync-config-data\") pod \"barbican-db-sync-g9l4s\" (UID: \"18361bc2-5db1-4611-be18-38593e0b5d5d\") " pod="openstack/barbican-db-sync-g9l4s" Feb 17 14:27:37 crc kubenswrapper[4836]: I0217 14:27:37.769170 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18361bc2-5db1-4611-be18-38593e0b5d5d-combined-ca-bundle\") pod \"barbican-db-sync-g9l4s\" (UID: \"18361bc2-5db1-4611-be18-38593e0b5d5d\") " pod="openstack/barbican-db-sync-g9l4s" Feb 17 14:27:37 crc kubenswrapper[4836]: I0217 14:27:37.776679 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-sb6h7"] Feb 17 14:27:37 crc kubenswrapper[4836]: I0217 14:27:37.778379 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-sb6h7" Feb 17 14:27:37 crc kubenswrapper[4836]: I0217 14:27:37.786442 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 17 14:27:37 crc kubenswrapper[4836]: I0217 14:27:37.786796 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 17 14:27:37 crc kubenswrapper[4836]: I0217 14:27:37.787140 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-qfhnd" Feb 17 14:27:37 crc kubenswrapper[4836]: I0217 14:27:37.796641 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f877ddd87-w4x5z"] Feb 17 14:27:37 crc kubenswrapper[4836]: I0217 14:27:37.827983 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-sb6h7"] Feb 17 14:27:37 crc kubenswrapper[4836]: I0217 14:27:37.855848 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-pvljf" Feb 17 14:27:37 crc kubenswrapper[4836]: I0217 14:27:37.861208 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-pdhxs"] Feb 17 14:27:37 crc kubenswrapper[4836]: I0217 14:27:37.863810 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-pdhxs" Feb 17 14:27:37 crc kubenswrapper[4836]: I0217 14:27:37.874857 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 17 14:27:37 crc kubenswrapper[4836]: I0217 14:27:37.876089 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-7l8w5" Feb 17 14:27:37 crc kubenswrapper[4836]: I0217 14:27:37.876284 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 17 14:27:37 crc kubenswrapper[4836]: I0217 14:27:37.877496 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sqtgf\" (UniqueName: \"kubernetes.io/projected/18361bc2-5db1-4611-be18-38593e0b5d5d-kube-api-access-sqtgf\") pod \"barbican-db-sync-g9l4s\" (UID: \"18361bc2-5db1-4611-be18-38593e0b5d5d\") " pod="openstack/barbican-db-sync-g9l4s" Feb 17 14:27:37 crc kubenswrapper[4836]: I0217 14:27:37.877553 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rb7hf\" (UniqueName: \"kubernetes.io/projected/81ddbaec-f370-44a3-802b-26980ea65d2f-kube-api-access-rb7hf\") pod \"neutron-db-sync-sb6h7\" (UID: \"81ddbaec-f370-44a3-802b-26980ea65d2f\") " pod="openstack/neutron-db-sync-sb6h7" Feb 17 14:27:37 crc kubenswrapper[4836]: I0217 14:27:37.877633 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81ddbaec-f370-44a3-802b-26980ea65d2f-combined-ca-bundle\") pod \"neutron-db-sync-sb6h7\" (UID: \"81ddbaec-f370-44a3-802b-26980ea65d2f\") " pod="openstack/neutron-db-sync-sb6h7" Feb 17 14:27:37 crc kubenswrapper[4836]: I0217 14:27:37.877727 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/18361bc2-5db1-4611-be18-38593e0b5d5d-db-sync-config-data\") pod \"barbican-db-sync-g9l4s\" (UID: \"18361bc2-5db1-4611-be18-38593e0b5d5d\") " pod="openstack/barbican-db-sync-g9l4s" Feb 17 14:27:37 crc kubenswrapper[4836]: I0217 14:27:37.877747 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18361bc2-5db1-4611-be18-38593e0b5d5d-combined-ca-bundle\") pod \"barbican-db-sync-g9l4s\" (UID: \"18361bc2-5db1-4611-be18-38593e0b5d5d\") " pod="openstack/barbican-db-sync-g9l4s" Feb 17 14:27:37 crc kubenswrapper[4836]: I0217 14:27:37.878056 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/81ddbaec-f370-44a3-802b-26980ea65d2f-config\") pod \"neutron-db-sync-sb6h7\" (UID: \"81ddbaec-f370-44a3-802b-26980ea65d2f\") " pod="openstack/neutron-db-sync-sb6h7" Feb 17 14:27:37 crc kubenswrapper[4836]: I0217 14:27:37.887900 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/18361bc2-5db1-4611-be18-38593e0b5d5d-db-sync-config-data\") pod \"barbican-db-sync-g9l4s\" (UID: \"18361bc2-5db1-4611-be18-38593e0b5d5d\") " pod="openstack/barbican-db-sync-g9l4s" Feb 17 14:27:37 crc kubenswrapper[4836]: I0217 14:27:37.888166 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18361bc2-5db1-4611-be18-38593e0b5d5d-combined-ca-bundle\") pod \"barbican-db-sync-g9l4s\" (UID: \"18361bc2-5db1-4611-be18-38593e0b5d5d\") " pod="openstack/barbican-db-sync-g9l4s" Feb 17 14:27:37 crc kubenswrapper[4836]: I0217 14:27:37.902104 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5959f8865f-9qw4t"] Feb 17 14:27:37 crc kubenswrapper[4836]: I0217 14:27:37.911773 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5959f8865f-9qw4t" Feb 17 14:27:37 crc kubenswrapper[4836]: I0217 14:27:37.915699 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqtgf\" (UniqueName: \"kubernetes.io/projected/18361bc2-5db1-4611-be18-38593e0b5d5d-kube-api-access-sqtgf\") pod \"barbican-db-sync-g9l4s\" (UID: \"18361bc2-5db1-4611-be18-38593e0b5d5d\") " pod="openstack/barbican-db-sync-g9l4s" Feb 17 14:27:37 crc kubenswrapper[4836]: I0217 14:27:37.918034 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Feb 17 14:27:37 crc kubenswrapper[4836]: I0217 14:27:37.940854 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-pdhxs"] Feb 17 14:27:37 crc kubenswrapper[4836]: I0217 14:27:37.989490 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5959f8865f-9qw4t"] Feb 17 14:27:37 crc kubenswrapper[4836]: I0217 14:27:37.995771 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e9a920b-04d0-41e4-8a9e-3b53f5ab7705-config\") pod \"dnsmasq-dns-5959f8865f-9qw4t\" (UID: \"6e9a920b-04d0-41e4-8a9e-3b53f5ab7705\") " pod="openstack/dnsmasq-dns-5959f8865f-9qw4t" Feb 17 14:27:37 crc kubenswrapper[4836]: I0217 14:27:37.995873 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k78mf\" (UniqueName: \"kubernetes.io/projected/1fe4b42c-afbf-41e1-8035-5fffb156eadc-kube-api-access-k78mf\") pod \"placement-db-sync-pdhxs\" (UID: \"1fe4b42c-afbf-41e1-8035-5fffb156eadc\") " pod="openstack/placement-db-sync-pdhxs" Feb 17 14:27:37 crc kubenswrapper[4836]: I0217 14:27:37.996019 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/81ddbaec-f370-44a3-802b-26980ea65d2f-config\") pod \"neutron-db-sync-sb6h7\" (UID: \"81ddbaec-f370-44a3-802b-26980ea65d2f\") " pod="openstack/neutron-db-sync-sb6h7" Feb 17 14:27:37 crc kubenswrapper[4836]: I0217 14:27:37.996283 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1fe4b42c-afbf-41e1-8035-5fffb156eadc-scripts\") pod \"placement-db-sync-pdhxs\" (UID: \"1fe4b42c-afbf-41e1-8035-5fffb156eadc\") " pod="openstack/placement-db-sync-pdhxs" Feb 17 14:27:37 crc kubenswrapper[4836]: I0217 14:27:37.996799 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rb7hf\" (UniqueName: \"kubernetes.io/projected/81ddbaec-f370-44a3-802b-26980ea65d2f-kube-api-access-rb7hf\") pod \"neutron-db-sync-sb6h7\" (UID: \"81ddbaec-f370-44a3-802b-26980ea65d2f\") " pod="openstack/neutron-db-sync-sb6h7" Feb 17 14:27:37 crc kubenswrapper[4836]: I0217 14:27:37.996959 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fe4b42c-afbf-41e1-8035-5fffb156eadc-config-data\") pod \"placement-db-sync-pdhxs\" (UID: \"1fe4b42c-afbf-41e1-8035-5fffb156eadc\") " pod="openstack/placement-db-sync-pdhxs" Feb 17 14:27:37 crc kubenswrapper[4836]: I0217 14:27:37.997083 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6e9a920b-04d0-41e4-8a9e-3b53f5ab7705-ovsdbserver-sb\") pod \"dnsmasq-dns-5959f8865f-9qw4t\" (UID: \"6e9a920b-04d0-41e4-8a9e-3b53f5ab7705\") " pod="openstack/dnsmasq-dns-5959f8865f-9qw4t" Feb 17 14:27:37 crc kubenswrapper[4836]: I0217 14:27:37.997175 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1fe4b42c-afbf-41e1-8035-5fffb156eadc-logs\") pod \"placement-db-sync-pdhxs\" (UID: \"1fe4b42c-afbf-41e1-8035-5fffb156eadc\") " pod="openstack/placement-db-sync-pdhxs" Feb 17 14:27:37 crc kubenswrapper[4836]: I0217 14:27:37.997271 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81ddbaec-f370-44a3-802b-26980ea65d2f-combined-ca-bundle\") pod \"neutron-db-sync-sb6h7\" (UID: \"81ddbaec-f370-44a3-802b-26980ea65d2f\") " pod="openstack/neutron-db-sync-sb6h7" Feb 17 14:27:37 crc kubenswrapper[4836]: I0217 14:27:37.997667 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fe4b42c-afbf-41e1-8035-5fffb156eadc-combined-ca-bundle\") pod \"placement-db-sync-pdhxs\" (UID: \"1fe4b42c-afbf-41e1-8035-5fffb156eadc\") " pod="openstack/placement-db-sync-pdhxs" Feb 17 14:27:37 crc kubenswrapper[4836]: I0217 14:27:37.998838 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6e9a920b-04d0-41e4-8a9e-3b53f5ab7705-ovsdbserver-nb\") pod \"dnsmasq-dns-5959f8865f-9qw4t\" (UID: \"6e9a920b-04d0-41e4-8a9e-3b53f5ab7705\") " pod="openstack/dnsmasq-dns-5959f8865f-9qw4t" Feb 17 14:27:37 crc kubenswrapper[4836]: I0217 14:27:37.999230 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6e9a920b-04d0-41e4-8a9e-3b53f5ab7705-dns-swift-storage-0\") pod \"dnsmasq-dns-5959f8865f-9qw4t\" (UID: \"6e9a920b-04d0-41e4-8a9e-3b53f5ab7705\") " pod="openstack/dnsmasq-dns-5959f8865f-9qw4t" Feb 17 14:27:37 crc kubenswrapper[4836]: I0217 14:27:37.999466 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwksm\" (UniqueName: \"kubernetes.io/projected/6e9a920b-04d0-41e4-8a9e-3b53f5ab7705-kube-api-access-rwksm\") pod \"dnsmasq-dns-5959f8865f-9qw4t\" (UID: \"6e9a920b-04d0-41e4-8a9e-3b53f5ab7705\") " pod="openstack/dnsmasq-dns-5959f8865f-9qw4t" Feb 17 14:27:37 crc kubenswrapper[4836]: I0217 14:27:37.999612 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6e9a920b-04d0-41e4-8a9e-3b53f5ab7705-dns-svc\") pod \"dnsmasq-dns-5959f8865f-9qw4t\" (UID: \"6e9a920b-04d0-41e4-8a9e-3b53f5ab7705\") " pod="openstack/dnsmasq-dns-5959f8865f-9qw4t" Feb 17 14:27:38 crc kubenswrapper[4836]: I0217 14:27:38.001937 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/81ddbaec-f370-44a3-802b-26980ea65d2f-config\") pod \"neutron-db-sync-sb6h7\" (UID: \"81ddbaec-f370-44a3-802b-26980ea65d2f\") " pod="openstack/neutron-db-sync-sb6h7" Feb 17 14:27:38 crc kubenswrapper[4836]: I0217 14:27:38.011753 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81ddbaec-f370-44a3-802b-26980ea65d2f-combined-ca-bundle\") pod \"neutron-db-sync-sb6h7\" (UID: \"81ddbaec-f370-44a3-802b-26980ea65d2f\") " pod="openstack/neutron-db-sync-sb6h7" Feb 17 14:27:38 crc kubenswrapper[4836]: I0217 14:27:38.019128 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 17 14:27:38 crc kubenswrapper[4836]: I0217 14:27:38.035257 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rb7hf\" (UniqueName: \"kubernetes.io/projected/81ddbaec-f370-44a3-802b-26980ea65d2f-kube-api-access-rb7hf\") pod \"neutron-db-sync-sb6h7\" (UID: \"81ddbaec-f370-44a3-802b-26980ea65d2f\") " pod="openstack/neutron-db-sync-sb6h7" Feb 17 14:27:38 crc kubenswrapper[4836]: I0217 14:27:38.055110 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 14:27:38 crc kubenswrapper[4836]: I0217 14:27:38.060204 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 17 14:27:38 crc kubenswrapper[4836]: I0217 14:27:38.060569 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 17 14:27:38 crc kubenswrapper[4836]: I0217 14:27:38.091570 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-g9l4s" Feb 17 14:27:38 crc kubenswrapper[4836]: I0217 14:27:38.108123 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-sb6h7" Feb 17 14:27:38 crc kubenswrapper[4836]: I0217 14:27:38.113993 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2a1d16f5-4710-43b4-805e-315ed73bb24e-run-httpd\") pod \"ceilometer-0\" (UID: \"2a1d16f5-4710-43b4-805e-315ed73bb24e\") " pod="openstack/ceilometer-0" Feb 17 14:27:38 crc kubenswrapper[4836]: I0217 14:27:38.127397 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fe4b42c-afbf-41e1-8035-5fffb156eadc-config-data\") pod \"placement-db-sync-pdhxs\" (UID: \"1fe4b42c-afbf-41e1-8035-5fffb156eadc\") " pod="openstack/placement-db-sync-pdhxs" Feb 17 14:27:38 crc kubenswrapper[4836]: I0217 14:27:38.127541 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6e9a920b-04d0-41e4-8a9e-3b53f5ab7705-ovsdbserver-sb\") pod \"dnsmasq-dns-5959f8865f-9qw4t\" (UID: \"6e9a920b-04d0-41e4-8a9e-3b53f5ab7705\") " pod="openstack/dnsmasq-dns-5959f8865f-9qw4t" Feb 17 14:27:38 crc kubenswrapper[4836]: I0217 14:27:38.127621 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1fe4b42c-afbf-41e1-8035-5fffb156eadc-logs\") pod \"placement-db-sync-pdhxs\" (UID: \"1fe4b42c-afbf-41e1-8035-5fffb156eadc\") " pod="openstack/placement-db-sync-pdhxs" Feb 17 14:27:38 crc kubenswrapper[4836]: I0217 14:27:38.127674 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a1d16f5-4710-43b4-805e-315ed73bb24e-config-data\") pod \"ceilometer-0\" (UID: \"2a1d16f5-4710-43b4-805e-315ed73bb24e\") " pod="openstack/ceilometer-0" Feb 17 14:27:38 crc kubenswrapper[4836]: I0217 14:27:38.127733 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a1d16f5-4710-43b4-805e-315ed73bb24e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2a1d16f5-4710-43b4-805e-315ed73bb24e\") " pod="openstack/ceilometer-0" Feb 17 14:27:38 crc kubenswrapper[4836]: I0217 14:27:38.127780 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fe4b42c-afbf-41e1-8035-5fffb156eadc-combined-ca-bundle\") pod \"placement-db-sync-pdhxs\" (UID: \"1fe4b42c-afbf-41e1-8035-5fffb156eadc\") " pod="openstack/placement-db-sync-pdhxs" Feb 17 14:27:38 crc kubenswrapper[4836]: I0217 14:27:38.127849 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6e9a920b-04d0-41e4-8a9e-3b53f5ab7705-ovsdbserver-nb\") pod \"dnsmasq-dns-5959f8865f-9qw4t\" (UID: \"6e9a920b-04d0-41e4-8a9e-3b53f5ab7705\") " pod="openstack/dnsmasq-dns-5959f8865f-9qw4t" Feb 17 14:27:38 crc kubenswrapper[4836]: I0217 14:27:38.127929 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6e9a920b-04d0-41e4-8a9e-3b53f5ab7705-dns-swift-storage-0\") pod \"dnsmasq-dns-5959f8865f-9qw4t\" (UID: \"6e9a920b-04d0-41e4-8a9e-3b53f5ab7705\") " pod="openstack/dnsmasq-dns-5959f8865f-9qw4t" Feb 17 14:27:38 crc kubenswrapper[4836]: I0217 14:27:38.127960 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwksm\" (UniqueName: \"kubernetes.io/projected/6e9a920b-04d0-41e4-8a9e-3b53f5ab7705-kube-api-access-rwksm\") pod \"dnsmasq-dns-5959f8865f-9qw4t\" (UID: \"6e9a920b-04d0-41e4-8a9e-3b53f5ab7705\") " pod="openstack/dnsmasq-dns-5959f8865f-9qw4t" Feb 17 14:27:38 crc kubenswrapper[4836]: I0217 14:27:38.127990 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2a1d16f5-4710-43b4-805e-315ed73bb24e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2a1d16f5-4710-43b4-805e-315ed73bb24e\") " pod="openstack/ceilometer-0" Feb 17 14:27:38 crc kubenswrapper[4836]: I0217 14:27:38.128025 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6e9a920b-04d0-41e4-8a9e-3b53f5ab7705-dns-svc\") pod \"dnsmasq-dns-5959f8865f-9qw4t\" (UID: \"6e9a920b-04d0-41e4-8a9e-3b53f5ab7705\") " pod="openstack/dnsmasq-dns-5959f8865f-9qw4t" Feb 17 14:27:38 crc kubenswrapper[4836]: I0217 14:27:38.128189 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2a1d16f5-4710-43b4-805e-315ed73bb24e-log-httpd\") pod \"ceilometer-0\" (UID: \"2a1d16f5-4710-43b4-805e-315ed73bb24e\") " pod="openstack/ceilometer-0" Feb 17 14:27:38 crc kubenswrapper[4836]: I0217 14:27:38.128313 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e9a920b-04d0-41e4-8a9e-3b53f5ab7705-config\") pod \"dnsmasq-dns-5959f8865f-9qw4t\" (UID: \"6e9a920b-04d0-41e4-8a9e-3b53f5ab7705\") " pod="openstack/dnsmasq-dns-5959f8865f-9qw4t" Feb 17 14:27:38 crc kubenswrapper[4836]: I0217 14:27:38.128375 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k78mf\" (UniqueName: \"kubernetes.io/projected/1fe4b42c-afbf-41e1-8035-5fffb156eadc-kube-api-access-k78mf\") pod \"placement-db-sync-pdhxs\" (UID: \"1fe4b42c-afbf-41e1-8035-5fffb156eadc\") " pod="openstack/placement-db-sync-pdhxs" Feb 17 14:27:38 crc kubenswrapper[4836]: I0217 14:27:38.128447 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpknx\" (UniqueName: \"kubernetes.io/projected/2a1d16f5-4710-43b4-805e-315ed73bb24e-kube-api-access-dpknx\") pod \"ceilometer-0\" (UID: \"2a1d16f5-4710-43b4-805e-315ed73bb24e\") " pod="openstack/ceilometer-0" Feb 17 14:27:38 crc kubenswrapper[4836]: I0217 14:27:38.130033 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6e9a920b-04d0-41e4-8a9e-3b53f5ab7705-dns-swift-storage-0\") pod \"dnsmasq-dns-5959f8865f-9qw4t\" (UID: \"6e9a920b-04d0-41e4-8a9e-3b53f5ab7705\") " pod="openstack/dnsmasq-dns-5959f8865f-9qw4t" Feb 17 14:27:38 crc kubenswrapper[4836]: I0217 14:27:38.130342 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6e9a920b-04d0-41e4-8a9e-3b53f5ab7705-ovsdbserver-nb\") pod \"dnsmasq-dns-5959f8865f-9qw4t\" (UID: \"6e9a920b-04d0-41e4-8a9e-3b53f5ab7705\") " pod="openstack/dnsmasq-dns-5959f8865f-9qw4t" Feb 17 14:27:38 crc kubenswrapper[4836]: I0217 14:27:38.146202 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fe4b42c-afbf-41e1-8035-5fffb156eadc-config-data\") pod \"placement-db-sync-pdhxs\" (UID: \"1fe4b42c-afbf-41e1-8035-5fffb156eadc\") " pod="openstack/placement-db-sync-pdhxs" Feb 17 14:27:38 crc kubenswrapper[4836]: I0217 14:27:38.173279 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1fe4b42c-afbf-41e1-8035-5fffb156eadc-logs\") pod \"placement-db-sync-pdhxs\" (UID: \"1fe4b42c-afbf-41e1-8035-5fffb156eadc\") " pod="openstack/placement-db-sync-pdhxs" Feb 17 14:27:38 crc kubenswrapper[4836]: I0217 14:27:38.174143 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6e9a920b-04d0-41e4-8a9e-3b53f5ab7705-ovsdbserver-sb\") pod \"dnsmasq-dns-5959f8865f-9qw4t\" (UID: \"6e9a920b-04d0-41e4-8a9e-3b53f5ab7705\") " pod="openstack/dnsmasq-dns-5959f8865f-9qw4t" Feb 17 14:27:38 crc kubenswrapper[4836]: I0217 14:27:38.184272 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6e9a920b-04d0-41e4-8a9e-3b53f5ab7705-dns-svc\") pod \"dnsmasq-dns-5959f8865f-9qw4t\" (UID: \"6e9a920b-04d0-41e4-8a9e-3b53f5ab7705\") " pod="openstack/dnsmasq-dns-5959f8865f-9qw4t" Feb 17 14:27:38 crc kubenswrapper[4836]: I0217 14:27:38.184361 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 14:27:38 crc kubenswrapper[4836]: I0217 14:27:38.185469 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e9a920b-04d0-41e4-8a9e-3b53f5ab7705-config\") pod \"dnsmasq-dns-5959f8865f-9qw4t\" (UID: \"6e9a920b-04d0-41e4-8a9e-3b53f5ab7705\") " pod="openstack/dnsmasq-dns-5959f8865f-9qw4t" Feb 17 14:27:38 crc kubenswrapper[4836]: I0217 14:27:38.185553 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a1d16f5-4710-43b4-805e-315ed73bb24e-scripts\") pod \"ceilometer-0\" (UID: \"2a1d16f5-4710-43b4-805e-315ed73bb24e\") " pod="openstack/ceilometer-0" Feb 17 14:27:38 crc kubenswrapper[4836]: I0217 14:27:38.185732 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1fe4b42c-afbf-41e1-8035-5fffb156eadc-scripts\") pod \"placement-db-sync-pdhxs\" (UID: \"1fe4b42c-afbf-41e1-8035-5fffb156eadc\") " pod="openstack/placement-db-sync-pdhxs" Feb 17 14:27:38 crc kubenswrapper[4836]: I0217 14:27:38.193567 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwksm\" (UniqueName: \"kubernetes.io/projected/6e9a920b-04d0-41e4-8a9e-3b53f5ab7705-kube-api-access-rwksm\") pod \"dnsmasq-dns-5959f8865f-9qw4t\" (UID: \"6e9a920b-04d0-41e4-8a9e-3b53f5ab7705\") " pod="openstack/dnsmasq-dns-5959f8865f-9qw4t" Feb 17 14:27:38 crc kubenswrapper[4836]: I0217 14:27:38.204728 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5959f8865f-9qw4t"] Feb 17 14:27:38 crc kubenswrapper[4836]: I0217 14:27:38.206335 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5959f8865f-9qw4t" Feb 17 14:27:38 crc kubenswrapper[4836]: I0217 14:27:38.207453 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1fe4b42c-afbf-41e1-8035-5fffb156eadc-scripts\") pod \"placement-db-sync-pdhxs\" (UID: \"1fe4b42c-afbf-41e1-8035-5fffb156eadc\") " pod="openstack/placement-db-sync-pdhxs" Feb 17 14:27:38 crc kubenswrapper[4836]: I0217 14:27:38.210214 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fe4b42c-afbf-41e1-8035-5fffb156eadc-combined-ca-bundle\") pod \"placement-db-sync-pdhxs\" (UID: \"1fe4b42c-afbf-41e1-8035-5fffb156eadc\") " pod="openstack/placement-db-sync-pdhxs" Feb 17 14:27:38 crc kubenswrapper[4836]: I0217 14:27:38.215594 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k78mf\" (UniqueName: \"kubernetes.io/projected/1fe4b42c-afbf-41e1-8035-5fffb156eadc-kube-api-access-k78mf\") pod \"placement-db-sync-pdhxs\" (UID: \"1fe4b42c-afbf-41e1-8035-5fffb156eadc\") " pod="openstack/placement-db-sync-pdhxs" Feb 17 14:27:38 crc kubenswrapper[4836]: I0217 14:27:38.285321 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-9hcq9"] Feb 17 14:27:38 crc kubenswrapper[4836]: I0217 14:27:38.288601 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58dd9ff6bc-9hcq9" Feb 17 14:27:38 crc kubenswrapper[4836]: I0217 14:27:38.291213 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2a1d16f5-4710-43b4-805e-315ed73bb24e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2a1d16f5-4710-43b4-805e-315ed73bb24e\") " pod="openstack/ceilometer-0" Feb 17 14:27:38 crc kubenswrapper[4836]: I0217 14:27:38.291345 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2a1d16f5-4710-43b4-805e-315ed73bb24e-log-httpd\") pod \"ceilometer-0\" (UID: \"2a1d16f5-4710-43b4-805e-315ed73bb24e\") " pod="openstack/ceilometer-0" Feb 17 14:27:38 crc kubenswrapper[4836]: I0217 14:27:38.291560 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dpknx\" (UniqueName: \"kubernetes.io/projected/2a1d16f5-4710-43b4-805e-315ed73bb24e-kube-api-access-dpknx\") pod \"ceilometer-0\" (UID: \"2a1d16f5-4710-43b4-805e-315ed73bb24e\") " pod="openstack/ceilometer-0" Feb 17 14:27:38 crc kubenswrapper[4836]: I0217 14:27:38.291685 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a1d16f5-4710-43b4-805e-315ed73bb24e-scripts\") pod \"ceilometer-0\" (UID: \"2a1d16f5-4710-43b4-805e-315ed73bb24e\") " pod="openstack/ceilometer-0" Feb 17 14:27:38 crc kubenswrapper[4836]: I0217 14:27:38.291847 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2a1d16f5-4710-43b4-805e-315ed73bb24e-run-httpd\") pod \"ceilometer-0\" (UID: \"2a1d16f5-4710-43b4-805e-315ed73bb24e\") " pod="openstack/ceilometer-0" Feb 17 14:27:38 crc kubenswrapper[4836]: I0217 14:27:38.291915 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a1d16f5-4710-43b4-805e-315ed73bb24e-config-data\") pod \"ceilometer-0\" (UID: \"2a1d16f5-4710-43b4-805e-315ed73bb24e\") " pod="openstack/ceilometer-0" Feb 17 14:27:38 crc kubenswrapper[4836]: I0217 14:27:38.291932 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a1d16f5-4710-43b4-805e-315ed73bb24e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2a1d16f5-4710-43b4-805e-315ed73bb24e\") " pod="openstack/ceilometer-0" Feb 17 14:27:38 crc kubenswrapper[4836]: I0217 14:27:38.294007 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2a1d16f5-4710-43b4-805e-315ed73bb24e-log-httpd\") pod \"ceilometer-0\" (UID: \"2a1d16f5-4710-43b4-805e-315ed73bb24e\") " pod="openstack/ceilometer-0" Feb 17 14:27:38 crc kubenswrapper[4836]: I0217 14:27:38.294999 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2a1d16f5-4710-43b4-805e-315ed73bb24e-run-httpd\") pod \"ceilometer-0\" (UID: \"2a1d16f5-4710-43b4-805e-315ed73bb24e\") " pod="openstack/ceilometer-0" Feb 17 14:27:38 crc kubenswrapper[4836]: I0217 14:27:38.298030 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2a1d16f5-4710-43b4-805e-315ed73bb24e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2a1d16f5-4710-43b4-805e-315ed73bb24e\") " pod="openstack/ceilometer-0" Feb 17 14:27:38 crc kubenswrapper[4836]: I0217 14:27:38.301116 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a1d16f5-4710-43b4-805e-315ed73bb24e-scripts\") pod \"ceilometer-0\" (UID: \"2a1d16f5-4710-43b4-805e-315ed73bb24e\") " pod="openstack/ceilometer-0" Feb 17 14:27:38 crc kubenswrapper[4836]: I0217 14:27:38.311490 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a1d16f5-4710-43b4-805e-315ed73bb24e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2a1d16f5-4710-43b4-805e-315ed73bb24e\") " pod="openstack/ceilometer-0" Feb 17 14:27:38 crc kubenswrapper[4836]: I0217 14:27:38.323279 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a1d16f5-4710-43b4-805e-315ed73bb24e-config-data\") pod \"ceilometer-0\" (UID: \"2a1d16f5-4710-43b4-805e-315ed73bb24e\") " pod="openstack/ceilometer-0" Feb 17 14:27:38 crc kubenswrapper[4836]: I0217 14:27:38.333979 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpknx\" (UniqueName: \"kubernetes.io/projected/2a1d16f5-4710-43b4-805e-315ed73bb24e-kube-api-access-dpknx\") pod \"ceilometer-0\" (UID: \"2a1d16f5-4710-43b4-805e-315ed73bb24e\") " pod="openstack/ceilometer-0" Feb 17 14:27:38 crc kubenswrapper[4836]: I0217 14:27:38.343872 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-9hcq9"] Feb 17 14:27:38 crc kubenswrapper[4836]: I0217 14:27:38.411841 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ccce7d80-ec87-4fb2-a75f-1b5ddc2f4be9-ovsdbserver-nb\") pod \"dnsmasq-dns-58dd9ff6bc-9hcq9\" (UID: \"ccce7d80-ec87-4fb2-a75f-1b5ddc2f4be9\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-9hcq9" Feb 17 14:27:38 crc kubenswrapper[4836]: I0217 14:27:38.412023 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ccce7d80-ec87-4fb2-a75f-1b5ddc2f4be9-ovsdbserver-sb\") pod \"dnsmasq-dns-58dd9ff6bc-9hcq9\" (UID: \"ccce7d80-ec87-4fb2-a75f-1b5ddc2f4be9\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-9hcq9" Feb 17 14:27:38 crc kubenswrapper[4836]: I0217 14:27:38.412112 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ccce7d80-ec87-4fb2-a75f-1b5ddc2f4be9-dns-swift-storage-0\") pod \"dnsmasq-dns-58dd9ff6bc-9hcq9\" (UID: \"ccce7d80-ec87-4fb2-a75f-1b5ddc2f4be9\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-9hcq9" Feb 17 14:27:38 crc kubenswrapper[4836]: I0217 14:27:38.412453 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ccce7d80-ec87-4fb2-a75f-1b5ddc2f4be9-dns-svc\") pod \"dnsmasq-dns-58dd9ff6bc-9hcq9\" (UID: \"ccce7d80-ec87-4fb2-a75f-1b5ddc2f4be9\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-9hcq9" Feb 17 14:27:38 crc kubenswrapper[4836]: I0217 14:27:38.414555 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n78t2\" (UniqueName: \"kubernetes.io/projected/ccce7d80-ec87-4fb2-a75f-1b5ddc2f4be9-kube-api-access-n78t2\") pod \"dnsmasq-dns-58dd9ff6bc-9hcq9\" (UID: \"ccce7d80-ec87-4fb2-a75f-1b5ddc2f4be9\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-9hcq9" Feb 17 14:27:38 crc kubenswrapper[4836]: I0217 14:27:38.414767 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ccce7d80-ec87-4fb2-a75f-1b5ddc2f4be9-config\") pod \"dnsmasq-dns-58dd9ff6bc-9hcq9\" (UID: \"ccce7d80-ec87-4fb2-a75f-1b5ddc2f4be9\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-9hcq9" Feb 17 14:27:38 crc kubenswrapper[4836]: I0217 14:27:38.444942 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 14:27:38 crc kubenswrapper[4836]: I0217 14:27:38.508380 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-pdhxs" Feb 17 14:27:38 crc kubenswrapper[4836]: I0217 14:27:38.658888 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ccce7d80-ec87-4fb2-a75f-1b5ddc2f4be9-config\") pod \"dnsmasq-dns-58dd9ff6bc-9hcq9\" (UID: \"ccce7d80-ec87-4fb2-a75f-1b5ddc2f4be9\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-9hcq9" Feb 17 14:27:38 crc kubenswrapper[4836]: I0217 14:27:38.659161 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ccce7d80-ec87-4fb2-a75f-1b5ddc2f4be9-ovsdbserver-nb\") pod \"dnsmasq-dns-58dd9ff6bc-9hcq9\" (UID: \"ccce7d80-ec87-4fb2-a75f-1b5ddc2f4be9\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-9hcq9" Feb 17 14:27:38 crc kubenswrapper[4836]: I0217 14:27:38.659455 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ccce7d80-ec87-4fb2-a75f-1b5ddc2f4be9-ovsdbserver-sb\") pod \"dnsmasq-dns-58dd9ff6bc-9hcq9\" (UID: \"ccce7d80-ec87-4fb2-a75f-1b5ddc2f4be9\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-9hcq9" Feb 17 14:27:38 crc kubenswrapper[4836]: I0217 14:27:38.659713 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ccce7d80-ec87-4fb2-a75f-1b5ddc2f4be9-dns-swift-storage-0\") pod \"dnsmasq-dns-58dd9ff6bc-9hcq9\" (UID: \"ccce7d80-ec87-4fb2-a75f-1b5ddc2f4be9\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-9hcq9" Feb 17 14:27:38 crc kubenswrapper[4836]: I0217 14:27:38.659880 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ccce7d80-ec87-4fb2-a75f-1b5ddc2f4be9-dns-svc\") pod \"dnsmasq-dns-58dd9ff6bc-9hcq9\" (UID: \"ccce7d80-ec87-4fb2-a75f-1b5ddc2f4be9\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-9hcq9" Feb 17 14:27:38 crc kubenswrapper[4836]: I0217 14:27:38.659907 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n78t2\" (UniqueName: \"kubernetes.io/projected/ccce7d80-ec87-4fb2-a75f-1b5ddc2f4be9-kube-api-access-n78t2\") pod \"dnsmasq-dns-58dd9ff6bc-9hcq9\" (UID: \"ccce7d80-ec87-4fb2-a75f-1b5ddc2f4be9\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-9hcq9" Feb 17 14:27:38 crc kubenswrapper[4836]: I0217 14:27:38.662487 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ccce7d80-ec87-4fb2-a75f-1b5ddc2f4be9-ovsdbserver-sb\") pod \"dnsmasq-dns-58dd9ff6bc-9hcq9\" (UID: \"ccce7d80-ec87-4fb2-a75f-1b5ddc2f4be9\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-9hcq9" Feb 17 14:27:38 crc kubenswrapper[4836]: I0217 14:27:38.662560 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ccce7d80-ec87-4fb2-a75f-1b5ddc2f4be9-dns-swift-storage-0\") pod \"dnsmasq-dns-58dd9ff6bc-9hcq9\" (UID: \"ccce7d80-ec87-4fb2-a75f-1b5ddc2f4be9\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-9hcq9" Feb 17 14:27:38 crc kubenswrapper[4836]: I0217 14:27:38.663049 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ccce7d80-ec87-4fb2-a75f-1b5ddc2f4be9-config\") pod \"dnsmasq-dns-58dd9ff6bc-9hcq9\" (UID: \"ccce7d80-ec87-4fb2-a75f-1b5ddc2f4be9\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-9hcq9" Feb 17 14:27:38 crc kubenswrapper[4836]: I0217 14:27:38.663975 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ccce7d80-ec87-4fb2-a75f-1b5ddc2f4be9-dns-svc\") pod \"dnsmasq-dns-58dd9ff6bc-9hcq9\" (UID: \"ccce7d80-ec87-4fb2-a75f-1b5ddc2f4be9\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-9hcq9" Feb 17 14:27:38 crc kubenswrapper[4836]: I0217 14:27:38.664277 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ccce7d80-ec87-4fb2-a75f-1b5ddc2f4be9-ovsdbserver-nb\") pod \"dnsmasq-dns-58dd9ff6bc-9hcq9\" (UID: \"ccce7d80-ec87-4fb2-a75f-1b5ddc2f4be9\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-9hcq9" Feb 17 14:27:38 crc kubenswrapper[4836]: I0217 14:27:38.721199 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n78t2\" (UniqueName: \"kubernetes.io/projected/ccce7d80-ec87-4fb2-a75f-1b5ddc2f4be9-kube-api-access-n78t2\") pod \"dnsmasq-dns-58dd9ff6bc-9hcq9\" (UID: \"ccce7d80-ec87-4fb2-a75f-1b5ddc2f4be9\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-9hcq9" Feb 17 14:27:38 crc kubenswrapper[4836]: I0217 14:27:38.804601 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-r25dh" event={"ID":"9b18f8ba-fa1b-4a70-8774-0df51c645ed9","Type":"ContainerStarted","Data":"57caf7dfbbf9619fcde234bc6e52e4ee9643128225ce4df5e2ebf099d43860d3"} Feb 17 14:27:38 crc kubenswrapper[4836]: I0217 14:27:38.804698 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-r25dh"] Feb 17 14:27:38 crc kubenswrapper[4836]: I0217 14:27:38.804717 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f877ddd87-w4x5z"] Feb 17 14:27:38 crc kubenswrapper[4836]: I0217 14:27:38.941284 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58dd9ff6bc-9hcq9" Feb 17 14:27:39 crc kubenswrapper[4836]: I0217 14:27:39.956494 4836 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 17 14:27:40 crc kubenswrapper[4836]: I0217 14:27:40.009196 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f877ddd87-w4x5z" event={"ID":"d985347f-7490-475c-a126-182ed65224d4","Type":"ContainerStarted","Data":"9b142894b75620c580a00cf3c274a19998723fde1cfc4c18c89919815fac6fa8"} Feb 17 14:27:40 crc kubenswrapper[4836]: I0217 14:27:40.027049 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-qqwhc"] Feb 17 14:27:40 crc kubenswrapper[4836]: I0217 14:27:40.046958 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-db-sync-pvljf"] Feb 17 14:27:40 crc kubenswrapper[4836]: I0217 14:27:40.048740 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-qqwhc" event={"ID":"8185c649-f1ad-4230-830d-07d002e5b358","Type":"ContainerStarted","Data":"b3482ed7c18ae58a71068d39ec0f731b2f5c23d1bee2fd95e9d280383de59ee3"} Feb 17 14:27:40 crc kubenswrapper[4836]: I0217 14:27:40.074391 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-r25dh" podStartSLOduration=4.074350939 podStartE2EDuration="4.074350939s" podCreationTimestamp="2026-02-17 14:27:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:27:40.067105744 +0000 UTC m=+1286.410034023" watchObservedRunningTime="2026-02-17 14:27:40.074350939 +0000 UTC m=+1286.417279208" Feb 17 14:27:40 crc kubenswrapper[4836]: I0217 14:27:40.428946 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-sb6h7"] Feb 17 14:27:40 crc kubenswrapper[4836]: I0217 14:27:40.449752 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5959f8865f-9qw4t"] Feb 17 14:27:40 crc kubenswrapper[4836]: W0217 14:27:40.463117 4836 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6e9a920b_04d0_41e4_8a9e_3b53f5ab7705.slice/crio-c42b88cfee7c21f45ce13367daa0b57526553b3dda8bb68d81609f13bacecaf3 WatchSource:0}: Error finding container c42b88cfee7c21f45ce13367daa0b57526553b3dda8bb68d81609f13bacecaf3: Status 404 returned error can't find the container with id c42b88cfee7c21f45ce13367daa0b57526553b3dda8bb68d81609f13bacecaf3 Feb 17 14:27:40 crc kubenswrapper[4836]: I0217 14:27:40.493896 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-g9l4s"] Feb 17 14:27:40 crc kubenswrapper[4836]: W0217 14:27:40.583995 4836 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod18361bc2_5db1_4611_be18_38593e0b5d5d.slice/crio-b92684999081dd7ebe3c5f048ea5d9a568a0e24a28001ce5ab97b2282351bbcb WatchSource:0}: Error finding container b92684999081dd7ebe3c5f048ea5d9a568a0e24a28001ce5ab97b2282351bbcb: Status 404 returned error can't find the container with id b92684999081dd7ebe3c5f048ea5d9a568a0e24a28001ce5ab97b2282351bbcb Feb 17 14:27:40 crc kubenswrapper[4836]: I0217 14:27:40.911394 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-9hcq9"] Feb 17 14:27:40 crc kubenswrapper[4836]: I0217 14:27:40.960269 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 14:27:41 crc kubenswrapper[4836]: I0217 14:27:41.004588 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-pdhxs"] Feb 17 14:27:41 crc kubenswrapper[4836]: W0217 14:27:41.033799 4836 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podccce7d80_ec87_4fb2_a75f_1b5ddc2f4be9.slice/crio-e9f16c54dee6fca57cba69a1f24712669edc03c2f1b74e5ff682993352dbd1af WatchSource:0}: Error finding container e9f16c54dee6fca57cba69a1f24712669edc03c2f1b74e5ff682993352dbd1af: Status 404 returned error can't find the container with id e9f16c54dee6fca57cba69a1f24712669edc03c2f1b74e5ff682993352dbd1af Feb 17 14:27:41 crc kubenswrapper[4836]: I0217 14:27:41.132809 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-sb6h7" event={"ID":"81ddbaec-f370-44a3-802b-26980ea65d2f","Type":"ContainerStarted","Data":"a94f2fee60c2cb9701b67002fd76857eaeaf8cc9cdf14886139c9af2827d62a6"} Feb 17 14:27:41 crc kubenswrapper[4836]: I0217 14:27:41.153794 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 14:27:41 crc kubenswrapper[4836]: I0217 14:27:41.165554 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2a1d16f5-4710-43b4-805e-315ed73bb24e","Type":"ContainerStarted","Data":"725a655ac601adcaa8185b937f6643704390b16c79c731f2de3ba649c346ef2b"} Feb 17 14:27:41 crc kubenswrapper[4836]: I0217 14:27:41.192714 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-pvljf" event={"ID":"4e016162-2025-44ad-989d-ce71d9f8f9bf","Type":"ContainerStarted","Data":"5256492605b5f72154c618f9880c205b521d09a7d2c8e835b6a6c8642893045e"} Feb 17 14:27:41 crc kubenswrapper[4836]: I0217 14:27:41.206315 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5959f8865f-9qw4t" event={"ID":"6e9a920b-04d0-41e4-8a9e-3b53f5ab7705","Type":"ContainerStarted","Data":"c42b88cfee7c21f45ce13367daa0b57526553b3dda8bb68d81609f13bacecaf3"} Feb 17 14:27:41 crc kubenswrapper[4836]: I0217 14:27:41.243624 4836 generic.go:334] "Generic (PLEG): container finished" podID="d985347f-7490-475c-a126-182ed65224d4" containerID="14423eb209623d815ed52e92ff6318e5e659fcf35e927a649dbd595f58224937" exitCode=0 Feb 17 14:27:41 crc kubenswrapper[4836]: I0217 14:27:41.243804 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f877ddd87-w4x5z" event={"ID":"d985347f-7490-475c-a126-182ed65224d4","Type":"ContainerDied","Data":"14423eb209623d815ed52e92ff6318e5e659fcf35e927a649dbd595f58224937"} Feb 17 14:27:41 crc kubenswrapper[4836]: I0217 14:27:41.268806 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-g9l4s" event={"ID":"18361bc2-5db1-4611-be18-38593e0b5d5d","Type":"ContainerStarted","Data":"b92684999081dd7ebe3c5f048ea5d9a568a0e24a28001ce5ab97b2282351bbcb"} Feb 17 14:27:41 crc kubenswrapper[4836]: I0217 14:27:41.271829 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-r25dh" event={"ID":"9b18f8ba-fa1b-4a70-8774-0df51c645ed9","Type":"ContainerStarted","Data":"85bf6d2c05b11776e36fd7dffb8368edf8f8e5b125a942780ac6175dd831a159"} Feb 17 14:27:41 crc kubenswrapper[4836]: I0217 14:27:41.298191 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dd9ff6bc-9hcq9" event={"ID":"ccce7d80-ec87-4fb2-a75f-1b5ddc2f4be9","Type":"ContainerStarted","Data":"e9f16c54dee6fca57cba69a1f24712669edc03c2f1b74e5ff682993352dbd1af"} Feb 17 14:27:42 crc kubenswrapper[4836]: I0217 14:27:42.369761 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-sb6h7" event={"ID":"81ddbaec-f370-44a3-802b-26980ea65d2f","Type":"ContainerStarted","Data":"35ecf820b0414db1c94b077c083568db5d4a957bb9d735db9d4e378b6ebbc861"} Feb 17 14:27:42 crc kubenswrapper[4836]: I0217 14:27:42.387716 4836 generic.go:334] "Generic (PLEG): container finished" podID="6e9a920b-04d0-41e4-8a9e-3b53f5ab7705" containerID="4dc5211e3fe44dc01a9738a9c6be073fa788d5b4e5643aa53e9529d0d5d0943a" exitCode=0 Feb 17 14:27:42 crc kubenswrapper[4836]: I0217 14:27:42.387876 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5959f8865f-9qw4t" event={"ID":"6e9a920b-04d0-41e4-8a9e-3b53f5ab7705","Type":"ContainerDied","Data":"4dc5211e3fe44dc01a9738a9c6be073fa788d5b4e5643aa53e9529d0d5d0943a"} Feb 17 14:27:42 crc kubenswrapper[4836]: I0217 14:27:42.409682 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-sb6h7" podStartSLOduration=5.409519045 podStartE2EDuration="5.409519045s" podCreationTimestamp="2026-02-17 14:27:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:27:42.408467417 +0000 UTC m=+1288.751395696" watchObservedRunningTime="2026-02-17 14:27:42.409519045 +0000 UTC m=+1288.752447314" Feb 17 14:27:42 crc kubenswrapper[4836]: I0217 14:27:42.450445 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f877ddd87-w4x5z" event={"ID":"d985347f-7490-475c-a126-182ed65224d4","Type":"ContainerDied","Data":"9b142894b75620c580a00cf3c274a19998723fde1cfc4c18c89919815fac6fa8"} Feb 17 14:27:42 crc kubenswrapper[4836]: I0217 14:27:42.450496 4836 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b142894b75620c580a00cf3c274a19998723fde1cfc4c18c89919815fac6fa8" Feb 17 14:27:42 crc kubenswrapper[4836]: I0217 14:27:42.456535 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f877ddd87-w4x5z" Feb 17 14:27:42 crc kubenswrapper[4836]: I0217 14:27:42.466639 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-pdhxs" event={"ID":"1fe4b42c-afbf-41e1-8035-5fffb156eadc","Type":"ContainerStarted","Data":"63af66d6a1d8223670f744aeb2ae7fb99f6f7344d8ab31ed483859da53a657a7"} Feb 17 14:27:42 crc kubenswrapper[4836]: I0217 14:27:42.501197 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d985347f-7490-475c-a126-182ed65224d4-config\") pod \"d985347f-7490-475c-a126-182ed65224d4\" (UID: \"d985347f-7490-475c-a126-182ed65224d4\") " Feb 17 14:27:42 crc kubenswrapper[4836]: I0217 14:27:42.501428 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d985347f-7490-475c-a126-182ed65224d4-ovsdbserver-nb\") pod \"d985347f-7490-475c-a126-182ed65224d4\" (UID: \"d985347f-7490-475c-a126-182ed65224d4\") " Feb 17 14:27:42 crc kubenswrapper[4836]: I0217 14:27:42.501468 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d985347f-7490-475c-a126-182ed65224d4-dns-svc\") pod \"d985347f-7490-475c-a126-182ed65224d4\" (UID: \"d985347f-7490-475c-a126-182ed65224d4\") " Feb 17 14:27:42 crc kubenswrapper[4836]: I0217 14:27:42.501503 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d985347f-7490-475c-a126-182ed65224d4-ovsdbserver-sb\") pod \"d985347f-7490-475c-a126-182ed65224d4\" (UID: \"d985347f-7490-475c-a126-182ed65224d4\") " Feb 17 14:27:42 crc kubenswrapper[4836]: I0217 14:27:42.501682 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fn7bv\" (UniqueName: \"kubernetes.io/projected/d985347f-7490-475c-a126-182ed65224d4-kube-api-access-fn7bv\") pod \"d985347f-7490-475c-a126-182ed65224d4\" (UID: \"d985347f-7490-475c-a126-182ed65224d4\") " Feb 17 14:27:42 crc kubenswrapper[4836]: I0217 14:27:42.515177 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d985347f-7490-475c-a126-182ed65224d4-kube-api-access-fn7bv" (OuterVolumeSpecName: "kube-api-access-fn7bv") pod "d985347f-7490-475c-a126-182ed65224d4" (UID: "d985347f-7490-475c-a126-182ed65224d4"). InnerVolumeSpecName "kube-api-access-fn7bv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:27:42 crc kubenswrapper[4836]: I0217 14:27:42.556177 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d985347f-7490-475c-a126-182ed65224d4-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d985347f-7490-475c-a126-182ed65224d4" (UID: "d985347f-7490-475c-a126-182ed65224d4"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:27:42 crc kubenswrapper[4836]: I0217 14:27:42.556564 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d985347f-7490-475c-a126-182ed65224d4-config" (OuterVolumeSpecName: "config") pod "d985347f-7490-475c-a126-182ed65224d4" (UID: "d985347f-7490-475c-a126-182ed65224d4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:27:42 crc kubenswrapper[4836]: I0217 14:27:42.600050 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d985347f-7490-475c-a126-182ed65224d4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d985347f-7490-475c-a126-182ed65224d4" (UID: "d985347f-7490-475c-a126-182ed65224d4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:27:42 crc kubenswrapper[4836]: I0217 14:27:42.605952 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fn7bv\" (UniqueName: \"kubernetes.io/projected/d985347f-7490-475c-a126-182ed65224d4-kube-api-access-fn7bv\") on node \"crc\" DevicePath \"\"" Feb 17 14:27:42 crc kubenswrapper[4836]: I0217 14:27:42.605991 4836 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d985347f-7490-475c-a126-182ed65224d4-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:27:42 crc kubenswrapper[4836]: I0217 14:27:42.606020 4836 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d985347f-7490-475c-a126-182ed65224d4-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 14:27:42 crc kubenswrapper[4836]: I0217 14:27:42.606030 4836 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d985347f-7490-475c-a126-182ed65224d4-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 17 14:27:42 crc kubenswrapper[4836]: I0217 14:27:42.626752 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d985347f-7490-475c-a126-182ed65224d4-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d985347f-7490-475c-a126-182ed65224d4" (UID: "d985347f-7490-475c-a126-182ed65224d4"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:27:42 crc kubenswrapper[4836]: I0217 14:27:42.733629 4836 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d985347f-7490-475c-a126-182ed65224d4-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 17 14:27:43 crc kubenswrapper[4836]: I0217 14:27:43.357809 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5959f8865f-9qw4t" Feb 17 14:27:43 crc kubenswrapper[4836]: I0217 14:27:43.476325 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6e9a920b-04d0-41e4-8a9e-3b53f5ab7705-dns-svc\") pod \"6e9a920b-04d0-41e4-8a9e-3b53f5ab7705\" (UID: \"6e9a920b-04d0-41e4-8a9e-3b53f5ab7705\") " Feb 17 14:27:43 crc kubenswrapper[4836]: I0217 14:27:43.476478 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6e9a920b-04d0-41e4-8a9e-3b53f5ab7705-ovsdbserver-nb\") pod \"6e9a920b-04d0-41e4-8a9e-3b53f5ab7705\" (UID: \"6e9a920b-04d0-41e4-8a9e-3b53f5ab7705\") " Feb 17 14:27:43 crc kubenswrapper[4836]: I0217 14:27:43.476729 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e9a920b-04d0-41e4-8a9e-3b53f5ab7705-config\") pod \"6e9a920b-04d0-41e4-8a9e-3b53f5ab7705\" (UID: \"6e9a920b-04d0-41e4-8a9e-3b53f5ab7705\") " Feb 17 14:27:43 crc kubenswrapper[4836]: I0217 14:27:43.476956 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rwksm\" (UniqueName: \"kubernetes.io/projected/6e9a920b-04d0-41e4-8a9e-3b53f5ab7705-kube-api-access-rwksm\") pod \"6e9a920b-04d0-41e4-8a9e-3b53f5ab7705\" (UID: \"6e9a920b-04d0-41e4-8a9e-3b53f5ab7705\") " Feb 17 14:27:43 crc kubenswrapper[4836]: I0217 14:27:43.477053 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6e9a920b-04d0-41e4-8a9e-3b53f5ab7705-ovsdbserver-sb\") pod \"6e9a920b-04d0-41e4-8a9e-3b53f5ab7705\" (UID: \"6e9a920b-04d0-41e4-8a9e-3b53f5ab7705\") " Feb 17 14:27:43 crc kubenswrapper[4836]: I0217 14:27:43.477244 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6e9a920b-04d0-41e4-8a9e-3b53f5ab7705-dns-swift-storage-0\") pod \"6e9a920b-04d0-41e4-8a9e-3b53f5ab7705\" (UID: \"6e9a920b-04d0-41e4-8a9e-3b53f5ab7705\") " Feb 17 14:27:43 crc kubenswrapper[4836]: I0217 14:27:43.489272 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e9a920b-04d0-41e4-8a9e-3b53f5ab7705-kube-api-access-rwksm" (OuterVolumeSpecName: "kube-api-access-rwksm") pod "6e9a920b-04d0-41e4-8a9e-3b53f5ab7705" (UID: "6e9a920b-04d0-41e4-8a9e-3b53f5ab7705"). InnerVolumeSpecName "kube-api-access-rwksm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:27:43 crc kubenswrapper[4836]: I0217 14:27:43.515023 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5959f8865f-9qw4t" event={"ID":"6e9a920b-04d0-41e4-8a9e-3b53f5ab7705","Type":"ContainerDied","Data":"c42b88cfee7c21f45ce13367daa0b57526553b3dda8bb68d81609f13bacecaf3"} Feb 17 14:27:43 crc kubenswrapper[4836]: I0217 14:27:43.515114 4836 scope.go:117] "RemoveContainer" containerID="4dc5211e3fe44dc01a9738a9c6be073fa788d5b4e5643aa53e9529d0d5d0943a" Feb 17 14:27:43 crc kubenswrapper[4836]: I0217 14:27:43.515525 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5959f8865f-9qw4t" Feb 17 14:27:43 crc kubenswrapper[4836]: I0217 14:27:43.525367 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e9a920b-04d0-41e4-8a9e-3b53f5ab7705-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6e9a920b-04d0-41e4-8a9e-3b53f5ab7705" (UID: "6e9a920b-04d0-41e4-8a9e-3b53f5ab7705"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:27:43 crc kubenswrapper[4836]: I0217 14:27:43.535986 4836 generic.go:334] "Generic (PLEG): container finished" podID="ccce7d80-ec87-4fb2-a75f-1b5ddc2f4be9" containerID="99b7d1e9f2cb717570cc4209028495f2ccc23c4beb025f8110935cc03d58feb9" exitCode=0 Feb 17 14:27:43 crc kubenswrapper[4836]: I0217 14:27:43.536126 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f877ddd87-w4x5z" Feb 17 14:27:43 crc kubenswrapper[4836]: I0217 14:27:43.536992 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e9a920b-04d0-41e4-8a9e-3b53f5ab7705-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6e9a920b-04d0-41e4-8a9e-3b53f5ab7705" (UID: "6e9a920b-04d0-41e4-8a9e-3b53f5ab7705"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:27:43 crc kubenswrapper[4836]: I0217 14:27:43.537939 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dd9ff6bc-9hcq9" event={"ID":"ccce7d80-ec87-4fb2-a75f-1b5ddc2f4be9","Type":"ContainerDied","Data":"99b7d1e9f2cb717570cc4209028495f2ccc23c4beb025f8110935cc03d58feb9"} Feb 17 14:27:43 crc kubenswrapper[4836]: I0217 14:27:43.541385 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e9a920b-04d0-41e4-8a9e-3b53f5ab7705-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "6e9a920b-04d0-41e4-8a9e-3b53f5ab7705" (UID: "6e9a920b-04d0-41e4-8a9e-3b53f5ab7705"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:27:43 crc kubenswrapper[4836]: I0217 14:27:43.557782 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e9a920b-04d0-41e4-8a9e-3b53f5ab7705-config" (OuterVolumeSpecName: "config") pod "6e9a920b-04d0-41e4-8a9e-3b53f5ab7705" (UID: "6e9a920b-04d0-41e4-8a9e-3b53f5ab7705"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:27:43 crc kubenswrapper[4836]: I0217 14:27:43.568978 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e9a920b-04d0-41e4-8a9e-3b53f5ab7705-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6e9a920b-04d0-41e4-8a9e-3b53f5ab7705" (UID: "6e9a920b-04d0-41e4-8a9e-3b53f5ab7705"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:27:43 crc kubenswrapper[4836]: I0217 14:27:43.588770 4836 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6e9a920b-04d0-41e4-8a9e-3b53f5ab7705-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 14:27:43 crc kubenswrapper[4836]: I0217 14:27:43.589267 4836 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6e9a920b-04d0-41e4-8a9e-3b53f5ab7705-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 17 14:27:43 crc kubenswrapper[4836]: I0217 14:27:43.589313 4836 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e9a920b-04d0-41e4-8a9e-3b53f5ab7705-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:27:43 crc kubenswrapper[4836]: I0217 14:27:43.589334 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rwksm\" (UniqueName: \"kubernetes.io/projected/6e9a920b-04d0-41e4-8a9e-3b53f5ab7705-kube-api-access-rwksm\") on node \"crc\" DevicePath \"\"" Feb 17 14:27:43 crc kubenswrapper[4836]: I0217 14:27:43.589350 4836 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6e9a920b-04d0-41e4-8a9e-3b53f5ab7705-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 17 14:27:43 crc kubenswrapper[4836]: I0217 14:27:43.589377 4836 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6e9a920b-04d0-41e4-8a9e-3b53f5ab7705-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 17 14:27:44 crc kubenswrapper[4836]: I0217 14:27:44.005795 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f877ddd87-w4x5z"] Feb 17 14:27:44 crc kubenswrapper[4836]: I0217 14:27:44.020288 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-f877ddd87-w4x5z"] Feb 17 14:27:44 crc kubenswrapper[4836]: I0217 14:27:44.050229 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5959f8865f-9qw4t"] Feb 17 14:27:44 crc kubenswrapper[4836]: I0217 14:27:44.068262 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5959f8865f-9qw4t"] Feb 17 14:27:44 crc kubenswrapper[4836]: I0217 14:27:44.721873 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e9a920b-04d0-41e4-8a9e-3b53f5ab7705" path="/var/lib/kubelet/pods/6e9a920b-04d0-41e4-8a9e-3b53f5ab7705/volumes" Feb 17 14:27:44 crc kubenswrapper[4836]: I0217 14:27:44.722765 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d985347f-7490-475c-a126-182ed65224d4" path="/var/lib/kubelet/pods/d985347f-7490-475c-a126-182ed65224d4/volumes" Feb 17 14:27:45 crc kubenswrapper[4836]: I0217 14:27:45.715167 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dd9ff6bc-9hcq9" event={"ID":"ccce7d80-ec87-4fb2-a75f-1b5ddc2f4be9","Type":"ContainerStarted","Data":"2fca778edd45bdfb866af7aaa0fc6f307d910a96cf1cd5ecfab2d14db35f72e8"} Feb 17 14:27:45 crc kubenswrapper[4836]: I0217 14:27:45.715808 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-58dd9ff6bc-9hcq9" Feb 17 14:27:45 crc kubenswrapper[4836]: I0217 14:27:45.793414 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-58dd9ff6bc-9hcq9" podStartSLOduration=8.793384334 podStartE2EDuration="8.793384334s" podCreationTimestamp="2026-02-17 14:27:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:27:45.751373578 +0000 UTC m=+1292.094301877" watchObservedRunningTime="2026-02-17 14:27:45.793384334 +0000 UTC m=+1292.136312613" Feb 17 14:27:46 crc kubenswrapper[4836]: I0217 14:27:46.738253 4836 generic.go:334] "Generic (PLEG): container finished" podID="9b18f8ba-fa1b-4a70-8774-0df51c645ed9" containerID="85bf6d2c05b11776e36fd7dffb8368edf8f8e5b125a942780ac6175dd831a159" exitCode=0 Feb 17 14:27:46 crc kubenswrapper[4836]: I0217 14:27:46.739608 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-r25dh" event={"ID":"9b18f8ba-fa1b-4a70-8774-0df51c645ed9","Type":"ContainerDied","Data":"85bf6d2c05b11776e36fd7dffb8368edf8f8e5b125a942780ac6175dd831a159"} Feb 17 14:27:48 crc kubenswrapper[4836]: I0217 14:27:48.776107 4836 generic.go:334] "Generic (PLEG): container finished" podID="df3a6cf1-bca0-45b2-9f7c-6d483452d49d" containerID="2953db160f228060c084b5fd479ec149c2b0acd6cacae4957fb68229d08ae1b9" exitCode=0 Feb 17 14:27:48 crc kubenswrapper[4836]: I0217 14:27:48.776274 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-z8g7x" event={"ID":"df3a6cf1-bca0-45b2-9f7c-6d483452d49d","Type":"ContainerDied","Data":"2953db160f228060c084b5fd479ec149c2b0acd6cacae4957fb68229d08ae1b9"} Feb 17 14:27:53 crc kubenswrapper[4836]: I0217 14:27:53.944613 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-58dd9ff6bc-9hcq9" Feb 17 14:27:54 crc kubenswrapper[4836]: I0217 14:27:54.025995 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-wbh2w"] Feb 17 14:27:54 crc kubenswrapper[4836]: I0217 14:27:54.026818 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-698758b865-wbh2w" podUID="312259c2-4f8f-401d-a19e-64d0bc7dd35f" containerName="dnsmasq-dns" containerID="cri-o://de75bc86bd0570fcef07a3f3195cfec352721b59eef66e22b061ebca87ca6456" gracePeriod=10 Feb 17 14:27:54 crc kubenswrapper[4836]: I0217 14:27:54.698491 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-r25dh" Feb 17 14:27:54 crc kubenswrapper[4836]: I0217 14:27:54.715833 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-z8g7x" Feb 17 14:27:54 crc kubenswrapper[4836]: I0217 14:27:54.751256 4836 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-wbh2w" podUID="312259c2-4f8f-401d-a19e-64d0bc7dd35f" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.135:5353: connect: connection refused" Feb 17 14:27:54 crc kubenswrapper[4836]: I0217 14:27:54.815862 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b18f8ba-fa1b-4a70-8774-0df51c645ed9-scripts\") pod \"9b18f8ba-fa1b-4a70-8774-0df51c645ed9\" (UID: \"9b18f8ba-fa1b-4a70-8774-0df51c645ed9\") " Feb 17 14:27:54 crc kubenswrapper[4836]: I0217 14:27:54.817163 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9b18f8ba-fa1b-4a70-8774-0df51c645ed9-credential-keys\") pod \"9b18f8ba-fa1b-4a70-8774-0df51c645ed9\" (UID: \"9b18f8ba-fa1b-4a70-8774-0df51c645ed9\") " Feb 17 14:27:54 crc kubenswrapper[4836]: I0217 14:27:54.817222 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9b18f8ba-fa1b-4a70-8774-0df51c645ed9-fernet-keys\") pod \"9b18f8ba-fa1b-4a70-8774-0df51c645ed9\" (UID: \"9b18f8ba-fa1b-4a70-8774-0df51c645ed9\") " Feb 17 14:27:54 crc kubenswrapper[4836]: I0217 14:27:54.817265 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-grffb\" (UniqueName: \"kubernetes.io/projected/df3a6cf1-bca0-45b2-9f7c-6d483452d49d-kube-api-access-grffb\") pod \"df3a6cf1-bca0-45b2-9f7c-6d483452d49d\" (UID: \"df3a6cf1-bca0-45b2-9f7c-6d483452d49d\") " Feb 17 14:27:54 crc kubenswrapper[4836]: I0217 14:27:54.817318 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lbt2q\" (UniqueName: \"kubernetes.io/projected/9b18f8ba-fa1b-4a70-8774-0df51c645ed9-kube-api-access-lbt2q\") pod \"9b18f8ba-fa1b-4a70-8774-0df51c645ed9\" (UID: \"9b18f8ba-fa1b-4a70-8774-0df51c645ed9\") " Feb 17 14:27:54 crc kubenswrapper[4836]: I0217 14:27:54.817364 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b18f8ba-fa1b-4a70-8774-0df51c645ed9-combined-ca-bundle\") pod \"9b18f8ba-fa1b-4a70-8774-0df51c645ed9\" (UID: \"9b18f8ba-fa1b-4a70-8774-0df51c645ed9\") " Feb 17 14:27:54 crc kubenswrapper[4836]: I0217 14:27:54.817476 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b18f8ba-fa1b-4a70-8774-0df51c645ed9-config-data\") pod \"9b18f8ba-fa1b-4a70-8774-0df51c645ed9\" (UID: \"9b18f8ba-fa1b-4a70-8774-0df51c645ed9\") " Feb 17 14:27:54 crc kubenswrapper[4836]: I0217 14:27:54.817499 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/df3a6cf1-bca0-45b2-9f7c-6d483452d49d-db-sync-config-data\") pod \"df3a6cf1-bca0-45b2-9f7c-6d483452d49d\" (UID: \"df3a6cf1-bca0-45b2-9f7c-6d483452d49d\") " Feb 17 14:27:54 crc kubenswrapper[4836]: I0217 14:27:54.818540 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df3a6cf1-bca0-45b2-9f7c-6d483452d49d-config-data\") pod \"df3a6cf1-bca0-45b2-9f7c-6d483452d49d\" (UID: \"df3a6cf1-bca0-45b2-9f7c-6d483452d49d\") " Feb 17 14:27:54 crc kubenswrapper[4836]: I0217 14:27:54.825290 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b18f8ba-fa1b-4a70-8774-0df51c645ed9-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "9b18f8ba-fa1b-4a70-8774-0df51c645ed9" (UID: "9b18f8ba-fa1b-4a70-8774-0df51c645ed9"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:27:54 crc kubenswrapper[4836]: I0217 14:27:54.826621 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b18f8ba-fa1b-4a70-8774-0df51c645ed9-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "9b18f8ba-fa1b-4a70-8774-0df51c645ed9" (UID: "9b18f8ba-fa1b-4a70-8774-0df51c645ed9"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:27:54 crc kubenswrapper[4836]: I0217 14:27:54.826828 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b18f8ba-fa1b-4a70-8774-0df51c645ed9-kube-api-access-lbt2q" (OuterVolumeSpecName: "kube-api-access-lbt2q") pod "9b18f8ba-fa1b-4a70-8774-0df51c645ed9" (UID: "9b18f8ba-fa1b-4a70-8774-0df51c645ed9"). InnerVolumeSpecName "kube-api-access-lbt2q". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:27:54 crc kubenswrapper[4836]: I0217 14:27:54.826959 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df3a6cf1-bca0-45b2-9f7c-6d483452d49d-kube-api-access-grffb" (OuterVolumeSpecName: "kube-api-access-grffb") pod "df3a6cf1-bca0-45b2-9f7c-6d483452d49d" (UID: "df3a6cf1-bca0-45b2-9f7c-6d483452d49d"). InnerVolumeSpecName "kube-api-access-grffb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:27:54 crc kubenswrapper[4836]: I0217 14:27:54.827178 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b18f8ba-fa1b-4a70-8774-0df51c645ed9-scripts" (OuterVolumeSpecName: "scripts") pod "9b18f8ba-fa1b-4a70-8774-0df51c645ed9" (UID: "9b18f8ba-fa1b-4a70-8774-0df51c645ed9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:27:54 crc kubenswrapper[4836]: I0217 14:27:54.831096 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df3a6cf1-bca0-45b2-9f7c-6d483452d49d-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "df3a6cf1-bca0-45b2-9f7c-6d483452d49d" (UID: "df3a6cf1-bca0-45b2-9f7c-6d483452d49d"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:27:54 crc kubenswrapper[4836]: I0217 14:27:54.859220 4836 generic.go:334] "Generic (PLEG): container finished" podID="312259c2-4f8f-401d-a19e-64d0bc7dd35f" containerID="de75bc86bd0570fcef07a3f3195cfec352721b59eef66e22b061ebca87ca6456" exitCode=0 Feb 17 14:27:54 crc kubenswrapper[4836]: I0217 14:27:54.859343 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-wbh2w" event={"ID":"312259c2-4f8f-401d-a19e-64d0bc7dd35f","Type":"ContainerDied","Data":"de75bc86bd0570fcef07a3f3195cfec352721b59eef66e22b061ebca87ca6456"} Feb 17 14:27:54 crc kubenswrapper[4836]: I0217 14:27:54.862023 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-r25dh" event={"ID":"9b18f8ba-fa1b-4a70-8774-0df51c645ed9","Type":"ContainerDied","Data":"57caf7dfbbf9619fcde234bc6e52e4ee9643128225ce4df5e2ebf099d43860d3"} Feb 17 14:27:54 crc kubenswrapper[4836]: I0217 14:27:54.862093 4836 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="57caf7dfbbf9619fcde234bc6e52e4ee9643128225ce4df5e2ebf099d43860d3" Feb 17 14:27:54 crc kubenswrapper[4836]: I0217 14:27:54.862673 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-r25dh" Feb 17 14:27:54 crc kubenswrapper[4836]: I0217 14:27:54.864106 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b18f8ba-fa1b-4a70-8774-0df51c645ed9-config-data" (OuterVolumeSpecName: "config-data") pod "9b18f8ba-fa1b-4a70-8774-0df51c645ed9" (UID: "9b18f8ba-fa1b-4a70-8774-0df51c645ed9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:27:54 crc kubenswrapper[4836]: I0217 14:27:54.866225 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-z8g7x" event={"ID":"df3a6cf1-bca0-45b2-9f7c-6d483452d49d","Type":"ContainerDied","Data":"a970e805deb8fc7e4ea80574fe0f4020e8d303f5c75ae4049947b41814dd24fc"} Feb 17 14:27:54 crc kubenswrapper[4836]: I0217 14:27:54.866254 4836 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a970e805deb8fc7e4ea80574fe0f4020e8d303f5c75ae4049947b41814dd24fc" Feb 17 14:27:54 crc kubenswrapper[4836]: I0217 14:27:54.866373 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-z8g7x" Feb 17 14:27:54 crc kubenswrapper[4836]: I0217 14:27:54.866586 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b18f8ba-fa1b-4a70-8774-0df51c645ed9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9b18f8ba-fa1b-4a70-8774-0df51c645ed9" (UID: "9b18f8ba-fa1b-4a70-8774-0df51c645ed9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:27:54 crc kubenswrapper[4836]: I0217 14:27:54.884793 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df3a6cf1-bca0-45b2-9f7c-6d483452d49d-config-data" (OuterVolumeSpecName: "config-data") pod "df3a6cf1-bca0-45b2-9f7c-6d483452d49d" (UID: "df3a6cf1-bca0-45b2-9f7c-6d483452d49d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:27:54 crc kubenswrapper[4836]: I0217 14:27:54.920248 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df3a6cf1-bca0-45b2-9f7c-6d483452d49d-combined-ca-bundle\") pod \"df3a6cf1-bca0-45b2-9f7c-6d483452d49d\" (UID: \"df3a6cf1-bca0-45b2-9f7c-6d483452d49d\") " Feb 17 14:27:54 crc kubenswrapper[4836]: I0217 14:27:54.921328 4836 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b18f8ba-fa1b-4a70-8774-0df51c645ed9-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 14:27:54 crc kubenswrapper[4836]: I0217 14:27:54.921356 4836 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9b18f8ba-fa1b-4a70-8774-0df51c645ed9-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 17 14:27:54 crc kubenswrapper[4836]: I0217 14:27:54.921367 4836 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9b18f8ba-fa1b-4a70-8774-0df51c645ed9-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 17 14:27:54 crc kubenswrapper[4836]: I0217 14:27:54.921385 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-grffb\" (UniqueName: \"kubernetes.io/projected/df3a6cf1-bca0-45b2-9f7c-6d483452d49d-kube-api-access-grffb\") on node \"crc\" DevicePath \"\"" Feb 17 14:27:54 crc kubenswrapper[4836]: I0217 14:27:54.921396 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lbt2q\" (UniqueName: \"kubernetes.io/projected/9b18f8ba-fa1b-4a70-8774-0df51c645ed9-kube-api-access-lbt2q\") on node \"crc\" DevicePath \"\"" Feb 17 14:27:54 crc kubenswrapper[4836]: I0217 14:27:54.921406 4836 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b18f8ba-fa1b-4a70-8774-0df51c645ed9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:27:54 crc kubenswrapper[4836]: I0217 14:27:54.921415 4836 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b18f8ba-fa1b-4a70-8774-0df51c645ed9-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 14:27:54 crc kubenswrapper[4836]: I0217 14:27:54.921425 4836 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/df3a6cf1-bca0-45b2-9f7c-6d483452d49d-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 14:27:54 crc kubenswrapper[4836]: I0217 14:27:54.921434 4836 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df3a6cf1-bca0-45b2-9f7c-6d483452d49d-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 14:27:54 crc kubenswrapper[4836]: I0217 14:27:54.952207 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df3a6cf1-bca0-45b2-9f7c-6d483452d49d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "df3a6cf1-bca0-45b2-9f7c-6d483452d49d" (UID: "df3a6cf1-bca0-45b2-9f7c-6d483452d49d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:27:55 crc kubenswrapper[4836]: I0217 14:27:55.023457 4836 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df3a6cf1-bca0-45b2-9f7c-6d483452d49d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:27:55 crc kubenswrapper[4836]: I0217 14:27:55.832446 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-r25dh"] Feb 17 14:27:55 crc kubenswrapper[4836]: I0217 14:27:55.855494 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-r25dh"] Feb 17 14:27:55 crc kubenswrapper[4836]: I0217 14:27:55.920707 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-vmgps"] Feb 17 14:27:55 crc kubenswrapper[4836]: E0217 14:27:55.921279 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df3a6cf1-bca0-45b2-9f7c-6d483452d49d" containerName="glance-db-sync" Feb 17 14:27:55 crc kubenswrapper[4836]: I0217 14:27:55.921385 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="df3a6cf1-bca0-45b2-9f7c-6d483452d49d" containerName="glance-db-sync" Feb 17 14:27:55 crc kubenswrapper[4836]: E0217 14:27:55.921411 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d985347f-7490-475c-a126-182ed65224d4" containerName="init" Feb 17 14:27:55 crc kubenswrapper[4836]: I0217 14:27:55.921417 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="d985347f-7490-475c-a126-182ed65224d4" containerName="init" Feb 17 14:27:55 crc kubenswrapper[4836]: E0217 14:27:55.921427 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b18f8ba-fa1b-4a70-8774-0df51c645ed9" containerName="keystone-bootstrap" Feb 17 14:27:55 crc kubenswrapper[4836]: I0217 14:27:55.921436 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b18f8ba-fa1b-4a70-8774-0df51c645ed9" containerName="keystone-bootstrap" Feb 17 14:27:55 crc kubenswrapper[4836]: E0217 14:27:55.921457 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e9a920b-04d0-41e4-8a9e-3b53f5ab7705" containerName="init" Feb 17 14:27:55 crc kubenswrapper[4836]: I0217 14:27:55.921464 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e9a920b-04d0-41e4-8a9e-3b53f5ab7705" containerName="init" Feb 17 14:27:55 crc kubenswrapper[4836]: I0217 14:27:55.921706 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="df3a6cf1-bca0-45b2-9f7c-6d483452d49d" containerName="glance-db-sync" Feb 17 14:27:55 crc kubenswrapper[4836]: I0217 14:27:55.921727 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="d985347f-7490-475c-a126-182ed65224d4" containerName="init" Feb 17 14:27:55 crc kubenswrapper[4836]: I0217 14:27:55.921746 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b18f8ba-fa1b-4a70-8774-0df51c645ed9" containerName="keystone-bootstrap" Feb 17 14:27:55 crc kubenswrapper[4836]: I0217 14:27:55.921758 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e9a920b-04d0-41e4-8a9e-3b53f5ab7705" containerName="init" Feb 17 14:27:55 crc kubenswrapper[4836]: I0217 14:27:55.923363 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-vmgps" Feb 17 14:27:55 crc kubenswrapper[4836]: I0217 14:27:55.928515 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 17 14:27:55 crc kubenswrapper[4836]: I0217 14:27:55.928827 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 17 14:27:55 crc kubenswrapper[4836]: I0217 14:27:55.929153 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 17 14:27:55 crc kubenswrapper[4836]: I0217 14:27:55.933019 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-s87v5" Feb 17 14:27:55 crc kubenswrapper[4836]: I0217 14:27:55.953551 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-vmgps"] Feb 17 14:27:56 crc kubenswrapper[4836]: I0217 14:27:56.044923 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10331926-261d-4e44-a8c2-89846903ca12-config-data\") pod \"keystone-bootstrap-vmgps\" (UID: \"10331926-261d-4e44-a8c2-89846903ca12\") " pod="openstack/keystone-bootstrap-vmgps" Feb 17 14:27:56 crc kubenswrapper[4836]: I0217 14:27:56.045013 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10331926-261d-4e44-a8c2-89846903ca12-combined-ca-bundle\") pod \"keystone-bootstrap-vmgps\" (UID: \"10331926-261d-4e44-a8c2-89846903ca12\") " pod="openstack/keystone-bootstrap-vmgps" Feb 17 14:27:56 crc kubenswrapper[4836]: I0217 14:27:56.045062 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/10331926-261d-4e44-a8c2-89846903ca12-credential-keys\") pod \"keystone-bootstrap-vmgps\" (UID: \"10331926-261d-4e44-a8c2-89846903ca12\") " pod="openstack/keystone-bootstrap-vmgps" Feb 17 14:27:56 crc kubenswrapper[4836]: I0217 14:27:56.045102 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/10331926-261d-4e44-a8c2-89846903ca12-scripts\") pod \"keystone-bootstrap-vmgps\" (UID: \"10331926-261d-4e44-a8c2-89846903ca12\") " pod="openstack/keystone-bootstrap-vmgps" Feb 17 14:27:56 crc kubenswrapper[4836]: I0217 14:27:56.045156 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/10331926-261d-4e44-a8c2-89846903ca12-fernet-keys\") pod \"keystone-bootstrap-vmgps\" (UID: \"10331926-261d-4e44-a8c2-89846903ca12\") " pod="openstack/keystone-bootstrap-vmgps" Feb 17 14:27:56 crc kubenswrapper[4836]: I0217 14:27:56.045255 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fflgh\" (UniqueName: \"kubernetes.io/projected/10331926-261d-4e44-a8c2-89846903ca12-kube-api-access-fflgh\") pod \"keystone-bootstrap-vmgps\" (UID: \"10331926-261d-4e44-a8c2-89846903ca12\") " pod="openstack/keystone-bootstrap-vmgps" Feb 17 14:27:56 crc kubenswrapper[4836]: I0217 14:27:56.150332 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fflgh\" (UniqueName: \"kubernetes.io/projected/10331926-261d-4e44-a8c2-89846903ca12-kube-api-access-fflgh\") pod \"keystone-bootstrap-vmgps\" (UID: \"10331926-261d-4e44-a8c2-89846903ca12\") " pod="openstack/keystone-bootstrap-vmgps" Feb 17 14:27:56 crc kubenswrapper[4836]: I0217 14:27:56.150462 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10331926-261d-4e44-a8c2-89846903ca12-config-data\") pod \"keystone-bootstrap-vmgps\" (UID: \"10331926-261d-4e44-a8c2-89846903ca12\") " pod="openstack/keystone-bootstrap-vmgps" Feb 17 14:27:56 crc kubenswrapper[4836]: I0217 14:27:56.150500 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10331926-261d-4e44-a8c2-89846903ca12-combined-ca-bundle\") pod \"keystone-bootstrap-vmgps\" (UID: \"10331926-261d-4e44-a8c2-89846903ca12\") " pod="openstack/keystone-bootstrap-vmgps" Feb 17 14:27:56 crc kubenswrapper[4836]: I0217 14:27:56.150533 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/10331926-261d-4e44-a8c2-89846903ca12-credential-keys\") pod \"keystone-bootstrap-vmgps\" (UID: \"10331926-261d-4e44-a8c2-89846903ca12\") " pod="openstack/keystone-bootstrap-vmgps" Feb 17 14:27:56 crc kubenswrapper[4836]: I0217 14:27:56.150567 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/10331926-261d-4e44-a8c2-89846903ca12-scripts\") pod \"keystone-bootstrap-vmgps\" (UID: \"10331926-261d-4e44-a8c2-89846903ca12\") " pod="openstack/keystone-bootstrap-vmgps" Feb 17 14:27:56 crc kubenswrapper[4836]: I0217 14:27:56.150616 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/10331926-261d-4e44-a8c2-89846903ca12-fernet-keys\") pod \"keystone-bootstrap-vmgps\" (UID: \"10331926-261d-4e44-a8c2-89846903ca12\") " pod="openstack/keystone-bootstrap-vmgps" Feb 17 14:27:56 crc kubenswrapper[4836]: I0217 14:27:56.166791 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/10331926-261d-4e44-a8c2-89846903ca12-credential-keys\") pod \"keystone-bootstrap-vmgps\" (UID: \"10331926-261d-4e44-a8c2-89846903ca12\") " pod="openstack/keystone-bootstrap-vmgps" Feb 17 14:27:56 crc kubenswrapper[4836]: I0217 14:27:56.167550 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10331926-261d-4e44-a8c2-89846903ca12-combined-ca-bundle\") pod \"keystone-bootstrap-vmgps\" (UID: \"10331926-261d-4e44-a8c2-89846903ca12\") " pod="openstack/keystone-bootstrap-vmgps" Feb 17 14:27:56 crc kubenswrapper[4836]: I0217 14:27:56.173360 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/10331926-261d-4e44-a8c2-89846903ca12-scripts\") pod \"keystone-bootstrap-vmgps\" (UID: \"10331926-261d-4e44-a8c2-89846903ca12\") " pod="openstack/keystone-bootstrap-vmgps" Feb 17 14:27:56 crc kubenswrapper[4836]: I0217 14:27:56.173924 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10331926-261d-4e44-a8c2-89846903ca12-config-data\") pod \"keystone-bootstrap-vmgps\" (UID: \"10331926-261d-4e44-a8c2-89846903ca12\") " pod="openstack/keystone-bootstrap-vmgps" Feb 17 14:27:56 crc kubenswrapper[4836]: I0217 14:27:56.174396 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/10331926-261d-4e44-a8c2-89846903ca12-fernet-keys\") pod \"keystone-bootstrap-vmgps\" (UID: \"10331926-261d-4e44-a8c2-89846903ca12\") " pod="openstack/keystone-bootstrap-vmgps" Feb 17 14:27:56 crc kubenswrapper[4836]: I0217 14:27:56.185832 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fflgh\" (UniqueName: \"kubernetes.io/projected/10331926-261d-4e44-a8c2-89846903ca12-kube-api-access-fflgh\") pod \"keystone-bootstrap-vmgps\" (UID: \"10331926-261d-4e44-a8c2-89846903ca12\") " pod="openstack/keystone-bootstrap-vmgps" Feb 17 14:27:56 crc kubenswrapper[4836]: I0217 14:27:56.251377 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-mpdz8"] Feb 17 14:27:56 crc kubenswrapper[4836]: I0217 14:27:56.253332 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-mpdz8" Feb 17 14:27:56 crc kubenswrapper[4836]: I0217 14:27:56.261762 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-vmgps" Feb 17 14:27:56 crc kubenswrapper[4836]: I0217 14:27:56.293160 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-mpdz8"] Feb 17 14:27:56 crc kubenswrapper[4836]: I0217 14:27:56.356863 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/522206b4-5f50-46e4-a363-24021bd65471-config\") pod \"dnsmasq-dns-785d8bcb8c-mpdz8\" (UID: \"522206b4-5f50-46e4-a363-24021bd65471\") " pod="openstack/dnsmasq-dns-785d8bcb8c-mpdz8" Feb 17 14:27:56 crc kubenswrapper[4836]: I0217 14:27:56.356938 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/522206b4-5f50-46e4-a363-24021bd65471-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-mpdz8\" (UID: \"522206b4-5f50-46e4-a363-24021bd65471\") " pod="openstack/dnsmasq-dns-785d8bcb8c-mpdz8" Feb 17 14:27:56 crc kubenswrapper[4836]: I0217 14:27:56.356977 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfb6n\" (UniqueName: \"kubernetes.io/projected/522206b4-5f50-46e4-a363-24021bd65471-kube-api-access-hfb6n\") pod \"dnsmasq-dns-785d8bcb8c-mpdz8\" (UID: \"522206b4-5f50-46e4-a363-24021bd65471\") " pod="openstack/dnsmasq-dns-785d8bcb8c-mpdz8" Feb 17 14:27:56 crc kubenswrapper[4836]: I0217 14:27:56.357084 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/522206b4-5f50-46e4-a363-24021bd65471-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-mpdz8\" (UID: \"522206b4-5f50-46e4-a363-24021bd65471\") " pod="openstack/dnsmasq-dns-785d8bcb8c-mpdz8" Feb 17 14:27:56 crc kubenswrapper[4836]: I0217 14:27:56.357101 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/522206b4-5f50-46e4-a363-24021bd65471-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-mpdz8\" (UID: \"522206b4-5f50-46e4-a363-24021bd65471\") " pod="openstack/dnsmasq-dns-785d8bcb8c-mpdz8" Feb 17 14:27:56 crc kubenswrapper[4836]: I0217 14:27:56.357125 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/522206b4-5f50-46e4-a363-24021bd65471-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-mpdz8\" (UID: \"522206b4-5f50-46e4-a363-24021bd65471\") " pod="openstack/dnsmasq-dns-785d8bcb8c-mpdz8" Feb 17 14:27:56 crc kubenswrapper[4836]: I0217 14:27:56.460852 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/522206b4-5f50-46e4-a363-24021bd65471-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-mpdz8\" (UID: \"522206b4-5f50-46e4-a363-24021bd65471\") " pod="openstack/dnsmasq-dns-785d8bcb8c-mpdz8" Feb 17 14:27:56 crc kubenswrapper[4836]: I0217 14:27:56.460911 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/522206b4-5f50-46e4-a363-24021bd65471-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-mpdz8\" (UID: \"522206b4-5f50-46e4-a363-24021bd65471\") " pod="openstack/dnsmasq-dns-785d8bcb8c-mpdz8" Feb 17 14:27:56 crc kubenswrapper[4836]: I0217 14:27:56.460954 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/522206b4-5f50-46e4-a363-24021bd65471-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-mpdz8\" (UID: \"522206b4-5f50-46e4-a363-24021bd65471\") " pod="openstack/dnsmasq-dns-785d8bcb8c-mpdz8" Feb 17 14:27:56 crc kubenswrapper[4836]: I0217 14:27:56.461034 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/522206b4-5f50-46e4-a363-24021bd65471-config\") pod \"dnsmasq-dns-785d8bcb8c-mpdz8\" (UID: \"522206b4-5f50-46e4-a363-24021bd65471\") " pod="openstack/dnsmasq-dns-785d8bcb8c-mpdz8" Feb 17 14:27:56 crc kubenswrapper[4836]: I0217 14:27:56.461086 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/522206b4-5f50-46e4-a363-24021bd65471-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-mpdz8\" (UID: \"522206b4-5f50-46e4-a363-24021bd65471\") " pod="openstack/dnsmasq-dns-785d8bcb8c-mpdz8" Feb 17 14:27:56 crc kubenswrapper[4836]: I0217 14:27:56.461135 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hfb6n\" (UniqueName: \"kubernetes.io/projected/522206b4-5f50-46e4-a363-24021bd65471-kube-api-access-hfb6n\") pod \"dnsmasq-dns-785d8bcb8c-mpdz8\" (UID: \"522206b4-5f50-46e4-a363-24021bd65471\") " pod="openstack/dnsmasq-dns-785d8bcb8c-mpdz8" Feb 17 14:27:56 crc kubenswrapper[4836]: I0217 14:27:56.462084 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/522206b4-5f50-46e4-a363-24021bd65471-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-mpdz8\" (UID: \"522206b4-5f50-46e4-a363-24021bd65471\") " pod="openstack/dnsmasq-dns-785d8bcb8c-mpdz8" Feb 17 14:27:56 crc kubenswrapper[4836]: I0217 14:27:56.462184 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/522206b4-5f50-46e4-a363-24021bd65471-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-mpdz8\" (UID: \"522206b4-5f50-46e4-a363-24021bd65471\") " pod="openstack/dnsmasq-dns-785d8bcb8c-mpdz8" Feb 17 14:27:56 crc kubenswrapper[4836]: I0217 14:27:56.462394 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/522206b4-5f50-46e4-a363-24021bd65471-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-mpdz8\" (UID: \"522206b4-5f50-46e4-a363-24021bd65471\") " pod="openstack/dnsmasq-dns-785d8bcb8c-mpdz8" Feb 17 14:27:56 crc kubenswrapper[4836]: I0217 14:27:56.462806 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/522206b4-5f50-46e4-a363-24021bd65471-config\") pod \"dnsmasq-dns-785d8bcb8c-mpdz8\" (UID: \"522206b4-5f50-46e4-a363-24021bd65471\") " pod="openstack/dnsmasq-dns-785d8bcb8c-mpdz8" Feb 17 14:27:56 crc kubenswrapper[4836]: I0217 14:27:56.463461 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/522206b4-5f50-46e4-a363-24021bd65471-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-mpdz8\" (UID: \"522206b4-5f50-46e4-a363-24021bd65471\") " pod="openstack/dnsmasq-dns-785d8bcb8c-mpdz8" Feb 17 14:27:56 crc kubenswrapper[4836]: I0217 14:27:56.489079 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfb6n\" (UniqueName: \"kubernetes.io/projected/522206b4-5f50-46e4-a363-24021bd65471-kube-api-access-hfb6n\") pod \"dnsmasq-dns-785d8bcb8c-mpdz8\" (UID: \"522206b4-5f50-46e4-a363-24021bd65471\") " pod="openstack/dnsmasq-dns-785d8bcb8c-mpdz8" Feb 17 14:27:56 crc kubenswrapper[4836]: I0217 14:27:56.582233 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b18f8ba-fa1b-4a70-8774-0df51c645ed9" path="/var/lib/kubelet/pods/9b18f8ba-fa1b-4a70-8774-0df51c645ed9/volumes" Feb 17 14:27:56 crc kubenswrapper[4836]: I0217 14:27:56.609092 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-mpdz8" Feb 17 14:27:57 crc kubenswrapper[4836]: I0217 14:27:57.236598 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 14:27:57 crc kubenswrapper[4836]: I0217 14:27:57.239222 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 17 14:27:57 crc kubenswrapper[4836]: I0217 14:27:57.247164 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Feb 17 14:27:57 crc kubenswrapper[4836]: I0217 14:27:57.247372 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-qbbvn" Feb 17 14:27:57 crc kubenswrapper[4836]: I0217 14:27:57.251068 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 17 14:27:57 crc kubenswrapper[4836]: I0217 14:27:57.261376 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 14:27:57 crc kubenswrapper[4836]: I0217 14:27:57.384896 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-47a47ba0-1e72-4c37-bb61-5251f8f68b34\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-47a47ba0-1e72-4c37-bb61-5251f8f68b34\") pod \"glance-default-external-api-0\" (UID: \"c3515b05-3f55-44fc-9578-ee0d73cb7382\") " pod="openstack/glance-default-external-api-0" Feb 17 14:27:57 crc kubenswrapper[4836]: I0217 14:27:57.385150 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3515b05-3f55-44fc-9578-ee0d73cb7382-config-data\") pod \"glance-default-external-api-0\" (UID: \"c3515b05-3f55-44fc-9578-ee0d73cb7382\") " pod="openstack/glance-default-external-api-0" Feb 17 14:27:57 crc kubenswrapper[4836]: I0217 14:27:57.385518 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3515b05-3f55-44fc-9578-ee0d73cb7382-logs\") pod \"glance-default-external-api-0\" (UID: \"c3515b05-3f55-44fc-9578-ee0d73cb7382\") " pod="openstack/glance-default-external-api-0" Feb 17 14:27:57 crc kubenswrapper[4836]: I0217 14:27:57.385863 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3515b05-3f55-44fc-9578-ee0d73cb7382-scripts\") pod \"glance-default-external-api-0\" (UID: \"c3515b05-3f55-44fc-9578-ee0d73cb7382\") " pod="openstack/glance-default-external-api-0" Feb 17 14:27:57 crc kubenswrapper[4836]: I0217 14:27:57.385963 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqt5t\" (UniqueName: \"kubernetes.io/projected/c3515b05-3f55-44fc-9578-ee0d73cb7382-kube-api-access-jqt5t\") pod \"glance-default-external-api-0\" (UID: \"c3515b05-3f55-44fc-9578-ee0d73cb7382\") " pod="openstack/glance-default-external-api-0" Feb 17 14:27:57 crc kubenswrapper[4836]: I0217 14:27:57.386004 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3515b05-3f55-44fc-9578-ee0d73cb7382-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c3515b05-3f55-44fc-9578-ee0d73cb7382\") " pod="openstack/glance-default-external-api-0" Feb 17 14:27:57 crc kubenswrapper[4836]: I0217 14:27:57.386082 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c3515b05-3f55-44fc-9578-ee0d73cb7382-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c3515b05-3f55-44fc-9578-ee0d73cb7382\") " pod="openstack/glance-default-external-api-0" Feb 17 14:27:57 crc kubenswrapper[4836]: I0217 14:27:57.397367 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 14:27:57 crc kubenswrapper[4836]: I0217 14:27:57.399308 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 17 14:27:57 crc kubenswrapper[4836]: I0217 14:27:57.402267 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 17 14:27:57 crc kubenswrapper[4836]: I0217 14:27:57.416472 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 14:27:57 crc kubenswrapper[4836]: I0217 14:27:57.488635 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3515b05-3f55-44fc-9578-ee0d73cb7382-config-data\") pod \"glance-default-external-api-0\" (UID: \"c3515b05-3f55-44fc-9578-ee0d73cb7382\") " pod="openstack/glance-default-external-api-0" Feb 17 14:27:57 crc kubenswrapper[4836]: I0217 14:27:57.488726 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3515b05-3f55-44fc-9578-ee0d73cb7382-logs\") pod \"glance-default-external-api-0\" (UID: \"c3515b05-3f55-44fc-9578-ee0d73cb7382\") " pod="openstack/glance-default-external-api-0" Feb 17 14:27:57 crc kubenswrapper[4836]: I0217 14:27:57.488800 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3515b05-3f55-44fc-9578-ee0d73cb7382-scripts\") pod \"glance-default-external-api-0\" (UID: \"c3515b05-3f55-44fc-9578-ee0d73cb7382\") " pod="openstack/glance-default-external-api-0" Feb 17 14:27:57 crc kubenswrapper[4836]: I0217 14:27:57.488828 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqt5t\" (UniqueName: \"kubernetes.io/projected/c3515b05-3f55-44fc-9578-ee0d73cb7382-kube-api-access-jqt5t\") pod \"glance-default-external-api-0\" (UID: \"c3515b05-3f55-44fc-9578-ee0d73cb7382\") " pod="openstack/glance-default-external-api-0" Feb 17 14:27:57 crc kubenswrapper[4836]: I0217 14:27:57.488850 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3515b05-3f55-44fc-9578-ee0d73cb7382-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c3515b05-3f55-44fc-9578-ee0d73cb7382\") " pod="openstack/glance-default-external-api-0" Feb 17 14:27:57 crc kubenswrapper[4836]: I0217 14:27:57.488874 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c3515b05-3f55-44fc-9578-ee0d73cb7382-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c3515b05-3f55-44fc-9578-ee0d73cb7382\") " pod="openstack/glance-default-external-api-0" Feb 17 14:27:57 crc kubenswrapper[4836]: I0217 14:27:57.488917 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-47a47ba0-1e72-4c37-bb61-5251f8f68b34\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-47a47ba0-1e72-4c37-bb61-5251f8f68b34\") pod \"glance-default-external-api-0\" (UID: \"c3515b05-3f55-44fc-9578-ee0d73cb7382\") " pod="openstack/glance-default-external-api-0" Feb 17 14:27:57 crc kubenswrapper[4836]: I0217 14:27:57.493330 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3515b05-3f55-44fc-9578-ee0d73cb7382-logs\") pod \"glance-default-external-api-0\" (UID: \"c3515b05-3f55-44fc-9578-ee0d73cb7382\") " pod="openstack/glance-default-external-api-0" Feb 17 14:27:57 crc kubenswrapper[4836]: I0217 14:27:57.493357 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c3515b05-3f55-44fc-9578-ee0d73cb7382-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c3515b05-3f55-44fc-9578-ee0d73cb7382\") " pod="openstack/glance-default-external-api-0" Feb 17 14:27:57 crc kubenswrapper[4836]: I0217 14:27:57.496118 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3515b05-3f55-44fc-9578-ee0d73cb7382-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c3515b05-3f55-44fc-9578-ee0d73cb7382\") " pod="openstack/glance-default-external-api-0" Feb 17 14:27:57 crc kubenswrapper[4836]: I0217 14:27:57.505418 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3515b05-3f55-44fc-9578-ee0d73cb7382-scripts\") pod \"glance-default-external-api-0\" (UID: \"c3515b05-3f55-44fc-9578-ee0d73cb7382\") " pod="openstack/glance-default-external-api-0" Feb 17 14:27:57 crc kubenswrapper[4836]: I0217 14:27:57.508150 4836 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 17 14:27:57 crc kubenswrapper[4836]: I0217 14:27:57.508202 4836 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-47a47ba0-1e72-4c37-bb61-5251f8f68b34\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-47a47ba0-1e72-4c37-bb61-5251f8f68b34\") pod \"glance-default-external-api-0\" (UID: \"c3515b05-3f55-44fc-9578-ee0d73cb7382\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/f1c05c143b5a67726d067625f4c5da25dac4624853da03b1088e3ef561519b77/globalmount\"" pod="openstack/glance-default-external-api-0" Feb 17 14:27:57 crc kubenswrapper[4836]: I0217 14:27:57.510433 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqt5t\" (UniqueName: \"kubernetes.io/projected/c3515b05-3f55-44fc-9578-ee0d73cb7382-kube-api-access-jqt5t\") pod \"glance-default-external-api-0\" (UID: \"c3515b05-3f55-44fc-9578-ee0d73cb7382\") " pod="openstack/glance-default-external-api-0" Feb 17 14:27:57 crc kubenswrapper[4836]: I0217 14:27:57.538220 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3515b05-3f55-44fc-9578-ee0d73cb7382-config-data\") pod \"glance-default-external-api-0\" (UID: \"c3515b05-3f55-44fc-9578-ee0d73cb7382\") " pod="openstack/glance-default-external-api-0" Feb 17 14:27:57 crc kubenswrapper[4836]: I0217 14:27:57.567847 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-47a47ba0-1e72-4c37-bb61-5251f8f68b34\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-47a47ba0-1e72-4c37-bb61-5251f8f68b34\") pod \"glance-default-external-api-0\" (UID: \"c3515b05-3f55-44fc-9578-ee0d73cb7382\") " pod="openstack/glance-default-external-api-0" Feb 17 14:27:57 crc kubenswrapper[4836]: I0217 14:27:57.591044 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c16e95ef-bb15-4f5c-b53d-42ca0c183ed6-logs\") pod \"glance-default-internal-api-0\" (UID: \"c16e95ef-bb15-4f5c-b53d-42ca0c183ed6\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:27:57 crc kubenswrapper[4836]: I0217 14:27:57.591199 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c16e95ef-bb15-4f5c-b53d-42ca0c183ed6-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c16e95ef-bb15-4f5c-b53d-42ca0c183ed6\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:27:57 crc kubenswrapper[4836]: I0217 14:27:57.591278 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c16e95ef-bb15-4f5c-b53d-42ca0c183ed6-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c16e95ef-bb15-4f5c-b53d-42ca0c183ed6\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:27:57 crc kubenswrapper[4836]: I0217 14:27:57.591340 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njnqq\" (UniqueName: \"kubernetes.io/projected/c16e95ef-bb15-4f5c-b53d-42ca0c183ed6-kube-api-access-njnqq\") pod \"glance-default-internal-api-0\" (UID: \"c16e95ef-bb15-4f5c-b53d-42ca0c183ed6\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:27:57 crc kubenswrapper[4836]: I0217 14:27:57.591376 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c16e95ef-bb15-4f5c-b53d-42ca0c183ed6-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c16e95ef-bb15-4f5c-b53d-42ca0c183ed6\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:27:57 crc kubenswrapper[4836]: I0217 14:27:57.591591 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-21ce1609-dbd7-400f-b0f4-c62fbe057ccf\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-21ce1609-dbd7-400f-b0f4-c62fbe057ccf\") pod \"glance-default-internal-api-0\" (UID: \"c16e95ef-bb15-4f5c-b53d-42ca0c183ed6\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:27:57 crc kubenswrapper[4836]: I0217 14:27:57.591702 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c16e95ef-bb15-4f5c-b53d-42ca0c183ed6-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c16e95ef-bb15-4f5c-b53d-42ca0c183ed6\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:27:57 crc kubenswrapper[4836]: I0217 14:27:57.712427 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c16e95ef-bb15-4f5c-b53d-42ca0c183ed6-logs\") pod \"glance-default-internal-api-0\" (UID: \"c16e95ef-bb15-4f5c-b53d-42ca0c183ed6\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:27:57 crc kubenswrapper[4836]: I0217 14:27:57.712703 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c16e95ef-bb15-4f5c-b53d-42ca0c183ed6-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c16e95ef-bb15-4f5c-b53d-42ca0c183ed6\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:27:57 crc kubenswrapper[4836]: I0217 14:27:57.712960 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c16e95ef-bb15-4f5c-b53d-42ca0c183ed6-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c16e95ef-bb15-4f5c-b53d-42ca0c183ed6\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:27:57 crc kubenswrapper[4836]: I0217 14:27:57.712999 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njnqq\" (UniqueName: \"kubernetes.io/projected/c16e95ef-bb15-4f5c-b53d-42ca0c183ed6-kube-api-access-njnqq\") pod \"glance-default-internal-api-0\" (UID: \"c16e95ef-bb15-4f5c-b53d-42ca0c183ed6\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:27:57 crc kubenswrapper[4836]: I0217 14:27:57.713024 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c16e95ef-bb15-4f5c-b53d-42ca0c183ed6-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c16e95ef-bb15-4f5c-b53d-42ca0c183ed6\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:27:57 crc kubenswrapper[4836]: I0217 14:27:57.713376 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-21ce1609-dbd7-400f-b0f4-c62fbe057ccf\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-21ce1609-dbd7-400f-b0f4-c62fbe057ccf\") pod \"glance-default-internal-api-0\" (UID: \"c16e95ef-bb15-4f5c-b53d-42ca0c183ed6\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:27:57 crc kubenswrapper[4836]: I0217 14:27:57.713439 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c16e95ef-bb15-4f5c-b53d-42ca0c183ed6-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c16e95ef-bb15-4f5c-b53d-42ca0c183ed6\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:27:57 crc kubenswrapper[4836]: I0217 14:27:57.714050 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c16e95ef-bb15-4f5c-b53d-42ca0c183ed6-logs\") pod \"glance-default-internal-api-0\" (UID: \"c16e95ef-bb15-4f5c-b53d-42ca0c183ed6\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:27:57 crc kubenswrapper[4836]: I0217 14:27:57.714851 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c16e95ef-bb15-4f5c-b53d-42ca0c183ed6-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c16e95ef-bb15-4f5c-b53d-42ca0c183ed6\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:27:57 crc kubenswrapper[4836]: I0217 14:27:57.719755 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c16e95ef-bb15-4f5c-b53d-42ca0c183ed6-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c16e95ef-bb15-4f5c-b53d-42ca0c183ed6\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:27:57 crc kubenswrapper[4836]: I0217 14:27:57.719859 4836 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 17 14:27:57 crc kubenswrapper[4836]: I0217 14:27:57.719901 4836 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-21ce1609-dbd7-400f-b0f4-c62fbe057ccf\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-21ce1609-dbd7-400f-b0f4-c62fbe057ccf\") pod \"glance-default-internal-api-0\" (UID: \"c16e95ef-bb15-4f5c-b53d-42ca0c183ed6\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/20e9fd566d593755c515c6f55c386051b7cebe94721b27d85313d87ab22fcec4/globalmount\"" pod="openstack/glance-default-internal-api-0" Feb 17 14:27:57 crc kubenswrapper[4836]: I0217 14:27:57.721033 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c16e95ef-bb15-4f5c-b53d-42ca0c183ed6-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c16e95ef-bb15-4f5c-b53d-42ca0c183ed6\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:27:57 crc kubenswrapper[4836]: I0217 14:27:57.743900 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c16e95ef-bb15-4f5c-b53d-42ca0c183ed6-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c16e95ef-bb15-4f5c-b53d-42ca0c183ed6\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:27:57 crc kubenswrapper[4836]: I0217 14:27:57.785002 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-21ce1609-dbd7-400f-b0f4-c62fbe057ccf\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-21ce1609-dbd7-400f-b0f4-c62fbe057ccf\") pod \"glance-default-internal-api-0\" (UID: \"c16e95ef-bb15-4f5c-b53d-42ca0c183ed6\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:27:57 crc kubenswrapper[4836]: I0217 14:27:57.797774 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njnqq\" (UniqueName: \"kubernetes.io/projected/c16e95ef-bb15-4f5c-b53d-42ca0c183ed6-kube-api-access-njnqq\") pod \"glance-default-internal-api-0\" (UID: \"c16e95ef-bb15-4f5c-b53d-42ca0c183ed6\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:27:57 crc kubenswrapper[4836]: I0217 14:27:57.808308 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 17 14:27:57 crc kubenswrapper[4836]: I0217 14:27:57.868274 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 17 14:27:59 crc kubenswrapper[4836]: I0217 14:27:59.418059 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 14:27:59 crc kubenswrapper[4836]: I0217 14:27:59.514331 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 14:27:59 crc kubenswrapper[4836]: I0217 14:27:59.752466 4836 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-wbh2w" podUID="312259c2-4f8f-401d-a19e-64d0bc7dd35f" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.135:5353: connect: connection refused" Feb 17 14:27:59 crc kubenswrapper[4836]: I0217 14:27:59.953677 4836 generic.go:334] "Generic (PLEG): container finished" podID="6fec8667-7189-4e29-8362-37dd935d2db7" containerID="a82e37c7eb14ee548654e466a1de02d0ef7f18f1bf7fd37d772effc7cc961f91" exitCode=0 Feb 17 14:27:59 crc kubenswrapper[4836]: I0217 14:27:59.953749 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"6fec8667-7189-4e29-8362-37dd935d2db7","Type":"ContainerDied","Data":"a82e37c7eb14ee548654e466a1de02d0ef7f18f1bf7fd37d772effc7cc961f91"} Feb 17 14:28:01 crc kubenswrapper[4836]: E0217 14:28:01.627071 4836 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Feb 17 14:28:01 crc kubenswrapper[4836]: E0217 14:28:01.628091 4836 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sqtgf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-g9l4s_openstack(18361bc2-5db1-4611-be18-38593e0b5d5d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 14:28:01 crc kubenswrapper[4836]: E0217 14:28:01.629384 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-g9l4s" podUID="18361bc2-5db1-4611-be18-38593e0b5d5d" Feb 17 14:28:01 crc kubenswrapper[4836]: E0217 14:28:01.982453 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-g9l4s" podUID="18361bc2-5db1-4611-be18-38593e0b5d5d" Feb 17 14:28:04 crc kubenswrapper[4836]: I0217 14:28:04.747895 4836 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-wbh2w" podUID="312259c2-4f8f-401d-a19e-64d0bc7dd35f" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.135:5353: connect: connection refused" Feb 17 14:28:04 crc kubenswrapper[4836]: I0217 14:28:04.748814 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-wbh2w" Feb 17 14:28:05 crc kubenswrapper[4836]: I0217 14:28:05.011955 4836 generic.go:334] "Generic (PLEG): container finished" podID="81ddbaec-f370-44a3-802b-26980ea65d2f" containerID="35ecf820b0414db1c94b077c083568db5d4a957bb9d735db9d4e378b6ebbc861" exitCode=0 Feb 17 14:28:05 crc kubenswrapper[4836]: I0217 14:28:05.012026 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-sb6h7" event={"ID":"81ddbaec-f370-44a3-802b-26980ea65d2f","Type":"ContainerDied","Data":"35ecf820b0414db1c94b077c083568db5d4a957bb9d735db9d4e378b6ebbc861"} Feb 17 14:28:12 crc kubenswrapper[4836]: I0217 14:28:12.534248 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-sb6h7" Feb 17 14:28:12 crc kubenswrapper[4836]: I0217 14:28:12.607844 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rb7hf\" (UniqueName: \"kubernetes.io/projected/81ddbaec-f370-44a3-802b-26980ea65d2f-kube-api-access-rb7hf\") pod \"81ddbaec-f370-44a3-802b-26980ea65d2f\" (UID: \"81ddbaec-f370-44a3-802b-26980ea65d2f\") " Feb 17 14:28:12 crc kubenswrapper[4836]: I0217 14:28:12.608178 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81ddbaec-f370-44a3-802b-26980ea65d2f-combined-ca-bundle\") pod \"81ddbaec-f370-44a3-802b-26980ea65d2f\" (UID: \"81ddbaec-f370-44a3-802b-26980ea65d2f\") " Feb 17 14:28:12 crc kubenswrapper[4836]: I0217 14:28:12.608240 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/81ddbaec-f370-44a3-802b-26980ea65d2f-config\") pod \"81ddbaec-f370-44a3-802b-26980ea65d2f\" (UID: \"81ddbaec-f370-44a3-802b-26980ea65d2f\") " Feb 17 14:28:12 crc kubenswrapper[4836]: I0217 14:28:12.617663 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81ddbaec-f370-44a3-802b-26980ea65d2f-kube-api-access-rb7hf" (OuterVolumeSpecName: "kube-api-access-rb7hf") pod "81ddbaec-f370-44a3-802b-26980ea65d2f" (UID: "81ddbaec-f370-44a3-802b-26980ea65d2f"). InnerVolumeSpecName "kube-api-access-rb7hf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:28:12 crc kubenswrapper[4836]: I0217 14:28:12.865235 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81ddbaec-f370-44a3-802b-26980ea65d2f-config" (OuterVolumeSpecName: "config") pod "81ddbaec-f370-44a3-802b-26980ea65d2f" (UID: "81ddbaec-f370-44a3-802b-26980ea65d2f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:28:12 crc kubenswrapper[4836]: I0217 14:28:12.871626 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rb7hf\" (UniqueName: \"kubernetes.io/projected/81ddbaec-f370-44a3-802b-26980ea65d2f-kube-api-access-rb7hf\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:12 crc kubenswrapper[4836]: I0217 14:28:12.871658 4836 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/81ddbaec-f370-44a3-802b-26980ea65d2f-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:12 crc kubenswrapper[4836]: I0217 14:28:12.875989 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81ddbaec-f370-44a3-802b-26980ea65d2f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "81ddbaec-f370-44a3-802b-26980ea65d2f" (UID: "81ddbaec-f370-44a3-802b-26980ea65d2f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:28:12 crc kubenswrapper[4836]: I0217 14:28:12.974351 4836 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81ddbaec-f370-44a3-802b-26980ea65d2f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:13 crc kubenswrapper[4836]: I0217 14:28:13.107054 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-sb6h7" event={"ID":"81ddbaec-f370-44a3-802b-26980ea65d2f","Type":"ContainerDied","Data":"a94f2fee60c2cb9701b67002fd76857eaeaf8cc9cdf14886139c9af2827d62a6"} Feb 17 14:28:13 crc kubenswrapper[4836]: I0217 14:28:13.107109 4836 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a94f2fee60c2cb9701b67002fd76857eaeaf8cc9cdf14886139c9af2827d62a6" Feb 17 14:28:13 crc kubenswrapper[4836]: I0217 14:28:13.107206 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-sb6h7" Feb 17 14:28:13 crc kubenswrapper[4836]: I0217 14:28:13.940460 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-mpdz8"] Feb 17 14:28:13 crc kubenswrapper[4836]: I0217 14:28:13.958393 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-56bdc657f6-lhdd4"] Feb 17 14:28:13 crc kubenswrapper[4836]: E0217 14:28:13.959090 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81ddbaec-f370-44a3-802b-26980ea65d2f" containerName="neutron-db-sync" Feb 17 14:28:13 crc kubenswrapper[4836]: I0217 14:28:13.959120 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="81ddbaec-f370-44a3-802b-26980ea65d2f" containerName="neutron-db-sync" Feb 17 14:28:13 crc kubenswrapper[4836]: I0217 14:28:13.959416 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="81ddbaec-f370-44a3-802b-26980ea65d2f" containerName="neutron-db-sync" Feb 17 14:28:13 crc kubenswrapper[4836]: I0217 14:28:13.960805 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-56bdc657f6-lhdd4" Feb 17 14:28:13 crc kubenswrapper[4836]: I0217 14:28:13.965831 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 17 14:28:13 crc kubenswrapper[4836]: I0217 14:28:13.966352 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 17 14:28:13 crc kubenswrapper[4836]: I0217 14:28:13.972595 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Feb 17 14:28:13 crc kubenswrapper[4836]: I0217 14:28:13.972682 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-qfhnd" Feb 17 14:28:13 crc kubenswrapper[4836]: I0217 14:28:13.981414 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-56bdc657f6-lhdd4"] Feb 17 14:28:14 crc kubenswrapper[4836]: I0217 14:28:13.999390 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/10f74a60-5438-45cd-a8e1-74ccc1c3b16a-config\") pod \"neutron-56bdc657f6-lhdd4\" (UID: \"10f74a60-5438-45cd-a8e1-74ccc1c3b16a\") " pod="openstack/neutron-56bdc657f6-lhdd4" Feb 17 14:28:14 crc kubenswrapper[4836]: I0217 14:28:13.999460 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10f74a60-5438-45cd-a8e1-74ccc1c3b16a-combined-ca-bundle\") pod \"neutron-56bdc657f6-lhdd4\" (UID: \"10f74a60-5438-45cd-a8e1-74ccc1c3b16a\") " pod="openstack/neutron-56bdc657f6-lhdd4" Feb 17 14:28:14 crc kubenswrapper[4836]: I0217 14:28:13.999586 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/10f74a60-5438-45cd-a8e1-74ccc1c3b16a-httpd-config\") pod \"neutron-56bdc657f6-lhdd4\" (UID: \"10f74a60-5438-45cd-a8e1-74ccc1c3b16a\") " pod="openstack/neutron-56bdc657f6-lhdd4" Feb 17 14:28:14 crc kubenswrapper[4836]: I0217 14:28:13.999620 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/10f74a60-5438-45cd-a8e1-74ccc1c3b16a-ovndb-tls-certs\") pod \"neutron-56bdc657f6-lhdd4\" (UID: \"10f74a60-5438-45cd-a8e1-74ccc1c3b16a\") " pod="openstack/neutron-56bdc657f6-lhdd4" Feb 17 14:28:14 crc kubenswrapper[4836]: I0217 14:28:13.999648 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whzb4\" (UniqueName: \"kubernetes.io/projected/10f74a60-5438-45cd-a8e1-74ccc1c3b16a-kube-api-access-whzb4\") pod \"neutron-56bdc657f6-lhdd4\" (UID: \"10f74a60-5438-45cd-a8e1-74ccc1c3b16a\") " pod="openstack/neutron-56bdc657f6-lhdd4" Feb 17 14:28:14 crc kubenswrapper[4836]: E0217 14:28:14.048430 4836 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Feb 17 14:28:14 crc kubenswrapper[4836]: E0217 14:28:14.048681 4836 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ftmbq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-qqwhc_openstack(8185c649-f1ad-4230-830d-07d002e5b358): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 14:28:14 crc kubenswrapper[4836]: E0217 14:28:14.051155 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-qqwhc" podUID="8185c649-f1ad-4230-830d-07d002e5b358" Feb 17 14:28:14 crc kubenswrapper[4836]: I0217 14:28:14.082514 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-knj6m"] Feb 17 14:28:14 crc kubenswrapper[4836]: I0217 14:28:14.094523 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-knj6m"] Feb 17 14:28:14 crc kubenswrapper[4836]: I0217 14:28:14.094711 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-knj6m" Feb 17 14:28:14 crc kubenswrapper[4836]: I0217 14:28:14.105439 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/10f74a60-5438-45cd-a8e1-74ccc1c3b16a-config\") pod \"neutron-56bdc657f6-lhdd4\" (UID: \"10f74a60-5438-45cd-a8e1-74ccc1c3b16a\") " pod="openstack/neutron-56bdc657f6-lhdd4" Feb 17 14:28:14 crc kubenswrapper[4836]: I0217 14:28:14.105541 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10f74a60-5438-45cd-a8e1-74ccc1c3b16a-combined-ca-bundle\") pod \"neutron-56bdc657f6-lhdd4\" (UID: \"10f74a60-5438-45cd-a8e1-74ccc1c3b16a\") " pod="openstack/neutron-56bdc657f6-lhdd4" Feb 17 14:28:14 crc kubenswrapper[4836]: I0217 14:28:14.105589 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7h58\" (UniqueName: \"kubernetes.io/projected/7622952e-3f9a-4569-8f4d-8a07f1cbcd2c-kube-api-access-f7h58\") pod \"dnsmasq-dns-55f844cf75-knj6m\" (UID: \"7622952e-3f9a-4569-8f4d-8a07f1cbcd2c\") " pod="openstack/dnsmasq-dns-55f844cf75-knj6m" Feb 17 14:28:14 crc kubenswrapper[4836]: I0217 14:28:14.105639 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7622952e-3f9a-4569-8f4d-8a07f1cbcd2c-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-knj6m\" (UID: \"7622952e-3f9a-4569-8f4d-8a07f1cbcd2c\") " pod="openstack/dnsmasq-dns-55f844cf75-knj6m" Feb 17 14:28:14 crc kubenswrapper[4836]: I0217 14:28:14.105669 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7622952e-3f9a-4569-8f4d-8a07f1cbcd2c-dns-svc\") pod \"dnsmasq-dns-55f844cf75-knj6m\" (UID: \"7622952e-3f9a-4569-8f4d-8a07f1cbcd2c\") " pod="openstack/dnsmasq-dns-55f844cf75-knj6m" Feb 17 14:28:14 crc kubenswrapper[4836]: I0217 14:28:14.105698 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7622952e-3f9a-4569-8f4d-8a07f1cbcd2c-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-knj6m\" (UID: \"7622952e-3f9a-4569-8f4d-8a07f1cbcd2c\") " pod="openstack/dnsmasq-dns-55f844cf75-knj6m" Feb 17 14:28:14 crc kubenswrapper[4836]: I0217 14:28:14.105781 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7622952e-3f9a-4569-8f4d-8a07f1cbcd2c-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-knj6m\" (UID: \"7622952e-3f9a-4569-8f4d-8a07f1cbcd2c\") " pod="openstack/dnsmasq-dns-55f844cf75-knj6m" Feb 17 14:28:14 crc kubenswrapper[4836]: I0217 14:28:14.105830 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/10f74a60-5438-45cd-a8e1-74ccc1c3b16a-httpd-config\") pod \"neutron-56bdc657f6-lhdd4\" (UID: \"10f74a60-5438-45cd-a8e1-74ccc1c3b16a\") " pod="openstack/neutron-56bdc657f6-lhdd4" Feb 17 14:28:14 crc kubenswrapper[4836]: I0217 14:28:14.105872 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/10f74a60-5438-45cd-a8e1-74ccc1c3b16a-ovndb-tls-certs\") pod \"neutron-56bdc657f6-lhdd4\" (UID: \"10f74a60-5438-45cd-a8e1-74ccc1c3b16a\") " pod="openstack/neutron-56bdc657f6-lhdd4" Feb 17 14:28:14 crc kubenswrapper[4836]: I0217 14:28:14.105904 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whzb4\" (UniqueName: \"kubernetes.io/projected/10f74a60-5438-45cd-a8e1-74ccc1c3b16a-kube-api-access-whzb4\") pod \"neutron-56bdc657f6-lhdd4\" (UID: \"10f74a60-5438-45cd-a8e1-74ccc1c3b16a\") " pod="openstack/neutron-56bdc657f6-lhdd4" Feb 17 14:28:14 crc kubenswrapper[4836]: I0217 14:28:14.105966 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7622952e-3f9a-4569-8f4d-8a07f1cbcd2c-config\") pod \"dnsmasq-dns-55f844cf75-knj6m\" (UID: \"7622952e-3f9a-4569-8f4d-8a07f1cbcd2c\") " pod="openstack/dnsmasq-dns-55f844cf75-knj6m" Feb 17 14:28:14 crc kubenswrapper[4836]: I0217 14:28:14.121827 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/10f74a60-5438-45cd-a8e1-74ccc1c3b16a-config\") pod \"neutron-56bdc657f6-lhdd4\" (UID: \"10f74a60-5438-45cd-a8e1-74ccc1c3b16a\") " pod="openstack/neutron-56bdc657f6-lhdd4" Feb 17 14:28:14 crc kubenswrapper[4836]: I0217 14:28:14.128163 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10f74a60-5438-45cd-a8e1-74ccc1c3b16a-combined-ca-bundle\") pod \"neutron-56bdc657f6-lhdd4\" (UID: \"10f74a60-5438-45cd-a8e1-74ccc1c3b16a\") " pod="openstack/neutron-56bdc657f6-lhdd4" Feb 17 14:28:14 crc kubenswrapper[4836]: I0217 14:28:14.128521 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/10f74a60-5438-45cd-a8e1-74ccc1c3b16a-httpd-config\") pod \"neutron-56bdc657f6-lhdd4\" (UID: \"10f74a60-5438-45cd-a8e1-74ccc1c3b16a\") " pod="openstack/neutron-56bdc657f6-lhdd4" Feb 17 14:28:14 crc kubenswrapper[4836]: I0217 14:28:14.155287 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whzb4\" (UniqueName: \"kubernetes.io/projected/10f74a60-5438-45cd-a8e1-74ccc1c3b16a-kube-api-access-whzb4\") pod \"neutron-56bdc657f6-lhdd4\" (UID: \"10f74a60-5438-45cd-a8e1-74ccc1c3b16a\") " pod="openstack/neutron-56bdc657f6-lhdd4" Feb 17 14:28:14 crc kubenswrapper[4836]: I0217 14:28:14.157872 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/10f74a60-5438-45cd-a8e1-74ccc1c3b16a-ovndb-tls-certs\") pod \"neutron-56bdc657f6-lhdd4\" (UID: \"10f74a60-5438-45cd-a8e1-74ccc1c3b16a\") " pod="openstack/neutron-56bdc657f6-lhdd4" Feb 17 14:28:14 crc kubenswrapper[4836]: I0217 14:28:14.166857 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-wbh2w" event={"ID":"312259c2-4f8f-401d-a19e-64d0bc7dd35f","Type":"ContainerDied","Data":"c6f4101d16fd86bcceb0625244616ff16d1c5665adecebcc6d46b7d7f983a200"} Feb 17 14:28:14 crc kubenswrapper[4836]: I0217 14:28:14.166922 4836 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c6f4101d16fd86bcceb0625244616ff16d1c5665adecebcc6d46b7d7f983a200" Feb 17 14:28:14 crc kubenswrapper[4836]: E0217 14:28:14.168622 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-qqwhc" podUID="8185c649-f1ad-4230-830d-07d002e5b358" Feb 17 14:28:14 crc kubenswrapper[4836]: I0217 14:28:14.211241 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7622952e-3f9a-4569-8f4d-8a07f1cbcd2c-config\") pod \"dnsmasq-dns-55f844cf75-knj6m\" (UID: \"7622952e-3f9a-4569-8f4d-8a07f1cbcd2c\") " pod="openstack/dnsmasq-dns-55f844cf75-knj6m" Feb 17 14:28:14 crc kubenswrapper[4836]: I0217 14:28:14.212650 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7622952e-3f9a-4569-8f4d-8a07f1cbcd2c-config\") pod \"dnsmasq-dns-55f844cf75-knj6m\" (UID: \"7622952e-3f9a-4569-8f4d-8a07f1cbcd2c\") " pod="openstack/dnsmasq-dns-55f844cf75-knj6m" Feb 17 14:28:14 crc kubenswrapper[4836]: I0217 14:28:14.215353 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7h58\" (UniqueName: \"kubernetes.io/projected/7622952e-3f9a-4569-8f4d-8a07f1cbcd2c-kube-api-access-f7h58\") pod \"dnsmasq-dns-55f844cf75-knj6m\" (UID: \"7622952e-3f9a-4569-8f4d-8a07f1cbcd2c\") " pod="openstack/dnsmasq-dns-55f844cf75-knj6m" Feb 17 14:28:14 crc kubenswrapper[4836]: I0217 14:28:14.215464 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7622952e-3f9a-4569-8f4d-8a07f1cbcd2c-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-knj6m\" (UID: \"7622952e-3f9a-4569-8f4d-8a07f1cbcd2c\") " pod="openstack/dnsmasq-dns-55f844cf75-knj6m" Feb 17 14:28:14 crc kubenswrapper[4836]: I0217 14:28:14.215497 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7622952e-3f9a-4569-8f4d-8a07f1cbcd2c-dns-svc\") pod \"dnsmasq-dns-55f844cf75-knj6m\" (UID: \"7622952e-3f9a-4569-8f4d-8a07f1cbcd2c\") " pod="openstack/dnsmasq-dns-55f844cf75-knj6m" Feb 17 14:28:14 crc kubenswrapper[4836]: I0217 14:28:14.215516 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7622952e-3f9a-4569-8f4d-8a07f1cbcd2c-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-knj6m\" (UID: \"7622952e-3f9a-4569-8f4d-8a07f1cbcd2c\") " pod="openstack/dnsmasq-dns-55f844cf75-knj6m" Feb 17 14:28:14 crc kubenswrapper[4836]: I0217 14:28:14.215688 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7622952e-3f9a-4569-8f4d-8a07f1cbcd2c-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-knj6m\" (UID: \"7622952e-3f9a-4569-8f4d-8a07f1cbcd2c\") " pod="openstack/dnsmasq-dns-55f844cf75-knj6m" Feb 17 14:28:14 crc kubenswrapper[4836]: I0217 14:28:14.217156 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7622952e-3f9a-4569-8f4d-8a07f1cbcd2c-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-knj6m\" (UID: \"7622952e-3f9a-4569-8f4d-8a07f1cbcd2c\") " pod="openstack/dnsmasq-dns-55f844cf75-knj6m" Feb 17 14:28:14 crc kubenswrapper[4836]: I0217 14:28:14.217316 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7622952e-3f9a-4569-8f4d-8a07f1cbcd2c-dns-svc\") pod \"dnsmasq-dns-55f844cf75-knj6m\" (UID: \"7622952e-3f9a-4569-8f4d-8a07f1cbcd2c\") " pod="openstack/dnsmasq-dns-55f844cf75-knj6m" Feb 17 14:28:14 crc kubenswrapper[4836]: I0217 14:28:14.217918 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7622952e-3f9a-4569-8f4d-8a07f1cbcd2c-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-knj6m\" (UID: \"7622952e-3f9a-4569-8f4d-8a07f1cbcd2c\") " pod="openstack/dnsmasq-dns-55f844cf75-knj6m" Feb 17 14:28:14 crc kubenswrapper[4836]: I0217 14:28:14.218411 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7622952e-3f9a-4569-8f4d-8a07f1cbcd2c-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-knj6m\" (UID: \"7622952e-3f9a-4569-8f4d-8a07f1cbcd2c\") " pod="openstack/dnsmasq-dns-55f844cf75-knj6m" Feb 17 14:28:14 crc kubenswrapper[4836]: I0217 14:28:14.235920 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7h58\" (UniqueName: \"kubernetes.io/projected/7622952e-3f9a-4569-8f4d-8a07f1cbcd2c-kube-api-access-f7h58\") pod \"dnsmasq-dns-55f844cf75-knj6m\" (UID: \"7622952e-3f9a-4569-8f4d-8a07f1cbcd2c\") " pod="openstack/dnsmasq-dns-55f844cf75-knj6m" Feb 17 14:28:14 crc kubenswrapper[4836]: I0217 14:28:14.311213 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-knj6m" Feb 17 14:28:14 crc kubenswrapper[4836]: I0217 14:28:14.314366 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-wbh2w" Feb 17 14:28:14 crc kubenswrapper[4836]: I0217 14:28:14.420274 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/312259c2-4f8f-401d-a19e-64d0bc7dd35f-config\") pod \"312259c2-4f8f-401d-a19e-64d0bc7dd35f\" (UID: \"312259c2-4f8f-401d-a19e-64d0bc7dd35f\") " Feb 17 14:28:14 crc kubenswrapper[4836]: I0217 14:28:14.420378 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/312259c2-4f8f-401d-a19e-64d0bc7dd35f-ovsdbserver-nb\") pod \"312259c2-4f8f-401d-a19e-64d0bc7dd35f\" (UID: \"312259c2-4f8f-401d-a19e-64d0bc7dd35f\") " Feb 17 14:28:14 crc kubenswrapper[4836]: I0217 14:28:14.420661 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-76vb5\" (UniqueName: \"kubernetes.io/projected/312259c2-4f8f-401d-a19e-64d0bc7dd35f-kube-api-access-76vb5\") pod \"312259c2-4f8f-401d-a19e-64d0bc7dd35f\" (UID: \"312259c2-4f8f-401d-a19e-64d0bc7dd35f\") " Feb 17 14:28:14 crc kubenswrapper[4836]: I0217 14:28:14.420903 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/312259c2-4f8f-401d-a19e-64d0bc7dd35f-dns-svc\") pod \"312259c2-4f8f-401d-a19e-64d0bc7dd35f\" (UID: \"312259c2-4f8f-401d-a19e-64d0bc7dd35f\") " Feb 17 14:28:14 crc kubenswrapper[4836]: I0217 14:28:14.420963 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/312259c2-4f8f-401d-a19e-64d0bc7dd35f-ovsdbserver-sb\") pod \"312259c2-4f8f-401d-a19e-64d0bc7dd35f\" (UID: \"312259c2-4f8f-401d-a19e-64d0bc7dd35f\") " Feb 17 14:28:14 crc kubenswrapper[4836]: I0217 14:28:14.421043 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-56bdc657f6-lhdd4" Feb 17 14:28:14 crc kubenswrapper[4836]: I0217 14:28:14.425989 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/312259c2-4f8f-401d-a19e-64d0bc7dd35f-kube-api-access-76vb5" (OuterVolumeSpecName: "kube-api-access-76vb5") pod "312259c2-4f8f-401d-a19e-64d0bc7dd35f" (UID: "312259c2-4f8f-401d-a19e-64d0bc7dd35f"). InnerVolumeSpecName "kube-api-access-76vb5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:28:14 crc kubenswrapper[4836]: I0217 14:28:14.477657 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/312259c2-4f8f-401d-a19e-64d0bc7dd35f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "312259c2-4f8f-401d-a19e-64d0bc7dd35f" (UID: "312259c2-4f8f-401d-a19e-64d0bc7dd35f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:28:14 crc kubenswrapper[4836]: I0217 14:28:14.480087 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/312259c2-4f8f-401d-a19e-64d0bc7dd35f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "312259c2-4f8f-401d-a19e-64d0bc7dd35f" (UID: "312259c2-4f8f-401d-a19e-64d0bc7dd35f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:28:14 crc kubenswrapper[4836]: I0217 14:28:14.493633 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/312259c2-4f8f-401d-a19e-64d0bc7dd35f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "312259c2-4f8f-401d-a19e-64d0bc7dd35f" (UID: "312259c2-4f8f-401d-a19e-64d0bc7dd35f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:28:14 crc kubenswrapper[4836]: I0217 14:28:14.493907 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/312259c2-4f8f-401d-a19e-64d0bc7dd35f-config" (OuterVolumeSpecName: "config") pod "312259c2-4f8f-401d-a19e-64d0bc7dd35f" (UID: "312259c2-4f8f-401d-a19e-64d0bc7dd35f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:28:14 crc kubenswrapper[4836]: I0217 14:28:14.524700 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-76vb5\" (UniqueName: \"kubernetes.io/projected/312259c2-4f8f-401d-a19e-64d0bc7dd35f-kube-api-access-76vb5\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:14 crc kubenswrapper[4836]: I0217 14:28:14.524735 4836 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/312259c2-4f8f-401d-a19e-64d0bc7dd35f-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:14 crc kubenswrapper[4836]: I0217 14:28:14.524745 4836 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/312259c2-4f8f-401d-a19e-64d0bc7dd35f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:14 crc kubenswrapper[4836]: I0217 14:28:14.524754 4836 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/312259c2-4f8f-401d-a19e-64d0bc7dd35f-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:14 crc kubenswrapper[4836]: I0217 14:28:14.524762 4836 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/312259c2-4f8f-401d-a19e-64d0bc7dd35f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:14 crc kubenswrapper[4836]: I0217 14:28:14.747714 4836 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-wbh2w" podUID="312259c2-4f8f-401d-a19e-64d0bc7dd35f" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.135:5353: i/o timeout" Feb 17 14:28:15 crc kubenswrapper[4836]: I0217 14:28:15.183693 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-wbh2w" Feb 17 14:28:15 crc kubenswrapper[4836]: I0217 14:28:15.217980 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-wbh2w"] Feb 17 14:28:15 crc kubenswrapper[4836]: I0217 14:28:15.230880 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-698758b865-wbh2w"] Feb 17 14:28:16 crc kubenswrapper[4836]: I0217 14:28:16.475727 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-bc789578f-mcrrx"] Feb 17 14:28:16 crc kubenswrapper[4836]: E0217 14:28:16.480015 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="312259c2-4f8f-401d-a19e-64d0bc7dd35f" containerName="dnsmasq-dns" Feb 17 14:28:16 crc kubenswrapper[4836]: I0217 14:28:16.480051 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="312259c2-4f8f-401d-a19e-64d0bc7dd35f" containerName="dnsmasq-dns" Feb 17 14:28:16 crc kubenswrapper[4836]: E0217 14:28:16.480089 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="312259c2-4f8f-401d-a19e-64d0bc7dd35f" containerName="init" Feb 17 14:28:16 crc kubenswrapper[4836]: I0217 14:28:16.480098 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="312259c2-4f8f-401d-a19e-64d0bc7dd35f" containerName="init" Feb 17 14:28:16 crc kubenswrapper[4836]: I0217 14:28:16.480355 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="312259c2-4f8f-401d-a19e-64d0bc7dd35f" containerName="dnsmasq-dns" Feb 17 14:28:16 crc kubenswrapper[4836]: I0217 14:28:16.482647 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-bc789578f-mcrrx" Feb 17 14:28:16 crc kubenswrapper[4836]: I0217 14:28:16.488373 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Feb 17 14:28:16 crc kubenswrapper[4836]: I0217 14:28:16.489214 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Feb 17 14:28:16 crc kubenswrapper[4836]: I0217 14:28:16.496208 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-bc789578f-mcrrx"] Feb 17 14:28:16 crc kubenswrapper[4836]: I0217 14:28:16.585214 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="312259c2-4f8f-401d-a19e-64d0bc7dd35f" path="/var/lib/kubelet/pods/312259c2-4f8f-401d-a19e-64d0bc7dd35f/volumes" Feb 17 14:28:16 crc kubenswrapper[4836]: I0217 14:28:16.591379 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7dc98d2-302d-4633-8123-fe76bb7dbd40-internal-tls-certs\") pod \"neutron-bc789578f-mcrrx\" (UID: \"a7dc98d2-302d-4633-8123-fe76bb7dbd40\") " pod="openstack/neutron-bc789578f-mcrrx" Feb 17 14:28:16 crc kubenswrapper[4836]: I0217 14:28:16.591505 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7dc98d2-302d-4633-8123-fe76bb7dbd40-combined-ca-bundle\") pod \"neutron-bc789578f-mcrrx\" (UID: \"a7dc98d2-302d-4633-8123-fe76bb7dbd40\") " pod="openstack/neutron-bc789578f-mcrrx" Feb 17 14:28:16 crc kubenswrapper[4836]: I0217 14:28:16.591551 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7dc98d2-302d-4633-8123-fe76bb7dbd40-ovndb-tls-certs\") pod \"neutron-bc789578f-mcrrx\" (UID: \"a7dc98d2-302d-4633-8123-fe76bb7dbd40\") " pod="openstack/neutron-bc789578f-mcrrx" Feb 17 14:28:16 crc kubenswrapper[4836]: I0217 14:28:16.591707 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a7dc98d2-302d-4633-8123-fe76bb7dbd40-httpd-config\") pod \"neutron-bc789578f-mcrrx\" (UID: \"a7dc98d2-302d-4633-8123-fe76bb7dbd40\") " pod="openstack/neutron-bc789578f-mcrrx" Feb 17 14:28:16 crc kubenswrapper[4836]: I0217 14:28:16.591769 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a7dc98d2-302d-4633-8123-fe76bb7dbd40-config\") pod \"neutron-bc789578f-mcrrx\" (UID: \"a7dc98d2-302d-4633-8123-fe76bb7dbd40\") " pod="openstack/neutron-bc789578f-mcrrx" Feb 17 14:28:16 crc kubenswrapper[4836]: I0217 14:28:16.592665 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7dc98d2-302d-4633-8123-fe76bb7dbd40-public-tls-certs\") pod \"neutron-bc789578f-mcrrx\" (UID: \"a7dc98d2-302d-4633-8123-fe76bb7dbd40\") " pod="openstack/neutron-bc789578f-mcrrx" Feb 17 14:28:16 crc kubenswrapper[4836]: I0217 14:28:16.592698 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nv8ws\" (UniqueName: \"kubernetes.io/projected/a7dc98d2-302d-4633-8123-fe76bb7dbd40-kube-api-access-nv8ws\") pod \"neutron-bc789578f-mcrrx\" (UID: \"a7dc98d2-302d-4633-8123-fe76bb7dbd40\") " pod="openstack/neutron-bc789578f-mcrrx" Feb 17 14:28:16 crc kubenswrapper[4836]: I0217 14:28:16.695211 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7dc98d2-302d-4633-8123-fe76bb7dbd40-public-tls-certs\") pod \"neutron-bc789578f-mcrrx\" (UID: \"a7dc98d2-302d-4633-8123-fe76bb7dbd40\") " pod="openstack/neutron-bc789578f-mcrrx" Feb 17 14:28:16 crc kubenswrapper[4836]: I0217 14:28:16.695286 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nv8ws\" (UniqueName: \"kubernetes.io/projected/a7dc98d2-302d-4633-8123-fe76bb7dbd40-kube-api-access-nv8ws\") pod \"neutron-bc789578f-mcrrx\" (UID: \"a7dc98d2-302d-4633-8123-fe76bb7dbd40\") " pod="openstack/neutron-bc789578f-mcrrx" Feb 17 14:28:16 crc kubenswrapper[4836]: I0217 14:28:16.695361 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7dc98d2-302d-4633-8123-fe76bb7dbd40-internal-tls-certs\") pod \"neutron-bc789578f-mcrrx\" (UID: \"a7dc98d2-302d-4633-8123-fe76bb7dbd40\") " pod="openstack/neutron-bc789578f-mcrrx" Feb 17 14:28:16 crc kubenswrapper[4836]: I0217 14:28:16.695413 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7dc98d2-302d-4633-8123-fe76bb7dbd40-combined-ca-bundle\") pod \"neutron-bc789578f-mcrrx\" (UID: \"a7dc98d2-302d-4633-8123-fe76bb7dbd40\") " pod="openstack/neutron-bc789578f-mcrrx" Feb 17 14:28:16 crc kubenswrapper[4836]: I0217 14:28:16.695441 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7dc98d2-302d-4633-8123-fe76bb7dbd40-ovndb-tls-certs\") pod \"neutron-bc789578f-mcrrx\" (UID: \"a7dc98d2-302d-4633-8123-fe76bb7dbd40\") " pod="openstack/neutron-bc789578f-mcrrx" Feb 17 14:28:16 crc kubenswrapper[4836]: I0217 14:28:16.695495 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a7dc98d2-302d-4633-8123-fe76bb7dbd40-httpd-config\") pod \"neutron-bc789578f-mcrrx\" (UID: \"a7dc98d2-302d-4633-8123-fe76bb7dbd40\") " pod="openstack/neutron-bc789578f-mcrrx" Feb 17 14:28:16 crc kubenswrapper[4836]: I0217 14:28:16.695521 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a7dc98d2-302d-4633-8123-fe76bb7dbd40-config\") pod \"neutron-bc789578f-mcrrx\" (UID: \"a7dc98d2-302d-4633-8123-fe76bb7dbd40\") " pod="openstack/neutron-bc789578f-mcrrx" Feb 17 14:28:16 crc kubenswrapper[4836]: I0217 14:28:16.703756 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7dc98d2-302d-4633-8123-fe76bb7dbd40-internal-tls-certs\") pod \"neutron-bc789578f-mcrrx\" (UID: \"a7dc98d2-302d-4633-8123-fe76bb7dbd40\") " pod="openstack/neutron-bc789578f-mcrrx" Feb 17 14:28:16 crc kubenswrapper[4836]: I0217 14:28:16.704027 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/a7dc98d2-302d-4633-8123-fe76bb7dbd40-config\") pod \"neutron-bc789578f-mcrrx\" (UID: \"a7dc98d2-302d-4633-8123-fe76bb7dbd40\") " pod="openstack/neutron-bc789578f-mcrrx" Feb 17 14:28:16 crc kubenswrapper[4836]: I0217 14:28:16.704213 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7dc98d2-302d-4633-8123-fe76bb7dbd40-combined-ca-bundle\") pod \"neutron-bc789578f-mcrrx\" (UID: \"a7dc98d2-302d-4633-8123-fe76bb7dbd40\") " pod="openstack/neutron-bc789578f-mcrrx" Feb 17 14:28:16 crc kubenswrapper[4836]: I0217 14:28:16.712396 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7dc98d2-302d-4633-8123-fe76bb7dbd40-ovndb-tls-certs\") pod \"neutron-bc789578f-mcrrx\" (UID: \"a7dc98d2-302d-4633-8123-fe76bb7dbd40\") " pod="openstack/neutron-bc789578f-mcrrx" Feb 17 14:28:16 crc kubenswrapper[4836]: I0217 14:28:16.713270 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7dc98d2-302d-4633-8123-fe76bb7dbd40-public-tls-certs\") pod \"neutron-bc789578f-mcrrx\" (UID: \"a7dc98d2-302d-4633-8123-fe76bb7dbd40\") " pod="openstack/neutron-bc789578f-mcrrx" Feb 17 14:28:16 crc kubenswrapper[4836]: I0217 14:28:16.714016 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a7dc98d2-302d-4633-8123-fe76bb7dbd40-httpd-config\") pod \"neutron-bc789578f-mcrrx\" (UID: \"a7dc98d2-302d-4633-8123-fe76bb7dbd40\") " pod="openstack/neutron-bc789578f-mcrrx" Feb 17 14:28:16 crc kubenswrapper[4836]: I0217 14:28:16.718274 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nv8ws\" (UniqueName: \"kubernetes.io/projected/a7dc98d2-302d-4633-8123-fe76bb7dbd40-kube-api-access-nv8ws\") pod \"neutron-bc789578f-mcrrx\" (UID: \"a7dc98d2-302d-4633-8123-fe76bb7dbd40\") " pod="openstack/neutron-bc789578f-mcrrx" Feb 17 14:28:16 crc kubenswrapper[4836]: I0217 14:28:16.810955 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-bc789578f-mcrrx" Feb 17 14:28:21 crc kubenswrapper[4836]: I0217 14:28:21.883668 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-vmgps"] Feb 17 14:28:21 crc kubenswrapper[4836]: I0217 14:28:21.906841 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-mpdz8"] Feb 17 14:28:22 crc kubenswrapper[4836]: I0217 14:28:22.004510 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 14:28:22 crc kubenswrapper[4836]: I0217 14:28:22.090745 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 14:28:22 crc kubenswrapper[4836]: I0217 14:28:22.225717 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-bc789578f-mcrrx"] Feb 17 14:28:22 crc kubenswrapper[4836]: E0217 14:28:22.345135 4836 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current" Feb 17 14:28:22 crc kubenswrapper[4836]: E0217 14:28:22.345206 4836 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current" Feb 17 14:28:22 crc kubenswrapper[4836]: E0217 14:28:22.345467 4836 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cloudkitty-db-sync,Image:quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CloudKittyPassword,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:osp-secret,},Key:CloudKittyPassword,Optional:nil,},},},EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:cloudkitty-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:certs,ReadOnly:true,MountPath:/var/lib/openstack/loki-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hfrn2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42406,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cloudkitty-db-sync-pvljf_openstack(4e016162-2025-44ad-989d-ce71d9f8f9bf): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 14:28:22 crc kubenswrapper[4836]: E0217 14:28:22.346759 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cloudkitty-db-sync-pvljf" podUID="4e016162-2025-44ad-989d-ce71d9f8f9bf" Feb 17 14:28:22 crc kubenswrapper[4836]: I0217 14:28:22.378990 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c16e95ef-bb15-4f5c-b53d-42ca0c183ed6","Type":"ContainerStarted","Data":"1bdfb1c3c1f902411f2380ceedd49dc7958dbb9a04d7b4060c81c560dbbd7e40"} Feb 17 14:28:22 crc kubenswrapper[4836]: I0217 14:28:22.388461 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-vmgps" event={"ID":"10331926-261d-4e44-a8c2-89846903ca12","Type":"ContainerStarted","Data":"02c724d11382ee98b69e6abaefd40a4e20c1b972def951d53202ec6f8b2b38f2"} Feb 17 14:28:22 crc kubenswrapper[4836]: I0217 14:28:22.396613 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"6fec8667-7189-4e29-8362-37dd935d2db7","Type":"ContainerStarted","Data":"9a61c69c04f2f3985c0e460f54a13203f50cbdd884dd38fc58f9989d463b2202"} Feb 17 14:28:22 crc kubenswrapper[4836]: I0217 14:28:22.399831 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-mpdz8" event={"ID":"522206b4-5f50-46e4-a363-24021bd65471","Type":"ContainerStarted","Data":"fee297ba365480160ebc2531b71af26f97646ac6816136e490dca68ef994f4eb"} Feb 17 14:28:22 crc kubenswrapper[4836]: I0217 14:28:22.435148 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-bc789578f-mcrrx" event={"ID":"a7dc98d2-302d-4633-8123-fe76bb7dbd40","Type":"ContainerStarted","Data":"61e1414473aaed7533a1bb0fd531409b1cf0fa9ea0b92c1ed51519923f9cbabf"} Feb 17 14:28:23 crc kubenswrapper[4836]: I0217 14:28:23.020529 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-knj6m"] Feb 17 14:28:23 crc kubenswrapper[4836]: I0217 14:28:23.047193 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-56bdc657f6-lhdd4"] Feb 17 14:28:23 crc kubenswrapper[4836]: W0217 14:28:23.082225 4836 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7622952e_3f9a_4569_8f4d_8a07f1cbcd2c.slice/crio-8555cc4b8a651ad8d38601eead66d8910d6d4cd8c7c50d4ab726898662d8c02f WatchSource:0}: Error finding container 8555cc4b8a651ad8d38601eead66d8910d6d4cd8c7c50d4ab726898662d8c02f: Status 404 returned error can't find the container with id 8555cc4b8a651ad8d38601eead66d8910d6d4cd8c7c50d4ab726898662d8c02f Feb 17 14:28:23 crc kubenswrapper[4836]: I0217 14:28:23.455851 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-56bdc657f6-lhdd4" event={"ID":"10f74a60-5438-45cd-a8e1-74ccc1c3b16a","Type":"ContainerStarted","Data":"3131621aad6bddf8f2539d514b9526e7c3c20a9b86076d983784e09cb9285473"} Feb 17 14:28:23 crc kubenswrapper[4836]: I0217 14:28:23.458783 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-pdhxs" event={"ID":"1fe4b42c-afbf-41e1-8035-5fffb156eadc","Type":"ContainerStarted","Data":"705f230fd2d44c1059294c17cc5410cef58dcabc1573c4e7f4f531d00aad46ec"} Feb 17 14:28:23 crc kubenswrapper[4836]: I0217 14:28:23.472784 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-bc789578f-mcrrx" event={"ID":"a7dc98d2-302d-4633-8123-fe76bb7dbd40","Type":"ContainerStarted","Data":"29474b05f933bb7261368e691fe6f6124baae6cdcaac7f0997ad485f3fcff20d"} Feb 17 14:28:23 crc kubenswrapper[4836]: I0217 14:28:23.472848 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-bc789578f-mcrrx" event={"ID":"a7dc98d2-302d-4633-8123-fe76bb7dbd40","Type":"ContainerStarted","Data":"ef329f1c472e28115c477d0f824ce0452609f500341f5fe161170bb1b7dd1f36"} Feb 17 14:28:23 crc kubenswrapper[4836]: I0217 14:28:23.473350 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-bc789578f-mcrrx" Feb 17 14:28:23 crc kubenswrapper[4836]: I0217 14:28:23.487639 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2a1d16f5-4710-43b4-805e-315ed73bb24e","Type":"ContainerStarted","Data":"6102d176b1010bbf234d415140cba35d28570c5b514c7edd1c4a0962a14c5149"} Feb 17 14:28:23 crc kubenswrapper[4836]: I0217 14:28:23.488510 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-pdhxs" podStartSLOduration=13.687779247 podStartE2EDuration="46.488494317s" podCreationTimestamp="2026-02-17 14:27:37 +0000 UTC" firstStartedPulling="2026-02-17 14:27:41.191924665 +0000 UTC m=+1287.534852924" lastFinishedPulling="2026-02-17 14:28:13.992639725 +0000 UTC m=+1320.335567994" observedRunningTime="2026-02-17 14:28:23.484588701 +0000 UTC m=+1329.827516970" watchObservedRunningTime="2026-02-17 14:28:23.488494317 +0000 UTC m=+1329.831422576" Feb 17 14:28:23 crc kubenswrapper[4836]: I0217 14:28:23.502779 4836 generic.go:334] "Generic (PLEG): container finished" podID="7622952e-3f9a-4569-8f4d-8a07f1cbcd2c" containerID="50d4a249bcc48e57b448052c5a0747dd07cf392d7bd62132728c04243ac9a69b" exitCode=0 Feb 17 14:28:23 crc kubenswrapper[4836]: I0217 14:28:23.502990 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-knj6m" event={"ID":"7622952e-3f9a-4569-8f4d-8a07f1cbcd2c","Type":"ContainerDied","Data":"50d4a249bcc48e57b448052c5a0747dd07cf392d7bd62132728c04243ac9a69b"} Feb 17 14:28:23 crc kubenswrapper[4836]: I0217 14:28:23.503040 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-knj6m" event={"ID":"7622952e-3f9a-4569-8f4d-8a07f1cbcd2c","Type":"ContainerStarted","Data":"8555cc4b8a651ad8d38601eead66d8910d6d4cd8c7c50d4ab726898662d8c02f"} Feb 17 14:28:23 crc kubenswrapper[4836]: I0217 14:28:23.506860 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c3515b05-3f55-44fc-9578-ee0d73cb7382","Type":"ContainerStarted","Data":"1c06a664d1e654a6bd86f236b173b45cd29b036aab35504b6bdf0b6a6af440cb"} Feb 17 14:28:23 crc kubenswrapper[4836]: I0217 14:28:23.525317 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-bc789578f-mcrrx" podStartSLOduration=7.5252577259999995 podStartE2EDuration="7.525257726s" podCreationTimestamp="2026-02-17 14:28:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:28:23.520935439 +0000 UTC m=+1329.863863718" watchObservedRunningTime="2026-02-17 14:28:23.525257726 +0000 UTC m=+1329.868185995" Feb 17 14:28:23 crc kubenswrapper[4836]: I0217 14:28:23.526136 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-vmgps" event={"ID":"10331926-261d-4e44-a8c2-89846903ca12","Type":"ContainerStarted","Data":"0a4b8ba8b2087b1a38486d6f6172aee2da2f8fb8e22feee2e93bb22306b6558e"} Feb 17 14:28:23 crc kubenswrapper[4836]: I0217 14:28:23.536036 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-g9l4s" event={"ID":"18361bc2-5db1-4611-be18-38593e0b5d5d","Type":"ContainerStarted","Data":"13ef4f24a42269dbbf22aa927159da757007caa607e5236e1441cff6b685fe12"} Feb 17 14:28:23 crc kubenswrapper[4836]: I0217 14:28:23.548900 4836 generic.go:334] "Generic (PLEG): container finished" podID="522206b4-5f50-46e4-a363-24021bd65471" containerID="96b7929b0efa57ddeb597b383fe5cd1d57bdd64461f3a7e920a20b1f3f965d47" exitCode=0 Feb 17 14:28:23 crc kubenswrapper[4836]: I0217 14:28:23.550402 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-mpdz8" event={"ID":"522206b4-5f50-46e4-a363-24021bd65471","Type":"ContainerDied","Data":"96b7929b0efa57ddeb597b383fe5cd1d57bdd64461f3a7e920a20b1f3f965d47"} Feb 17 14:28:23 crc kubenswrapper[4836]: E0217 14:28:23.570240 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-pvljf" podUID="4e016162-2025-44ad-989d-ce71d9f8f9bf" Feb 17 14:28:23 crc kubenswrapper[4836]: I0217 14:28:23.666560 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-g9l4s" podStartSLOduration=5.892190719 podStartE2EDuration="47.666526578s" podCreationTimestamp="2026-02-17 14:27:36 +0000 UTC" firstStartedPulling="2026-02-17 14:27:40.605470404 +0000 UTC m=+1286.948398673" lastFinishedPulling="2026-02-17 14:28:22.379806263 +0000 UTC m=+1328.722734532" observedRunningTime="2026-02-17 14:28:23.612722245 +0000 UTC m=+1329.955650524" watchObservedRunningTime="2026-02-17 14:28:23.666526578 +0000 UTC m=+1330.009454867" Feb 17 14:28:23 crc kubenswrapper[4836]: I0217 14:28:23.823190 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-vmgps" podStartSLOduration=28.823158417 podStartE2EDuration="28.823158417s" podCreationTimestamp="2026-02-17 14:27:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:28:23.68794678 +0000 UTC m=+1330.030875049" watchObservedRunningTime="2026-02-17 14:28:23.823158417 +0000 UTC m=+1330.166086686" Feb 17 14:28:24 crc kubenswrapper[4836]: I0217 14:28:24.153801 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-mpdz8" Feb 17 14:28:24 crc kubenswrapper[4836]: I0217 14:28:24.272617 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/522206b4-5f50-46e4-a363-24021bd65471-config\") pod \"522206b4-5f50-46e4-a363-24021bd65471\" (UID: \"522206b4-5f50-46e4-a363-24021bd65471\") " Feb 17 14:28:24 crc kubenswrapper[4836]: I0217 14:28:24.272741 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hfb6n\" (UniqueName: \"kubernetes.io/projected/522206b4-5f50-46e4-a363-24021bd65471-kube-api-access-hfb6n\") pod \"522206b4-5f50-46e4-a363-24021bd65471\" (UID: \"522206b4-5f50-46e4-a363-24021bd65471\") " Feb 17 14:28:24 crc kubenswrapper[4836]: I0217 14:28:24.272860 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/522206b4-5f50-46e4-a363-24021bd65471-ovsdbserver-sb\") pod \"522206b4-5f50-46e4-a363-24021bd65471\" (UID: \"522206b4-5f50-46e4-a363-24021bd65471\") " Feb 17 14:28:24 crc kubenswrapper[4836]: I0217 14:28:24.272944 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/522206b4-5f50-46e4-a363-24021bd65471-ovsdbserver-nb\") pod \"522206b4-5f50-46e4-a363-24021bd65471\" (UID: \"522206b4-5f50-46e4-a363-24021bd65471\") " Feb 17 14:28:24 crc kubenswrapper[4836]: I0217 14:28:24.272988 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/522206b4-5f50-46e4-a363-24021bd65471-dns-svc\") pod \"522206b4-5f50-46e4-a363-24021bd65471\" (UID: \"522206b4-5f50-46e4-a363-24021bd65471\") " Feb 17 14:28:24 crc kubenswrapper[4836]: I0217 14:28:24.273022 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/522206b4-5f50-46e4-a363-24021bd65471-dns-swift-storage-0\") pod \"522206b4-5f50-46e4-a363-24021bd65471\" (UID: \"522206b4-5f50-46e4-a363-24021bd65471\") " Feb 17 14:28:24 crc kubenswrapper[4836]: I0217 14:28:24.283799 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/522206b4-5f50-46e4-a363-24021bd65471-kube-api-access-hfb6n" (OuterVolumeSpecName: "kube-api-access-hfb6n") pod "522206b4-5f50-46e4-a363-24021bd65471" (UID: "522206b4-5f50-46e4-a363-24021bd65471"). InnerVolumeSpecName "kube-api-access-hfb6n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:28:24 crc kubenswrapper[4836]: I0217 14:28:24.311074 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/522206b4-5f50-46e4-a363-24021bd65471-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "522206b4-5f50-46e4-a363-24021bd65471" (UID: "522206b4-5f50-46e4-a363-24021bd65471"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:28:24 crc kubenswrapper[4836]: I0217 14:28:24.329658 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/522206b4-5f50-46e4-a363-24021bd65471-config" (OuterVolumeSpecName: "config") pod "522206b4-5f50-46e4-a363-24021bd65471" (UID: "522206b4-5f50-46e4-a363-24021bd65471"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:28:24 crc kubenswrapper[4836]: I0217 14:28:24.333818 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/522206b4-5f50-46e4-a363-24021bd65471-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "522206b4-5f50-46e4-a363-24021bd65471" (UID: "522206b4-5f50-46e4-a363-24021bd65471"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:28:24 crc kubenswrapper[4836]: I0217 14:28:24.335938 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/522206b4-5f50-46e4-a363-24021bd65471-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "522206b4-5f50-46e4-a363-24021bd65471" (UID: "522206b4-5f50-46e4-a363-24021bd65471"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:28:24 crc kubenswrapper[4836]: I0217 14:28:24.359966 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/522206b4-5f50-46e4-a363-24021bd65471-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "522206b4-5f50-46e4-a363-24021bd65471" (UID: "522206b4-5f50-46e4-a363-24021bd65471"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:28:24 crc kubenswrapper[4836]: I0217 14:28:24.379662 4836 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/522206b4-5f50-46e4-a363-24021bd65471-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:24 crc kubenswrapper[4836]: I0217 14:28:24.380083 4836 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/522206b4-5f50-46e4-a363-24021bd65471-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:24 crc kubenswrapper[4836]: I0217 14:28:24.380159 4836 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/522206b4-5f50-46e4-a363-24021bd65471-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:24 crc kubenswrapper[4836]: I0217 14:28:24.380216 4836 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/522206b4-5f50-46e4-a363-24021bd65471-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:24 crc kubenswrapper[4836]: I0217 14:28:24.380271 4836 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/522206b4-5f50-46e4-a363-24021bd65471-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:24 crc kubenswrapper[4836]: I0217 14:28:24.380365 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hfb6n\" (UniqueName: \"kubernetes.io/projected/522206b4-5f50-46e4-a363-24021bd65471-kube-api-access-hfb6n\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:24 crc kubenswrapper[4836]: I0217 14:28:24.564893 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-mpdz8" event={"ID":"522206b4-5f50-46e4-a363-24021bd65471","Type":"ContainerDied","Data":"fee297ba365480160ebc2531b71af26f97646ac6816136e490dca68ef994f4eb"} Feb 17 14:28:24 crc kubenswrapper[4836]: I0217 14:28:24.565424 4836 scope.go:117] "RemoveContainer" containerID="96b7929b0efa57ddeb597b383fe5cd1d57bdd64461f3a7e920a20b1f3f965d47" Feb 17 14:28:24 crc kubenswrapper[4836]: I0217 14:28:24.565634 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-mpdz8" Feb 17 14:28:24 crc kubenswrapper[4836]: I0217 14:28:24.635997 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c16e95ef-bb15-4f5c-b53d-42ca0c183ed6","Type":"ContainerStarted","Data":"869ce82d3359043cc4431b6028a34dabc445615d7136178d3f86f4e3032f0d97"} Feb 17 14:28:24 crc kubenswrapper[4836]: I0217 14:28:24.663652 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-knj6m" event={"ID":"7622952e-3f9a-4569-8f4d-8a07f1cbcd2c","Type":"ContainerStarted","Data":"85a21ea6f28662473a5cbe42dfa68ac85c766a6f09753e438f2c37af7356f777"} Feb 17 14:28:24 crc kubenswrapper[4836]: I0217 14:28:24.665029 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-55f844cf75-knj6m" Feb 17 14:28:24 crc kubenswrapper[4836]: I0217 14:28:24.681688 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c3515b05-3f55-44fc-9578-ee0d73cb7382","Type":"ContainerStarted","Data":"4184adaaf005ab31a219f3203092826b82f06f8762db3d006dcd2fec9f1d8ea0"} Feb 17 14:28:24 crc kubenswrapper[4836]: I0217 14:28:24.687264 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-56bdc657f6-lhdd4" event={"ID":"10f74a60-5438-45cd-a8e1-74ccc1c3b16a","Type":"ContainerStarted","Data":"d2098b2a7c4dcbee4fa27ea9bfa1c19e32c5f83e96aa663b877abb8284852c74"} Feb 17 14:28:24 crc kubenswrapper[4836]: I0217 14:28:24.687463 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-56bdc657f6-lhdd4" event={"ID":"10f74a60-5438-45cd-a8e1-74ccc1c3b16a","Type":"ContainerStarted","Data":"b7a5e210ee7a505ae087f3c56329942b71db962383e4ae1693812dd8340169c8"} Feb 17 14:28:24 crc kubenswrapper[4836]: I0217 14:28:24.689079 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-56bdc657f6-lhdd4" Feb 17 14:28:24 crc kubenswrapper[4836]: I0217 14:28:24.789330 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-mpdz8"] Feb 17 14:28:24 crc kubenswrapper[4836]: I0217 14:28:24.819514 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-mpdz8"] Feb 17 14:28:24 crc kubenswrapper[4836]: I0217 14:28:24.834327 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-56bdc657f6-lhdd4" podStartSLOduration=11.834268717 podStartE2EDuration="11.834268717s" podCreationTimestamp="2026-02-17 14:28:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:28:24.769765424 +0000 UTC m=+1331.112693693" watchObservedRunningTime="2026-02-17 14:28:24.834268717 +0000 UTC m=+1331.177196996" Feb 17 14:28:24 crc kubenswrapper[4836]: I0217 14:28:24.858976 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-55f844cf75-knj6m" podStartSLOduration=11.858937989 podStartE2EDuration="11.858937989s" podCreationTimestamp="2026-02-17 14:28:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:28:24.80712148 +0000 UTC m=+1331.150049769" watchObservedRunningTime="2026-02-17 14:28:24.858937989 +0000 UTC m=+1331.201866258" Feb 17 14:28:25 crc kubenswrapper[4836]: I0217 14:28:25.707169 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c16e95ef-bb15-4f5c-b53d-42ca0c183ed6","Type":"ContainerStarted","Data":"0f30cc20795a1be11a63cd83ca82ef1f96bc339f29c98fa9ea79201d66ddd14b"} Feb 17 14:28:25 crc kubenswrapper[4836]: I0217 14:28:25.707310 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="c16e95ef-bb15-4f5c-b53d-42ca0c183ed6" containerName="glance-log" containerID="cri-o://869ce82d3359043cc4431b6028a34dabc445615d7136178d3f86f4e3032f0d97" gracePeriod=30 Feb 17 14:28:25 crc kubenswrapper[4836]: I0217 14:28:25.707399 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="c16e95ef-bb15-4f5c-b53d-42ca0c183ed6" containerName="glance-httpd" containerID="cri-o://0f30cc20795a1be11a63cd83ca82ef1f96bc339f29c98fa9ea79201d66ddd14b" gracePeriod=30 Feb 17 14:28:25 crc kubenswrapper[4836]: I0217 14:28:25.717341 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c3515b05-3f55-44fc-9578-ee0d73cb7382","Type":"ContainerStarted","Data":"6ae7bcd92058237aae5f45aa971c6650154498599d3d666733d736f045d19bfc"} Feb 17 14:28:25 crc kubenswrapper[4836]: I0217 14:28:25.717853 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="c3515b05-3f55-44fc-9578-ee0d73cb7382" containerName="glance-log" containerID="cri-o://4184adaaf005ab31a219f3203092826b82f06f8762db3d006dcd2fec9f1d8ea0" gracePeriod=30 Feb 17 14:28:25 crc kubenswrapper[4836]: I0217 14:28:25.717991 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="c3515b05-3f55-44fc-9578-ee0d73cb7382" containerName="glance-httpd" containerID="cri-o://6ae7bcd92058237aae5f45aa971c6650154498599d3d666733d736f045d19bfc" gracePeriod=30 Feb 17 14:28:25 crc kubenswrapper[4836]: I0217 14:28:25.765137 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=29.765106596 podStartE2EDuration="29.765106596s" podCreationTimestamp="2026-02-17 14:27:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:28:25.746490109 +0000 UTC m=+1332.089418388" watchObservedRunningTime="2026-02-17 14:28:25.765106596 +0000 UTC m=+1332.108034865" Feb 17 14:28:25 crc kubenswrapper[4836]: I0217 14:28:25.804119 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=29.804093406 podStartE2EDuration="29.804093406s" podCreationTimestamp="2026-02-17 14:27:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:28:25.785395037 +0000 UTC m=+1332.128323316" watchObservedRunningTime="2026-02-17 14:28:25.804093406 +0000 UTC m=+1332.147021675" Feb 17 14:28:26 crc kubenswrapper[4836]: I0217 14:28:26.650502 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="522206b4-5f50-46e4-a363-24021bd65471" path="/var/lib/kubelet/pods/522206b4-5f50-46e4-a363-24021bd65471/volumes" Feb 17 14:28:26 crc kubenswrapper[4836]: I0217 14:28:26.752832 4836 generic.go:334] "Generic (PLEG): container finished" podID="c3515b05-3f55-44fc-9578-ee0d73cb7382" containerID="6ae7bcd92058237aae5f45aa971c6650154498599d3d666733d736f045d19bfc" exitCode=0 Feb 17 14:28:26 crc kubenswrapper[4836]: I0217 14:28:26.753672 4836 generic.go:334] "Generic (PLEG): container finished" podID="c3515b05-3f55-44fc-9578-ee0d73cb7382" containerID="4184adaaf005ab31a219f3203092826b82f06f8762db3d006dcd2fec9f1d8ea0" exitCode=143 Feb 17 14:28:26 crc kubenswrapper[4836]: I0217 14:28:26.752943 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c3515b05-3f55-44fc-9578-ee0d73cb7382","Type":"ContainerDied","Data":"6ae7bcd92058237aae5f45aa971c6650154498599d3d666733d736f045d19bfc"} Feb 17 14:28:26 crc kubenswrapper[4836]: I0217 14:28:26.753815 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c3515b05-3f55-44fc-9578-ee0d73cb7382","Type":"ContainerDied","Data":"4184adaaf005ab31a219f3203092826b82f06f8762db3d006dcd2fec9f1d8ea0"} Feb 17 14:28:26 crc kubenswrapper[4836]: I0217 14:28:26.761223 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"6fec8667-7189-4e29-8362-37dd935d2db7","Type":"ContainerStarted","Data":"71b1cb8f78f78e8c4dd6692ccdd01ad2461897f8475fbee0b7f838d8c85e743a"} Feb 17 14:28:26 crc kubenswrapper[4836]: I0217 14:28:26.761276 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"6fec8667-7189-4e29-8362-37dd935d2db7","Type":"ContainerStarted","Data":"937e842c15cf5da92c6be36a32ba414a091df39b4e16414d5f646f11edcd1602"} Feb 17 14:28:26 crc kubenswrapper[4836]: I0217 14:28:26.772314 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2a1d16f5-4710-43b4-805e-315ed73bb24e","Type":"ContainerStarted","Data":"adc3ef3643d684dbbbf0790a30dd752752d5a28971c3915143c0a6ec314bc365"} Feb 17 14:28:26 crc kubenswrapper[4836]: I0217 14:28:26.786392 4836 generic.go:334] "Generic (PLEG): container finished" podID="c16e95ef-bb15-4f5c-b53d-42ca0c183ed6" containerID="0f30cc20795a1be11a63cd83ca82ef1f96bc339f29c98fa9ea79201d66ddd14b" exitCode=0 Feb 17 14:28:26 crc kubenswrapper[4836]: I0217 14:28:26.786444 4836 generic.go:334] "Generic (PLEG): container finished" podID="c16e95ef-bb15-4f5c-b53d-42ca0c183ed6" containerID="869ce82d3359043cc4431b6028a34dabc445615d7136178d3f86f4e3032f0d97" exitCode=143 Feb 17 14:28:26 crc kubenswrapper[4836]: I0217 14:28:26.788384 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c16e95ef-bb15-4f5c-b53d-42ca0c183ed6","Type":"ContainerDied","Data":"0f30cc20795a1be11a63cd83ca82ef1f96bc339f29c98fa9ea79201d66ddd14b"} Feb 17 14:28:26 crc kubenswrapper[4836]: I0217 14:28:26.788468 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c16e95ef-bb15-4f5c-b53d-42ca0c183ed6","Type":"ContainerDied","Data":"869ce82d3359043cc4431b6028a34dabc445615d7136178d3f86f4e3032f0d97"} Feb 17 14:28:26 crc kubenswrapper[4836]: I0217 14:28:26.814405 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=68.814369624 podStartE2EDuration="1m8.814369624s" podCreationTimestamp="2026-02-17 14:27:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:28:26.802274576 +0000 UTC m=+1333.145202855" watchObservedRunningTime="2026-02-17 14:28:26.814369624 +0000 UTC m=+1333.157297903" Feb 17 14:28:26 crc kubenswrapper[4836]: I0217 14:28:26.943473 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 17 14:28:27 crc kubenswrapper[4836]: I0217 14:28:27.088866 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c16e95ef-bb15-4f5c-b53d-42ca0c183ed6-combined-ca-bundle\") pod \"c16e95ef-bb15-4f5c-b53d-42ca0c183ed6\" (UID: \"c16e95ef-bb15-4f5c-b53d-42ca0c183ed6\") " Feb 17 14:28:27 crc kubenswrapper[4836]: I0217 14:28:27.088969 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c16e95ef-bb15-4f5c-b53d-42ca0c183ed6-scripts\") pod \"c16e95ef-bb15-4f5c-b53d-42ca0c183ed6\" (UID: \"c16e95ef-bb15-4f5c-b53d-42ca0c183ed6\") " Feb 17 14:28:27 crc kubenswrapper[4836]: I0217 14:28:27.089110 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c16e95ef-bb15-4f5c-b53d-42ca0c183ed6-httpd-run\") pod \"c16e95ef-bb15-4f5c-b53d-42ca0c183ed6\" (UID: \"c16e95ef-bb15-4f5c-b53d-42ca0c183ed6\") " Feb 17 14:28:27 crc kubenswrapper[4836]: I0217 14:28:27.089164 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-njnqq\" (UniqueName: \"kubernetes.io/projected/c16e95ef-bb15-4f5c-b53d-42ca0c183ed6-kube-api-access-njnqq\") pod \"c16e95ef-bb15-4f5c-b53d-42ca0c183ed6\" (UID: \"c16e95ef-bb15-4f5c-b53d-42ca0c183ed6\") " Feb 17 14:28:27 crc kubenswrapper[4836]: I0217 14:28:27.089267 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c16e95ef-bb15-4f5c-b53d-42ca0c183ed6-config-data\") pod \"c16e95ef-bb15-4f5c-b53d-42ca0c183ed6\" (UID: \"c16e95ef-bb15-4f5c-b53d-42ca0c183ed6\") " Feb 17 14:28:27 crc kubenswrapper[4836]: I0217 14:28:27.089684 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-21ce1609-dbd7-400f-b0f4-c62fbe057ccf\") pod \"c16e95ef-bb15-4f5c-b53d-42ca0c183ed6\" (UID: \"c16e95ef-bb15-4f5c-b53d-42ca0c183ed6\") " Feb 17 14:28:27 crc kubenswrapper[4836]: I0217 14:28:27.089762 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c16e95ef-bb15-4f5c-b53d-42ca0c183ed6-logs\") pod \"c16e95ef-bb15-4f5c-b53d-42ca0c183ed6\" (UID: \"c16e95ef-bb15-4f5c-b53d-42ca0c183ed6\") " Feb 17 14:28:27 crc kubenswrapper[4836]: I0217 14:28:27.091777 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c16e95ef-bb15-4f5c-b53d-42ca0c183ed6-logs" (OuterVolumeSpecName: "logs") pod "c16e95ef-bb15-4f5c-b53d-42ca0c183ed6" (UID: "c16e95ef-bb15-4f5c-b53d-42ca0c183ed6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:28:27 crc kubenswrapper[4836]: I0217 14:28:27.095929 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c16e95ef-bb15-4f5c-b53d-42ca0c183ed6-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "c16e95ef-bb15-4f5c-b53d-42ca0c183ed6" (UID: "c16e95ef-bb15-4f5c-b53d-42ca0c183ed6"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:28:27 crc kubenswrapper[4836]: I0217 14:28:27.103120 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c16e95ef-bb15-4f5c-b53d-42ca0c183ed6-scripts" (OuterVolumeSpecName: "scripts") pod "c16e95ef-bb15-4f5c-b53d-42ca0c183ed6" (UID: "c16e95ef-bb15-4f5c-b53d-42ca0c183ed6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:28:27 crc kubenswrapper[4836]: I0217 14:28:27.112350 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c16e95ef-bb15-4f5c-b53d-42ca0c183ed6-kube-api-access-njnqq" (OuterVolumeSpecName: "kube-api-access-njnqq") pod "c16e95ef-bb15-4f5c-b53d-42ca0c183ed6" (UID: "c16e95ef-bb15-4f5c-b53d-42ca0c183ed6"). InnerVolumeSpecName "kube-api-access-njnqq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:28:27 crc kubenswrapper[4836]: I0217 14:28:27.123868 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-21ce1609-dbd7-400f-b0f4-c62fbe057ccf" (OuterVolumeSpecName: "glance") pod "c16e95ef-bb15-4f5c-b53d-42ca0c183ed6" (UID: "c16e95ef-bb15-4f5c-b53d-42ca0c183ed6"). InnerVolumeSpecName "pvc-21ce1609-dbd7-400f-b0f4-c62fbe057ccf". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 17 14:28:27 crc kubenswrapper[4836]: I0217 14:28:27.133407 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 17 14:28:27 crc kubenswrapper[4836]: I0217 14:28:27.179706 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c16e95ef-bb15-4f5c-b53d-42ca0c183ed6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c16e95ef-bb15-4f5c-b53d-42ca0c183ed6" (UID: "c16e95ef-bb15-4f5c-b53d-42ca0c183ed6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:28:27 crc kubenswrapper[4836]: I0217 14:28:27.195039 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c16e95ef-bb15-4f5c-b53d-42ca0c183ed6-config-data" (OuterVolumeSpecName: "config-data") pod "c16e95ef-bb15-4f5c-b53d-42ca0c183ed6" (UID: "c16e95ef-bb15-4f5c-b53d-42ca0c183ed6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:28:27 crc kubenswrapper[4836]: I0217 14:28:27.195350 4836 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-21ce1609-dbd7-400f-b0f4-c62fbe057ccf\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-21ce1609-dbd7-400f-b0f4-c62fbe057ccf\") on node \"crc\" " Feb 17 14:28:27 crc kubenswrapper[4836]: I0217 14:28:27.195377 4836 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c16e95ef-bb15-4f5c-b53d-42ca0c183ed6-logs\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:27 crc kubenswrapper[4836]: I0217 14:28:27.195389 4836 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c16e95ef-bb15-4f5c-b53d-42ca0c183ed6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:27 crc kubenswrapper[4836]: I0217 14:28:27.195400 4836 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c16e95ef-bb15-4f5c-b53d-42ca0c183ed6-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:27 crc kubenswrapper[4836]: I0217 14:28:27.195409 4836 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c16e95ef-bb15-4f5c-b53d-42ca0c183ed6-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:27 crc kubenswrapper[4836]: I0217 14:28:27.195419 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-njnqq\" (UniqueName: \"kubernetes.io/projected/c16e95ef-bb15-4f5c-b53d-42ca0c183ed6-kube-api-access-njnqq\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:27 crc kubenswrapper[4836]: I0217 14:28:27.195427 4836 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c16e95ef-bb15-4f5c-b53d-42ca0c183ed6-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:27 crc kubenswrapper[4836]: I0217 14:28:27.250677 4836 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 17 14:28:27 crc kubenswrapper[4836]: I0217 14:28:27.250923 4836 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-21ce1609-dbd7-400f-b0f4-c62fbe057ccf" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-21ce1609-dbd7-400f-b0f4-c62fbe057ccf") on node "crc" Feb 17 14:28:27 crc kubenswrapper[4836]: I0217 14:28:27.296587 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3515b05-3f55-44fc-9578-ee0d73cb7382-scripts\") pod \"c3515b05-3f55-44fc-9578-ee0d73cb7382\" (UID: \"c3515b05-3f55-44fc-9578-ee0d73cb7382\") " Feb 17 14:28:27 crc kubenswrapper[4836]: I0217 14:28:27.296684 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c3515b05-3f55-44fc-9578-ee0d73cb7382-httpd-run\") pod \"c3515b05-3f55-44fc-9578-ee0d73cb7382\" (UID: \"c3515b05-3f55-44fc-9578-ee0d73cb7382\") " Feb 17 14:28:27 crc kubenswrapper[4836]: I0217 14:28:27.296873 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3515b05-3f55-44fc-9578-ee0d73cb7382-config-data\") pod \"c3515b05-3f55-44fc-9578-ee0d73cb7382\" (UID: \"c3515b05-3f55-44fc-9578-ee0d73cb7382\") " Feb 17 14:28:27 crc kubenswrapper[4836]: I0217 14:28:27.297092 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-47a47ba0-1e72-4c37-bb61-5251f8f68b34\") pod \"c3515b05-3f55-44fc-9578-ee0d73cb7382\" (UID: \"c3515b05-3f55-44fc-9578-ee0d73cb7382\") " Feb 17 14:28:27 crc kubenswrapper[4836]: I0217 14:28:27.297127 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3515b05-3f55-44fc-9578-ee0d73cb7382-logs\") pod \"c3515b05-3f55-44fc-9578-ee0d73cb7382\" (UID: \"c3515b05-3f55-44fc-9578-ee0d73cb7382\") " Feb 17 14:28:27 crc kubenswrapper[4836]: I0217 14:28:27.297173 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3515b05-3f55-44fc-9578-ee0d73cb7382-combined-ca-bundle\") pod \"c3515b05-3f55-44fc-9578-ee0d73cb7382\" (UID: \"c3515b05-3f55-44fc-9578-ee0d73cb7382\") " Feb 17 14:28:27 crc kubenswrapper[4836]: I0217 14:28:27.297346 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jqt5t\" (UniqueName: \"kubernetes.io/projected/c3515b05-3f55-44fc-9578-ee0d73cb7382-kube-api-access-jqt5t\") pod \"c3515b05-3f55-44fc-9578-ee0d73cb7382\" (UID: \"c3515b05-3f55-44fc-9578-ee0d73cb7382\") " Feb 17 14:28:27 crc kubenswrapper[4836]: I0217 14:28:27.297575 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3515b05-3f55-44fc-9578-ee0d73cb7382-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "c3515b05-3f55-44fc-9578-ee0d73cb7382" (UID: "c3515b05-3f55-44fc-9578-ee0d73cb7382"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:28:27 crc kubenswrapper[4836]: I0217 14:28:27.298001 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3515b05-3f55-44fc-9578-ee0d73cb7382-logs" (OuterVolumeSpecName: "logs") pod "c3515b05-3f55-44fc-9578-ee0d73cb7382" (UID: "c3515b05-3f55-44fc-9578-ee0d73cb7382"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:28:27 crc kubenswrapper[4836]: I0217 14:28:27.298608 4836 reconciler_common.go:293] "Volume detached for volume \"pvc-21ce1609-dbd7-400f-b0f4-c62fbe057ccf\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-21ce1609-dbd7-400f-b0f4-c62fbe057ccf\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:27 crc kubenswrapper[4836]: I0217 14:28:27.298632 4836 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c3515b05-3f55-44fc-9578-ee0d73cb7382-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:27 crc kubenswrapper[4836]: I0217 14:28:27.298649 4836 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3515b05-3f55-44fc-9578-ee0d73cb7382-logs\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:27 crc kubenswrapper[4836]: I0217 14:28:27.304353 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3515b05-3f55-44fc-9578-ee0d73cb7382-kube-api-access-jqt5t" (OuterVolumeSpecName: "kube-api-access-jqt5t") pod "c3515b05-3f55-44fc-9578-ee0d73cb7382" (UID: "c3515b05-3f55-44fc-9578-ee0d73cb7382"). InnerVolumeSpecName "kube-api-access-jqt5t". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:28:27 crc kubenswrapper[4836]: I0217 14:28:27.305438 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3515b05-3f55-44fc-9578-ee0d73cb7382-scripts" (OuterVolumeSpecName: "scripts") pod "c3515b05-3f55-44fc-9578-ee0d73cb7382" (UID: "c3515b05-3f55-44fc-9578-ee0d73cb7382"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:28:27 crc kubenswrapper[4836]: I0217 14:28:27.321031 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-47a47ba0-1e72-4c37-bb61-5251f8f68b34" (OuterVolumeSpecName: "glance") pod "c3515b05-3f55-44fc-9578-ee0d73cb7382" (UID: "c3515b05-3f55-44fc-9578-ee0d73cb7382"). InnerVolumeSpecName "pvc-47a47ba0-1e72-4c37-bb61-5251f8f68b34". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 17 14:28:27 crc kubenswrapper[4836]: I0217 14:28:27.344910 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3515b05-3f55-44fc-9578-ee0d73cb7382-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c3515b05-3f55-44fc-9578-ee0d73cb7382" (UID: "c3515b05-3f55-44fc-9578-ee0d73cb7382"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:28:27 crc kubenswrapper[4836]: I0217 14:28:27.364587 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3515b05-3f55-44fc-9578-ee0d73cb7382-config-data" (OuterVolumeSpecName: "config-data") pod "c3515b05-3f55-44fc-9578-ee0d73cb7382" (UID: "c3515b05-3f55-44fc-9578-ee0d73cb7382"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:28:27 crc kubenswrapper[4836]: I0217 14:28:27.402011 4836 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3515b05-3f55-44fc-9578-ee0d73cb7382-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:27 crc kubenswrapper[4836]: I0217 14:28:27.402544 4836 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3515b05-3f55-44fc-9578-ee0d73cb7382-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:27 crc kubenswrapper[4836]: I0217 14:28:27.402678 4836 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-47a47ba0-1e72-4c37-bb61-5251f8f68b34\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-47a47ba0-1e72-4c37-bb61-5251f8f68b34\") on node \"crc\" " Feb 17 14:28:27 crc kubenswrapper[4836]: I0217 14:28:27.414439 4836 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3515b05-3f55-44fc-9578-ee0d73cb7382-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:27 crc kubenswrapper[4836]: I0217 14:28:27.414826 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jqt5t\" (UniqueName: \"kubernetes.io/projected/c3515b05-3f55-44fc-9578-ee0d73cb7382-kube-api-access-jqt5t\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:27 crc kubenswrapper[4836]: I0217 14:28:27.431677 4836 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 17 14:28:27 crc kubenswrapper[4836]: I0217 14:28:27.432577 4836 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-47a47ba0-1e72-4c37-bb61-5251f8f68b34" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-47a47ba0-1e72-4c37-bb61-5251f8f68b34") on node "crc" Feb 17 14:28:27 crc kubenswrapper[4836]: I0217 14:28:27.517797 4836 reconciler_common.go:293] "Volume detached for volume \"pvc-47a47ba0-1e72-4c37-bb61-5251f8f68b34\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-47a47ba0-1e72-4c37-bb61-5251f8f68b34\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:27 crc kubenswrapper[4836]: I0217 14:28:27.802920 4836 generic.go:334] "Generic (PLEG): container finished" podID="10331926-261d-4e44-a8c2-89846903ca12" containerID="0a4b8ba8b2087b1a38486d6f6172aee2da2f8fb8e22feee2e93bb22306b6558e" exitCode=0 Feb 17 14:28:27 crc kubenswrapper[4836]: I0217 14:28:27.803031 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-vmgps" event={"ID":"10331926-261d-4e44-a8c2-89846903ca12","Type":"ContainerDied","Data":"0a4b8ba8b2087b1a38486d6f6172aee2da2f8fb8e22feee2e93bb22306b6558e"} Feb 17 14:28:27 crc kubenswrapper[4836]: I0217 14:28:27.805835 4836 generic.go:334] "Generic (PLEG): container finished" podID="18361bc2-5db1-4611-be18-38593e0b5d5d" containerID="13ef4f24a42269dbbf22aa927159da757007caa607e5236e1441cff6b685fe12" exitCode=0 Feb 17 14:28:27 crc kubenswrapper[4836]: I0217 14:28:27.805904 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-g9l4s" event={"ID":"18361bc2-5db1-4611-be18-38593e0b5d5d","Type":"ContainerDied","Data":"13ef4f24a42269dbbf22aa927159da757007caa607e5236e1441cff6b685fe12"} Feb 17 14:28:27 crc kubenswrapper[4836]: I0217 14:28:27.809522 4836 generic.go:334] "Generic (PLEG): container finished" podID="1fe4b42c-afbf-41e1-8035-5fffb156eadc" containerID="705f230fd2d44c1059294c17cc5410cef58dcabc1573c4e7f4f531d00aad46ec" exitCode=0 Feb 17 14:28:27 crc kubenswrapper[4836]: I0217 14:28:27.809623 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-pdhxs" event={"ID":"1fe4b42c-afbf-41e1-8035-5fffb156eadc","Type":"ContainerDied","Data":"705f230fd2d44c1059294c17cc5410cef58dcabc1573c4e7f4f531d00aad46ec"} Feb 17 14:28:27 crc kubenswrapper[4836]: I0217 14:28:27.814327 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 17 14:28:27 crc kubenswrapper[4836]: I0217 14:28:27.814404 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c16e95ef-bb15-4f5c-b53d-42ca0c183ed6","Type":"ContainerDied","Data":"1bdfb1c3c1f902411f2380ceedd49dc7958dbb9a04d7b4060c81c560dbbd7e40"} Feb 17 14:28:27 crc kubenswrapper[4836]: I0217 14:28:27.814484 4836 scope.go:117] "RemoveContainer" containerID="0f30cc20795a1be11a63cd83ca82ef1f96bc339f29c98fa9ea79201d66ddd14b" Feb 17 14:28:27 crc kubenswrapper[4836]: I0217 14:28:27.819417 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 17 14:28:27 crc kubenswrapper[4836]: I0217 14:28:27.819450 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c3515b05-3f55-44fc-9578-ee0d73cb7382","Type":"ContainerDied","Data":"1c06a664d1e654a6bd86f236b173b45cd29b036aab35504b6bdf0b6a6af440cb"} Feb 17 14:28:27 crc kubenswrapper[4836]: I0217 14:28:27.934539 4836 scope.go:117] "RemoveContainer" containerID="869ce82d3359043cc4431b6028a34dabc445615d7136178d3f86f4e3032f0d97" Feb 17 14:28:27 crc kubenswrapper[4836]: I0217 14:28:27.934767 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 14:28:27 crc kubenswrapper[4836]: I0217 14:28:27.947142 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 14:28:27 crc kubenswrapper[4836]: I0217 14:28:27.961871 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 14:28:27 crc kubenswrapper[4836]: I0217 14:28:27.995498 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 14:28:28 crc kubenswrapper[4836]: I0217 14:28:28.012063 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 14:28:28 crc kubenswrapper[4836]: E0217 14:28:28.012717 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3515b05-3f55-44fc-9578-ee0d73cb7382" containerName="glance-httpd" Feb 17 14:28:28 crc kubenswrapper[4836]: I0217 14:28:28.012747 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3515b05-3f55-44fc-9578-ee0d73cb7382" containerName="glance-httpd" Feb 17 14:28:28 crc kubenswrapper[4836]: E0217 14:28:28.012770 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="522206b4-5f50-46e4-a363-24021bd65471" containerName="init" Feb 17 14:28:28 crc kubenswrapper[4836]: I0217 14:28:28.012779 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="522206b4-5f50-46e4-a363-24021bd65471" containerName="init" Feb 17 14:28:28 crc kubenswrapper[4836]: E0217 14:28:28.012800 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c16e95ef-bb15-4f5c-b53d-42ca0c183ed6" containerName="glance-httpd" Feb 17 14:28:28 crc kubenswrapper[4836]: I0217 14:28:28.012809 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="c16e95ef-bb15-4f5c-b53d-42ca0c183ed6" containerName="glance-httpd" Feb 17 14:28:28 crc kubenswrapper[4836]: E0217 14:28:28.012820 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3515b05-3f55-44fc-9578-ee0d73cb7382" containerName="glance-log" Feb 17 14:28:28 crc kubenswrapper[4836]: I0217 14:28:28.012828 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3515b05-3f55-44fc-9578-ee0d73cb7382" containerName="glance-log" Feb 17 14:28:28 crc kubenswrapper[4836]: E0217 14:28:28.012845 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c16e95ef-bb15-4f5c-b53d-42ca0c183ed6" containerName="glance-log" Feb 17 14:28:28 crc kubenswrapper[4836]: I0217 14:28:28.012854 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="c16e95ef-bb15-4f5c-b53d-42ca0c183ed6" containerName="glance-log" Feb 17 14:28:28 crc kubenswrapper[4836]: I0217 14:28:28.013119 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="522206b4-5f50-46e4-a363-24021bd65471" containerName="init" Feb 17 14:28:28 crc kubenswrapper[4836]: I0217 14:28:28.013149 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="c16e95ef-bb15-4f5c-b53d-42ca0c183ed6" containerName="glance-log" Feb 17 14:28:28 crc kubenswrapper[4836]: I0217 14:28:28.013163 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="c16e95ef-bb15-4f5c-b53d-42ca0c183ed6" containerName="glance-httpd" Feb 17 14:28:28 crc kubenswrapper[4836]: I0217 14:28:28.013171 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3515b05-3f55-44fc-9578-ee0d73cb7382" containerName="glance-log" Feb 17 14:28:28 crc kubenswrapper[4836]: I0217 14:28:28.013192 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3515b05-3f55-44fc-9578-ee0d73cb7382" containerName="glance-httpd" Feb 17 14:28:28 crc kubenswrapper[4836]: I0217 14:28:28.014673 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 17 14:28:28 crc kubenswrapper[4836]: I0217 14:28:28.020593 4836 scope.go:117] "RemoveContainer" containerID="6ae7bcd92058237aae5f45aa971c6650154498599d3d666733d736f045d19bfc" Feb 17 14:28:28 crc kubenswrapper[4836]: I0217 14:28:28.028753 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Feb 17 14:28:28 crc kubenswrapper[4836]: I0217 14:28:28.029381 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 17 14:28:28 crc kubenswrapper[4836]: I0217 14:28:28.029558 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 17 14:28:28 crc kubenswrapper[4836]: I0217 14:28:28.029721 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-qbbvn" Feb 17 14:28:28 crc kubenswrapper[4836]: I0217 14:28:28.063085 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 14:28:28 crc kubenswrapper[4836]: I0217 14:28:28.084637 4836 scope.go:117] "RemoveContainer" containerID="4184adaaf005ab31a219f3203092826b82f06f8762db3d006dcd2fec9f1d8ea0" Feb 17 14:28:28 crc kubenswrapper[4836]: I0217 14:28:28.107749 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 14:28:28 crc kubenswrapper[4836]: I0217 14:28:28.113553 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 17 14:28:28 crc kubenswrapper[4836]: I0217 14:28:28.118808 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 17 14:28:28 crc kubenswrapper[4836]: I0217 14:28:28.119941 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 17 14:28:28 crc kubenswrapper[4836]: I0217 14:28:28.148045 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 14:28:28 crc kubenswrapper[4836]: I0217 14:28:28.156981 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9fc032cb-3063-4e39-a91f-ccc89defe9c4-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9fc032cb-3063-4e39-a91f-ccc89defe9c4\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:28:28 crc kubenswrapper[4836]: I0217 14:28:28.157222 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9fc032cb-3063-4e39-a91f-ccc89defe9c4-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9fc032cb-3063-4e39-a91f-ccc89defe9c4\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:28:28 crc kubenswrapper[4836]: I0217 14:28:28.161569 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-21ce1609-dbd7-400f-b0f4-c62fbe057ccf\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-21ce1609-dbd7-400f-b0f4-c62fbe057ccf\") pod \"glance-default-internal-api-0\" (UID: \"9fc032cb-3063-4e39-a91f-ccc89defe9c4\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:28:28 crc kubenswrapper[4836]: I0217 14:28:28.161630 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9fc032cb-3063-4e39-a91f-ccc89defe9c4-logs\") pod \"glance-default-internal-api-0\" (UID: \"9fc032cb-3063-4e39-a91f-ccc89defe9c4\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:28:28 crc kubenswrapper[4836]: I0217 14:28:28.161783 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6smd\" (UniqueName: \"kubernetes.io/projected/9fc032cb-3063-4e39-a91f-ccc89defe9c4-kube-api-access-x6smd\") pod \"glance-default-internal-api-0\" (UID: \"9fc032cb-3063-4e39-a91f-ccc89defe9c4\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:28:28 crc kubenswrapper[4836]: I0217 14:28:28.161858 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fc032cb-3063-4e39-a91f-ccc89defe9c4-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9fc032cb-3063-4e39-a91f-ccc89defe9c4\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:28:28 crc kubenswrapper[4836]: I0217 14:28:28.161932 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9fc032cb-3063-4e39-a91f-ccc89defe9c4-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9fc032cb-3063-4e39-a91f-ccc89defe9c4\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:28:28 crc kubenswrapper[4836]: I0217 14:28:28.161952 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9fc032cb-3063-4e39-a91f-ccc89defe9c4-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"9fc032cb-3063-4e39-a91f-ccc89defe9c4\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:28:28 crc kubenswrapper[4836]: I0217 14:28:28.264703 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6smd\" (UniqueName: \"kubernetes.io/projected/9fc032cb-3063-4e39-a91f-ccc89defe9c4-kube-api-access-x6smd\") pod \"glance-default-internal-api-0\" (UID: \"9fc032cb-3063-4e39-a91f-ccc89defe9c4\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:28:28 crc kubenswrapper[4836]: I0217 14:28:28.264810 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fc032cb-3063-4e39-a91f-ccc89defe9c4-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9fc032cb-3063-4e39-a91f-ccc89defe9c4\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:28:28 crc kubenswrapper[4836]: I0217 14:28:28.264846 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9fc032cb-3063-4e39-a91f-ccc89defe9c4-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9fc032cb-3063-4e39-a91f-ccc89defe9c4\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:28:28 crc kubenswrapper[4836]: I0217 14:28:28.264863 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9fc032cb-3063-4e39-a91f-ccc89defe9c4-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"9fc032cb-3063-4e39-a91f-ccc89defe9c4\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:28:28 crc kubenswrapper[4836]: I0217 14:28:28.265379 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cbj8\" (UniqueName: \"kubernetes.io/projected/c29f84b9-3879-4fc6-b2aa-e334bd08f24e-kube-api-access-8cbj8\") pod \"glance-default-external-api-0\" (UID: \"c29f84b9-3879-4fc6-b2aa-e334bd08f24e\") " pod="openstack/glance-default-external-api-0" Feb 17 14:28:28 crc kubenswrapper[4836]: I0217 14:28:28.265421 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c29f84b9-3879-4fc6-b2aa-e334bd08f24e-config-data\") pod \"glance-default-external-api-0\" (UID: \"c29f84b9-3879-4fc6-b2aa-e334bd08f24e\") " pod="openstack/glance-default-external-api-0" Feb 17 14:28:28 crc kubenswrapper[4836]: I0217 14:28:28.265914 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9fc032cb-3063-4e39-a91f-ccc89defe9c4-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9fc032cb-3063-4e39-a91f-ccc89defe9c4\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:28:28 crc kubenswrapper[4836]: I0217 14:28:28.266086 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c29f84b9-3879-4fc6-b2aa-e334bd08f24e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c29f84b9-3879-4fc6-b2aa-e334bd08f24e\") " pod="openstack/glance-default-external-api-0" Feb 17 14:28:28 crc kubenswrapper[4836]: I0217 14:28:28.266325 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9fc032cb-3063-4e39-a91f-ccc89defe9c4-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9fc032cb-3063-4e39-a91f-ccc89defe9c4\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:28:28 crc kubenswrapper[4836]: I0217 14:28:28.266583 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c29f84b9-3879-4fc6-b2aa-e334bd08f24e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c29f84b9-3879-4fc6-b2aa-e334bd08f24e\") " pod="openstack/glance-default-external-api-0" Feb 17 14:28:28 crc kubenswrapper[4836]: I0217 14:28:28.266770 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c29f84b9-3879-4fc6-b2aa-e334bd08f24e-scripts\") pod \"glance-default-external-api-0\" (UID: \"c29f84b9-3879-4fc6-b2aa-e334bd08f24e\") " pod="openstack/glance-default-external-api-0" Feb 17 14:28:28 crc kubenswrapper[4836]: I0217 14:28:28.266812 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-47a47ba0-1e72-4c37-bb61-5251f8f68b34\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-47a47ba0-1e72-4c37-bb61-5251f8f68b34\") pod \"glance-default-external-api-0\" (UID: \"c29f84b9-3879-4fc6-b2aa-e334bd08f24e\") " pod="openstack/glance-default-external-api-0" Feb 17 14:28:28 crc kubenswrapper[4836]: I0217 14:28:28.266894 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-21ce1609-dbd7-400f-b0f4-c62fbe057ccf\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-21ce1609-dbd7-400f-b0f4-c62fbe057ccf\") pod \"glance-default-internal-api-0\" (UID: \"9fc032cb-3063-4e39-a91f-ccc89defe9c4\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:28:28 crc kubenswrapper[4836]: I0217 14:28:28.266895 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9fc032cb-3063-4e39-a91f-ccc89defe9c4-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9fc032cb-3063-4e39-a91f-ccc89defe9c4\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:28:28 crc kubenswrapper[4836]: I0217 14:28:28.266924 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9fc032cb-3063-4e39-a91f-ccc89defe9c4-logs\") pod \"glance-default-internal-api-0\" (UID: \"9fc032cb-3063-4e39-a91f-ccc89defe9c4\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:28:28 crc kubenswrapper[4836]: I0217 14:28:28.267160 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c29f84b9-3879-4fc6-b2aa-e334bd08f24e-logs\") pod \"glance-default-external-api-0\" (UID: \"c29f84b9-3879-4fc6-b2aa-e334bd08f24e\") " pod="openstack/glance-default-external-api-0" Feb 17 14:28:28 crc kubenswrapper[4836]: I0217 14:28:28.267324 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9fc032cb-3063-4e39-a91f-ccc89defe9c4-logs\") pod \"glance-default-internal-api-0\" (UID: \"9fc032cb-3063-4e39-a91f-ccc89defe9c4\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:28:28 crc kubenswrapper[4836]: I0217 14:28:28.267324 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c29f84b9-3879-4fc6-b2aa-e334bd08f24e-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"c29f84b9-3879-4fc6-b2aa-e334bd08f24e\") " pod="openstack/glance-default-external-api-0" Feb 17 14:28:28 crc kubenswrapper[4836]: I0217 14:28:28.273816 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9fc032cb-3063-4e39-a91f-ccc89defe9c4-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9fc032cb-3063-4e39-a91f-ccc89defe9c4\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:28:28 crc kubenswrapper[4836]: I0217 14:28:28.275603 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9fc032cb-3063-4e39-a91f-ccc89defe9c4-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9fc032cb-3063-4e39-a91f-ccc89defe9c4\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:28:28 crc kubenswrapper[4836]: I0217 14:28:28.275690 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fc032cb-3063-4e39-a91f-ccc89defe9c4-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9fc032cb-3063-4e39-a91f-ccc89defe9c4\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:28:28 crc kubenswrapper[4836]: I0217 14:28:28.276277 4836 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 17 14:28:28 crc kubenswrapper[4836]: I0217 14:28:28.276328 4836 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-21ce1609-dbd7-400f-b0f4-c62fbe057ccf\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-21ce1609-dbd7-400f-b0f4-c62fbe057ccf\") pod \"glance-default-internal-api-0\" (UID: \"9fc032cb-3063-4e39-a91f-ccc89defe9c4\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/20e9fd566d593755c515c6f55c386051b7cebe94721b27d85313d87ab22fcec4/globalmount\"" pod="openstack/glance-default-internal-api-0" Feb 17 14:28:28 crc kubenswrapper[4836]: I0217 14:28:28.278877 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9fc032cb-3063-4e39-a91f-ccc89defe9c4-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"9fc032cb-3063-4e39-a91f-ccc89defe9c4\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:28:28 crc kubenswrapper[4836]: I0217 14:28:28.292340 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6smd\" (UniqueName: \"kubernetes.io/projected/9fc032cb-3063-4e39-a91f-ccc89defe9c4-kube-api-access-x6smd\") pod \"glance-default-internal-api-0\" (UID: \"9fc032cb-3063-4e39-a91f-ccc89defe9c4\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:28:28 crc kubenswrapper[4836]: I0217 14:28:28.338064 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-21ce1609-dbd7-400f-b0f4-c62fbe057ccf\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-21ce1609-dbd7-400f-b0f4-c62fbe057ccf\") pod \"glance-default-internal-api-0\" (UID: \"9fc032cb-3063-4e39-a91f-ccc89defe9c4\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:28:28 crc kubenswrapper[4836]: I0217 14:28:28.362775 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 17 14:28:28 crc kubenswrapper[4836]: I0217 14:28:28.369579 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c29f84b9-3879-4fc6-b2aa-e334bd08f24e-logs\") pod \"glance-default-external-api-0\" (UID: \"c29f84b9-3879-4fc6-b2aa-e334bd08f24e\") " pod="openstack/glance-default-external-api-0" Feb 17 14:28:28 crc kubenswrapper[4836]: I0217 14:28:28.369678 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c29f84b9-3879-4fc6-b2aa-e334bd08f24e-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"c29f84b9-3879-4fc6-b2aa-e334bd08f24e\") " pod="openstack/glance-default-external-api-0" Feb 17 14:28:28 crc kubenswrapper[4836]: I0217 14:28:28.369780 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8cbj8\" (UniqueName: \"kubernetes.io/projected/c29f84b9-3879-4fc6-b2aa-e334bd08f24e-kube-api-access-8cbj8\") pod \"glance-default-external-api-0\" (UID: \"c29f84b9-3879-4fc6-b2aa-e334bd08f24e\") " pod="openstack/glance-default-external-api-0" Feb 17 14:28:28 crc kubenswrapper[4836]: I0217 14:28:28.369801 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c29f84b9-3879-4fc6-b2aa-e334bd08f24e-config-data\") pod \"glance-default-external-api-0\" (UID: \"c29f84b9-3879-4fc6-b2aa-e334bd08f24e\") " pod="openstack/glance-default-external-api-0" Feb 17 14:28:28 crc kubenswrapper[4836]: I0217 14:28:28.369856 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c29f84b9-3879-4fc6-b2aa-e334bd08f24e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c29f84b9-3879-4fc6-b2aa-e334bd08f24e\") " pod="openstack/glance-default-external-api-0" Feb 17 14:28:28 crc kubenswrapper[4836]: I0217 14:28:28.369913 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c29f84b9-3879-4fc6-b2aa-e334bd08f24e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c29f84b9-3879-4fc6-b2aa-e334bd08f24e\") " pod="openstack/glance-default-external-api-0" Feb 17 14:28:28 crc kubenswrapper[4836]: I0217 14:28:28.369934 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c29f84b9-3879-4fc6-b2aa-e334bd08f24e-scripts\") pod \"glance-default-external-api-0\" (UID: \"c29f84b9-3879-4fc6-b2aa-e334bd08f24e\") " pod="openstack/glance-default-external-api-0" Feb 17 14:28:28 crc kubenswrapper[4836]: I0217 14:28:28.369957 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-47a47ba0-1e72-4c37-bb61-5251f8f68b34\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-47a47ba0-1e72-4c37-bb61-5251f8f68b34\") pod \"glance-default-external-api-0\" (UID: \"c29f84b9-3879-4fc6-b2aa-e334bd08f24e\") " pod="openstack/glance-default-external-api-0" Feb 17 14:28:28 crc kubenswrapper[4836]: I0217 14:28:28.375105 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c29f84b9-3879-4fc6-b2aa-e334bd08f24e-logs\") pod \"glance-default-external-api-0\" (UID: \"c29f84b9-3879-4fc6-b2aa-e334bd08f24e\") " pod="openstack/glance-default-external-api-0" Feb 17 14:28:28 crc kubenswrapper[4836]: I0217 14:28:28.375280 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c29f84b9-3879-4fc6-b2aa-e334bd08f24e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c29f84b9-3879-4fc6-b2aa-e334bd08f24e\") " pod="openstack/glance-default-external-api-0" Feb 17 14:28:28 crc kubenswrapper[4836]: I0217 14:28:28.380200 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c29f84b9-3879-4fc6-b2aa-e334bd08f24e-scripts\") pod \"glance-default-external-api-0\" (UID: \"c29f84b9-3879-4fc6-b2aa-e334bd08f24e\") " pod="openstack/glance-default-external-api-0" Feb 17 14:28:28 crc kubenswrapper[4836]: I0217 14:28:28.381191 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c29f84b9-3879-4fc6-b2aa-e334bd08f24e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c29f84b9-3879-4fc6-b2aa-e334bd08f24e\") " pod="openstack/glance-default-external-api-0" Feb 17 14:28:28 crc kubenswrapper[4836]: I0217 14:28:28.383722 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c29f84b9-3879-4fc6-b2aa-e334bd08f24e-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"c29f84b9-3879-4fc6-b2aa-e334bd08f24e\") " pod="openstack/glance-default-external-api-0" Feb 17 14:28:28 crc kubenswrapper[4836]: I0217 14:28:28.388105 4836 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 17 14:28:28 crc kubenswrapper[4836]: I0217 14:28:28.388174 4836 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-47a47ba0-1e72-4c37-bb61-5251f8f68b34\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-47a47ba0-1e72-4c37-bb61-5251f8f68b34\") pod \"glance-default-external-api-0\" (UID: \"c29f84b9-3879-4fc6-b2aa-e334bd08f24e\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/f1c05c143b5a67726d067625f4c5da25dac4624853da03b1088e3ef561519b77/globalmount\"" pod="openstack/glance-default-external-api-0" Feb 17 14:28:28 crc kubenswrapper[4836]: I0217 14:28:28.388392 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c29f84b9-3879-4fc6-b2aa-e334bd08f24e-config-data\") pod \"glance-default-external-api-0\" (UID: \"c29f84b9-3879-4fc6-b2aa-e334bd08f24e\") " pod="openstack/glance-default-external-api-0" Feb 17 14:28:28 crc kubenswrapper[4836]: I0217 14:28:28.392889 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cbj8\" (UniqueName: \"kubernetes.io/projected/c29f84b9-3879-4fc6-b2aa-e334bd08f24e-kube-api-access-8cbj8\") pod \"glance-default-external-api-0\" (UID: \"c29f84b9-3879-4fc6-b2aa-e334bd08f24e\") " pod="openstack/glance-default-external-api-0" Feb 17 14:28:28 crc kubenswrapper[4836]: I0217 14:28:28.435365 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-47a47ba0-1e72-4c37-bb61-5251f8f68b34\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-47a47ba0-1e72-4c37-bb61-5251f8f68b34\") pod \"glance-default-external-api-0\" (UID: \"c29f84b9-3879-4fc6-b2aa-e334bd08f24e\") " pod="openstack/glance-default-external-api-0" Feb 17 14:28:28 crc kubenswrapper[4836]: I0217 14:28:28.456062 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 17 14:28:28 crc kubenswrapper[4836]: I0217 14:28:28.623380 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c16e95ef-bb15-4f5c-b53d-42ca0c183ed6" path="/var/lib/kubelet/pods/c16e95ef-bb15-4f5c-b53d-42ca0c183ed6/volumes" Feb 17 14:28:28 crc kubenswrapper[4836]: I0217 14:28:28.625856 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3515b05-3f55-44fc-9578-ee0d73cb7382" path="/var/lib/kubelet/pods/c3515b05-3f55-44fc-9578-ee0d73cb7382/volumes" Feb 17 14:28:28 crc kubenswrapper[4836]: I0217 14:28:28.854348 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-qqwhc" event={"ID":"8185c649-f1ad-4230-830d-07d002e5b358","Type":"ContainerStarted","Data":"ff24c89536ae06cf6a0fbffcb68050de3e8ed22356c912b4e7e87afbef99480d"} Feb 17 14:28:28 crc kubenswrapper[4836]: I0217 14:28:28.904412 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-qqwhc" podStartSLOduration=5.757878491 podStartE2EDuration="52.904379829s" podCreationTimestamp="2026-02-17 14:27:36 +0000 UTC" firstStartedPulling="2026-02-17 14:27:39.956074452 +0000 UTC m=+1286.299002721" lastFinishedPulling="2026-02-17 14:28:27.10257579 +0000 UTC m=+1333.445504059" observedRunningTime="2026-02-17 14:28:28.881213309 +0000 UTC m=+1335.224141598" watchObservedRunningTime="2026-02-17 14:28:28.904379829 +0000 UTC m=+1335.247308098" Feb 17 14:28:29 crc kubenswrapper[4836]: I0217 14:28:29.036997 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Feb 17 14:28:29 crc kubenswrapper[4836]: I0217 14:28:29.056038 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 14:28:29 crc kubenswrapper[4836]: I0217 14:28:29.296097 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 14:28:29 crc kubenswrapper[4836]: I0217 14:28:29.316573 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-55f844cf75-knj6m" Feb 17 14:28:29 crc kubenswrapper[4836]: I0217 14:28:29.417195 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-9hcq9"] Feb 17 14:28:29 crc kubenswrapper[4836]: I0217 14:28:29.417536 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-58dd9ff6bc-9hcq9" podUID="ccce7d80-ec87-4fb2-a75f-1b5ddc2f4be9" containerName="dnsmasq-dns" containerID="cri-o://2fca778edd45bdfb866af7aaa0fc6f307d910a96cf1cd5ecfab2d14db35f72e8" gracePeriod=10 Feb 17 14:28:29 crc kubenswrapper[4836]: I0217 14:28:29.747985 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-pdhxs" Feb 17 14:28:29 crc kubenswrapper[4836]: I0217 14:28:29.753020 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-vmgps" Feb 17 14:28:29 crc kubenswrapper[4836]: I0217 14:28:29.906656 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-g9l4s" Feb 17 14:28:29 crc kubenswrapper[4836]: I0217 14:28:29.913580 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c29f84b9-3879-4fc6-b2aa-e334bd08f24e","Type":"ContainerStarted","Data":"a73e6cf975755957f05fddc903522d5d75b3eb7f41eb5a42c5ad06b115f44634"} Feb 17 14:28:29 crc kubenswrapper[4836]: I0217 14:28:29.951812 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10331926-261d-4e44-a8c2-89846903ca12-combined-ca-bundle\") pod \"10331926-261d-4e44-a8c2-89846903ca12\" (UID: \"10331926-261d-4e44-a8c2-89846903ca12\") " Feb 17 14:28:29 crc kubenswrapper[4836]: I0217 14:28:29.951987 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1fe4b42c-afbf-41e1-8035-5fffb156eadc-logs\") pod \"1fe4b42c-afbf-41e1-8035-5fffb156eadc\" (UID: \"1fe4b42c-afbf-41e1-8035-5fffb156eadc\") " Feb 17 14:28:29 crc kubenswrapper[4836]: I0217 14:28:29.952025 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k78mf\" (UniqueName: \"kubernetes.io/projected/1fe4b42c-afbf-41e1-8035-5fffb156eadc-kube-api-access-k78mf\") pod \"1fe4b42c-afbf-41e1-8035-5fffb156eadc\" (UID: \"1fe4b42c-afbf-41e1-8035-5fffb156eadc\") " Feb 17 14:28:29 crc kubenswrapper[4836]: I0217 14:28:29.952082 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fe4b42c-afbf-41e1-8035-5fffb156eadc-combined-ca-bundle\") pod \"1fe4b42c-afbf-41e1-8035-5fffb156eadc\" (UID: \"1fe4b42c-afbf-41e1-8035-5fffb156eadc\") " Feb 17 14:28:29 crc kubenswrapper[4836]: I0217 14:28:29.952154 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fe4b42c-afbf-41e1-8035-5fffb156eadc-config-data\") pod \"1fe4b42c-afbf-41e1-8035-5fffb156eadc\" (UID: \"1fe4b42c-afbf-41e1-8035-5fffb156eadc\") " Feb 17 14:28:29 crc kubenswrapper[4836]: I0217 14:28:29.952183 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/10331926-261d-4e44-a8c2-89846903ca12-scripts\") pod \"10331926-261d-4e44-a8c2-89846903ca12\" (UID: \"10331926-261d-4e44-a8c2-89846903ca12\") " Feb 17 14:28:29 crc kubenswrapper[4836]: I0217 14:28:29.952331 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10331926-261d-4e44-a8c2-89846903ca12-config-data\") pod \"10331926-261d-4e44-a8c2-89846903ca12\" (UID: \"10331926-261d-4e44-a8c2-89846903ca12\") " Feb 17 14:28:29 crc kubenswrapper[4836]: I0217 14:28:29.952374 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/10331926-261d-4e44-a8c2-89846903ca12-credential-keys\") pod \"10331926-261d-4e44-a8c2-89846903ca12\" (UID: \"10331926-261d-4e44-a8c2-89846903ca12\") " Feb 17 14:28:29 crc kubenswrapper[4836]: I0217 14:28:29.952486 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1fe4b42c-afbf-41e1-8035-5fffb156eadc-scripts\") pod \"1fe4b42c-afbf-41e1-8035-5fffb156eadc\" (UID: \"1fe4b42c-afbf-41e1-8035-5fffb156eadc\") " Feb 17 14:28:29 crc kubenswrapper[4836]: I0217 14:28:29.952580 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fflgh\" (UniqueName: \"kubernetes.io/projected/10331926-261d-4e44-a8c2-89846903ca12-kube-api-access-fflgh\") pod \"10331926-261d-4e44-a8c2-89846903ca12\" (UID: \"10331926-261d-4e44-a8c2-89846903ca12\") " Feb 17 14:28:29 crc kubenswrapper[4836]: I0217 14:28:29.952617 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/10331926-261d-4e44-a8c2-89846903ca12-fernet-keys\") pod \"10331926-261d-4e44-a8c2-89846903ca12\" (UID: \"10331926-261d-4e44-a8c2-89846903ca12\") " Feb 17 14:28:29 crc kubenswrapper[4836]: I0217 14:28:29.959505 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9fc032cb-3063-4e39-a91f-ccc89defe9c4","Type":"ContainerStarted","Data":"38f3541a8bef919fb1afd541589fd4540ccef699d3e6a2e7f1dcb0859f09ea45"} Feb 17 14:28:29 crc kubenswrapper[4836]: I0217 14:28:29.967081 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1fe4b42c-afbf-41e1-8035-5fffb156eadc-logs" (OuterVolumeSpecName: "logs") pod "1fe4b42c-afbf-41e1-8035-5fffb156eadc" (UID: "1fe4b42c-afbf-41e1-8035-5fffb156eadc"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:28:29 crc kubenswrapper[4836]: I0217 14:28:29.977480 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10331926-261d-4e44-a8c2-89846903ca12-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "10331926-261d-4e44-a8c2-89846903ca12" (UID: "10331926-261d-4e44-a8c2-89846903ca12"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:28:29 crc kubenswrapper[4836]: I0217 14:28:29.977636 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10331926-261d-4e44-a8c2-89846903ca12-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "10331926-261d-4e44-a8c2-89846903ca12" (UID: "10331926-261d-4e44-a8c2-89846903ca12"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:28:29 crc kubenswrapper[4836]: I0217 14:28:29.983864 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-vmgps" Feb 17 14:28:29 crc kubenswrapper[4836]: I0217 14:28:29.984074 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-vmgps" event={"ID":"10331926-261d-4e44-a8c2-89846903ca12","Type":"ContainerDied","Data":"02c724d11382ee98b69e6abaefd40a4e20c1b972def951d53202ec6f8b2b38f2"} Feb 17 14:28:29 crc kubenswrapper[4836]: I0217 14:28:29.984148 4836 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="02c724d11382ee98b69e6abaefd40a4e20c1b972def951d53202ec6f8b2b38f2" Feb 17 14:28:29 crc kubenswrapper[4836]: I0217 14:28:29.986968 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fe4b42c-afbf-41e1-8035-5fffb156eadc-kube-api-access-k78mf" (OuterVolumeSpecName: "kube-api-access-k78mf") pod "1fe4b42c-afbf-41e1-8035-5fffb156eadc" (UID: "1fe4b42c-afbf-41e1-8035-5fffb156eadc"). InnerVolumeSpecName "kube-api-access-k78mf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:28:29 crc kubenswrapper[4836]: I0217 14:28:29.987515 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fe4b42c-afbf-41e1-8035-5fffb156eadc-scripts" (OuterVolumeSpecName: "scripts") pod "1fe4b42c-afbf-41e1-8035-5fffb156eadc" (UID: "1fe4b42c-afbf-41e1-8035-5fffb156eadc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:28:29 crc kubenswrapper[4836]: I0217 14:28:29.989723 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10331926-261d-4e44-a8c2-89846903ca12-scripts" (OuterVolumeSpecName: "scripts") pod "10331926-261d-4e44-a8c2-89846903ca12" (UID: "10331926-261d-4e44-a8c2-89846903ca12"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:28:29 crc kubenswrapper[4836]: I0217 14:28:29.993952 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-g9l4s" event={"ID":"18361bc2-5db1-4611-be18-38593e0b5d5d","Type":"ContainerDied","Data":"b92684999081dd7ebe3c5f048ea5d9a568a0e24a28001ce5ab97b2282351bbcb"} Feb 17 14:28:29 crc kubenswrapper[4836]: I0217 14:28:29.994042 4836 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b92684999081dd7ebe3c5f048ea5d9a568a0e24a28001ce5ab97b2282351bbcb" Feb 17 14:28:29 crc kubenswrapper[4836]: I0217 14:28:29.994211 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-g9l4s" Feb 17 14:28:29 crc kubenswrapper[4836]: I0217 14:28:29.996569 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10331926-261d-4e44-a8c2-89846903ca12-kube-api-access-fflgh" (OuterVolumeSpecName: "kube-api-access-fflgh") pod "10331926-261d-4e44-a8c2-89846903ca12" (UID: "10331926-261d-4e44-a8c2-89846903ca12"). InnerVolumeSpecName "kube-api-access-fflgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:28:30 crc kubenswrapper[4836]: I0217 14:28:30.058461 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sqtgf\" (UniqueName: \"kubernetes.io/projected/18361bc2-5db1-4611-be18-38593e0b5d5d-kube-api-access-sqtgf\") pod \"18361bc2-5db1-4611-be18-38593e0b5d5d\" (UID: \"18361bc2-5db1-4611-be18-38593e0b5d5d\") " Feb 17 14:28:30 crc kubenswrapper[4836]: I0217 14:28:30.058597 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18361bc2-5db1-4611-be18-38593e0b5d5d-combined-ca-bundle\") pod \"18361bc2-5db1-4611-be18-38593e0b5d5d\" (UID: \"18361bc2-5db1-4611-be18-38593e0b5d5d\") " Feb 17 14:28:30 crc kubenswrapper[4836]: I0217 14:28:30.058781 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/18361bc2-5db1-4611-be18-38593e0b5d5d-db-sync-config-data\") pod \"18361bc2-5db1-4611-be18-38593e0b5d5d\" (UID: \"18361bc2-5db1-4611-be18-38593e0b5d5d\") " Feb 17 14:28:30 crc kubenswrapper[4836]: I0217 14:28:30.059598 4836 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/10331926-261d-4e44-a8c2-89846903ca12-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:30 crc kubenswrapper[4836]: I0217 14:28:30.059617 4836 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1fe4b42c-afbf-41e1-8035-5fffb156eadc-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:30 crc kubenswrapper[4836]: I0217 14:28:30.059627 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fflgh\" (UniqueName: \"kubernetes.io/projected/10331926-261d-4e44-a8c2-89846903ca12-kube-api-access-fflgh\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:30 crc kubenswrapper[4836]: I0217 14:28:30.059639 4836 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/10331926-261d-4e44-a8c2-89846903ca12-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:30 crc kubenswrapper[4836]: I0217 14:28:30.059653 4836 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1fe4b42c-afbf-41e1-8035-5fffb156eadc-logs\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:30 crc kubenswrapper[4836]: I0217 14:28:30.059663 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k78mf\" (UniqueName: \"kubernetes.io/projected/1fe4b42c-afbf-41e1-8035-5fffb156eadc-kube-api-access-k78mf\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:30 crc kubenswrapper[4836]: I0217 14:28:30.059675 4836 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/10331926-261d-4e44-a8c2-89846903ca12-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:30 crc kubenswrapper[4836]: I0217 14:28:30.082547 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10331926-261d-4e44-a8c2-89846903ca12-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "10331926-261d-4e44-a8c2-89846903ca12" (UID: "10331926-261d-4e44-a8c2-89846903ca12"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:28:30 crc kubenswrapper[4836]: I0217 14:28:30.082917 4836 generic.go:334] "Generic (PLEG): container finished" podID="ccce7d80-ec87-4fb2-a75f-1b5ddc2f4be9" containerID="2fca778edd45bdfb866af7aaa0fc6f307d910a96cf1cd5ecfab2d14db35f72e8" exitCode=0 Feb 17 14:28:30 crc kubenswrapper[4836]: I0217 14:28:30.083064 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dd9ff6bc-9hcq9" event={"ID":"ccce7d80-ec87-4fb2-a75f-1b5ddc2f4be9","Type":"ContainerDied","Data":"2fca778edd45bdfb866af7aaa0fc6f307d910a96cf1cd5ecfab2d14db35f72e8"} Feb 17 14:28:30 crc kubenswrapper[4836]: I0217 14:28:30.083913 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10331926-261d-4e44-a8c2-89846903ca12-config-data" (OuterVolumeSpecName: "config-data") pod "10331926-261d-4e44-a8c2-89846903ca12" (UID: "10331926-261d-4e44-a8c2-89846903ca12"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:28:30 crc kubenswrapper[4836]: I0217 14:28:30.100391 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-pdhxs" event={"ID":"1fe4b42c-afbf-41e1-8035-5fffb156eadc","Type":"ContainerDied","Data":"63af66d6a1d8223670f744aeb2ae7fb99f6f7344d8ab31ed483859da53a657a7"} Feb 17 14:28:30 crc kubenswrapper[4836]: I0217 14:28:30.100446 4836 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="63af66d6a1d8223670f744aeb2ae7fb99f6f7344d8ab31ed483859da53a657a7" Feb 17 14:28:30 crc kubenswrapper[4836]: I0217 14:28:30.100585 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-pdhxs" Feb 17 14:28:30 crc kubenswrapper[4836]: I0217 14:28:30.121126 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-55d7557768-wvvpt"] Feb 17 14:28:30 crc kubenswrapper[4836]: E0217 14:28:30.121827 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fe4b42c-afbf-41e1-8035-5fffb156eadc" containerName="placement-db-sync" Feb 17 14:28:30 crc kubenswrapper[4836]: I0217 14:28:30.121848 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fe4b42c-afbf-41e1-8035-5fffb156eadc" containerName="placement-db-sync" Feb 17 14:28:30 crc kubenswrapper[4836]: E0217 14:28:30.121857 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18361bc2-5db1-4611-be18-38593e0b5d5d" containerName="barbican-db-sync" Feb 17 14:28:30 crc kubenswrapper[4836]: I0217 14:28:30.121864 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="18361bc2-5db1-4611-be18-38593e0b5d5d" containerName="barbican-db-sync" Feb 17 14:28:30 crc kubenswrapper[4836]: E0217 14:28:30.121881 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10331926-261d-4e44-a8c2-89846903ca12" containerName="keystone-bootstrap" Feb 17 14:28:30 crc kubenswrapper[4836]: I0217 14:28:30.121888 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="10331926-261d-4e44-a8c2-89846903ca12" containerName="keystone-bootstrap" Feb 17 14:28:30 crc kubenswrapper[4836]: I0217 14:28:30.122097 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fe4b42c-afbf-41e1-8035-5fffb156eadc" containerName="placement-db-sync" Feb 17 14:28:30 crc kubenswrapper[4836]: I0217 14:28:30.122115 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="10331926-261d-4e44-a8c2-89846903ca12" containerName="keystone-bootstrap" Feb 17 14:28:30 crc kubenswrapper[4836]: I0217 14:28:30.122133 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="18361bc2-5db1-4611-be18-38593e0b5d5d" containerName="barbican-db-sync" Feb 17 14:28:30 crc kubenswrapper[4836]: I0217 14:28:30.123667 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-55d7557768-wvvpt" Feb 17 14:28:30 crc kubenswrapper[4836]: I0217 14:28:30.129559 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Feb 17 14:28:30 crc kubenswrapper[4836]: I0217 14:28:30.129859 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Feb 17 14:28:30 crc kubenswrapper[4836]: I0217 14:28:30.129944 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18361bc2-5db1-4611-be18-38593e0b5d5d-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "18361bc2-5db1-4611-be18-38593e0b5d5d" (UID: "18361bc2-5db1-4611-be18-38593e0b5d5d"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:28:30 crc kubenswrapper[4836]: I0217 14:28:30.134759 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fe4b42c-afbf-41e1-8035-5fffb156eadc-config-data" (OuterVolumeSpecName: "config-data") pod "1fe4b42c-afbf-41e1-8035-5fffb156eadc" (UID: "1fe4b42c-afbf-41e1-8035-5fffb156eadc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:28:30 crc kubenswrapper[4836]: I0217 14:28:30.160350 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18361bc2-5db1-4611-be18-38593e0b5d5d-kube-api-access-sqtgf" (OuterVolumeSpecName: "kube-api-access-sqtgf") pod "18361bc2-5db1-4611-be18-38593e0b5d5d" (UID: "18361bc2-5db1-4611-be18-38593e0b5d5d"). InnerVolumeSpecName "kube-api-access-sqtgf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:28:30 crc kubenswrapper[4836]: I0217 14:28:30.168821 4836 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fe4b42c-afbf-41e1-8035-5fffb156eadc-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:30 crc kubenswrapper[4836]: I0217 14:28:30.168877 4836 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/18361bc2-5db1-4611-be18-38593e0b5d5d-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:30 crc kubenswrapper[4836]: I0217 14:28:30.168891 4836 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10331926-261d-4e44-a8c2-89846903ca12-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:30 crc kubenswrapper[4836]: I0217 14:28:30.168903 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sqtgf\" (UniqueName: \"kubernetes.io/projected/18361bc2-5db1-4611-be18-38593e0b5d5d-kube-api-access-sqtgf\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:30 crc kubenswrapper[4836]: I0217 14:28:30.168915 4836 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10331926-261d-4e44-a8c2-89846903ca12-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:30 crc kubenswrapper[4836]: I0217 14:28:30.171604 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-78c4d587b5-cqhdl"] Feb 17 14:28:30 crc kubenswrapper[4836]: I0217 14:28:30.174638 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-78c4d587b5-cqhdl" Feb 17 14:28:30 crc kubenswrapper[4836]: I0217 14:28:30.179427 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Feb 17 14:28:30 crc kubenswrapper[4836]: I0217 14:28:30.180428 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Feb 17 14:28:30 crc kubenswrapper[4836]: I0217 14:28:30.190686 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18361bc2-5db1-4611-be18-38593e0b5d5d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "18361bc2-5db1-4611-be18-38593e0b5d5d" (UID: "18361bc2-5db1-4611-be18-38593e0b5d5d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:28:30 crc kubenswrapper[4836]: I0217 14:28:30.207534 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-78c4d587b5-cqhdl"] Feb 17 14:28:30 crc kubenswrapper[4836]: I0217 14:28:30.228342 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fe4b42c-afbf-41e1-8035-5fffb156eadc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1fe4b42c-afbf-41e1-8035-5fffb156eadc" (UID: "1fe4b42c-afbf-41e1-8035-5fffb156eadc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:28:30 crc kubenswrapper[4836]: I0217 14:28:30.279181 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/21c73844-3235-4a12-9f77-901ba8614e11-public-tls-certs\") pod \"placement-55d7557768-wvvpt\" (UID: \"21c73844-3235-4a12-9f77-901ba8614e11\") " pod="openstack/placement-55d7557768-wvvpt" Feb 17 14:28:30 crc kubenswrapper[4836]: I0217 14:28:30.279366 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2f9acba-3f54-43b6-9461-31cba0cc954b-public-tls-certs\") pod \"keystone-78c4d587b5-cqhdl\" (UID: \"f2f9acba-3f54-43b6-9461-31cba0cc954b\") " pod="openstack/keystone-78c4d587b5-cqhdl" Feb 17 14:28:30 crc kubenswrapper[4836]: I0217 14:28:30.279515 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/21c73844-3235-4a12-9f77-901ba8614e11-internal-tls-certs\") pod \"placement-55d7557768-wvvpt\" (UID: \"21c73844-3235-4a12-9f77-901ba8614e11\") " pod="openstack/placement-55d7557768-wvvpt" Feb 17 14:28:30 crc kubenswrapper[4836]: I0217 14:28:30.279545 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2f9acba-3f54-43b6-9461-31cba0cc954b-internal-tls-certs\") pod \"keystone-78c4d587b5-cqhdl\" (UID: \"f2f9acba-3f54-43b6-9461-31cba0cc954b\") " pod="openstack/keystone-78c4d587b5-cqhdl" Feb 17 14:28:30 crc kubenswrapper[4836]: I0217 14:28:30.279576 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21c73844-3235-4a12-9f77-901ba8614e11-scripts\") pod \"placement-55d7557768-wvvpt\" (UID: \"21c73844-3235-4a12-9f77-901ba8614e11\") " pod="openstack/placement-55d7557768-wvvpt" Feb 17 14:28:30 crc kubenswrapper[4836]: I0217 14:28:30.279677 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99mjt\" (UniqueName: \"kubernetes.io/projected/f2f9acba-3f54-43b6-9461-31cba0cc954b-kube-api-access-99mjt\") pod \"keystone-78c4d587b5-cqhdl\" (UID: \"f2f9acba-3f54-43b6-9461-31cba0cc954b\") " pod="openstack/keystone-78c4d587b5-cqhdl" Feb 17 14:28:30 crc kubenswrapper[4836]: I0217 14:28:30.279723 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21c73844-3235-4a12-9f77-901ba8614e11-config-data\") pod \"placement-55d7557768-wvvpt\" (UID: \"21c73844-3235-4a12-9f77-901ba8614e11\") " pod="openstack/placement-55d7557768-wvvpt" Feb 17 14:28:30 crc kubenswrapper[4836]: I0217 14:28:30.279755 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2f9acba-3f54-43b6-9461-31cba0cc954b-combined-ca-bundle\") pod \"keystone-78c4d587b5-cqhdl\" (UID: \"f2f9acba-3f54-43b6-9461-31cba0cc954b\") " pod="openstack/keystone-78c4d587b5-cqhdl" Feb 17 14:28:30 crc kubenswrapper[4836]: I0217 14:28:30.279799 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f2f9acba-3f54-43b6-9461-31cba0cc954b-credential-keys\") pod \"keystone-78c4d587b5-cqhdl\" (UID: \"f2f9acba-3f54-43b6-9461-31cba0cc954b\") " pod="openstack/keystone-78c4d587b5-cqhdl" Feb 17 14:28:30 crc kubenswrapper[4836]: I0217 14:28:30.279825 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfjpm\" (UniqueName: \"kubernetes.io/projected/21c73844-3235-4a12-9f77-901ba8614e11-kube-api-access-nfjpm\") pod \"placement-55d7557768-wvvpt\" (UID: \"21c73844-3235-4a12-9f77-901ba8614e11\") " pod="openstack/placement-55d7557768-wvvpt" Feb 17 14:28:30 crc kubenswrapper[4836]: I0217 14:28:30.279855 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f2f9acba-3f54-43b6-9461-31cba0cc954b-scripts\") pod \"keystone-78c4d587b5-cqhdl\" (UID: \"f2f9acba-3f54-43b6-9461-31cba0cc954b\") " pod="openstack/keystone-78c4d587b5-cqhdl" Feb 17 14:28:30 crc kubenswrapper[4836]: I0217 14:28:30.279909 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f2f9acba-3f54-43b6-9461-31cba0cc954b-fernet-keys\") pod \"keystone-78c4d587b5-cqhdl\" (UID: \"f2f9acba-3f54-43b6-9461-31cba0cc954b\") " pod="openstack/keystone-78c4d587b5-cqhdl" Feb 17 14:28:30 crc kubenswrapper[4836]: I0217 14:28:30.279961 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/21c73844-3235-4a12-9f77-901ba8614e11-logs\") pod \"placement-55d7557768-wvvpt\" (UID: \"21c73844-3235-4a12-9f77-901ba8614e11\") " pod="openstack/placement-55d7557768-wvvpt" Feb 17 14:28:30 crc kubenswrapper[4836]: I0217 14:28:30.280105 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21c73844-3235-4a12-9f77-901ba8614e11-combined-ca-bundle\") pod \"placement-55d7557768-wvvpt\" (UID: \"21c73844-3235-4a12-9f77-901ba8614e11\") " pod="openstack/placement-55d7557768-wvvpt" Feb 17 14:28:30 crc kubenswrapper[4836]: I0217 14:28:30.280176 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2f9acba-3f54-43b6-9461-31cba0cc954b-config-data\") pod \"keystone-78c4d587b5-cqhdl\" (UID: \"f2f9acba-3f54-43b6-9461-31cba0cc954b\") " pod="openstack/keystone-78c4d587b5-cqhdl" Feb 17 14:28:30 crc kubenswrapper[4836]: I0217 14:28:30.280816 4836 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18361bc2-5db1-4611-be18-38593e0b5d5d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:30 crc kubenswrapper[4836]: I0217 14:28:30.280840 4836 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fe4b42c-afbf-41e1-8035-5fffb156eadc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:30 crc kubenswrapper[4836]: I0217 14:28:30.338602 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-55d7557768-wvvpt"] Feb 17 14:28:30 crc kubenswrapper[4836]: I0217 14:28:30.391149 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/21c73844-3235-4a12-9f77-901ba8614e11-internal-tls-certs\") pod \"placement-55d7557768-wvvpt\" (UID: \"21c73844-3235-4a12-9f77-901ba8614e11\") " pod="openstack/placement-55d7557768-wvvpt" Feb 17 14:28:30 crc kubenswrapper[4836]: I0217 14:28:30.391246 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2f9acba-3f54-43b6-9461-31cba0cc954b-internal-tls-certs\") pod \"keystone-78c4d587b5-cqhdl\" (UID: \"f2f9acba-3f54-43b6-9461-31cba0cc954b\") " pod="openstack/keystone-78c4d587b5-cqhdl" Feb 17 14:28:30 crc kubenswrapper[4836]: I0217 14:28:30.391277 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21c73844-3235-4a12-9f77-901ba8614e11-scripts\") pod \"placement-55d7557768-wvvpt\" (UID: \"21c73844-3235-4a12-9f77-901ba8614e11\") " pod="openstack/placement-55d7557768-wvvpt" Feb 17 14:28:30 crc kubenswrapper[4836]: I0217 14:28:30.391398 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99mjt\" (UniqueName: \"kubernetes.io/projected/f2f9acba-3f54-43b6-9461-31cba0cc954b-kube-api-access-99mjt\") pod \"keystone-78c4d587b5-cqhdl\" (UID: \"f2f9acba-3f54-43b6-9461-31cba0cc954b\") " pod="openstack/keystone-78c4d587b5-cqhdl" Feb 17 14:28:30 crc kubenswrapper[4836]: I0217 14:28:30.391442 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21c73844-3235-4a12-9f77-901ba8614e11-config-data\") pod \"placement-55d7557768-wvvpt\" (UID: \"21c73844-3235-4a12-9f77-901ba8614e11\") " pod="openstack/placement-55d7557768-wvvpt" Feb 17 14:28:30 crc kubenswrapper[4836]: I0217 14:28:30.391465 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2f9acba-3f54-43b6-9461-31cba0cc954b-combined-ca-bundle\") pod \"keystone-78c4d587b5-cqhdl\" (UID: \"f2f9acba-3f54-43b6-9461-31cba0cc954b\") " pod="openstack/keystone-78c4d587b5-cqhdl" Feb 17 14:28:30 crc kubenswrapper[4836]: I0217 14:28:30.391504 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f2f9acba-3f54-43b6-9461-31cba0cc954b-credential-keys\") pod \"keystone-78c4d587b5-cqhdl\" (UID: \"f2f9acba-3f54-43b6-9461-31cba0cc954b\") " pod="openstack/keystone-78c4d587b5-cqhdl" Feb 17 14:28:30 crc kubenswrapper[4836]: I0217 14:28:30.391528 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nfjpm\" (UniqueName: \"kubernetes.io/projected/21c73844-3235-4a12-9f77-901ba8614e11-kube-api-access-nfjpm\") pod \"placement-55d7557768-wvvpt\" (UID: \"21c73844-3235-4a12-9f77-901ba8614e11\") " pod="openstack/placement-55d7557768-wvvpt" Feb 17 14:28:30 crc kubenswrapper[4836]: I0217 14:28:30.391581 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f2f9acba-3f54-43b6-9461-31cba0cc954b-scripts\") pod \"keystone-78c4d587b5-cqhdl\" (UID: \"f2f9acba-3f54-43b6-9461-31cba0cc954b\") " pod="openstack/keystone-78c4d587b5-cqhdl" Feb 17 14:28:30 crc kubenswrapper[4836]: I0217 14:28:30.391724 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f2f9acba-3f54-43b6-9461-31cba0cc954b-fernet-keys\") pod \"keystone-78c4d587b5-cqhdl\" (UID: \"f2f9acba-3f54-43b6-9461-31cba0cc954b\") " pod="openstack/keystone-78c4d587b5-cqhdl" Feb 17 14:28:30 crc kubenswrapper[4836]: I0217 14:28:30.391942 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/21c73844-3235-4a12-9f77-901ba8614e11-logs\") pod \"placement-55d7557768-wvvpt\" (UID: \"21c73844-3235-4a12-9f77-901ba8614e11\") " pod="openstack/placement-55d7557768-wvvpt" Feb 17 14:28:30 crc kubenswrapper[4836]: I0217 14:28:30.392029 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21c73844-3235-4a12-9f77-901ba8614e11-combined-ca-bundle\") pod \"placement-55d7557768-wvvpt\" (UID: \"21c73844-3235-4a12-9f77-901ba8614e11\") " pod="openstack/placement-55d7557768-wvvpt" Feb 17 14:28:30 crc kubenswrapper[4836]: I0217 14:28:30.392087 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2f9acba-3f54-43b6-9461-31cba0cc954b-config-data\") pod \"keystone-78c4d587b5-cqhdl\" (UID: \"f2f9acba-3f54-43b6-9461-31cba0cc954b\") " pod="openstack/keystone-78c4d587b5-cqhdl" Feb 17 14:28:30 crc kubenswrapper[4836]: I0217 14:28:30.392172 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/21c73844-3235-4a12-9f77-901ba8614e11-public-tls-certs\") pod \"placement-55d7557768-wvvpt\" (UID: \"21c73844-3235-4a12-9f77-901ba8614e11\") " pod="openstack/placement-55d7557768-wvvpt" Feb 17 14:28:30 crc kubenswrapper[4836]: I0217 14:28:30.392238 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2f9acba-3f54-43b6-9461-31cba0cc954b-public-tls-certs\") pod \"keystone-78c4d587b5-cqhdl\" (UID: \"f2f9acba-3f54-43b6-9461-31cba0cc954b\") " pod="openstack/keystone-78c4d587b5-cqhdl" Feb 17 14:28:30 crc kubenswrapper[4836]: I0217 14:28:30.396074 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/21c73844-3235-4a12-9f77-901ba8614e11-logs\") pod \"placement-55d7557768-wvvpt\" (UID: \"21c73844-3235-4a12-9f77-901ba8614e11\") " pod="openstack/placement-55d7557768-wvvpt" Feb 17 14:28:30 crc kubenswrapper[4836]: I0217 14:28:30.399508 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f2f9acba-3f54-43b6-9461-31cba0cc954b-credential-keys\") pod \"keystone-78c4d587b5-cqhdl\" (UID: \"f2f9acba-3f54-43b6-9461-31cba0cc954b\") " pod="openstack/keystone-78c4d587b5-cqhdl" Feb 17 14:28:30 crc kubenswrapper[4836]: I0217 14:28:30.409736 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2f9acba-3f54-43b6-9461-31cba0cc954b-config-data\") pod \"keystone-78c4d587b5-cqhdl\" (UID: \"f2f9acba-3f54-43b6-9461-31cba0cc954b\") " pod="openstack/keystone-78c4d587b5-cqhdl" Feb 17 14:28:30 crc kubenswrapper[4836]: I0217 14:28:30.412212 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f2f9acba-3f54-43b6-9461-31cba0cc954b-scripts\") pod \"keystone-78c4d587b5-cqhdl\" (UID: \"f2f9acba-3f54-43b6-9461-31cba0cc954b\") " pod="openstack/keystone-78c4d587b5-cqhdl" Feb 17 14:28:30 crc kubenswrapper[4836]: I0217 14:28:30.413510 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2f9acba-3f54-43b6-9461-31cba0cc954b-public-tls-certs\") pod \"keystone-78c4d587b5-cqhdl\" (UID: \"f2f9acba-3f54-43b6-9461-31cba0cc954b\") " pod="openstack/keystone-78c4d587b5-cqhdl" Feb 17 14:28:30 crc kubenswrapper[4836]: I0217 14:28:30.420878 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/21c73844-3235-4a12-9f77-901ba8614e11-internal-tls-certs\") pod \"placement-55d7557768-wvvpt\" (UID: \"21c73844-3235-4a12-9f77-901ba8614e11\") " pod="openstack/placement-55d7557768-wvvpt" Feb 17 14:28:30 crc kubenswrapper[4836]: I0217 14:28:30.423411 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/21c73844-3235-4a12-9f77-901ba8614e11-public-tls-certs\") pod \"placement-55d7557768-wvvpt\" (UID: \"21c73844-3235-4a12-9f77-901ba8614e11\") " pod="openstack/placement-55d7557768-wvvpt" Feb 17 14:28:30 crc kubenswrapper[4836]: I0217 14:28:30.432402 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21c73844-3235-4a12-9f77-901ba8614e11-scripts\") pod \"placement-55d7557768-wvvpt\" (UID: \"21c73844-3235-4a12-9f77-901ba8614e11\") " pod="openstack/placement-55d7557768-wvvpt" Feb 17 14:28:30 crc kubenswrapper[4836]: I0217 14:28:30.446064 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f2f9acba-3f54-43b6-9461-31cba0cc954b-fernet-keys\") pod \"keystone-78c4d587b5-cqhdl\" (UID: \"f2f9acba-3f54-43b6-9461-31cba0cc954b\") " pod="openstack/keystone-78c4d587b5-cqhdl" Feb 17 14:28:30 crc kubenswrapper[4836]: I0217 14:28:30.447025 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21c73844-3235-4a12-9f77-901ba8614e11-combined-ca-bundle\") pod \"placement-55d7557768-wvvpt\" (UID: \"21c73844-3235-4a12-9f77-901ba8614e11\") " pod="openstack/placement-55d7557768-wvvpt" Feb 17 14:28:30 crc kubenswrapper[4836]: I0217 14:28:30.447208 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2f9acba-3f54-43b6-9461-31cba0cc954b-internal-tls-certs\") pod \"keystone-78c4d587b5-cqhdl\" (UID: \"f2f9acba-3f54-43b6-9461-31cba0cc954b\") " pod="openstack/keystone-78c4d587b5-cqhdl" Feb 17 14:28:30 crc kubenswrapper[4836]: I0217 14:28:30.447543 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21c73844-3235-4a12-9f77-901ba8614e11-config-data\") pod \"placement-55d7557768-wvvpt\" (UID: \"21c73844-3235-4a12-9f77-901ba8614e11\") " pod="openstack/placement-55d7557768-wvvpt" Feb 17 14:28:30 crc kubenswrapper[4836]: I0217 14:28:30.447628 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2f9acba-3f54-43b6-9461-31cba0cc954b-combined-ca-bundle\") pod \"keystone-78c4d587b5-cqhdl\" (UID: \"f2f9acba-3f54-43b6-9461-31cba0cc954b\") " pod="openstack/keystone-78c4d587b5-cqhdl" Feb 17 14:28:30 crc kubenswrapper[4836]: I0217 14:28:30.455145 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfjpm\" (UniqueName: \"kubernetes.io/projected/21c73844-3235-4a12-9f77-901ba8614e11-kube-api-access-nfjpm\") pod \"placement-55d7557768-wvvpt\" (UID: \"21c73844-3235-4a12-9f77-901ba8614e11\") " pod="openstack/placement-55d7557768-wvvpt" Feb 17 14:28:30 crc kubenswrapper[4836]: I0217 14:28:30.473468 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99mjt\" (UniqueName: \"kubernetes.io/projected/f2f9acba-3f54-43b6-9461-31cba0cc954b-kube-api-access-99mjt\") pod \"keystone-78c4d587b5-cqhdl\" (UID: \"f2f9acba-3f54-43b6-9461-31cba0cc954b\") " pod="openstack/keystone-78c4d587b5-cqhdl" Feb 17 14:28:30 crc kubenswrapper[4836]: I0217 14:28:30.486238 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-55d7557768-wvvpt" Feb 17 14:28:30 crc kubenswrapper[4836]: I0217 14:28:30.542777 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-bc958ddf6-kh2rq"] Feb 17 14:28:30 crc kubenswrapper[4836]: I0217 14:28:30.544127 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-78c4d587b5-cqhdl" Feb 17 14:28:30 crc kubenswrapper[4836]: I0217 14:28:30.550378 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-bc958ddf6-kh2rq" Feb 17 14:28:30 crc kubenswrapper[4836]: I0217 14:28:30.568022 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-bc958ddf6-kh2rq"] Feb 17 14:28:30 crc kubenswrapper[4836]: I0217 14:28:30.712395 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42c3b1e3-728a-4bd8-9669-bfe1656b6de2-combined-ca-bundle\") pod \"placement-bc958ddf6-kh2rq\" (UID: \"42c3b1e3-728a-4bd8-9669-bfe1656b6de2\") " pod="openstack/placement-bc958ddf6-kh2rq" Feb 17 14:28:30 crc kubenswrapper[4836]: I0217 14:28:30.712820 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/42c3b1e3-728a-4bd8-9669-bfe1656b6de2-logs\") pod \"placement-bc958ddf6-kh2rq\" (UID: \"42c3b1e3-728a-4bd8-9669-bfe1656b6de2\") " pod="openstack/placement-bc958ddf6-kh2rq" Feb 17 14:28:30 crc kubenswrapper[4836]: I0217 14:28:30.712868 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/42c3b1e3-728a-4bd8-9669-bfe1656b6de2-internal-tls-certs\") pod \"placement-bc958ddf6-kh2rq\" (UID: \"42c3b1e3-728a-4bd8-9669-bfe1656b6de2\") " pod="openstack/placement-bc958ddf6-kh2rq" Feb 17 14:28:30 crc kubenswrapper[4836]: I0217 14:28:30.712914 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42c3b1e3-728a-4bd8-9669-bfe1656b6de2-scripts\") pod \"placement-bc958ddf6-kh2rq\" (UID: \"42c3b1e3-728a-4bd8-9669-bfe1656b6de2\") " pod="openstack/placement-bc958ddf6-kh2rq" Feb 17 14:28:30 crc kubenswrapper[4836]: I0217 14:28:30.712952 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42c3b1e3-728a-4bd8-9669-bfe1656b6de2-config-data\") pod \"placement-bc958ddf6-kh2rq\" (UID: \"42c3b1e3-728a-4bd8-9669-bfe1656b6de2\") " pod="openstack/placement-bc958ddf6-kh2rq" Feb 17 14:28:30 crc kubenswrapper[4836]: I0217 14:28:30.713217 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/42c3b1e3-728a-4bd8-9669-bfe1656b6de2-public-tls-certs\") pod \"placement-bc958ddf6-kh2rq\" (UID: \"42c3b1e3-728a-4bd8-9669-bfe1656b6de2\") " pod="openstack/placement-bc958ddf6-kh2rq" Feb 17 14:28:30 crc kubenswrapper[4836]: I0217 14:28:30.713345 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q75n2\" (UniqueName: \"kubernetes.io/projected/42c3b1e3-728a-4bd8-9669-bfe1656b6de2-kube-api-access-q75n2\") pod \"placement-bc958ddf6-kh2rq\" (UID: \"42c3b1e3-728a-4bd8-9669-bfe1656b6de2\") " pod="openstack/placement-bc958ddf6-kh2rq" Feb 17 14:28:30 crc kubenswrapper[4836]: I0217 14:28:30.815313 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/42c3b1e3-728a-4bd8-9669-bfe1656b6de2-public-tls-certs\") pod \"placement-bc958ddf6-kh2rq\" (UID: \"42c3b1e3-728a-4bd8-9669-bfe1656b6de2\") " pod="openstack/placement-bc958ddf6-kh2rq" Feb 17 14:28:30 crc kubenswrapper[4836]: I0217 14:28:30.815414 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q75n2\" (UniqueName: \"kubernetes.io/projected/42c3b1e3-728a-4bd8-9669-bfe1656b6de2-kube-api-access-q75n2\") pod \"placement-bc958ddf6-kh2rq\" (UID: \"42c3b1e3-728a-4bd8-9669-bfe1656b6de2\") " pod="openstack/placement-bc958ddf6-kh2rq" Feb 17 14:28:30 crc kubenswrapper[4836]: I0217 14:28:30.815457 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42c3b1e3-728a-4bd8-9669-bfe1656b6de2-combined-ca-bundle\") pod \"placement-bc958ddf6-kh2rq\" (UID: \"42c3b1e3-728a-4bd8-9669-bfe1656b6de2\") " pod="openstack/placement-bc958ddf6-kh2rq" Feb 17 14:28:30 crc kubenswrapper[4836]: I0217 14:28:30.815588 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/42c3b1e3-728a-4bd8-9669-bfe1656b6de2-logs\") pod \"placement-bc958ddf6-kh2rq\" (UID: \"42c3b1e3-728a-4bd8-9669-bfe1656b6de2\") " pod="openstack/placement-bc958ddf6-kh2rq" Feb 17 14:28:30 crc kubenswrapper[4836]: I0217 14:28:30.815624 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/42c3b1e3-728a-4bd8-9669-bfe1656b6de2-internal-tls-certs\") pod \"placement-bc958ddf6-kh2rq\" (UID: \"42c3b1e3-728a-4bd8-9669-bfe1656b6de2\") " pod="openstack/placement-bc958ddf6-kh2rq" Feb 17 14:28:30 crc kubenswrapper[4836]: I0217 14:28:30.815654 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42c3b1e3-728a-4bd8-9669-bfe1656b6de2-scripts\") pod \"placement-bc958ddf6-kh2rq\" (UID: \"42c3b1e3-728a-4bd8-9669-bfe1656b6de2\") " pod="openstack/placement-bc958ddf6-kh2rq" Feb 17 14:28:30 crc kubenswrapper[4836]: I0217 14:28:30.815687 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42c3b1e3-728a-4bd8-9669-bfe1656b6de2-config-data\") pod \"placement-bc958ddf6-kh2rq\" (UID: \"42c3b1e3-728a-4bd8-9669-bfe1656b6de2\") " pod="openstack/placement-bc958ddf6-kh2rq" Feb 17 14:28:30 crc kubenswrapper[4836]: I0217 14:28:30.827103 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/42c3b1e3-728a-4bd8-9669-bfe1656b6de2-logs\") pod \"placement-bc958ddf6-kh2rq\" (UID: \"42c3b1e3-728a-4bd8-9669-bfe1656b6de2\") " pod="openstack/placement-bc958ddf6-kh2rq" Feb 17 14:28:30 crc kubenswrapper[4836]: I0217 14:28:30.827346 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42c3b1e3-728a-4bd8-9669-bfe1656b6de2-config-data\") pod \"placement-bc958ddf6-kh2rq\" (UID: \"42c3b1e3-728a-4bd8-9669-bfe1656b6de2\") " pod="openstack/placement-bc958ddf6-kh2rq" Feb 17 14:28:30 crc kubenswrapper[4836]: I0217 14:28:30.827449 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/42c3b1e3-728a-4bd8-9669-bfe1656b6de2-internal-tls-certs\") pod \"placement-bc958ddf6-kh2rq\" (UID: \"42c3b1e3-728a-4bd8-9669-bfe1656b6de2\") " pod="openstack/placement-bc958ddf6-kh2rq" Feb 17 14:28:30 crc kubenswrapper[4836]: I0217 14:28:30.839480 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42c3b1e3-728a-4bd8-9669-bfe1656b6de2-scripts\") pod \"placement-bc958ddf6-kh2rq\" (UID: \"42c3b1e3-728a-4bd8-9669-bfe1656b6de2\") " pod="openstack/placement-bc958ddf6-kh2rq" Feb 17 14:28:30 crc kubenswrapper[4836]: I0217 14:28:30.849727 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42c3b1e3-728a-4bd8-9669-bfe1656b6de2-combined-ca-bundle\") pod \"placement-bc958ddf6-kh2rq\" (UID: \"42c3b1e3-728a-4bd8-9669-bfe1656b6de2\") " pod="openstack/placement-bc958ddf6-kh2rq" Feb 17 14:28:30 crc kubenswrapper[4836]: I0217 14:28:30.856543 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/42c3b1e3-728a-4bd8-9669-bfe1656b6de2-public-tls-certs\") pod \"placement-bc958ddf6-kh2rq\" (UID: \"42c3b1e3-728a-4bd8-9669-bfe1656b6de2\") " pod="openstack/placement-bc958ddf6-kh2rq" Feb 17 14:28:30 crc kubenswrapper[4836]: I0217 14:28:30.873438 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q75n2\" (UniqueName: \"kubernetes.io/projected/42c3b1e3-728a-4bd8-9669-bfe1656b6de2-kube-api-access-q75n2\") pod \"placement-bc958ddf6-kh2rq\" (UID: \"42c3b1e3-728a-4bd8-9669-bfe1656b6de2\") " pod="openstack/placement-bc958ddf6-kh2rq" Feb 17 14:28:30 crc kubenswrapper[4836]: I0217 14:28:30.993129 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-bc958ddf6-kh2rq" Feb 17 14:28:31 crc kubenswrapper[4836]: I0217 14:28:31.115748 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9fc032cb-3063-4e39-a91f-ccc89defe9c4","Type":"ContainerStarted","Data":"2d37a99072f4fb6a9bc38dee8c6986d96ef5977cd1d2c3dca6d3d95cb5f3bcee"} Feb 17 14:28:31 crc kubenswrapper[4836]: I0217 14:28:31.326900 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-6567fb9c77-xcq7p"] Feb 17 14:28:31 crc kubenswrapper[4836]: I0217 14:28:31.341190 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6567fb9c77-xcq7p" Feb 17 14:28:31 crc kubenswrapper[4836]: I0217 14:28:31.356618 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 17 14:28:31 crc kubenswrapper[4836]: I0217 14:28:31.356995 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Feb 17 14:28:31 crc kubenswrapper[4836]: I0217 14:28:31.366042 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-fkh7w" Feb 17 14:28:31 crc kubenswrapper[4836]: I0217 14:28:31.386354 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6567fb9c77-xcq7p"] Feb 17 14:28:31 crc kubenswrapper[4836]: I0217 14:28:31.431860 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-68fd77ffbb-m5r5c"] Feb 17 14:28:31 crc kubenswrapper[4836]: I0217 14:28:31.434035 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-68fd77ffbb-m5r5c" Feb 17 14:28:31 crc kubenswrapper[4836]: I0217 14:28:31.444972 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Feb 17 14:28:31 crc kubenswrapper[4836]: I0217 14:28:31.446570 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf33e52a-365f-4ccc-8352-f4c7f8e2aebd-config-data\") pod \"barbican-worker-6567fb9c77-xcq7p\" (UID: \"bf33e52a-365f-4ccc-8352-f4c7f8e2aebd\") " pod="openstack/barbican-worker-6567fb9c77-xcq7p" Feb 17 14:28:31 crc kubenswrapper[4836]: I0217 14:28:31.446617 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bf33e52a-365f-4ccc-8352-f4c7f8e2aebd-config-data-custom\") pod \"barbican-worker-6567fb9c77-xcq7p\" (UID: \"bf33e52a-365f-4ccc-8352-f4c7f8e2aebd\") " pod="openstack/barbican-worker-6567fb9c77-xcq7p" Feb 17 14:28:31 crc kubenswrapper[4836]: I0217 14:28:31.446648 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf33e52a-365f-4ccc-8352-f4c7f8e2aebd-logs\") pod \"barbican-worker-6567fb9c77-xcq7p\" (UID: \"bf33e52a-365f-4ccc-8352-f4c7f8e2aebd\") " pod="openstack/barbican-worker-6567fb9c77-xcq7p" Feb 17 14:28:31 crc kubenswrapper[4836]: I0217 14:28:31.446710 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf33e52a-365f-4ccc-8352-f4c7f8e2aebd-combined-ca-bundle\") pod \"barbican-worker-6567fb9c77-xcq7p\" (UID: \"bf33e52a-365f-4ccc-8352-f4c7f8e2aebd\") " pod="openstack/barbican-worker-6567fb9c77-xcq7p" Feb 17 14:28:31 crc kubenswrapper[4836]: I0217 14:28:31.446782 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44t8g\" (UniqueName: \"kubernetes.io/projected/bf33e52a-365f-4ccc-8352-f4c7f8e2aebd-kube-api-access-44t8g\") pod \"barbican-worker-6567fb9c77-xcq7p\" (UID: \"bf33e52a-365f-4ccc-8352-f4c7f8e2aebd\") " pod="openstack/barbican-worker-6567fb9c77-xcq7p" Feb 17 14:28:31 crc kubenswrapper[4836]: I0217 14:28:31.460578 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-68fd77ffbb-m5r5c"] Feb 17 14:28:31 crc kubenswrapper[4836]: I0217 14:28:31.548822 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jhnl\" (UniqueName: \"kubernetes.io/projected/f79d706e-2d22-49c6-acb5-dc3f130ab102-kube-api-access-4jhnl\") pod \"barbican-keystone-listener-68fd77ffbb-m5r5c\" (UID: \"f79d706e-2d22-49c6-acb5-dc3f130ab102\") " pod="openstack/barbican-keystone-listener-68fd77ffbb-m5r5c" Feb 17 14:28:31 crc kubenswrapper[4836]: I0217 14:28:31.548879 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f79d706e-2d22-49c6-acb5-dc3f130ab102-config-data-custom\") pod \"barbican-keystone-listener-68fd77ffbb-m5r5c\" (UID: \"f79d706e-2d22-49c6-acb5-dc3f130ab102\") " pod="openstack/barbican-keystone-listener-68fd77ffbb-m5r5c" Feb 17 14:28:31 crc kubenswrapper[4836]: I0217 14:28:31.548917 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44t8g\" (UniqueName: \"kubernetes.io/projected/bf33e52a-365f-4ccc-8352-f4c7f8e2aebd-kube-api-access-44t8g\") pod \"barbican-worker-6567fb9c77-xcq7p\" (UID: \"bf33e52a-365f-4ccc-8352-f4c7f8e2aebd\") " pod="openstack/barbican-worker-6567fb9c77-xcq7p" Feb 17 14:28:31 crc kubenswrapper[4836]: I0217 14:28:31.548940 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f79d706e-2d22-49c6-acb5-dc3f130ab102-config-data\") pod \"barbican-keystone-listener-68fd77ffbb-m5r5c\" (UID: \"f79d706e-2d22-49c6-acb5-dc3f130ab102\") " pod="openstack/barbican-keystone-listener-68fd77ffbb-m5r5c" Feb 17 14:28:31 crc kubenswrapper[4836]: I0217 14:28:31.549019 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f79d706e-2d22-49c6-acb5-dc3f130ab102-logs\") pod \"barbican-keystone-listener-68fd77ffbb-m5r5c\" (UID: \"f79d706e-2d22-49c6-acb5-dc3f130ab102\") " pod="openstack/barbican-keystone-listener-68fd77ffbb-m5r5c" Feb 17 14:28:31 crc kubenswrapper[4836]: I0217 14:28:31.549060 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf33e52a-365f-4ccc-8352-f4c7f8e2aebd-config-data\") pod \"barbican-worker-6567fb9c77-xcq7p\" (UID: \"bf33e52a-365f-4ccc-8352-f4c7f8e2aebd\") " pod="openstack/barbican-worker-6567fb9c77-xcq7p" Feb 17 14:28:31 crc kubenswrapper[4836]: I0217 14:28:31.549082 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bf33e52a-365f-4ccc-8352-f4c7f8e2aebd-config-data-custom\") pod \"barbican-worker-6567fb9c77-xcq7p\" (UID: \"bf33e52a-365f-4ccc-8352-f4c7f8e2aebd\") " pod="openstack/barbican-worker-6567fb9c77-xcq7p" Feb 17 14:28:31 crc kubenswrapper[4836]: I0217 14:28:31.549112 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf33e52a-365f-4ccc-8352-f4c7f8e2aebd-logs\") pod \"barbican-worker-6567fb9c77-xcq7p\" (UID: \"bf33e52a-365f-4ccc-8352-f4c7f8e2aebd\") " pod="openstack/barbican-worker-6567fb9c77-xcq7p" Feb 17 14:28:31 crc kubenswrapper[4836]: I0217 14:28:31.549181 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf33e52a-365f-4ccc-8352-f4c7f8e2aebd-combined-ca-bundle\") pod \"barbican-worker-6567fb9c77-xcq7p\" (UID: \"bf33e52a-365f-4ccc-8352-f4c7f8e2aebd\") " pod="openstack/barbican-worker-6567fb9c77-xcq7p" Feb 17 14:28:31 crc kubenswrapper[4836]: I0217 14:28:31.549209 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f79d706e-2d22-49c6-acb5-dc3f130ab102-combined-ca-bundle\") pod \"barbican-keystone-listener-68fd77ffbb-m5r5c\" (UID: \"f79d706e-2d22-49c6-acb5-dc3f130ab102\") " pod="openstack/barbican-keystone-listener-68fd77ffbb-m5r5c" Feb 17 14:28:31 crc kubenswrapper[4836]: I0217 14:28:31.551269 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf33e52a-365f-4ccc-8352-f4c7f8e2aebd-logs\") pod \"barbican-worker-6567fb9c77-xcq7p\" (UID: \"bf33e52a-365f-4ccc-8352-f4c7f8e2aebd\") " pod="openstack/barbican-worker-6567fb9c77-xcq7p" Feb 17 14:28:31 crc kubenswrapper[4836]: I0217 14:28:31.561503 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf33e52a-365f-4ccc-8352-f4c7f8e2aebd-config-data\") pod \"barbican-worker-6567fb9c77-xcq7p\" (UID: \"bf33e52a-365f-4ccc-8352-f4c7f8e2aebd\") " pod="openstack/barbican-worker-6567fb9c77-xcq7p" Feb 17 14:28:31 crc kubenswrapper[4836]: I0217 14:28:31.570251 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-m9ds8"] Feb 17 14:28:31 crc kubenswrapper[4836]: I0217 14:28:31.573417 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-m9ds8" Feb 17 14:28:31 crc kubenswrapper[4836]: I0217 14:28:31.583502 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf33e52a-365f-4ccc-8352-f4c7f8e2aebd-combined-ca-bundle\") pod \"barbican-worker-6567fb9c77-xcq7p\" (UID: \"bf33e52a-365f-4ccc-8352-f4c7f8e2aebd\") " pod="openstack/barbican-worker-6567fb9c77-xcq7p" Feb 17 14:28:31 crc kubenswrapper[4836]: I0217 14:28:31.588100 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bf33e52a-365f-4ccc-8352-f4c7f8e2aebd-config-data-custom\") pod \"barbican-worker-6567fb9c77-xcq7p\" (UID: \"bf33e52a-365f-4ccc-8352-f4c7f8e2aebd\") " pod="openstack/barbican-worker-6567fb9c77-xcq7p" Feb 17 14:28:31 crc kubenswrapper[4836]: I0217 14:28:31.622448 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44t8g\" (UniqueName: \"kubernetes.io/projected/bf33e52a-365f-4ccc-8352-f4c7f8e2aebd-kube-api-access-44t8g\") pod \"barbican-worker-6567fb9c77-xcq7p\" (UID: \"bf33e52a-365f-4ccc-8352-f4c7f8e2aebd\") " pod="openstack/barbican-worker-6567fb9c77-xcq7p" Feb 17 14:28:31 crc kubenswrapper[4836]: I0217 14:28:31.630144 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-m9ds8"] Feb 17 14:28:31 crc kubenswrapper[4836]: I0217 14:28:31.651680 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d2472d16-7ff3-4f1c-a9b8-045dd8ffa6d0-dns-svc\") pod \"dnsmasq-dns-85ff748b95-m9ds8\" (UID: \"d2472d16-7ff3-4f1c-a9b8-045dd8ffa6d0\") " pod="openstack/dnsmasq-dns-85ff748b95-m9ds8" Feb 17 14:28:31 crc kubenswrapper[4836]: I0217 14:28:31.651776 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f79d706e-2d22-49c6-acb5-dc3f130ab102-logs\") pod \"barbican-keystone-listener-68fd77ffbb-m5r5c\" (UID: \"f79d706e-2d22-49c6-acb5-dc3f130ab102\") " pod="openstack/barbican-keystone-listener-68fd77ffbb-m5r5c" Feb 17 14:28:31 crc kubenswrapper[4836]: I0217 14:28:31.652947 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d2472d16-7ff3-4f1c-a9b8-045dd8ffa6d0-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-m9ds8\" (UID: \"d2472d16-7ff3-4f1c-a9b8-045dd8ffa6d0\") " pod="openstack/dnsmasq-dns-85ff748b95-m9ds8" Feb 17 14:28:31 crc kubenswrapper[4836]: I0217 14:28:31.653019 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d2472d16-7ff3-4f1c-a9b8-045dd8ffa6d0-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-m9ds8\" (UID: \"d2472d16-7ff3-4f1c-a9b8-045dd8ffa6d0\") " pod="openstack/dnsmasq-dns-85ff748b95-m9ds8" Feb 17 14:28:31 crc kubenswrapper[4836]: I0217 14:28:31.653060 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spw5p\" (UniqueName: \"kubernetes.io/projected/d2472d16-7ff3-4f1c-a9b8-045dd8ffa6d0-kube-api-access-spw5p\") pod \"dnsmasq-dns-85ff748b95-m9ds8\" (UID: \"d2472d16-7ff3-4f1c-a9b8-045dd8ffa6d0\") " pod="openstack/dnsmasq-dns-85ff748b95-m9ds8" Feb 17 14:28:31 crc kubenswrapper[4836]: I0217 14:28:31.653138 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d2472d16-7ff3-4f1c-a9b8-045dd8ffa6d0-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-m9ds8\" (UID: \"d2472d16-7ff3-4f1c-a9b8-045dd8ffa6d0\") " pod="openstack/dnsmasq-dns-85ff748b95-m9ds8" Feb 17 14:28:31 crc kubenswrapper[4836]: I0217 14:28:31.653474 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f79d706e-2d22-49c6-acb5-dc3f130ab102-combined-ca-bundle\") pod \"barbican-keystone-listener-68fd77ffbb-m5r5c\" (UID: \"f79d706e-2d22-49c6-acb5-dc3f130ab102\") " pod="openstack/barbican-keystone-listener-68fd77ffbb-m5r5c" Feb 17 14:28:31 crc kubenswrapper[4836]: I0217 14:28:31.653542 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4jhnl\" (UniqueName: \"kubernetes.io/projected/f79d706e-2d22-49c6-acb5-dc3f130ab102-kube-api-access-4jhnl\") pod \"barbican-keystone-listener-68fd77ffbb-m5r5c\" (UID: \"f79d706e-2d22-49c6-acb5-dc3f130ab102\") " pod="openstack/barbican-keystone-listener-68fd77ffbb-m5r5c" Feb 17 14:28:31 crc kubenswrapper[4836]: I0217 14:28:31.653574 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f79d706e-2d22-49c6-acb5-dc3f130ab102-config-data-custom\") pod \"barbican-keystone-listener-68fd77ffbb-m5r5c\" (UID: \"f79d706e-2d22-49c6-acb5-dc3f130ab102\") " pod="openstack/barbican-keystone-listener-68fd77ffbb-m5r5c" Feb 17 14:28:31 crc kubenswrapper[4836]: I0217 14:28:31.653603 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2472d16-7ff3-4f1c-a9b8-045dd8ffa6d0-config\") pod \"dnsmasq-dns-85ff748b95-m9ds8\" (UID: \"d2472d16-7ff3-4f1c-a9b8-045dd8ffa6d0\") " pod="openstack/dnsmasq-dns-85ff748b95-m9ds8" Feb 17 14:28:31 crc kubenswrapper[4836]: I0217 14:28:31.653636 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f79d706e-2d22-49c6-acb5-dc3f130ab102-config-data\") pod \"barbican-keystone-listener-68fd77ffbb-m5r5c\" (UID: \"f79d706e-2d22-49c6-acb5-dc3f130ab102\") " pod="openstack/barbican-keystone-listener-68fd77ffbb-m5r5c" Feb 17 14:28:31 crc kubenswrapper[4836]: I0217 14:28:31.668447 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f79d706e-2d22-49c6-acb5-dc3f130ab102-logs\") pod \"barbican-keystone-listener-68fd77ffbb-m5r5c\" (UID: \"f79d706e-2d22-49c6-acb5-dc3f130ab102\") " pod="openstack/barbican-keystone-listener-68fd77ffbb-m5r5c" Feb 17 14:28:31 crc kubenswrapper[4836]: I0217 14:28:31.677686 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f79d706e-2d22-49c6-acb5-dc3f130ab102-combined-ca-bundle\") pod \"barbican-keystone-listener-68fd77ffbb-m5r5c\" (UID: \"f79d706e-2d22-49c6-acb5-dc3f130ab102\") " pod="openstack/barbican-keystone-listener-68fd77ffbb-m5r5c" Feb 17 14:28:31 crc kubenswrapper[4836]: I0217 14:28:31.678685 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f79d706e-2d22-49c6-acb5-dc3f130ab102-config-data\") pod \"barbican-keystone-listener-68fd77ffbb-m5r5c\" (UID: \"f79d706e-2d22-49c6-acb5-dc3f130ab102\") " pod="openstack/barbican-keystone-listener-68fd77ffbb-m5r5c" Feb 17 14:28:31 crc kubenswrapper[4836]: I0217 14:28:31.694861 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jhnl\" (UniqueName: \"kubernetes.io/projected/f79d706e-2d22-49c6-acb5-dc3f130ab102-kube-api-access-4jhnl\") pod \"barbican-keystone-listener-68fd77ffbb-m5r5c\" (UID: \"f79d706e-2d22-49c6-acb5-dc3f130ab102\") " pod="openstack/barbican-keystone-listener-68fd77ffbb-m5r5c" Feb 17 14:28:31 crc kubenswrapper[4836]: I0217 14:28:31.704393 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f79d706e-2d22-49c6-acb5-dc3f130ab102-config-data-custom\") pod \"barbican-keystone-listener-68fd77ffbb-m5r5c\" (UID: \"f79d706e-2d22-49c6-acb5-dc3f130ab102\") " pod="openstack/barbican-keystone-listener-68fd77ffbb-m5r5c" Feb 17 14:28:31 crc kubenswrapper[4836]: I0217 14:28:31.706458 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6567fb9c77-xcq7p" Feb 17 14:28:31 crc kubenswrapper[4836]: I0217 14:28:31.755387 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d2472d16-7ff3-4f1c-a9b8-045dd8ffa6d0-dns-svc\") pod \"dnsmasq-dns-85ff748b95-m9ds8\" (UID: \"d2472d16-7ff3-4f1c-a9b8-045dd8ffa6d0\") " pod="openstack/dnsmasq-dns-85ff748b95-m9ds8" Feb 17 14:28:31 crc kubenswrapper[4836]: I0217 14:28:31.756471 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d2472d16-7ff3-4f1c-a9b8-045dd8ffa6d0-dns-svc\") pod \"dnsmasq-dns-85ff748b95-m9ds8\" (UID: \"d2472d16-7ff3-4f1c-a9b8-045dd8ffa6d0\") " pod="openstack/dnsmasq-dns-85ff748b95-m9ds8" Feb 17 14:28:31 crc kubenswrapper[4836]: I0217 14:28:31.757212 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d2472d16-7ff3-4f1c-a9b8-045dd8ffa6d0-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-m9ds8\" (UID: \"d2472d16-7ff3-4f1c-a9b8-045dd8ffa6d0\") " pod="openstack/dnsmasq-dns-85ff748b95-m9ds8" Feb 17 14:28:31 crc kubenswrapper[4836]: I0217 14:28:31.757260 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d2472d16-7ff3-4f1c-a9b8-045dd8ffa6d0-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-m9ds8\" (UID: \"d2472d16-7ff3-4f1c-a9b8-045dd8ffa6d0\") " pod="openstack/dnsmasq-dns-85ff748b95-m9ds8" Feb 17 14:28:31 crc kubenswrapper[4836]: I0217 14:28:31.757349 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d2472d16-7ff3-4f1c-a9b8-045dd8ffa6d0-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-m9ds8\" (UID: \"d2472d16-7ff3-4f1c-a9b8-045dd8ffa6d0\") " pod="openstack/dnsmasq-dns-85ff748b95-m9ds8" Feb 17 14:28:31 crc kubenswrapper[4836]: I0217 14:28:31.759212 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-649d9995c8-rcxvp"] Feb 17 14:28:31 crc kubenswrapper[4836]: I0217 14:28:31.761420 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-649d9995c8-rcxvp" Feb 17 14:28:31 crc kubenswrapper[4836]: I0217 14:28:31.763526 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spw5p\" (UniqueName: \"kubernetes.io/projected/d2472d16-7ff3-4f1c-a9b8-045dd8ffa6d0-kube-api-access-spw5p\") pod \"dnsmasq-dns-85ff748b95-m9ds8\" (UID: \"d2472d16-7ff3-4f1c-a9b8-045dd8ffa6d0\") " pod="openstack/dnsmasq-dns-85ff748b95-m9ds8" Feb 17 14:28:31 crc kubenswrapper[4836]: I0217 14:28:31.765735 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Feb 17 14:28:31 crc kubenswrapper[4836]: I0217 14:28:31.766051 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d2472d16-7ff3-4f1c-a9b8-045dd8ffa6d0-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-m9ds8\" (UID: \"d2472d16-7ff3-4f1c-a9b8-045dd8ffa6d0\") " pod="openstack/dnsmasq-dns-85ff748b95-m9ds8" Feb 17 14:28:31 crc kubenswrapper[4836]: I0217 14:28:31.766367 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d2472d16-7ff3-4f1c-a9b8-045dd8ffa6d0-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-m9ds8\" (UID: \"d2472d16-7ff3-4f1c-a9b8-045dd8ffa6d0\") " pod="openstack/dnsmasq-dns-85ff748b95-m9ds8" Feb 17 14:28:31 crc kubenswrapper[4836]: I0217 14:28:31.766880 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d2472d16-7ff3-4f1c-a9b8-045dd8ffa6d0-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-m9ds8\" (UID: \"d2472d16-7ff3-4f1c-a9b8-045dd8ffa6d0\") " pod="openstack/dnsmasq-dns-85ff748b95-m9ds8" Feb 17 14:28:31 crc kubenswrapper[4836]: I0217 14:28:31.773778 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2472d16-7ff3-4f1c-a9b8-045dd8ffa6d0-config\") pod \"dnsmasq-dns-85ff748b95-m9ds8\" (UID: \"d2472d16-7ff3-4f1c-a9b8-045dd8ffa6d0\") " pod="openstack/dnsmasq-dns-85ff748b95-m9ds8" Feb 17 14:28:31 crc kubenswrapper[4836]: I0217 14:28:31.775931 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2472d16-7ff3-4f1c-a9b8-045dd8ffa6d0-config\") pod \"dnsmasq-dns-85ff748b95-m9ds8\" (UID: \"d2472d16-7ff3-4f1c-a9b8-045dd8ffa6d0\") " pod="openstack/dnsmasq-dns-85ff748b95-m9ds8" Feb 17 14:28:31 crc kubenswrapper[4836]: I0217 14:28:31.800856 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-68fd77ffbb-m5r5c" Feb 17 14:28:31 crc kubenswrapper[4836]: I0217 14:28:31.819174 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-spw5p\" (UniqueName: \"kubernetes.io/projected/d2472d16-7ff3-4f1c-a9b8-045dd8ffa6d0-kube-api-access-spw5p\") pod \"dnsmasq-dns-85ff748b95-m9ds8\" (UID: \"d2472d16-7ff3-4f1c-a9b8-045dd8ffa6d0\") " pod="openstack/dnsmasq-dns-85ff748b95-m9ds8" Feb 17 14:28:31 crc kubenswrapper[4836]: I0217 14:28:31.833596 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-649d9995c8-rcxvp"] Feb 17 14:28:31 crc kubenswrapper[4836]: I0217 14:28:31.886549 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-slsjs\" (UniqueName: \"kubernetes.io/projected/ed247b9d-af54-401e-80a3-82d18772f29d-kube-api-access-slsjs\") pod \"barbican-api-649d9995c8-rcxvp\" (UID: \"ed247b9d-af54-401e-80a3-82d18772f29d\") " pod="openstack/barbican-api-649d9995c8-rcxvp" Feb 17 14:28:31 crc kubenswrapper[4836]: I0217 14:28:31.886640 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed247b9d-af54-401e-80a3-82d18772f29d-combined-ca-bundle\") pod \"barbican-api-649d9995c8-rcxvp\" (UID: \"ed247b9d-af54-401e-80a3-82d18772f29d\") " pod="openstack/barbican-api-649d9995c8-rcxvp" Feb 17 14:28:31 crc kubenswrapper[4836]: I0217 14:28:31.886825 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed247b9d-af54-401e-80a3-82d18772f29d-config-data\") pod \"barbican-api-649d9995c8-rcxvp\" (UID: \"ed247b9d-af54-401e-80a3-82d18772f29d\") " pod="openstack/barbican-api-649d9995c8-rcxvp" Feb 17 14:28:31 crc kubenswrapper[4836]: I0217 14:28:31.886987 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ed247b9d-af54-401e-80a3-82d18772f29d-config-data-custom\") pod \"barbican-api-649d9995c8-rcxvp\" (UID: \"ed247b9d-af54-401e-80a3-82d18772f29d\") " pod="openstack/barbican-api-649d9995c8-rcxvp" Feb 17 14:28:31 crc kubenswrapper[4836]: I0217 14:28:31.887189 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed247b9d-af54-401e-80a3-82d18772f29d-logs\") pod \"barbican-api-649d9995c8-rcxvp\" (UID: \"ed247b9d-af54-401e-80a3-82d18772f29d\") " pod="openstack/barbican-api-649d9995c8-rcxvp" Feb 17 14:28:31 crc kubenswrapper[4836]: I0217 14:28:31.946849 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-m9ds8" Feb 17 14:28:31 crc kubenswrapper[4836]: I0217 14:28:31.991196 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-slsjs\" (UniqueName: \"kubernetes.io/projected/ed247b9d-af54-401e-80a3-82d18772f29d-kube-api-access-slsjs\") pod \"barbican-api-649d9995c8-rcxvp\" (UID: \"ed247b9d-af54-401e-80a3-82d18772f29d\") " pod="openstack/barbican-api-649d9995c8-rcxvp" Feb 17 14:28:31 crc kubenswrapper[4836]: I0217 14:28:31.991272 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed247b9d-af54-401e-80a3-82d18772f29d-combined-ca-bundle\") pod \"barbican-api-649d9995c8-rcxvp\" (UID: \"ed247b9d-af54-401e-80a3-82d18772f29d\") " pod="openstack/barbican-api-649d9995c8-rcxvp" Feb 17 14:28:31 crc kubenswrapper[4836]: I0217 14:28:31.991337 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed247b9d-af54-401e-80a3-82d18772f29d-config-data\") pod \"barbican-api-649d9995c8-rcxvp\" (UID: \"ed247b9d-af54-401e-80a3-82d18772f29d\") " pod="openstack/barbican-api-649d9995c8-rcxvp" Feb 17 14:28:31 crc kubenswrapper[4836]: I0217 14:28:31.991389 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ed247b9d-af54-401e-80a3-82d18772f29d-config-data-custom\") pod \"barbican-api-649d9995c8-rcxvp\" (UID: \"ed247b9d-af54-401e-80a3-82d18772f29d\") " pod="openstack/barbican-api-649d9995c8-rcxvp" Feb 17 14:28:31 crc kubenswrapper[4836]: I0217 14:28:31.991464 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed247b9d-af54-401e-80a3-82d18772f29d-logs\") pod \"barbican-api-649d9995c8-rcxvp\" (UID: \"ed247b9d-af54-401e-80a3-82d18772f29d\") " pod="openstack/barbican-api-649d9995c8-rcxvp" Feb 17 14:28:31 crc kubenswrapper[4836]: I0217 14:28:31.991958 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed247b9d-af54-401e-80a3-82d18772f29d-logs\") pod \"barbican-api-649d9995c8-rcxvp\" (UID: \"ed247b9d-af54-401e-80a3-82d18772f29d\") " pod="openstack/barbican-api-649d9995c8-rcxvp" Feb 17 14:28:31 crc kubenswrapper[4836]: I0217 14:28:31.997810 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ed247b9d-af54-401e-80a3-82d18772f29d-config-data-custom\") pod \"barbican-api-649d9995c8-rcxvp\" (UID: \"ed247b9d-af54-401e-80a3-82d18772f29d\") " pod="openstack/barbican-api-649d9995c8-rcxvp" Feb 17 14:28:31 crc kubenswrapper[4836]: I0217 14:28:31.998107 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed247b9d-af54-401e-80a3-82d18772f29d-combined-ca-bundle\") pod \"barbican-api-649d9995c8-rcxvp\" (UID: \"ed247b9d-af54-401e-80a3-82d18772f29d\") " pod="openstack/barbican-api-649d9995c8-rcxvp" Feb 17 14:28:32 crc kubenswrapper[4836]: I0217 14:28:32.001280 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed247b9d-af54-401e-80a3-82d18772f29d-config-data\") pod \"barbican-api-649d9995c8-rcxvp\" (UID: \"ed247b9d-af54-401e-80a3-82d18772f29d\") " pod="openstack/barbican-api-649d9995c8-rcxvp" Feb 17 14:28:32 crc kubenswrapper[4836]: I0217 14:28:32.019058 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-slsjs\" (UniqueName: \"kubernetes.io/projected/ed247b9d-af54-401e-80a3-82d18772f29d-kube-api-access-slsjs\") pod \"barbican-api-649d9995c8-rcxvp\" (UID: \"ed247b9d-af54-401e-80a3-82d18772f29d\") " pod="openstack/barbican-api-649d9995c8-rcxvp" Feb 17 14:28:32 crc kubenswrapper[4836]: I0217 14:28:32.254024 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-649d9995c8-rcxvp" Feb 17 14:28:33 crc kubenswrapper[4836]: I0217 14:28:33.165602 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c29f84b9-3879-4fc6-b2aa-e334bd08f24e","Type":"ContainerStarted","Data":"4231e0f0134e5c8db2d1379ad611e9d1ddd911c706b7c534c46f5a480fa7035b"} Feb 17 14:28:34 crc kubenswrapper[4836]: I0217 14:28:34.031742 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Feb 17 14:28:34 crc kubenswrapper[4836]: I0217 14:28:34.047544 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Feb 17 14:28:34 crc kubenswrapper[4836]: I0217 14:28:34.190814 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Feb 17 14:28:34 crc kubenswrapper[4836]: I0217 14:28:34.691012 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-7dc9c9fdbb-zxjj6"] Feb 17 14:28:34 crc kubenswrapper[4836]: I0217 14:28:34.693048 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7dc9c9fdbb-zxjj6" Feb 17 14:28:34 crc kubenswrapper[4836]: I0217 14:28:34.695623 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Feb 17 14:28:34 crc kubenswrapper[4836]: I0217 14:28:34.696257 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Feb 17 14:28:34 crc kubenswrapper[4836]: I0217 14:28:34.709314 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7dc9c9fdbb-zxjj6"] Feb 17 14:28:34 crc kubenswrapper[4836]: I0217 14:28:34.873883 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/62b902ba-6ba2-48f3-a6dc-652fd1d6d58c-logs\") pod \"barbican-api-7dc9c9fdbb-zxjj6\" (UID: \"62b902ba-6ba2-48f3-a6dc-652fd1d6d58c\") " pod="openstack/barbican-api-7dc9c9fdbb-zxjj6" Feb 17 14:28:34 crc kubenswrapper[4836]: I0217 14:28:34.873960 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/62b902ba-6ba2-48f3-a6dc-652fd1d6d58c-config-data-custom\") pod \"barbican-api-7dc9c9fdbb-zxjj6\" (UID: \"62b902ba-6ba2-48f3-a6dc-652fd1d6d58c\") " pod="openstack/barbican-api-7dc9c9fdbb-zxjj6" Feb 17 14:28:34 crc kubenswrapper[4836]: I0217 14:28:34.873983 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/62b902ba-6ba2-48f3-a6dc-652fd1d6d58c-internal-tls-certs\") pod \"barbican-api-7dc9c9fdbb-zxjj6\" (UID: \"62b902ba-6ba2-48f3-a6dc-652fd1d6d58c\") " pod="openstack/barbican-api-7dc9c9fdbb-zxjj6" Feb 17 14:28:34 crc kubenswrapper[4836]: I0217 14:28:34.874233 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62b902ba-6ba2-48f3-a6dc-652fd1d6d58c-combined-ca-bundle\") pod \"barbican-api-7dc9c9fdbb-zxjj6\" (UID: \"62b902ba-6ba2-48f3-a6dc-652fd1d6d58c\") " pod="openstack/barbican-api-7dc9c9fdbb-zxjj6" Feb 17 14:28:34 crc kubenswrapper[4836]: I0217 14:28:34.874342 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62b902ba-6ba2-48f3-a6dc-652fd1d6d58c-config-data\") pod \"barbican-api-7dc9c9fdbb-zxjj6\" (UID: \"62b902ba-6ba2-48f3-a6dc-652fd1d6d58c\") " pod="openstack/barbican-api-7dc9c9fdbb-zxjj6" Feb 17 14:28:34 crc kubenswrapper[4836]: I0217 14:28:34.874743 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/62b902ba-6ba2-48f3-a6dc-652fd1d6d58c-public-tls-certs\") pod \"barbican-api-7dc9c9fdbb-zxjj6\" (UID: \"62b902ba-6ba2-48f3-a6dc-652fd1d6d58c\") " pod="openstack/barbican-api-7dc9c9fdbb-zxjj6" Feb 17 14:28:34 crc kubenswrapper[4836]: I0217 14:28:34.874788 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4bpt\" (UniqueName: \"kubernetes.io/projected/62b902ba-6ba2-48f3-a6dc-652fd1d6d58c-kube-api-access-v4bpt\") pod \"barbican-api-7dc9c9fdbb-zxjj6\" (UID: \"62b902ba-6ba2-48f3-a6dc-652fd1d6d58c\") " pod="openstack/barbican-api-7dc9c9fdbb-zxjj6" Feb 17 14:28:34 crc kubenswrapper[4836]: I0217 14:28:34.977125 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/62b902ba-6ba2-48f3-a6dc-652fd1d6d58c-logs\") pod \"barbican-api-7dc9c9fdbb-zxjj6\" (UID: \"62b902ba-6ba2-48f3-a6dc-652fd1d6d58c\") " pod="openstack/barbican-api-7dc9c9fdbb-zxjj6" Feb 17 14:28:34 crc kubenswrapper[4836]: I0217 14:28:34.977267 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/62b902ba-6ba2-48f3-a6dc-652fd1d6d58c-config-data-custom\") pod \"barbican-api-7dc9c9fdbb-zxjj6\" (UID: \"62b902ba-6ba2-48f3-a6dc-652fd1d6d58c\") " pod="openstack/barbican-api-7dc9c9fdbb-zxjj6" Feb 17 14:28:34 crc kubenswrapper[4836]: I0217 14:28:34.977320 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/62b902ba-6ba2-48f3-a6dc-652fd1d6d58c-internal-tls-certs\") pod \"barbican-api-7dc9c9fdbb-zxjj6\" (UID: \"62b902ba-6ba2-48f3-a6dc-652fd1d6d58c\") " pod="openstack/barbican-api-7dc9c9fdbb-zxjj6" Feb 17 14:28:34 crc kubenswrapper[4836]: I0217 14:28:34.977376 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62b902ba-6ba2-48f3-a6dc-652fd1d6d58c-combined-ca-bundle\") pod \"barbican-api-7dc9c9fdbb-zxjj6\" (UID: \"62b902ba-6ba2-48f3-a6dc-652fd1d6d58c\") " pod="openstack/barbican-api-7dc9c9fdbb-zxjj6" Feb 17 14:28:34 crc kubenswrapper[4836]: I0217 14:28:34.977403 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62b902ba-6ba2-48f3-a6dc-652fd1d6d58c-config-data\") pod \"barbican-api-7dc9c9fdbb-zxjj6\" (UID: \"62b902ba-6ba2-48f3-a6dc-652fd1d6d58c\") " pod="openstack/barbican-api-7dc9c9fdbb-zxjj6" Feb 17 14:28:34 crc kubenswrapper[4836]: I0217 14:28:34.977628 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/62b902ba-6ba2-48f3-a6dc-652fd1d6d58c-public-tls-certs\") pod \"barbican-api-7dc9c9fdbb-zxjj6\" (UID: \"62b902ba-6ba2-48f3-a6dc-652fd1d6d58c\") " pod="openstack/barbican-api-7dc9c9fdbb-zxjj6" Feb 17 14:28:34 crc kubenswrapper[4836]: I0217 14:28:34.977675 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4bpt\" (UniqueName: \"kubernetes.io/projected/62b902ba-6ba2-48f3-a6dc-652fd1d6d58c-kube-api-access-v4bpt\") pod \"barbican-api-7dc9c9fdbb-zxjj6\" (UID: \"62b902ba-6ba2-48f3-a6dc-652fd1d6d58c\") " pod="openstack/barbican-api-7dc9c9fdbb-zxjj6" Feb 17 14:28:34 crc kubenswrapper[4836]: I0217 14:28:34.977725 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/62b902ba-6ba2-48f3-a6dc-652fd1d6d58c-logs\") pod \"barbican-api-7dc9c9fdbb-zxjj6\" (UID: \"62b902ba-6ba2-48f3-a6dc-652fd1d6d58c\") " pod="openstack/barbican-api-7dc9c9fdbb-zxjj6" Feb 17 14:28:34 crc kubenswrapper[4836]: I0217 14:28:34.984283 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62b902ba-6ba2-48f3-a6dc-652fd1d6d58c-combined-ca-bundle\") pod \"barbican-api-7dc9c9fdbb-zxjj6\" (UID: \"62b902ba-6ba2-48f3-a6dc-652fd1d6d58c\") " pod="openstack/barbican-api-7dc9c9fdbb-zxjj6" Feb 17 14:28:34 crc kubenswrapper[4836]: I0217 14:28:34.985030 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62b902ba-6ba2-48f3-a6dc-652fd1d6d58c-config-data\") pod \"barbican-api-7dc9c9fdbb-zxjj6\" (UID: \"62b902ba-6ba2-48f3-a6dc-652fd1d6d58c\") " pod="openstack/barbican-api-7dc9c9fdbb-zxjj6" Feb 17 14:28:34 crc kubenswrapper[4836]: I0217 14:28:34.986077 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/62b902ba-6ba2-48f3-a6dc-652fd1d6d58c-public-tls-certs\") pod \"barbican-api-7dc9c9fdbb-zxjj6\" (UID: \"62b902ba-6ba2-48f3-a6dc-652fd1d6d58c\") " pod="openstack/barbican-api-7dc9c9fdbb-zxjj6" Feb 17 14:28:34 crc kubenswrapper[4836]: I0217 14:28:34.986717 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/62b902ba-6ba2-48f3-a6dc-652fd1d6d58c-config-data-custom\") pod \"barbican-api-7dc9c9fdbb-zxjj6\" (UID: \"62b902ba-6ba2-48f3-a6dc-652fd1d6d58c\") " pod="openstack/barbican-api-7dc9c9fdbb-zxjj6" Feb 17 14:28:34 crc kubenswrapper[4836]: I0217 14:28:34.988941 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/62b902ba-6ba2-48f3-a6dc-652fd1d6d58c-internal-tls-certs\") pod \"barbican-api-7dc9c9fdbb-zxjj6\" (UID: \"62b902ba-6ba2-48f3-a6dc-652fd1d6d58c\") " pod="openstack/barbican-api-7dc9c9fdbb-zxjj6" Feb 17 14:28:34 crc kubenswrapper[4836]: I0217 14:28:34.999073 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4bpt\" (UniqueName: \"kubernetes.io/projected/62b902ba-6ba2-48f3-a6dc-652fd1d6d58c-kube-api-access-v4bpt\") pod \"barbican-api-7dc9c9fdbb-zxjj6\" (UID: \"62b902ba-6ba2-48f3-a6dc-652fd1d6d58c\") " pod="openstack/barbican-api-7dc9c9fdbb-zxjj6" Feb 17 14:28:35 crc kubenswrapper[4836]: I0217 14:28:35.023323 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7dc9c9fdbb-zxjj6" Feb 17 14:28:35 crc kubenswrapper[4836]: I0217 14:28:35.997543 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58dd9ff6bc-9hcq9" Feb 17 14:28:36 crc kubenswrapper[4836]: I0217 14:28:36.117400 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ccce7d80-ec87-4fb2-a75f-1b5ddc2f4be9-ovsdbserver-sb\") pod \"ccce7d80-ec87-4fb2-a75f-1b5ddc2f4be9\" (UID: \"ccce7d80-ec87-4fb2-a75f-1b5ddc2f4be9\") " Feb 17 14:28:36 crc kubenswrapper[4836]: I0217 14:28:36.117520 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ccce7d80-ec87-4fb2-a75f-1b5ddc2f4be9-config\") pod \"ccce7d80-ec87-4fb2-a75f-1b5ddc2f4be9\" (UID: \"ccce7d80-ec87-4fb2-a75f-1b5ddc2f4be9\") " Feb 17 14:28:36 crc kubenswrapper[4836]: I0217 14:28:36.119755 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ccce7d80-ec87-4fb2-a75f-1b5ddc2f4be9-dns-swift-storage-0\") pod \"ccce7d80-ec87-4fb2-a75f-1b5ddc2f4be9\" (UID: \"ccce7d80-ec87-4fb2-a75f-1b5ddc2f4be9\") " Feb 17 14:28:36 crc kubenswrapper[4836]: I0217 14:28:36.119917 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ccce7d80-ec87-4fb2-a75f-1b5ddc2f4be9-ovsdbserver-nb\") pod \"ccce7d80-ec87-4fb2-a75f-1b5ddc2f4be9\" (UID: \"ccce7d80-ec87-4fb2-a75f-1b5ddc2f4be9\") " Feb 17 14:28:36 crc kubenswrapper[4836]: I0217 14:28:36.120048 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n78t2\" (UniqueName: \"kubernetes.io/projected/ccce7d80-ec87-4fb2-a75f-1b5ddc2f4be9-kube-api-access-n78t2\") pod \"ccce7d80-ec87-4fb2-a75f-1b5ddc2f4be9\" (UID: \"ccce7d80-ec87-4fb2-a75f-1b5ddc2f4be9\") " Feb 17 14:28:36 crc kubenswrapper[4836]: I0217 14:28:36.120097 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ccce7d80-ec87-4fb2-a75f-1b5ddc2f4be9-dns-svc\") pod \"ccce7d80-ec87-4fb2-a75f-1b5ddc2f4be9\" (UID: \"ccce7d80-ec87-4fb2-a75f-1b5ddc2f4be9\") " Feb 17 14:28:36 crc kubenswrapper[4836]: I0217 14:28:36.127147 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ccce7d80-ec87-4fb2-a75f-1b5ddc2f4be9-kube-api-access-n78t2" (OuterVolumeSpecName: "kube-api-access-n78t2") pod "ccce7d80-ec87-4fb2-a75f-1b5ddc2f4be9" (UID: "ccce7d80-ec87-4fb2-a75f-1b5ddc2f4be9"). InnerVolumeSpecName "kube-api-access-n78t2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:28:36 crc kubenswrapper[4836]: I0217 14:28:36.183596 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ccce7d80-ec87-4fb2-a75f-1b5ddc2f4be9-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ccce7d80-ec87-4fb2-a75f-1b5ddc2f4be9" (UID: "ccce7d80-ec87-4fb2-a75f-1b5ddc2f4be9"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:28:36 crc kubenswrapper[4836]: I0217 14:28:36.188476 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ccce7d80-ec87-4fb2-a75f-1b5ddc2f4be9-config" (OuterVolumeSpecName: "config") pod "ccce7d80-ec87-4fb2-a75f-1b5ddc2f4be9" (UID: "ccce7d80-ec87-4fb2-a75f-1b5ddc2f4be9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:28:36 crc kubenswrapper[4836]: I0217 14:28:36.193541 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ccce7d80-ec87-4fb2-a75f-1b5ddc2f4be9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ccce7d80-ec87-4fb2-a75f-1b5ddc2f4be9" (UID: "ccce7d80-ec87-4fb2-a75f-1b5ddc2f4be9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:28:36 crc kubenswrapper[4836]: I0217 14:28:36.220933 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dd9ff6bc-9hcq9" event={"ID":"ccce7d80-ec87-4fb2-a75f-1b5ddc2f4be9","Type":"ContainerDied","Data":"e9f16c54dee6fca57cba69a1f24712669edc03c2f1b74e5ff682993352dbd1af"} Feb 17 14:28:36 crc kubenswrapper[4836]: I0217 14:28:36.221018 4836 scope.go:117] "RemoveContainer" containerID="2fca778edd45bdfb866af7aaa0fc6f307d910a96cf1cd5ecfab2d14db35f72e8" Feb 17 14:28:36 crc kubenswrapper[4836]: I0217 14:28:36.221393 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58dd9ff6bc-9hcq9" Feb 17 14:28:36 crc kubenswrapper[4836]: I0217 14:28:36.225065 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n78t2\" (UniqueName: \"kubernetes.io/projected/ccce7d80-ec87-4fb2-a75f-1b5ddc2f4be9-kube-api-access-n78t2\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:36 crc kubenswrapper[4836]: I0217 14:28:36.225105 4836 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ccce7d80-ec87-4fb2-a75f-1b5ddc2f4be9-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:36 crc kubenswrapper[4836]: I0217 14:28:36.225116 4836 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ccce7d80-ec87-4fb2-a75f-1b5ddc2f4be9-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:36 crc kubenswrapper[4836]: I0217 14:28:36.225125 4836 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ccce7d80-ec87-4fb2-a75f-1b5ddc2f4be9-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:36 crc kubenswrapper[4836]: I0217 14:28:36.252009 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ccce7d80-ec87-4fb2-a75f-1b5ddc2f4be9-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ccce7d80-ec87-4fb2-a75f-1b5ddc2f4be9" (UID: "ccce7d80-ec87-4fb2-a75f-1b5ddc2f4be9"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:28:36 crc kubenswrapper[4836]: I0217 14:28:36.287943 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ccce7d80-ec87-4fb2-a75f-1b5ddc2f4be9-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ccce7d80-ec87-4fb2-a75f-1b5ddc2f4be9" (UID: "ccce7d80-ec87-4fb2-a75f-1b5ddc2f4be9"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:28:36 crc kubenswrapper[4836]: I0217 14:28:36.327494 4836 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ccce7d80-ec87-4fb2-a75f-1b5ddc2f4be9-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:36 crc kubenswrapper[4836]: I0217 14:28:36.327527 4836 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ccce7d80-ec87-4fb2-a75f-1b5ddc2f4be9-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:36 crc kubenswrapper[4836]: I0217 14:28:36.591653 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-9hcq9"] Feb 17 14:28:36 crc kubenswrapper[4836]: I0217 14:28:36.601664 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-9hcq9"] Feb 17 14:28:37 crc kubenswrapper[4836]: I0217 14:28:37.644699 4836 scope.go:117] "RemoveContainer" containerID="99b7d1e9f2cb717570cc4209028495f2ccc23c4beb025f8110935cc03d58feb9" Feb 17 14:28:38 crc kubenswrapper[4836]: I0217 14:28:38.249461 4836 generic.go:334] "Generic (PLEG): container finished" podID="8185c649-f1ad-4230-830d-07d002e5b358" containerID="ff24c89536ae06cf6a0fbffcb68050de3e8ed22356c912b4e7e87afbef99480d" exitCode=0 Feb 17 14:28:38 crc kubenswrapper[4836]: I0217 14:28:38.249570 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-qqwhc" event={"ID":"8185c649-f1ad-4230-830d-07d002e5b358","Type":"ContainerDied","Data":"ff24c89536ae06cf6a0fbffcb68050de3e8ed22356c912b4e7e87afbef99480d"} Feb 17 14:28:38 crc kubenswrapper[4836]: I0217 14:28:38.581377 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ccce7d80-ec87-4fb2-a75f-1b5ddc2f4be9" path="/var/lib/kubelet/pods/ccce7d80-ec87-4fb2-a75f-1b5ddc2f4be9/volumes" Feb 17 14:28:38 crc kubenswrapper[4836]: I0217 14:28:38.943423 4836 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-58dd9ff6bc-9hcq9" podUID="ccce7d80-ec87-4fb2-a75f-1b5ddc2f4be9" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.163:5353: i/o timeout" Feb 17 14:28:39 crc kubenswrapper[4836]: I0217 14:28:39.121379 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7dc9c9fdbb-zxjj6"] Feb 17 14:28:39 crc kubenswrapper[4836]: I0217 14:28:39.131393 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-649d9995c8-rcxvp"] Feb 17 14:28:39 crc kubenswrapper[4836]: I0217 14:28:39.141555 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-78c4d587b5-cqhdl"] Feb 17 14:28:39 crc kubenswrapper[4836]: W0217 14:28:39.158755 4836 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod21c73844_3235_4a12_9f77_901ba8614e11.slice/crio-bbe789caf6ed33cc607fbf4e010b5eb03468b6cfaedd4af371c447ef9c0fa67b WatchSource:0}: Error finding container bbe789caf6ed33cc607fbf4e010b5eb03468b6cfaedd4af371c447ef9c0fa67b: Status 404 returned error can't find the container with id bbe789caf6ed33cc607fbf4e010b5eb03468b6cfaedd4af371c447ef9c0fa67b Feb 17 14:28:39 crc kubenswrapper[4836]: I0217 14:28:39.175532 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-55d7557768-wvvpt"] Feb 17 14:28:39 crc kubenswrapper[4836]: I0217 14:28:39.187625 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-bc958ddf6-kh2rq"] Feb 17 14:28:39 crc kubenswrapper[4836]: I0217 14:28:39.224544 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-68fd77ffbb-m5r5c"] Feb 17 14:28:39 crc kubenswrapper[4836]: I0217 14:28:39.224839 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6567fb9c77-xcq7p"] Feb 17 14:28:39 crc kubenswrapper[4836]: I0217 14:28:39.260064 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-m9ds8"] Feb 17 14:28:39 crc kubenswrapper[4836]: I0217 14:28:39.299727 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-55d7557768-wvvpt" event={"ID":"21c73844-3235-4a12-9f77-901ba8614e11","Type":"ContainerStarted","Data":"bbe789caf6ed33cc607fbf4e010b5eb03468b6cfaedd4af371c447ef9c0fa67b"} Feb 17 14:28:39 crc kubenswrapper[4836]: I0217 14:28:39.310938 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-bc958ddf6-kh2rq" event={"ID":"42c3b1e3-728a-4bd8-9669-bfe1656b6de2","Type":"ContainerStarted","Data":"b84d4e5903eaf939e6e9df46d9fb1bb6356ca308386cbbf38e24f231a92fd785"} Feb 17 14:28:39 crc kubenswrapper[4836]: I0217 14:28:39.314571 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c29f84b9-3879-4fc6-b2aa-e334bd08f24e","Type":"ContainerStarted","Data":"ffac93583d3a46218a79cd0eec11b0e9213bdce6e0622ee8ec1b1030a56cebbf"} Feb 17 14:28:39 crc kubenswrapper[4836]: I0217 14:28:39.320079 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9fc032cb-3063-4e39-a91f-ccc89defe9c4","Type":"ContainerStarted","Data":"253884c8bfee6f38dc03fef1da6c5e47b92d31a3b1592567360ef3f04d7144a9"} Feb 17 14:28:39 crc kubenswrapper[4836]: I0217 14:28:39.322001 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-68fd77ffbb-m5r5c" event={"ID":"f79d706e-2d22-49c6-acb5-dc3f130ab102","Type":"ContainerStarted","Data":"01bbc5f7898f61cdcdfc6bcc497fb9d3899fe8249745e1a2740b884ba8f14e3e"} Feb 17 14:28:39 crc kubenswrapper[4836]: I0217 14:28:39.327410 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7dc9c9fdbb-zxjj6" event={"ID":"62b902ba-6ba2-48f3-a6dc-652fd1d6d58c","Type":"ContainerStarted","Data":"edc3f4ab33e4ac096c5f5d2d06d6b958f1361e9414fe6db97b0070eb37f81b3b"} Feb 17 14:28:39 crc kubenswrapper[4836]: I0217 14:28:39.347439 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6567fb9c77-xcq7p" event={"ID":"bf33e52a-365f-4ccc-8352-f4c7f8e2aebd","Type":"ContainerStarted","Data":"a18fab57f2f011fbaa8104cb37092948da9f782eda5389fddb0e9e15d016b797"} Feb 17 14:28:39 crc kubenswrapper[4836]: I0217 14:28:39.349431 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=12.349354837 podStartE2EDuration="12.349354837s" podCreationTimestamp="2026-02-17 14:28:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:28:39.337651888 +0000 UTC m=+1345.680580167" watchObservedRunningTime="2026-02-17 14:28:39.349354837 +0000 UTC m=+1345.692283126" Feb 17 14:28:39 crc kubenswrapper[4836]: I0217 14:28:39.350545 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-m9ds8" event={"ID":"d2472d16-7ff3-4f1c-a9b8-045dd8ffa6d0","Type":"ContainerStarted","Data":"769356b8227ec8df8f5b18dca0f8472d3df22108be6b842885971987f0b77c6e"} Feb 17 14:28:39 crc kubenswrapper[4836]: I0217 14:28:39.367633 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2a1d16f5-4710-43b4-805e-315ed73bb24e","Type":"ContainerStarted","Data":"b11cf843196ed96ab329470f8fb90c845e937e84667798d3853568520da77e41"} Feb 17 14:28:39 crc kubenswrapper[4836]: I0217 14:28:39.385736 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-pvljf" event={"ID":"4e016162-2025-44ad-989d-ce71d9f8f9bf","Type":"ContainerStarted","Data":"fc7f81c47e20cce7a74c227545b963bd61d6dadbccf7dacfaa97a9b912354775"} Feb 17 14:28:39 crc kubenswrapper[4836]: I0217 14:28:39.400468 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-649d9995c8-rcxvp" event={"ID":"ed247b9d-af54-401e-80a3-82d18772f29d","Type":"ContainerStarted","Data":"569d3d6ec6caf0a5240f5c4b4d890d8ae7a0c22c3878dcb1e8c71559ae9f5a26"} Feb 17 14:28:39 crc kubenswrapper[4836]: I0217 14:28:39.402480 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-78c4d587b5-cqhdl" event={"ID":"f2f9acba-3f54-43b6-9461-31cba0cc954b","Type":"ContainerStarted","Data":"3ae7c7cdf4ba5c76850b19dacd65afe9e21b18d8d278aea8459f693c9b38a7d0"} Feb 17 14:28:39 crc kubenswrapper[4836]: I0217 14:28:39.423857 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=12.423829491 podStartE2EDuration="12.423829491s" podCreationTimestamp="2026-02-17 14:28:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:28:39.3778131 +0000 UTC m=+1345.720741379" watchObservedRunningTime="2026-02-17 14:28:39.423829491 +0000 UTC m=+1345.766757760" Feb 17 14:28:39 crc kubenswrapper[4836]: I0217 14:28:39.429971 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-db-sync-pvljf" podStartSLOduration=5.486771198 podStartE2EDuration="1m3.429945457s" podCreationTimestamp="2026-02-17 14:27:36 +0000 UTC" firstStartedPulling="2026-02-17 14:27:40.031338307 +0000 UTC m=+1286.374266576" lastFinishedPulling="2026-02-17 14:28:37.974512566 +0000 UTC m=+1344.317440835" observedRunningTime="2026-02-17 14:28:39.416662447 +0000 UTC m=+1345.759590716" watchObservedRunningTime="2026-02-17 14:28:39.429945457 +0000 UTC m=+1345.772873726" Feb 17 14:28:40 crc kubenswrapper[4836]: I0217 14:28:40.147527 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-qqwhc" Feb 17 14:28:40 crc kubenswrapper[4836]: I0217 14:28:40.288858 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8185c649-f1ad-4230-830d-07d002e5b358-scripts\") pod \"8185c649-f1ad-4230-830d-07d002e5b358\" (UID: \"8185c649-f1ad-4230-830d-07d002e5b358\") " Feb 17 14:28:40 crc kubenswrapper[4836]: I0217 14:28:40.288951 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8185c649-f1ad-4230-830d-07d002e5b358-db-sync-config-data\") pod \"8185c649-f1ad-4230-830d-07d002e5b358\" (UID: \"8185c649-f1ad-4230-830d-07d002e5b358\") " Feb 17 14:28:40 crc kubenswrapper[4836]: I0217 14:28:40.289063 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8185c649-f1ad-4230-830d-07d002e5b358-etc-machine-id\") pod \"8185c649-f1ad-4230-830d-07d002e5b358\" (UID: \"8185c649-f1ad-4230-830d-07d002e5b358\") " Feb 17 14:28:40 crc kubenswrapper[4836]: I0217 14:28:40.289142 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8185c649-f1ad-4230-830d-07d002e5b358-combined-ca-bundle\") pod \"8185c649-f1ad-4230-830d-07d002e5b358\" (UID: \"8185c649-f1ad-4230-830d-07d002e5b358\") " Feb 17 14:28:40 crc kubenswrapper[4836]: I0217 14:28:40.289218 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ftmbq\" (UniqueName: \"kubernetes.io/projected/8185c649-f1ad-4230-830d-07d002e5b358-kube-api-access-ftmbq\") pod \"8185c649-f1ad-4230-830d-07d002e5b358\" (UID: \"8185c649-f1ad-4230-830d-07d002e5b358\") " Feb 17 14:28:40 crc kubenswrapper[4836]: I0217 14:28:40.289441 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8185c649-f1ad-4230-830d-07d002e5b358-config-data\") pod \"8185c649-f1ad-4230-830d-07d002e5b358\" (UID: \"8185c649-f1ad-4230-830d-07d002e5b358\") " Feb 17 14:28:40 crc kubenswrapper[4836]: I0217 14:28:40.289519 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8185c649-f1ad-4230-830d-07d002e5b358-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "8185c649-f1ad-4230-830d-07d002e5b358" (UID: "8185c649-f1ad-4230-830d-07d002e5b358"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 14:28:40 crc kubenswrapper[4836]: I0217 14:28:40.291774 4836 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8185c649-f1ad-4230-830d-07d002e5b358-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:40 crc kubenswrapper[4836]: I0217 14:28:40.312705 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8185c649-f1ad-4230-830d-07d002e5b358-scripts" (OuterVolumeSpecName: "scripts") pod "8185c649-f1ad-4230-830d-07d002e5b358" (UID: "8185c649-f1ad-4230-830d-07d002e5b358"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:28:40 crc kubenswrapper[4836]: I0217 14:28:40.325571 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8185c649-f1ad-4230-830d-07d002e5b358-kube-api-access-ftmbq" (OuterVolumeSpecName: "kube-api-access-ftmbq") pod "8185c649-f1ad-4230-830d-07d002e5b358" (UID: "8185c649-f1ad-4230-830d-07d002e5b358"). InnerVolumeSpecName "kube-api-access-ftmbq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:28:40 crc kubenswrapper[4836]: I0217 14:28:40.325697 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8185c649-f1ad-4230-830d-07d002e5b358-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "8185c649-f1ad-4230-830d-07d002e5b358" (UID: "8185c649-f1ad-4230-830d-07d002e5b358"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:28:40 crc kubenswrapper[4836]: I0217 14:28:40.374522 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8185c649-f1ad-4230-830d-07d002e5b358-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8185c649-f1ad-4230-830d-07d002e5b358" (UID: "8185c649-f1ad-4230-830d-07d002e5b358"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:28:40 crc kubenswrapper[4836]: I0217 14:28:40.396587 4836 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8185c649-f1ad-4230-830d-07d002e5b358-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:40 crc kubenswrapper[4836]: I0217 14:28:40.396650 4836 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8185c649-f1ad-4230-830d-07d002e5b358-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:40 crc kubenswrapper[4836]: I0217 14:28:40.396666 4836 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8185c649-f1ad-4230-830d-07d002e5b358-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:40 crc kubenswrapper[4836]: I0217 14:28:40.396678 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ftmbq\" (UniqueName: \"kubernetes.io/projected/8185c649-f1ad-4230-830d-07d002e5b358-kube-api-access-ftmbq\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:40 crc kubenswrapper[4836]: I0217 14:28:40.424514 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8185c649-f1ad-4230-830d-07d002e5b358-config-data" (OuterVolumeSpecName: "config-data") pod "8185c649-f1ad-4230-830d-07d002e5b358" (UID: "8185c649-f1ad-4230-830d-07d002e5b358"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:28:40 crc kubenswrapper[4836]: I0217 14:28:40.442413 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-qqwhc" event={"ID":"8185c649-f1ad-4230-830d-07d002e5b358","Type":"ContainerDied","Data":"b3482ed7c18ae58a71068d39ec0f731b2f5c23d1bee2fd95e9d280383de59ee3"} Feb 17 14:28:40 crc kubenswrapper[4836]: I0217 14:28:40.442464 4836 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b3482ed7c18ae58a71068d39ec0f731b2f5c23d1bee2fd95e9d280383de59ee3" Feb 17 14:28:40 crc kubenswrapper[4836]: I0217 14:28:40.442490 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-qqwhc" Feb 17 14:28:40 crc kubenswrapper[4836]: I0217 14:28:40.455198 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-649d9995c8-rcxvp" event={"ID":"ed247b9d-af54-401e-80a3-82d18772f29d","Type":"ContainerStarted","Data":"0f637c77116f7f89955d3abdef322406d505cefd736b74cd4ec4ac6a045c16f5"} Feb 17 14:28:40 crc kubenswrapper[4836]: I0217 14:28:40.455256 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-649d9995c8-rcxvp" event={"ID":"ed247b9d-af54-401e-80a3-82d18772f29d","Type":"ContainerStarted","Data":"1cbfdb3ab6153a9ef06f48a83598052cef7a8d6eccc057f49dbfbf1a10abee8c"} Feb 17 14:28:40 crc kubenswrapper[4836]: I0217 14:28:40.455364 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-649d9995c8-rcxvp" Feb 17 14:28:40 crc kubenswrapper[4836]: I0217 14:28:40.472178 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-78c4d587b5-cqhdl" event={"ID":"f2f9acba-3f54-43b6-9461-31cba0cc954b","Type":"ContainerStarted","Data":"257eab0890393e6caf8d16464988774c8d840eb8ba2300d1cba4536e48d1671d"} Feb 17 14:28:40 crc kubenswrapper[4836]: I0217 14:28:40.479500 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-78c4d587b5-cqhdl" Feb 17 14:28:40 crc kubenswrapper[4836]: I0217 14:28:40.529476 4836 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8185c649-f1ad-4230-830d-07d002e5b358-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:40 crc kubenswrapper[4836]: I0217 14:28:40.547528 4836 generic.go:334] "Generic (PLEG): container finished" podID="d2472d16-7ff3-4f1c-a9b8-045dd8ffa6d0" containerID="42d2d98237842a647a52d62a45a7a321647f264ef6fe4dba4d7263c62fd90bc4" exitCode=0 Feb 17 14:28:40 crc kubenswrapper[4836]: I0217 14:28:40.555692 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7dc9c9fdbb-zxjj6" event={"ID":"62b902ba-6ba2-48f3-a6dc-652fd1d6d58c","Type":"ContainerStarted","Data":"c922d67f76cc408680821ed49972a5114ab647a61b7cd84843e32accf4f0fc8b"} Feb 17 14:28:40 crc kubenswrapper[4836]: I0217 14:28:40.555803 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7dc9c9fdbb-zxjj6" Feb 17 14:28:40 crc kubenswrapper[4836]: I0217 14:28:40.555822 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7dc9c9fdbb-zxjj6" event={"ID":"62b902ba-6ba2-48f3-a6dc-652fd1d6d58c","Type":"ContainerStarted","Data":"5ba46c4e4411bd77b1697c5688742a0dcb01cc6b5035e314a1b65cd71fd750dd"} Feb 17 14:28:40 crc kubenswrapper[4836]: I0217 14:28:40.555834 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7dc9c9fdbb-zxjj6" Feb 17 14:28:40 crc kubenswrapper[4836]: I0217 14:28:40.555856 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-m9ds8" event={"ID":"d2472d16-7ff3-4f1c-a9b8-045dd8ffa6d0","Type":"ContainerDied","Data":"42d2d98237842a647a52d62a45a7a321647f264ef6fe4dba4d7263c62fd90bc4"} Feb 17 14:28:40 crc kubenswrapper[4836]: I0217 14:28:40.560461 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-649d9995c8-rcxvp" podStartSLOduration=9.560437445 podStartE2EDuration="9.560437445s" podCreationTimestamp="2026-02-17 14:28:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:28:40.526480722 +0000 UTC m=+1346.869409011" watchObservedRunningTime="2026-02-17 14:28:40.560437445 +0000 UTC m=+1346.903365714" Feb 17 14:28:40 crc kubenswrapper[4836]: I0217 14:28:40.578870 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-78c4d587b5-cqhdl" podStartSLOduration=10.578846645 podStartE2EDuration="10.578846645s" podCreationTimestamp="2026-02-17 14:28:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:28:40.557029462 +0000 UTC m=+1346.899957741" watchObservedRunningTime="2026-02-17 14:28:40.578846645 +0000 UTC m=+1346.921774904" Feb 17 14:28:40 crc kubenswrapper[4836]: I0217 14:28:40.613953 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-bc958ddf6-kh2rq" event={"ID":"42c3b1e3-728a-4bd8-9669-bfe1656b6de2","Type":"ContainerStarted","Data":"74874321dc0f21c297655ffb8e49b9e5f17f6048c138f35b62d786cdb8a831aa"} Feb 17 14:28:40 crc kubenswrapper[4836]: I0217 14:28:40.614045 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-bc958ddf6-kh2rq" Feb 17 14:28:40 crc kubenswrapper[4836]: I0217 14:28:40.614065 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-bc958ddf6-kh2rq" Feb 17 14:28:40 crc kubenswrapper[4836]: I0217 14:28:40.636814 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-7dc9c9fdbb-zxjj6" podStartSLOduration=6.636748799 podStartE2EDuration="6.636748799s" podCreationTimestamp="2026-02-17 14:28:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:28:40.61101809 +0000 UTC m=+1346.953946369" watchObservedRunningTime="2026-02-17 14:28:40.636748799 +0000 UTC m=+1346.979677088" Feb 17 14:28:40 crc kubenswrapper[4836]: I0217 14:28:40.648633 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-55d7557768-wvvpt" event={"ID":"21c73844-3235-4a12-9f77-901ba8614e11","Type":"ContainerStarted","Data":"f44d059f8118ff4bd369e70f4ee21eeb5fe30846206d89ff637f2b1a0a5be387"} Feb 17 14:28:40 crc kubenswrapper[4836]: I0217 14:28:40.648705 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-55d7557768-wvvpt" event={"ID":"21c73844-3235-4a12-9f77-901ba8614e11","Type":"ContainerStarted","Data":"d7a1e6a91c2fd3547de1abbc6562e149219faa94982d5287b5144a4b6b8eb8b9"} Feb 17 14:28:40 crc kubenswrapper[4836]: I0217 14:28:40.649222 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-55d7557768-wvvpt" Feb 17 14:28:40 crc kubenswrapper[4836]: I0217 14:28:40.649246 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-55d7557768-wvvpt" Feb 17 14:28:40 crc kubenswrapper[4836]: I0217 14:28:40.692489 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 17 14:28:40 crc kubenswrapper[4836]: E0217 14:28:40.693068 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8185c649-f1ad-4230-830d-07d002e5b358" containerName="cinder-db-sync" Feb 17 14:28:40 crc kubenswrapper[4836]: I0217 14:28:40.693084 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="8185c649-f1ad-4230-830d-07d002e5b358" containerName="cinder-db-sync" Feb 17 14:28:40 crc kubenswrapper[4836]: E0217 14:28:40.693096 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccce7d80-ec87-4fb2-a75f-1b5ddc2f4be9" containerName="init" Feb 17 14:28:40 crc kubenswrapper[4836]: I0217 14:28:40.693102 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccce7d80-ec87-4fb2-a75f-1b5ddc2f4be9" containerName="init" Feb 17 14:28:40 crc kubenswrapper[4836]: E0217 14:28:40.693108 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccce7d80-ec87-4fb2-a75f-1b5ddc2f4be9" containerName="dnsmasq-dns" Feb 17 14:28:40 crc kubenswrapper[4836]: I0217 14:28:40.693114 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccce7d80-ec87-4fb2-a75f-1b5ddc2f4be9" containerName="dnsmasq-dns" Feb 17 14:28:40 crc kubenswrapper[4836]: I0217 14:28:40.693387 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="8185c649-f1ad-4230-830d-07d002e5b358" containerName="cinder-db-sync" Feb 17 14:28:40 crc kubenswrapper[4836]: I0217 14:28:40.693398 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccce7d80-ec87-4fb2-a75f-1b5ddc2f4be9" containerName="dnsmasq-dns" Feb 17 14:28:40 crc kubenswrapper[4836]: I0217 14:28:40.694668 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 17 14:28:40 crc kubenswrapper[4836]: I0217 14:28:40.703957 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 17 14:28:40 crc kubenswrapper[4836]: I0217 14:28:40.705801 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-cg95t" Feb 17 14:28:40 crc kubenswrapper[4836]: I0217 14:28:40.705867 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 17 14:28:40 crc kubenswrapper[4836]: I0217 14:28:40.705891 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 17 14:28:40 crc kubenswrapper[4836]: I0217 14:28:40.783452 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 17 14:28:40 crc kubenswrapper[4836]: I0217 14:28:40.832057 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-55d7557768-wvvpt" podStartSLOduration=10.832026249 podStartE2EDuration="10.832026249s" podCreationTimestamp="2026-02-17 14:28:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:28:40.729431899 +0000 UTC m=+1347.072360188" watchObservedRunningTime="2026-02-17 14:28:40.832026249 +0000 UTC m=+1347.174954518" Feb 17 14:28:40 crc kubenswrapper[4836]: I0217 14:28:40.838898 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f85d3a41-bec9-4783-a2c6-2e6627156cce-scripts\") pod \"cinder-scheduler-0\" (UID: \"f85d3a41-bec9-4783-a2c6-2e6627156cce\") " pod="openstack/cinder-scheduler-0" Feb 17 14:28:40 crc kubenswrapper[4836]: I0217 14:28:40.838993 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f85d3a41-bec9-4783-a2c6-2e6627156cce-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"f85d3a41-bec9-4783-a2c6-2e6627156cce\") " pod="openstack/cinder-scheduler-0" Feb 17 14:28:40 crc kubenswrapper[4836]: I0217 14:28:40.839087 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f85d3a41-bec9-4783-a2c6-2e6627156cce-config-data\") pod \"cinder-scheduler-0\" (UID: \"f85d3a41-bec9-4783-a2c6-2e6627156cce\") " pod="openstack/cinder-scheduler-0" Feb 17 14:28:40 crc kubenswrapper[4836]: I0217 14:28:40.839199 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f85d3a41-bec9-4783-a2c6-2e6627156cce-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"f85d3a41-bec9-4783-a2c6-2e6627156cce\") " pod="openstack/cinder-scheduler-0" Feb 17 14:28:40 crc kubenswrapper[4836]: I0217 14:28:40.839493 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f85d3a41-bec9-4783-a2c6-2e6627156cce-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"f85d3a41-bec9-4783-a2c6-2e6627156cce\") " pod="openstack/cinder-scheduler-0" Feb 17 14:28:40 crc kubenswrapper[4836]: I0217 14:28:40.839647 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcwnw\" (UniqueName: \"kubernetes.io/projected/f85d3a41-bec9-4783-a2c6-2e6627156cce-kube-api-access-mcwnw\") pod \"cinder-scheduler-0\" (UID: \"f85d3a41-bec9-4783-a2c6-2e6627156cce\") " pod="openstack/cinder-scheduler-0" Feb 17 14:28:40 crc kubenswrapper[4836]: I0217 14:28:40.915533 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-bc958ddf6-kh2rq" podStartSLOduration=10.915499708 podStartE2EDuration="10.915499708s" podCreationTimestamp="2026-02-17 14:28:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:28:40.782925574 +0000 UTC m=+1347.125853843" watchObservedRunningTime="2026-02-17 14:28:40.915499708 +0000 UTC m=+1347.258427967" Feb 17 14:28:40 crc kubenswrapper[4836]: I0217 14:28:40.944350 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f85d3a41-bec9-4783-a2c6-2e6627156cce-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"f85d3a41-bec9-4783-a2c6-2e6627156cce\") " pod="openstack/cinder-scheduler-0" Feb 17 14:28:40 crc kubenswrapper[4836]: I0217 14:28:40.944431 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mcwnw\" (UniqueName: \"kubernetes.io/projected/f85d3a41-bec9-4783-a2c6-2e6627156cce-kube-api-access-mcwnw\") pod \"cinder-scheduler-0\" (UID: \"f85d3a41-bec9-4783-a2c6-2e6627156cce\") " pod="openstack/cinder-scheduler-0" Feb 17 14:28:40 crc kubenswrapper[4836]: I0217 14:28:40.944484 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f85d3a41-bec9-4783-a2c6-2e6627156cce-scripts\") pod \"cinder-scheduler-0\" (UID: \"f85d3a41-bec9-4783-a2c6-2e6627156cce\") " pod="openstack/cinder-scheduler-0" Feb 17 14:28:40 crc kubenswrapper[4836]: I0217 14:28:40.944507 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f85d3a41-bec9-4783-a2c6-2e6627156cce-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"f85d3a41-bec9-4783-a2c6-2e6627156cce\") " pod="openstack/cinder-scheduler-0" Feb 17 14:28:40 crc kubenswrapper[4836]: I0217 14:28:40.944551 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f85d3a41-bec9-4783-a2c6-2e6627156cce-config-data\") pod \"cinder-scheduler-0\" (UID: \"f85d3a41-bec9-4783-a2c6-2e6627156cce\") " pod="openstack/cinder-scheduler-0" Feb 17 14:28:40 crc kubenswrapper[4836]: I0217 14:28:40.944597 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f85d3a41-bec9-4783-a2c6-2e6627156cce-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"f85d3a41-bec9-4783-a2c6-2e6627156cce\") " pod="openstack/cinder-scheduler-0" Feb 17 14:28:40 crc kubenswrapper[4836]: I0217 14:28:40.944721 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f85d3a41-bec9-4783-a2c6-2e6627156cce-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"f85d3a41-bec9-4783-a2c6-2e6627156cce\") " pod="openstack/cinder-scheduler-0" Feb 17 14:28:41 crc kubenswrapper[4836]: I0217 14:28:41.063112 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-m9ds8"] Feb 17 14:28:41 crc kubenswrapper[4836]: I0217 14:28:41.086433 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-nvkvs"] Feb 17 14:28:41 crc kubenswrapper[4836]: I0217 14:28:41.088982 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-nvkvs" Feb 17 14:28:41 crc kubenswrapper[4836]: I0217 14:28:41.111448 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f85d3a41-bec9-4783-a2c6-2e6627156cce-scripts\") pod \"cinder-scheduler-0\" (UID: \"f85d3a41-bec9-4783-a2c6-2e6627156cce\") " pod="openstack/cinder-scheduler-0" Feb 17 14:28:41 crc kubenswrapper[4836]: I0217 14:28:41.111934 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f85d3a41-bec9-4783-a2c6-2e6627156cce-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"f85d3a41-bec9-4783-a2c6-2e6627156cce\") " pod="openstack/cinder-scheduler-0" Feb 17 14:28:41 crc kubenswrapper[4836]: I0217 14:28:41.116331 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-nvkvs"] Feb 17 14:28:41 crc kubenswrapper[4836]: I0217 14:28:41.138163 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcwnw\" (UniqueName: \"kubernetes.io/projected/f85d3a41-bec9-4783-a2c6-2e6627156cce-kube-api-access-mcwnw\") pod \"cinder-scheduler-0\" (UID: \"f85d3a41-bec9-4783-a2c6-2e6627156cce\") " pod="openstack/cinder-scheduler-0" Feb 17 14:28:41 crc kubenswrapper[4836]: I0217 14:28:41.138616 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f85d3a41-bec9-4783-a2c6-2e6627156cce-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"f85d3a41-bec9-4783-a2c6-2e6627156cce\") " pod="openstack/cinder-scheduler-0" Feb 17 14:28:41 crc kubenswrapper[4836]: I0217 14:28:41.142681 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f85d3a41-bec9-4783-a2c6-2e6627156cce-config-data\") pod \"cinder-scheduler-0\" (UID: \"f85d3a41-bec9-4783-a2c6-2e6627156cce\") " pod="openstack/cinder-scheduler-0" Feb 17 14:28:41 crc kubenswrapper[4836]: I0217 14:28:41.149507 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79b71acb-6b55-4f99-8b13-0c5aea065cbb-config\") pod \"dnsmasq-dns-5c9776ccc5-nvkvs\" (UID: \"79b71acb-6b55-4f99-8b13-0c5aea065cbb\") " pod="openstack/dnsmasq-dns-5c9776ccc5-nvkvs" Feb 17 14:28:41 crc kubenswrapper[4836]: I0217 14:28:41.154433 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvggg\" (UniqueName: \"kubernetes.io/projected/79b71acb-6b55-4f99-8b13-0c5aea065cbb-kube-api-access-zvggg\") pod \"dnsmasq-dns-5c9776ccc5-nvkvs\" (UID: \"79b71acb-6b55-4f99-8b13-0c5aea065cbb\") " pod="openstack/dnsmasq-dns-5c9776ccc5-nvkvs" Feb 17 14:28:41 crc kubenswrapper[4836]: I0217 14:28:41.154600 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/79b71acb-6b55-4f99-8b13-0c5aea065cbb-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-nvkvs\" (UID: \"79b71acb-6b55-4f99-8b13-0c5aea065cbb\") " pod="openstack/dnsmasq-dns-5c9776ccc5-nvkvs" Feb 17 14:28:41 crc kubenswrapper[4836]: I0217 14:28:41.154938 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/79b71acb-6b55-4f99-8b13-0c5aea065cbb-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-nvkvs\" (UID: \"79b71acb-6b55-4f99-8b13-0c5aea065cbb\") " pod="openstack/dnsmasq-dns-5c9776ccc5-nvkvs" Feb 17 14:28:41 crc kubenswrapper[4836]: I0217 14:28:41.155164 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/79b71acb-6b55-4f99-8b13-0c5aea065cbb-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-nvkvs\" (UID: \"79b71acb-6b55-4f99-8b13-0c5aea065cbb\") " pod="openstack/dnsmasq-dns-5c9776ccc5-nvkvs" Feb 17 14:28:41 crc kubenswrapper[4836]: I0217 14:28:41.155250 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/79b71acb-6b55-4f99-8b13-0c5aea065cbb-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-nvkvs\" (UID: \"79b71acb-6b55-4f99-8b13-0c5aea065cbb\") " pod="openstack/dnsmasq-dns-5c9776ccc5-nvkvs" Feb 17 14:28:41 crc kubenswrapper[4836]: I0217 14:28:41.164504 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 17 14:28:41 crc kubenswrapper[4836]: I0217 14:28:41.167042 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 17 14:28:41 crc kubenswrapper[4836]: I0217 14:28:41.186038 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 17 14:28:41 crc kubenswrapper[4836]: I0217 14:28:41.186786 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 17 14:28:41 crc kubenswrapper[4836]: I0217 14:28:41.258533 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/79b71acb-6b55-4f99-8b13-0c5aea065cbb-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-nvkvs\" (UID: \"79b71acb-6b55-4f99-8b13-0c5aea065cbb\") " pod="openstack/dnsmasq-dns-5c9776ccc5-nvkvs" Feb 17 14:28:41 crc kubenswrapper[4836]: I0217 14:28:41.258606 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/79b71acb-6b55-4f99-8b13-0c5aea065cbb-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-nvkvs\" (UID: \"79b71acb-6b55-4f99-8b13-0c5aea065cbb\") " pod="openstack/dnsmasq-dns-5c9776ccc5-nvkvs" Feb 17 14:28:41 crc kubenswrapper[4836]: I0217 14:28:41.258659 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/68fadcf3-845e-4605-add5-6b5b6092e443-logs\") pod \"cinder-api-0\" (UID: \"68fadcf3-845e-4605-add5-6b5b6092e443\") " pod="openstack/cinder-api-0" Feb 17 14:28:41 crc kubenswrapper[4836]: I0217 14:28:41.258692 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/68fadcf3-845e-4605-add5-6b5b6092e443-config-data-custom\") pod \"cinder-api-0\" (UID: \"68fadcf3-845e-4605-add5-6b5b6092e443\") " pod="openstack/cinder-api-0" Feb 17 14:28:41 crc kubenswrapper[4836]: I0217 14:28:41.258780 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrtlp\" (UniqueName: \"kubernetes.io/projected/68fadcf3-845e-4605-add5-6b5b6092e443-kube-api-access-wrtlp\") pod \"cinder-api-0\" (UID: \"68fadcf3-845e-4605-add5-6b5b6092e443\") " pod="openstack/cinder-api-0" Feb 17 14:28:41 crc kubenswrapper[4836]: I0217 14:28:41.258812 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79b71acb-6b55-4f99-8b13-0c5aea065cbb-config\") pod \"dnsmasq-dns-5c9776ccc5-nvkvs\" (UID: \"79b71acb-6b55-4f99-8b13-0c5aea065cbb\") " pod="openstack/dnsmasq-dns-5c9776ccc5-nvkvs" Feb 17 14:28:41 crc kubenswrapper[4836]: I0217 14:28:41.258860 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvggg\" (UniqueName: \"kubernetes.io/projected/79b71acb-6b55-4f99-8b13-0c5aea065cbb-kube-api-access-zvggg\") pod \"dnsmasq-dns-5c9776ccc5-nvkvs\" (UID: \"79b71acb-6b55-4f99-8b13-0c5aea065cbb\") " pod="openstack/dnsmasq-dns-5c9776ccc5-nvkvs" Feb 17 14:28:41 crc kubenswrapper[4836]: I0217 14:28:41.258877 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68fadcf3-845e-4605-add5-6b5b6092e443-scripts\") pod \"cinder-api-0\" (UID: \"68fadcf3-845e-4605-add5-6b5b6092e443\") " pod="openstack/cinder-api-0" Feb 17 14:28:41 crc kubenswrapper[4836]: I0217 14:28:41.258920 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/79b71acb-6b55-4f99-8b13-0c5aea065cbb-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-nvkvs\" (UID: \"79b71acb-6b55-4f99-8b13-0c5aea065cbb\") " pod="openstack/dnsmasq-dns-5c9776ccc5-nvkvs" Feb 17 14:28:41 crc kubenswrapper[4836]: I0217 14:28:41.258951 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68fadcf3-845e-4605-add5-6b5b6092e443-config-data\") pod \"cinder-api-0\" (UID: \"68fadcf3-845e-4605-add5-6b5b6092e443\") " pod="openstack/cinder-api-0" Feb 17 14:28:41 crc kubenswrapper[4836]: I0217 14:28:41.259994 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/79b71acb-6b55-4f99-8b13-0c5aea065cbb-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-nvkvs\" (UID: \"79b71acb-6b55-4f99-8b13-0c5aea065cbb\") " pod="openstack/dnsmasq-dns-5c9776ccc5-nvkvs" Feb 17 14:28:41 crc kubenswrapper[4836]: I0217 14:28:41.260056 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79b71acb-6b55-4f99-8b13-0c5aea065cbb-config\") pod \"dnsmasq-dns-5c9776ccc5-nvkvs\" (UID: \"79b71acb-6b55-4f99-8b13-0c5aea065cbb\") " pod="openstack/dnsmasq-dns-5c9776ccc5-nvkvs" Feb 17 14:28:41 crc kubenswrapper[4836]: I0217 14:28:41.261039 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/79b71acb-6b55-4f99-8b13-0c5aea065cbb-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-nvkvs\" (UID: \"79b71acb-6b55-4f99-8b13-0c5aea065cbb\") " pod="openstack/dnsmasq-dns-5c9776ccc5-nvkvs" Feb 17 14:28:41 crc kubenswrapper[4836]: I0217 14:28:41.261128 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/79b71acb-6b55-4f99-8b13-0c5aea065cbb-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-nvkvs\" (UID: \"79b71acb-6b55-4f99-8b13-0c5aea065cbb\") " pod="openstack/dnsmasq-dns-5c9776ccc5-nvkvs" Feb 17 14:28:41 crc kubenswrapper[4836]: I0217 14:28:41.261251 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/79b71acb-6b55-4f99-8b13-0c5aea065cbb-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-nvkvs\" (UID: \"79b71acb-6b55-4f99-8b13-0c5aea065cbb\") " pod="openstack/dnsmasq-dns-5c9776ccc5-nvkvs" Feb 17 14:28:41 crc kubenswrapper[4836]: I0217 14:28:41.261323 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/68fadcf3-845e-4605-add5-6b5b6092e443-etc-machine-id\") pod \"cinder-api-0\" (UID: \"68fadcf3-845e-4605-add5-6b5b6092e443\") " pod="openstack/cinder-api-0" Feb 17 14:28:41 crc kubenswrapper[4836]: I0217 14:28:41.261362 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68fadcf3-845e-4605-add5-6b5b6092e443-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"68fadcf3-845e-4605-add5-6b5b6092e443\") " pod="openstack/cinder-api-0" Feb 17 14:28:41 crc kubenswrapper[4836]: I0217 14:28:41.262036 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/79b71acb-6b55-4f99-8b13-0c5aea065cbb-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-nvkvs\" (UID: \"79b71acb-6b55-4f99-8b13-0c5aea065cbb\") " pod="openstack/dnsmasq-dns-5c9776ccc5-nvkvs" Feb 17 14:28:41 crc kubenswrapper[4836]: I0217 14:28:41.305373 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvggg\" (UniqueName: \"kubernetes.io/projected/79b71acb-6b55-4f99-8b13-0c5aea065cbb-kube-api-access-zvggg\") pod \"dnsmasq-dns-5c9776ccc5-nvkvs\" (UID: \"79b71acb-6b55-4f99-8b13-0c5aea065cbb\") " pod="openstack/dnsmasq-dns-5c9776ccc5-nvkvs" Feb 17 14:28:41 crc kubenswrapper[4836]: I0217 14:28:41.345629 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 17 14:28:41 crc kubenswrapper[4836]: I0217 14:28:41.363394 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrtlp\" (UniqueName: \"kubernetes.io/projected/68fadcf3-845e-4605-add5-6b5b6092e443-kube-api-access-wrtlp\") pod \"cinder-api-0\" (UID: \"68fadcf3-845e-4605-add5-6b5b6092e443\") " pod="openstack/cinder-api-0" Feb 17 14:28:41 crc kubenswrapper[4836]: I0217 14:28:41.363519 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68fadcf3-845e-4605-add5-6b5b6092e443-scripts\") pod \"cinder-api-0\" (UID: \"68fadcf3-845e-4605-add5-6b5b6092e443\") " pod="openstack/cinder-api-0" Feb 17 14:28:41 crc kubenswrapper[4836]: I0217 14:28:41.363573 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68fadcf3-845e-4605-add5-6b5b6092e443-config-data\") pod \"cinder-api-0\" (UID: \"68fadcf3-845e-4605-add5-6b5b6092e443\") " pod="openstack/cinder-api-0" Feb 17 14:28:41 crc kubenswrapper[4836]: I0217 14:28:41.363631 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/68fadcf3-845e-4605-add5-6b5b6092e443-etc-machine-id\") pod \"cinder-api-0\" (UID: \"68fadcf3-845e-4605-add5-6b5b6092e443\") " pod="openstack/cinder-api-0" Feb 17 14:28:41 crc kubenswrapper[4836]: I0217 14:28:41.363658 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68fadcf3-845e-4605-add5-6b5b6092e443-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"68fadcf3-845e-4605-add5-6b5b6092e443\") " pod="openstack/cinder-api-0" Feb 17 14:28:41 crc kubenswrapper[4836]: I0217 14:28:41.363737 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/68fadcf3-845e-4605-add5-6b5b6092e443-logs\") pod \"cinder-api-0\" (UID: \"68fadcf3-845e-4605-add5-6b5b6092e443\") " pod="openstack/cinder-api-0" Feb 17 14:28:41 crc kubenswrapper[4836]: I0217 14:28:41.363771 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/68fadcf3-845e-4605-add5-6b5b6092e443-config-data-custom\") pod \"cinder-api-0\" (UID: \"68fadcf3-845e-4605-add5-6b5b6092e443\") " pod="openstack/cinder-api-0" Feb 17 14:28:41 crc kubenswrapper[4836]: I0217 14:28:41.366268 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/68fadcf3-845e-4605-add5-6b5b6092e443-etc-machine-id\") pod \"cinder-api-0\" (UID: \"68fadcf3-845e-4605-add5-6b5b6092e443\") " pod="openstack/cinder-api-0" Feb 17 14:28:41 crc kubenswrapper[4836]: I0217 14:28:41.371927 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/68fadcf3-845e-4605-add5-6b5b6092e443-logs\") pod \"cinder-api-0\" (UID: \"68fadcf3-845e-4605-add5-6b5b6092e443\") " pod="openstack/cinder-api-0" Feb 17 14:28:41 crc kubenswrapper[4836]: I0217 14:28:41.375965 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/68fadcf3-845e-4605-add5-6b5b6092e443-config-data-custom\") pod \"cinder-api-0\" (UID: \"68fadcf3-845e-4605-add5-6b5b6092e443\") " pod="openstack/cinder-api-0" Feb 17 14:28:41 crc kubenswrapper[4836]: I0217 14:28:41.382334 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68fadcf3-845e-4605-add5-6b5b6092e443-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"68fadcf3-845e-4605-add5-6b5b6092e443\") " pod="openstack/cinder-api-0" Feb 17 14:28:41 crc kubenswrapper[4836]: I0217 14:28:41.383121 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68fadcf3-845e-4605-add5-6b5b6092e443-config-data\") pod \"cinder-api-0\" (UID: \"68fadcf3-845e-4605-add5-6b5b6092e443\") " pod="openstack/cinder-api-0" Feb 17 14:28:41 crc kubenswrapper[4836]: I0217 14:28:41.392010 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68fadcf3-845e-4605-add5-6b5b6092e443-scripts\") pod \"cinder-api-0\" (UID: \"68fadcf3-845e-4605-add5-6b5b6092e443\") " pod="openstack/cinder-api-0" Feb 17 14:28:41 crc kubenswrapper[4836]: I0217 14:28:41.402535 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrtlp\" (UniqueName: \"kubernetes.io/projected/68fadcf3-845e-4605-add5-6b5b6092e443-kube-api-access-wrtlp\") pod \"cinder-api-0\" (UID: \"68fadcf3-845e-4605-add5-6b5b6092e443\") " pod="openstack/cinder-api-0" Feb 17 14:28:41 crc kubenswrapper[4836]: I0217 14:28:41.559405 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-nvkvs" Feb 17 14:28:41 crc kubenswrapper[4836]: I0217 14:28:41.601779 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 17 14:28:41 crc kubenswrapper[4836]: I0217 14:28:41.673422 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-bc958ddf6-kh2rq" event={"ID":"42c3b1e3-728a-4bd8-9669-bfe1656b6de2","Type":"ContainerStarted","Data":"cd82728e9e642b4fdcdb3da0600fffa9b73fbd7c12deb985a9bb63d9550eb3b7"} Feb 17 14:28:41 crc kubenswrapper[4836]: I0217 14:28:41.675017 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-649d9995c8-rcxvp" Feb 17 14:28:43 crc kubenswrapper[4836]: I0217 14:28:43.418685 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 17 14:28:43 crc kubenswrapper[4836]: I0217 14:28:43.680406 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 17 14:28:43 crc kubenswrapper[4836]: I0217 14:28:43.751282 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-68fd77ffbb-m5r5c" event={"ID":"f79d706e-2d22-49c6-acb5-dc3f130ab102","Type":"ContainerStarted","Data":"5d611f254cd8b26e577a0770e4344757470d1efb9c2b80eb7d014f808d33145d"} Feb 17 14:28:43 crc kubenswrapper[4836]: I0217 14:28:43.761772 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6567fb9c77-xcq7p" event={"ID":"bf33e52a-365f-4ccc-8352-f4c7f8e2aebd","Type":"ContainerStarted","Data":"cada7ae52988ba210d8ab4a87f2fde9c64110427784b4893df0918b6e1e9743f"} Feb 17 14:28:43 crc kubenswrapper[4836]: I0217 14:28:43.768254 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-m9ds8" event={"ID":"d2472d16-7ff3-4f1c-a9b8-045dd8ffa6d0","Type":"ContainerStarted","Data":"285379d1eba3979dcb5d94df1c00afbe970d0914d5d9084d64761ce8e89152e8"} Feb 17 14:28:43 crc kubenswrapper[4836]: I0217 14:28:43.768529 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-85ff748b95-m9ds8" podUID="d2472d16-7ff3-4f1c-a9b8-045dd8ffa6d0" containerName="dnsmasq-dns" containerID="cri-o://285379d1eba3979dcb5d94df1c00afbe970d0914d5d9084d64761ce8e89152e8" gracePeriod=10 Feb 17 14:28:43 crc kubenswrapper[4836]: I0217 14:28:43.768659 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-85ff748b95-m9ds8" Feb 17 14:28:43 crc kubenswrapper[4836]: I0217 14:28:43.806262 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-85ff748b95-m9ds8" podStartSLOduration=12.806229064 podStartE2EDuration="12.806229064s" podCreationTimestamp="2026-02-17 14:28:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:28:43.792873601 +0000 UTC m=+1350.135801880" watchObservedRunningTime="2026-02-17 14:28:43.806229064 +0000 UTC m=+1350.149157333" Feb 17 14:28:43 crc kubenswrapper[4836]: I0217 14:28:43.844019 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 17 14:28:43 crc kubenswrapper[4836]: W0217 14:28:43.864530 4836 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf85d3a41_bec9_4783_a2c6_2e6627156cce.slice/crio-d7078c4a99b5b82f758808cb22901fe00a2b1684715c3195cb32e503ee688469 WatchSource:0}: Error finding container d7078c4a99b5b82f758808cb22901fe00a2b1684715c3195cb32e503ee688469: Status 404 returned error can't find the container with id d7078c4a99b5b82f758808cb22901fe00a2b1684715c3195cb32e503ee688469 Feb 17 14:28:44 crc kubenswrapper[4836]: I0217 14:28:44.039586 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-nvkvs"] Feb 17 14:28:44 crc kubenswrapper[4836]: I0217 14:28:44.454558 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-56bdc657f6-lhdd4" Feb 17 14:28:44 crc kubenswrapper[4836]: I0217 14:28:44.563574 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-m9ds8" Feb 17 14:28:44 crc kubenswrapper[4836]: I0217 14:28:44.608743 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-spw5p\" (UniqueName: \"kubernetes.io/projected/d2472d16-7ff3-4f1c-a9b8-045dd8ffa6d0-kube-api-access-spw5p\") pod \"d2472d16-7ff3-4f1c-a9b8-045dd8ffa6d0\" (UID: \"d2472d16-7ff3-4f1c-a9b8-045dd8ffa6d0\") " Feb 17 14:28:44 crc kubenswrapper[4836]: I0217 14:28:44.608859 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d2472d16-7ff3-4f1c-a9b8-045dd8ffa6d0-ovsdbserver-nb\") pod \"d2472d16-7ff3-4f1c-a9b8-045dd8ffa6d0\" (UID: \"d2472d16-7ff3-4f1c-a9b8-045dd8ffa6d0\") " Feb 17 14:28:44 crc kubenswrapper[4836]: I0217 14:28:44.608905 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d2472d16-7ff3-4f1c-a9b8-045dd8ffa6d0-ovsdbserver-sb\") pod \"d2472d16-7ff3-4f1c-a9b8-045dd8ffa6d0\" (UID: \"d2472d16-7ff3-4f1c-a9b8-045dd8ffa6d0\") " Feb 17 14:28:44 crc kubenswrapper[4836]: I0217 14:28:44.608935 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d2472d16-7ff3-4f1c-a9b8-045dd8ffa6d0-dns-swift-storage-0\") pod \"d2472d16-7ff3-4f1c-a9b8-045dd8ffa6d0\" (UID: \"d2472d16-7ff3-4f1c-a9b8-045dd8ffa6d0\") " Feb 17 14:28:44 crc kubenswrapper[4836]: I0217 14:28:44.609110 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2472d16-7ff3-4f1c-a9b8-045dd8ffa6d0-config\") pod \"d2472d16-7ff3-4f1c-a9b8-045dd8ffa6d0\" (UID: \"d2472d16-7ff3-4f1c-a9b8-045dd8ffa6d0\") " Feb 17 14:28:44 crc kubenswrapper[4836]: I0217 14:28:44.609209 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d2472d16-7ff3-4f1c-a9b8-045dd8ffa6d0-dns-svc\") pod \"d2472d16-7ff3-4f1c-a9b8-045dd8ffa6d0\" (UID: \"d2472d16-7ff3-4f1c-a9b8-045dd8ffa6d0\") " Feb 17 14:28:44 crc kubenswrapper[4836]: I0217 14:28:44.646431 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2472d16-7ff3-4f1c-a9b8-045dd8ffa6d0-kube-api-access-spw5p" (OuterVolumeSpecName: "kube-api-access-spw5p") pod "d2472d16-7ff3-4f1c-a9b8-045dd8ffa6d0" (UID: "d2472d16-7ff3-4f1c-a9b8-045dd8ffa6d0"). InnerVolumeSpecName "kube-api-access-spw5p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:28:44 crc kubenswrapper[4836]: I0217 14:28:44.713354 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-spw5p\" (UniqueName: \"kubernetes.io/projected/d2472d16-7ff3-4f1c-a9b8-045dd8ffa6d0-kube-api-access-spw5p\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:44 crc kubenswrapper[4836]: I0217 14:28:44.795702 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2472d16-7ff3-4f1c-a9b8-045dd8ffa6d0-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "d2472d16-7ff3-4f1c-a9b8-045dd8ffa6d0" (UID: "d2472d16-7ff3-4f1c-a9b8-045dd8ffa6d0"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:28:44 crc kubenswrapper[4836]: I0217 14:28:44.824480 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2472d16-7ff3-4f1c-a9b8-045dd8ffa6d0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d2472d16-7ff3-4f1c-a9b8-045dd8ffa6d0" (UID: "d2472d16-7ff3-4f1c-a9b8-045dd8ffa6d0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:28:44 crc kubenswrapper[4836]: I0217 14:28:44.833004 4836 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d2472d16-7ff3-4f1c-a9b8-045dd8ffa6d0-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:44 crc kubenswrapper[4836]: I0217 14:28:44.833120 4836 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d2472d16-7ff3-4f1c-a9b8-045dd8ffa6d0-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:44 crc kubenswrapper[4836]: I0217 14:28:44.892712 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2472d16-7ff3-4f1c-a9b8-045dd8ffa6d0-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d2472d16-7ff3-4f1c-a9b8-045dd8ffa6d0" (UID: "d2472d16-7ff3-4f1c-a9b8-045dd8ffa6d0"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:28:44 crc kubenswrapper[4836]: I0217 14:28:44.899691 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-68fd77ffbb-m5r5c" event={"ID":"f79d706e-2d22-49c6-acb5-dc3f130ab102","Type":"ContainerStarted","Data":"c5627163cba14a6237b268dc4e67eff4bc9b5ac5e211c616969c2bb5f1c1dfe4"} Feb 17 14:28:44 crc kubenswrapper[4836]: I0217 14:28:44.945389 4836 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d2472d16-7ff3-4f1c-a9b8-045dd8ffa6d0-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:44 crc kubenswrapper[4836]: I0217 14:28:44.951878 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6567fb9c77-xcq7p" event={"ID":"bf33e52a-365f-4ccc-8352-f4c7f8e2aebd","Type":"ContainerStarted","Data":"cc1332951352adff0e97b1b3dcac0fe2080c98fdd0a7ef00edae5a858db9e20d"} Feb 17 14:28:44 crc kubenswrapper[4836]: I0217 14:28:44.961404 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2472d16-7ff3-4f1c-a9b8-045dd8ffa6d0-config" (OuterVolumeSpecName: "config") pod "d2472d16-7ff3-4f1c-a9b8-045dd8ffa6d0" (UID: "d2472d16-7ff3-4f1c-a9b8-045dd8ffa6d0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:28:45 crc kubenswrapper[4836]: I0217 14:28:44.999570 4836 generic.go:334] "Generic (PLEG): container finished" podID="d2472d16-7ff3-4f1c-a9b8-045dd8ffa6d0" containerID="285379d1eba3979dcb5d94df1c00afbe970d0914d5d9084d64761ce8e89152e8" exitCode=0 Feb 17 14:28:45 crc kubenswrapper[4836]: I0217 14:28:44.999690 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-m9ds8" event={"ID":"d2472d16-7ff3-4f1c-a9b8-045dd8ffa6d0","Type":"ContainerDied","Data":"285379d1eba3979dcb5d94df1c00afbe970d0914d5d9084d64761ce8e89152e8"} Feb 17 14:28:45 crc kubenswrapper[4836]: I0217 14:28:44.999728 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-m9ds8" event={"ID":"d2472d16-7ff3-4f1c-a9b8-045dd8ffa6d0","Type":"ContainerDied","Data":"769356b8227ec8df8f5b18dca0f8472d3df22108be6b842885971987f0b77c6e"} Feb 17 14:28:45 crc kubenswrapper[4836]: I0217 14:28:44.999751 4836 scope.go:117] "RemoveContainer" containerID="285379d1eba3979dcb5d94df1c00afbe970d0914d5d9084d64761ce8e89152e8" Feb 17 14:28:45 crc kubenswrapper[4836]: I0217 14:28:44.999949 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-m9ds8" Feb 17 14:28:45 crc kubenswrapper[4836]: I0217 14:28:45.019215 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-nvkvs" event={"ID":"79b71acb-6b55-4f99-8b13-0c5aea065cbb","Type":"ContainerStarted","Data":"1fe3b4e682953cc1e2a6a78ba19a4ca238a5effc3b1823c6d9c0ce3876e226a4"} Feb 17 14:28:45 crc kubenswrapper[4836]: I0217 14:28:45.030353 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"f85d3a41-bec9-4783-a2c6-2e6627156cce","Type":"ContainerStarted","Data":"d7078c4a99b5b82f758808cb22901fe00a2b1684715c3195cb32e503ee688469"} Feb 17 14:28:45 crc kubenswrapper[4836]: I0217 14:28:45.042216 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"68fadcf3-845e-4605-add5-6b5b6092e443","Type":"ContainerStarted","Data":"f48faf4eb5b80c2380cc0bd40d17893777d9a019452082ae03e6f15d493c9099"} Feb 17 14:28:45 crc kubenswrapper[4836]: I0217 14:28:45.051832 4836 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2472d16-7ff3-4f1c-a9b8-045dd8ffa6d0-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:45 crc kubenswrapper[4836]: I0217 14:28:45.064740 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-bc789578f-mcrrx"] Feb 17 14:28:45 crc kubenswrapper[4836]: I0217 14:28:45.065095 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-bc789578f-mcrrx" podUID="a7dc98d2-302d-4633-8123-fe76bb7dbd40" containerName="neutron-api" containerID="cri-o://ef329f1c472e28115c477d0f824ce0452609f500341f5fe161170bb1b7dd1f36" gracePeriod=30 Feb 17 14:28:45 crc kubenswrapper[4836]: I0217 14:28:45.066081 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-bc789578f-mcrrx" podUID="a7dc98d2-302d-4633-8123-fe76bb7dbd40" containerName="neutron-httpd" containerID="cri-o://29474b05f933bb7261368e691fe6f6124baae6cdcaac7f0997ad485f3fcff20d" gracePeriod=30 Feb 17 14:28:45 crc kubenswrapper[4836]: I0217 14:28:45.096341 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6fc4994bf7-cqhhj"] Feb 17 14:28:45 crc kubenswrapper[4836]: E0217 14:28:45.096841 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2472d16-7ff3-4f1c-a9b8-045dd8ffa6d0" containerName="dnsmasq-dns" Feb 17 14:28:45 crc kubenswrapper[4836]: I0217 14:28:45.096857 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2472d16-7ff3-4f1c-a9b8-045dd8ffa6d0" containerName="dnsmasq-dns" Feb 17 14:28:45 crc kubenswrapper[4836]: E0217 14:28:45.096902 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2472d16-7ff3-4f1c-a9b8-045dd8ffa6d0" containerName="init" Feb 17 14:28:45 crc kubenswrapper[4836]: I0217 14:28:45.096910 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2472d16-7ff3-4f1c-a9b8-045dd8ffa6d0" containerName="init" Feb 17 14:28:45 crc kubenswrapper[4836]: I0217 14:28:45.097134 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2472d16-7ff3-4f1c-a9b8-045dd8ffa6d0" containerName="dnsmasq-dns" Feb 17 14:28:45 crc kubenswrapper[4836]: I0217 14:28:45.100451 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6fc4994bf7-cqhhj" Feb 17 14:28:45 crc kubenswrapper[4836]: I0217 14:28:45.111574 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2472d16-7ff3-4f1c-a9b8-045dd8ffa6d0-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d2472d16-7ff3-4f1c-a9b8-045dd8ffa6d0" (UID: "d2472d16-7ff3-4f1c-a9b8-045dd8ffa6d0"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:28:45 crc kubenswrapper[4836]: I0217 14:28:45.125461 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-bc789578f-mcrrx" Feb 17 14:28:45 crc kubenswrapper[4836]: I0217 14:28:45.155261 4836 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d2472d16-7ff3-4f1c-a9b8-045dd8ffa6d0-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:45 crc kubenswrapper[4836]: I0217 14:28:45.159723 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6fc4994bf7-cqhhj"] Feb 17 14:28:45 crc kubenswrapper[4836]: I0217 14:28:45.226109 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-68fd77ffbb-m5r5c" podStartSLOduration=10.421944468 podStartE2EDuration="14.226078448s" podCreationTimestamp="2026-02-17 14:28:31 +0000 UTC" firstStartedPulling="2026-02-17 14:28:39.24281577 +0000 UTC m=+1345.585744039" lastFinishedPulling="2026-02-17 14:28:43.04694975 +0000 UTC m=+1349.389878019" observedRunningTime="2026-02-17 14:28:44.946214659 +0000 UTC m=+1351.289142958" watchObservedRunningTime="2026-02-17 14:28:45.226078448 +0000 UTC m=+1351.569006727" Feb 17 14:28:45 crc kubenswrapper[4836]: I0217 14:28:45.250822 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-6567fb9c77-xcq7p" podStartSLOduration=10.405541462 podStartE2EDuration="14.25078654s" podCreationTimestamp="2026-02-17 14:28:31 +0000 UTC" firstStartedPulling="2026-02-17 14:28:39.24245248 +0000 UTC m=+1345.585380749" lastFinishedPulling="2026-02-17 14:28:43.087697558 +0000 UTC m=+1349.430625827" observedRunningTime="2026-02-17 14:28:45.017015684 +0000 UTC m=+1351.359943963" watchObservedRunningTime="2026-02-17 14:28:45.25078654 +0000 UTC m=+1351.593714819" Feb 17 14:28:45 crc kubenswrapper[4836]: I0217 14:28:45.257214 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/88848d0f-5d90-4ca0-9a78-d08e73159601-config\") pod \"neutron-6fc4994bf7-cqhhj\" (UID: \"88848d0f-5d90-4ca0-9a78-d08e73159601\") " pod="openstack/neutron-6fc4994bf7-cqhhj" Feb 17 14:28:45 crc kubenswrapper[4836]: I0217 14:28:45.258875 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/88848d0f-5d90-4ca0-9a78-d08e73159601-ovndb-tls-certs\") pod \"neutron-6fc4994bf7-cqhhj\" (UID: \"88848d0f-5d90-4ca0-9a78-d08e73159601\") " pod="openstack/neutron-6fc4994bf7-cqhhj" Feb 17 14:28:45 crc kubenswrapper[4836]: I0217 14:28:45.258984 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8597g\" (UniqueName: \"kubernetes.io/projected/88848d0f-5d90-4ca0-9a78-d08e73159601-kube-api-access-8597g\") pod \"neutron-6fc4994bf7-cqhhj\" (UID: \"88848d0f-5d90-4ca0-9a78-d08e73159601\") " pod="openstack/neutron-6fc4994bf7-cqhhj" Feb 17 14:28:45 crc kubenswrapper[4836]: I0217 14:28:45.259080 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88848d0f-5d90-4ca0-9a78-d08e73159601-combined-ca-bundle\") pod \"neutron-6fc4994bf7-cqhhj\" (UID: \"88848d0f-5d90-4ca0-9a78-d08e73159601\") " pod="openstack/neutron-6fc4994bf7-cqhhj" Feb 17 14:28:45 crc kubenswrapper[4836]: I0217 14:28:45.259196 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/88848d0f-5d90-4ca0-9a78-d08e73159601-httpd-config\") pod \"neutron-6fc4994bf7-cqhhj\" (UID: \"88848d0f-5d90-4ca0-9a78-d08e73159601\") " pod="openstack/neutron-6fc4994bf7-cqhhj" Feb 17 14:28:45 crc kubenswrapper[4836]: I0217 14:28:45.259347 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/88848d0f-5d90-4ca0-9a78-d08e73159601-public-tls-certs\") pod \"neutron-6fc4994bf7-cqhhj\" (UID: \"88848d0f-5d90-4ca0-9a78-d08e73159601\") " pod="openstack/neutron-6fc4994bf7-cqhhj" Feb 17 14:28:45 crc kubenswrapper[4836]: I0217 14:28:45.259519 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/88848d0f-5d90-4ca0-9a78-d08e73159601-internal-tls-certs\") pod \"neutron-6fc4994bf7-cqhhj\" (UID: \"88848d0f-5d90-4ca0-9a78-d08e73159601\") " pod="openstack/neutron-6fc4994bf7-cqhhj" Feb 17 14:28:45 crc kubenswrapper[4836]: I0217 14:28:45.283969 4836 scope.go:117] "RemoveContainer" containerID="42d2d98237842a647a52d62a45a7a321647f264ef6fe4dba4d7263c62fd90bc4" Feb 17 14:28:45 crc kubenswrapper[4836]: I0217 14:28:45.362137 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8597g\" (UniqueName: \"kubernetes.io/projected/88848d0f-5d90-4ca0-9a78-d08e73159601-kube-api-access-8597g\") pod \"neutron-6fc4994bf7-cqhhj\" (UID: \"88848d0f-5d90-4ca0-9a78-d08e73159601\") " pod="openstack/neutron-6fc4994bf7-cqhhj" Feb 17 14:28:45 crc kubenswrapper[4836]: I0217 14:28:45.362966 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88848d0f-5d90-4ca0-9a78-d08e73159601-combined-ca-bundle\") pod \"neutron-6fc4994bf7-cqhhj\" (UID: \"88848d0f-5d90-4ca0-9a78-d08e73159601\") " pod="openstack/neutron-6fc4994bf7-cqhhj" Feb 17 14:28:45 crc kubenswrapper[4836]: I0217 14:28:45.364029 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/88848d0f-5d90-4ca0-9a78-d08e73159601-httpd-config\") pod \"neutron-6fc4994bf7-cqhhj\" (UID: \"88848d0f-5d90-4ca0-9a78-d08e73159601\") " pod="openstack/neutron-6fc4994bf7-cqhhj" Feb 17 14:28:45 crc kubenswrapper[4836]: I0217 14:28:45.364394 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/88848d0f-5d90-4ca0-9a78-d08e73159601-public-tls-certs\") pod \"neutron-6fc4994bf7-cqhhj\" (UID: \"88848d0f-5d90-4ca0-9a78-d08e73159601\") " pod="openstack/neutron-6fc4994bf7-cqhhj" Feb 17 14:28:45 crc kubenswrapper[4836]: I0217 14:28:45.364676 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/88848d0f-5d90-4ca0-9a78-d08e73159601-internal-tls-certs\") pod \"neutron-6fc4994bf7-cqhhj\" (UID: \"88848d0f-5d90-4ca0-9a78-d08e73159601\") " pod="openstack/neutron-6fc4994bf7-cqhhj" Feb 17 14:28:45 crc kubenswrapper[4836]: I0217 14:28:45.364865 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/88848d0f-5d90-4ca0-9a78-d08e73159601-config\") pod \"neutron-6fc4994bf7-cqhhj\" (UID: \"88848d0f-5d90-4ca0-9a78-d08e73159601\") " pod="openstack/neutron-6fc4994bf7-cqhhj" Feb 17 14:28:45 crc kubenswrapper[4836]: I0217 14:28:45.365005 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/88848d0f-5d90-4ca0-9a78-d08e73159601-ovndb-tls-certs\") pod \"neutron-6fc4994bf7-cqhhj\" (UID: \"88848d0f-5d90-4ca0-9a78-d08e73159601\") " pod="openstack/neutron-6fc4994bf7-cqhhj" Feb 17 14:28:45 crc kubenswrapper[4836]: I0217 14:28:45.369684 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/88848d0f-5d90-4ca0-9a78-d08e73159601-internal-tls-certs\") pod \"neutron-6fc4994bf7-cqhhj\" (UID: \"88848d0f-5d90-4ca0-9a78-d08e73159601\") " pod="openstack/neutron-6fc4994bf7-cqhhj" Feb 17 14:28:45 crc kubenswrapper[4836]: I0217 14:28:45.371124 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88848d0f-5d90-4ca0-9a78-d08e73159601-combined-ca-bundle\") pod \"neutron-6fc4994bf7-cqhhj\" (UID: \"88848d0f-5d90-4ca0-9a78-d08e73159601\") " pod="openstack/neutron-6fc4994bf7-cqhhj" Feb 17 14:28:45 crc kubenswrapper[4836]: I0217 14:28:45.371987 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/88848d0f-5d90-4ca0-9a78-d08e73159601-ovndb-tls-certs\") pod \"neutron-6fc4994bf7-cqhhj\" (UID: \"88848d0f-5d90-4ca0-9a78-d08e73159601\") " pod="openstack/neutron-6fc4994bf7-cqhhj" Feb 17 14:28:45 crc kubenswrapper[4836]: I0217 14:28:45.377549 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/88848d0f-5d90-4ca0-9a78-d08e73159601-public-tls-certs\") pod \"neutron-6fc4994bf7-cqhhj\" (UID: \"88848d0f-5d90-4ca0-9a78-d08e73159601\") " pod="openstack/neutron-6fc4994bf7-cqhhj" Feb 17 14:28:45 crc kubenswrapper[4836]: I0217 14:28:45.378810 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/88848d0f-5d90-4ca0-9a78-d08e73159601-httpd-config\") pod \"neutron-6fc4994bf7-cqhhj\" (UID: \"88848d0f-5d90-4ca0-9a78-d08e73159601\") " pod="openstack/neutron-6fc4994bf7-cqhhj" Feb 17 14:28:45 crc kubenswrapper[4836]: I0217 14:28:45.382895 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/88848d0f-5d90-4ca0-9a78-d08e73159601-config\") pod \"neutron-6fc4994bf7-cqhhj\" (UID: \"88848d0f-5d90-4ca0-9a78-d08e73159601\") " pod="openstack/neutron-6fc4994bf7-cqhhj" Feb 17 14:28:45 crc kubenswrapper[4836]: I0217 14:28:45.383669 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8597g\" (UniqueName: \"kubernetes.io/projected/88848d0f-5d90-4ca0-9a78-d08e73159601-kube-api-access-8597g\") pod \"neutron-6fc4994bf7-cqhhj\" (UID: \"88848d0f-5d90-4ca0-9a78-d08e73159601\") " pod="openstack/neutron-6fc4994bf7-cqhhj" Feb 17 14:28:45 crc kubenswrapper[4836]: I0217 14:28:45.546677 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-m9ds8"] Feb 17 14:28:45 crc kubenswrapper[4836]: I0217 14:28:45.564694 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6fc4994bf7-cqhhj" Feb 17 14:28:45 crc kubenswrapper[4836]: I0217 14:28:45.588083 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-m9ds8"] Feb 17 14:28:45 crc kubenswrapper[4836]: I0217 14:28:45.671858 4836 scope.go:117] "RemoveContainer" containerID="285379d1eba3979dcb5d94df1c00afbe970d0914d5d9084d64761ce8e89152e8" Feb 17 14:28:45 crc kubenswrapper[4836]: E0217 14:28:45.674270 4836 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"285379d1eba3979dcb5d94df1c00afbe970d0914d5d9084d64761ce8e89152e8\": container with ID starting with 285379d1eba3979dcb5d94df1c00afbe970d0914d5d9084d64761ce8e89152e8 not found: ID does not exist" containerID="285379d1eba3979dcb5d94df1c00afbe970d0914d5d9084d64761ce8e89152e8" Feb 17 14:28:45 crc kubenswrapper[4836]: I0217 14:28:45.674356 4836 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"285379d1eba3979dcb5d94df1c00afbe970d0914d5d9084d64761ce8e89152e8"} err="failed to get container status \"285379d1eba3979dcb5d94df1c00afbe970d0914d5d9084d64761ce8e89152e8\": rpc error: code = NotFound desc = could not find container \"285379d1eba3979dcb5d94df1c00afbe970d0914d5d9084d64761ce8e89152e8\": container with ID starting with 285379d1eba3979dcb5d94df1c00afbe970d0914d5d9084d64761ce8e89152e8 not found: ID does not exist" Feb 17 14:28:45 crc kubenswrapper[4836]: I0217 14:28:45.674414 4836 scope.go:117] "RemoveContainer" containerID="42d2d98237842a647a52d62a45a7a321647f264ef6fe4dba4d7263c62fd90bc4" Feb 17 14:28:45 crc kubenswrapper[4836]: E0217 14:28:45.675128 4836 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42d2d98237842a647a52d62a45a7a321647f264ef6fe4dba4d7263c62fd90bc4\": container with ID starting with 42d2d98237842a647a52d62a45a7a321647f264ef6fe4dba4d7263c62fd90bc4 not found: ID does not exist" containerID="42d2d98237842a647a52d62a45a7a321647f264ef6fe4dba4d7263c62fd90bc4" Feb 17 14:28:45 crc kubenswrapper[4836]: I0217 14:28:45.675169 4836 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42d2d98237842a647a52d62a45a7a321647f264ef6fe4dba4d7263c62fd90bc4"} err="failed to get container status \"42d2d98237842a647a52d62a45a7a321647f264ef6fe4dba4d7263c62fd90bc4\": rpc error: code = NotFound desc = could not find container \"42d2d98237842a647a52d62a45a7a321647f264ef6fe4dba4d7263c62fd90bc4\": container with ID starting with 42d2d98237842a647a52d62a45a7a321647f264ef6fe4dba4d7263c62fd90bc4 not found: ID does not exist" Feb 17 14:28:46 crc kubenswrapper[4836]: I0217 14:28:46.103992 4836 generic.go:334] "Generic (PLEG): container finished" podID="a7dc98d2-302d-4633-8123-fe76bb7dbd40" containerID="29474b05f933bb7261368e691fe6f6124baae6cdcaac7f0997ad485f3fcff20d" exitCode=0 Feb 17 14:28:46 crc kubenswrapper[4836]: I0217 14:28:46.104460 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-bc789578f-mcrrx" event={"ID":"a7dc98d2-302d-4633-8123-fe76bb7dbd40","Type":"ContainerDied","Data":"29474b05f933bb7261368e691fe6f6124baae6cdcaac7f0997ad485f3fcff20d"} Feb 17 14:28:46 crc kubenswrapper[4836]: I0217 14:28:46.108130 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"68fadcf3-845e-4605-add5-6b5b6092e443","Type":"ContainerStarted","Data":"1faaca327f5f255f0110639e69475e673f9cf85d2a5f0e368b4e02002323c46e"} Feb 17 14:28:46 crc kubenswrapper[4836]: I0217 14:28:46.165725 4836 generic.go:334] "Generic (PLEG): container finished" podID="79b71acb-6b55-4f99-8b13-0c5aea065cbb" containerID="277bd33eae834b988e7c295c653ee707631d0efdc5453cfacb6a97be01ceb016" exitCode=0 Feb 17 14:28:46 crc kubenswrapper[4836]: I0217 14:28:46.166145 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-nvkvs" event={"ID":"79b71acb-6b55-4f99-8b13-0c5aea065cbb","Type":"ContainerDied","Data":"277bd33eae834b988e7c295c653ee707631d0efdc5453cfacb6a97be01ceb016"} Feb 17 14:28:46 crc kubenswrapper[4836]: I0217 14:28:46.413217 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6fc4994bf7-cqhhj"] Feb 17 14:28:46 crc kubenswrapper[4836]: W0217 14:28:46.497228 4836 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod88848d0f_5d90_4ca0_9a78_d08e73159601.slice/crio-05963d65db74fb793e8950376bfdf51fdcf588dc4e9e04371a2fbbc4a0e16f14 WatchSource:0}: Error finding container 05963d65db74fb793e8950376bfdf51fdcf588dc4e9e04371a2fbbc4a0e16f14: Status 404 returned error can't find the container with id 05963d65db74fb793e8950376bfdf51fdcf588dc4e9e04371a2fbbc4a0e16f14 Feb 17 14:28:46 crc kubenswrapper[4836]: I0217 14:28:46.591412 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2472d16-7ff3-4f1c-a9b8-045dd8ffa6d0" path="/var/lib/kubelet/pods/d2472d16-7ff3-4f1c-a9b8-045dd8ffa6d0/volumes" Feb 17 14:28:46 crc kubenswrapper[4836]: I0217 14:28:46.812905 4836 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-bc789578f-mcrrx" podUID="a7dc98d2-302d-4633-8123-fe76bb7dbd40" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.170:9696/\": dial tcp 10.217.0.170:9696: connect: connection refused" Feb 17 14:28:47 crc kubenswrapper[4836]: I0217 14:28:47.263451 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6fc4994bf7-cqhhj" event={"ID":"88848d0f-5d90-4ca0-9a78-d08e73159601","Type":"ContainerStarted","Data":"05963d65db74fb793e8950376bfdf51fdcf588dc4e9e04371a2fbbc4a0e16f14"} Feb 17 14:28:48 crc kubenswrapper[4836]: I0217 14:28:48.319589 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-nvkvs" event={"ID":"79b71acb-6b55-4f99-8b13-0c5aea065cbb","Type":"ContainerStarted","Data":"116ce92f31628ecf8d5384bc487f6288540b5a8b08da5572838c3c49083bb344"} Feb 17 14:28:48 crc kubenswrapper[4836]: I0217 14:28:48.321681 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c9776ccc5-nvkvs" Feb 17 14:28:48 crc kubenswrapper[4836]: I0217 14:28:48.342531 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"f85d3a41-bec9-4783-a2c6-2e6627156cce","Type":"ContainerStarted","Data":"713b2d67a8bcdbac716e75c48ae87c2e03fb9564a7bca1ee0f598db4994298c7"} Feb 17 14:28:48 crc kubenswrapper[4836]: I0217 14:28:48.345048 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"68fadcf3-845e-4605-add5-6b5b6092e443","Type":"ContainerStarted","Data":"46227394029fb44dba4a63e6488a9dfab06641f1dffd60fc1769e1f7240d858f"} Feb 17 14:28:48 crc kubenswrapper[4836]: I0217 14:28:48.345270 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="68fadcf3-845e-4605-add5-6b5b6092e443" containerName="cinder-api-log" containerID="cri-o://1faaca327f5f255f0110639e69475e673f9cf85d2a5f0e368b4e02002323c46e" gracePeriod=30 Feb 17 14:28:48 crc kubenswrapper[4836]: I0217 14:28:48.345675 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 17 14:28:48 crc kubenswrapper[4836]: I0217 14:28:48.345716 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="68fadcf3-845e-4605-add5-6b5b6092e443" containerName="cinder-api" containerID="cri-o://46227394029fb44dba4a63e6488a9dfab06641f1dffd60fc1769e1f7240d858f" gracePeriod=30 Feb 17 14:28:48 crc kubenswrapper[4836]: I0217 14:28:48.363453 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 17 14:28:48 crc kubenswrapper[4836]: I0217 14:28:48.363619 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 17 14:28:48 crc kubenswrapper[4836]: I0217 14:28:48.366470 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c9776ccc5-nvkvs" podStartSLOduration=8.366443561 podStartE2EDuration="8.366443561s" podCreationTimestamp="2026-02-17 14:28:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:28:48.352634376 +0000 UTC m=+1354.695562655" watchObservedRunningTime="2026-02-17 14:28:48.366443561 +0000 UTC m=+1354.709371830" Feb 17 14:28:48 crc kubenswrapper[4836]: I0217 14:28:48.381601 4836 generic.go:334] "Generic (PLEG): container finished" podID="4e016162-2025-44ad-989d-ce71d9f8f9bf" containerID="fc7f81c47e20cce7a74c227545b963bd61d6dadbccf7dacfaa97a9b912354775" exitCode=0 Feb 17 14:28:48 crc kubenswrapper[4836]: I0217 14:28:48.381734 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-pvljf" event={"ID":"4e016162-2025-44ad-989d-ce71d9f8f9bf","Type":"ContainerDied","Data":"fc7f81c47e20cce7a74c227545b963bd61d6dadbccf7dacfaa97a9b912354775"} Feb 17 14:28:48 crc kubenswrapper[4836]: I0217 14:28:48.398377 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=7.398349279 podStartE2EDuration="7.398349279s" podCreationTimestamp="2026-02-17 14:28:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:28:48.38551849 +0000 UTC m=+1354.728446789" watchObservedRunningTime="2026-02-17 14:28:48.398349279 +0000 UTC m=+1354.741277558" Feb 17 14:28:48 crc kubenswrapper[4836]: I0217 14:28:48.412569 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6fc4994bf7-cqhhj" event={"ID":"88848d0f-5d90-4ca0-9a78-d08e73159601","Type":"ContainerStarted","Data":"4f76baeee98c161173add9a106f2846a4169503adf035510e6744fb02106f608"} Feb 17 14:28:48 crc kubenswrapper[4836]: I0217 14:28:48.414691 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6fc4994bf7-cqhhj" Feb 17 14:28:48 crc kubenswrapper[4836]: I0217 14:28:48.459650 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 17 14:28:48 crc kubenswrapper[4836]: I0217 14:28:48.460068 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 17 14:28:48 crc kubenswrapper[4836]: I0217 14:28:48.485661 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 17 14:28:48 crc kubenswrapper[4836]: I0217 14:28:48.497099 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 17 14:28:48 crc kubenswrapper[4836]: I0217 14:28:48.501992 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6fc4994bf7-cqhhj" podStartSLOduration=4.501961036 podStartE2EDuration="4.501961036s" podCreationTimestamp="2026-02-17 14:28:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:28:48.45245549 +0000 UTC m=+1354.795383789" watchObservedRunningTime="2026-02-17 14:28:48.501961036 +0000 UTC m=+1354.844889325" Feb 17 14:28:48 crc kubenswrapper[4836]: I0217 14:28:48.548726 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 17 14:28:48 crc kubenswrapper[4836]: I0217 14:28:48.735370 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 17 14:28:48 crc kubenswrapper[4836]: I0217 14:28:48.747527 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7dc9c9fdbb-zxjj6" Feb 17 14:28:48 crc kubenswrapper[4836]: I0217 14:28:48.761073 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7dc9c9fdbb-zxjj6" Feb 17 14:28:48 crc kubenswrapper[4836]: I0217 14:28:48.904574 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-649d9995c8-rcxvp"] Feb 17 14:28:48 crc kubenswrapper[4836]: I0217 14:28:48.905409 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-649d9995c8-rcxvp" podUID="ed247b9d-af54-401e-80a3-82d18772f29d" containerName="barbican-api-log" containerID="cri-o://1cbfdb3ab6153a9ef06f48a83598052cef7a8d6eccc057f49dbfbf1a10abee8c" gracePeriod=30 Feb 17 14:28:48 crc kubenswrapper[4836]: I0217 14:28:48.906393 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-649d9995c8-rcxvp" podUID="ed247b9d-af54-401e-80a3-82d18772f29d" containerName="barbican-api" containerID="cri-o://0f637c77116f7f89955d3abdef322406d505cefd736b74cd4ec4ac6a045c16f5" gracePeriod=30 Feb 17 14:28:48 crc kubenswrapper[4836]: I0217 14:28:48.923735 4836 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-649d9995c8-rcxvp" podUID="ed247b9d-af54-401e-80a3-82d18772f29d" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.179:9311/healthcheck\": EOF" Feb 17 14:28:48 crc kubenswrapper[4836]: I0217 14:28:48.924444 4836 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-649d9995c8-rcxvp" podUID="ed247b9d-af54-401e-80a3-82d18772f29d" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.179:9311/healthcheck\": EOF" Feb 17 14:28:48 crc kubenswrapper[4836]: I0217 14:28:48.924612 4836 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-649d9995c8-rcxvp" podUID="ed247b9d-af54-401e-80a3-82d18772f29d" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.179:9311/healthcheck\": EOF" Feb 17 14:28:48 crc kubenswrapper[4836]: I0217 14:28:48.924737 4836 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-649d9995c8-rcxvp" podUID="ed247b9d-af54-401e-80a3-82d18772f29d" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.179:9311/healthcheck\": EOF" Feb 17 14:28:49 crc kubenswrapper[4836]: I0217 14:28:49.443603 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6fc4994bf7-cqhhj" event={"ID":"88848d0f-5d90-4ca0-9a78-d08e73159601","Type":"ContainerStarted","Data":"7cede19c06071d792fd48dc22325538e54e536266b5f02989155c0035ae46d82"} Feb 17 14:28:49 crc kubenswrapper[4836]: I0217 14:28:49.475180 4836 generic.go:334] "Generic (PLEG): container finished" podID="ed247b9d-af54-401e-80a3-82d18772f29d" containerID="1cbfdb3ab6153a9ef06f48a83598052cef7a8d6eccc057f49dbfbf1a10abee8c" exitCode=143 Feb 17 14:28:49 crc kubenswrapper[4836]: I0217 14:28:49.475316 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-649d9995c8-rcxvp" event={"ID":"ed247b9d-af54-401e-80a3-82d18772f29d","Type":"ContainerDied","Data":"1cbfdb3ab6153a9ef06f48a83598052cef7a8d6eccc057f49dbfbf1a10abee8c"} Feb 17 14:28:49 crc kubenswrapper[4836]: I0217 14:28:49.495632 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"f85d3a41-bec9-4783-a2c6-2e6627156cce","Type":"ContainerStarted","Data":"b853d8aee890fe988c231f41ba4f29bef7974b65a59ee2897863ebef85aca6e3"} Feb 17 14:28:49 crc kubenswrapper[4836]: I0217 14:28:49.534125 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=7.666934042 podStartE2EDuration="9.534098919s" podCreationTimestamp="2026-02-17 14:28:40 +0000 UTC" firstStartedPulling="2026-02-17 14:28:43.882463596 +0000 UTC m=+1350.225391865" lastFinishedPulling="2026-02-17 14:28:45.749628473 +0000 UTC m=+1352.092556742" observedRunningTime="2026-02-17 14:28:49.533372468 +0000 UTC m=+1355.876300737" watchObservedRunningTime="2026-02-17 14:28:49.534098919 +0000 UTC m=+1355.877027198" Feb 17 14:28:49 crc kubenswrapper[4836]: I0217 14:28:49.549194 4836 generic.go:334] "Generic (PLEG): container finished" podID="68fadcf3-845e-4605-add5-6b5b6092e443" containerID="46227394029fb44dba4a63e6488a9dfab06641f1dffd60fc1769e1f7240d858f" exitCode=0 Feb 17 14:28:49 crc kubenswrapper[4836]: I0217 14:28:49.549668 4836 generic.go:334] "Generic (PLEG): container finished" podID="68fadcf3-845e-4605-add5-6b5b6092e443" containerID="1faaca327f5f255f0110639e69475e673f9cf85d2a5f0e368b4e02002323c46e" exitCode=143 Feb 17 14:28:49 crc kubenswrapper[4836]: I0217 14:28:49.549416 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"68fadcf3-845e-4605-add5-6b5b6092e443","Type":"ContainerDied","Data":"46227394029fb44dba4a63e6488a9dfab06641f1dffd60fc1769e1f7240d858f"} Feb 17 14:28:49 crc kubenswrapper[4836]: I0217 14:28:49.552049 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 17 14:28:49 crc kubenswrapper[4836]: I0217 14:28:49.552147 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"68fadcf3-845e-4605-add5-6b5b6092e443","Type":"ContainerDied","Data":"1faaca327f5f255f0110639e69475e673f9cf85d2a5f0e368b4e02002323c46e"} Feb 17 14:28:49 crc kubenswrapper[4836]: I0217 14:28:49.554922 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 17 14:28:49 crc kubenswrapper[4836]: I0217 14:28:49.554965 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 17 14:28:49 crc kubenswrapper[4836]: I0217 14:28:49.554982 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 17 14:28:49 crc kubenswrapper[4836]: I0217 14:28:49.830759 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 17 14:28:49 crc kubenswrapper[4836]: I0217 14:28:49.909733 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68fadcf3-845e-4605-add5-6b5b6092e443-scripts\") pod \"68fadcf3-845e-4605-add5-6b5b6092e443\" (UID: \"68fadcf3-845e-4605-add5-6b5b6092e443\") " Feb 17 14:28:49 crc kubenswrapper[4836]: I0217 14:28:49.910245 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wrtlp\" (UniqueName: \"kubernetes.io/projected/68fadcf3-845e-4605-add5-6b5b6092e443-kube-api-access-wrtlp\") pod \"68fadcf3-845e-4605-add5-6b5b6092e443\" (UID: \"68fadcf3-845e-4605-add5-6b5b6092e443\") " Feb 17 14:28:49 crc kubenswrapper[4836]: I0217 14:28:49.910408 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/68fadcf3-845e-4605-add5-6b5b6092e443-logs\") pod \"68fadcf3-845e-4605-add5-6b5b6092e443\" (UID: \"68fadcf3-845e-4605-add5-6b5b6092e443\") " Feb 17 14:28:49 crc kubenswrapper[4836]: I0217 14:28:49.910455 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/68fadcf3-845e-4605-add5-6b5b6092e443-config-data-custom\") pod \"68fadcf3-845e-4605-add5-6b5b6092e443\" (UID: \"68fadcf3-845e-4605-add5-6b5b6092e443\") " Feb 17 14:28:49 crc kubenswrapper[4836]: I0217 14:28:49.910647 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/68fadcf3-845e-4605-add5-6b5b6092e443-etc-machine-id\") pod \"68fadcf3-845e-4605-add5-6b5b6092e443\" (UID: \"68fadcf3-845e-4605-add5-6b5b6092e443\") " Feb 17 14:28:49 crc kubenswrapper[4836]: I0217 14:28:49.910691 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68fadcf3-845e-4605-add5-6b5b6092e443-config-data\") pod \"68fadcf3-845e-4605-add5-6b5b6092e443\" (UID: \"68fadcf3-845e-4605-add5-6b5b6092e443\") " Feb 17 14:28:49 crc kubenswrapper[4836]: I0217 14:28:49.910748 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68fadcf3-845e-4605-add5-6b5b6092e443-combined-ca-bundle\") pod \"68fadcf3-845e-4605-add5-6b5b6092e443\" (UID: \"68fadcf3-845e-4605-add5-6b5b6092e443\") " Feb 17 14:28:49 crc kubenswrapper[4836]: I0217 14:28:49.911249 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/68fadcf3-845e-4605-add5-6b5b6092e443-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "68fadcf3-845e-4605-add5-6b5b6092e443" (UID: "68fadcf3-845e-4605-add5-6b5b6092e443"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 14:28:49 crc kubenswrapper[4836]: I0217 14:28:49.912610 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68fadcf3-845e-4605-add5-6b5b6092e443-logs" (OuterVolumeSpecName: "logs") pod "68fadcf3-845e-4605-add5-6b5b6092e443" (UID: "68fadcf3-845e-4605-add5-6b5b6092e443"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:28:49 crc kubenswrapper[4836]: I0217 14:28:49.922164 4836 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/68fadcf3-845e-4605-add5-6b5b6092e443-logs\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:49 crc kubenswrapper[4836]: I0217 14:28:49.922212 4836 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/68fadcf3-845e-4605-add5-6b5b6092e443-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:49 crc kubenswrapper[4836]: I0217 14:28:49.948721 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68fadcf3-845e-4605-add5-6b5b6092e443-scripts" (OuterVolumeSpecName: "scripts") pod "68fadcf3-845e-4605-add5-6b5b6092e443" (UID: "68fadcf3-845e-4605-add5-6b5b6092e443"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:28:49 crc kubenswrapper[4836]: I0217 14:28:49.949156 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68fadcf3-845e-4605-add5-6b5b6092e443-kube-api-access-wrtlp" (OuterVolumeSpecName: "kube-api-access-wrtlp") pod "68fadcf3-845e-4605-add5-6b5b6092e443" (UID: "68fadcf3-845e-4605-add5-6b5b6092e443"). InnerVolumeSpecName "kube-api-access-wrtlp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:28:49 crc kubenswrapper[4836]: I0217 14:28:49.949266 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68fadcf3-845e-4605-add5-6b5b6092e443-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "68fadcf3-845e-4605-add5-6b5b6092e443" (UID: "68fadcf3-845e-4605-add5-6b5b6092e443"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:28:50 crc kubenswrapper[4836]: I0217 14:28:50.010348 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68fadcf3-845e-4605-add5-6b5b6092e443-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "68fadcf3-845e-4605-add5-6b5b6092e443" (UID: "68fadcf3-845e-4605-add5-6b5b6092e443"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:28:50 crc kubenswrapper[4836]: I0217 14:28:50.010521 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68fadcf3-845e-4605-add5-6b5b6092e443-config-data" (OuterVolumeSpecName: "config-data") pod "68fadcf3-845e-4605-add5-6b5b6092e443" (UID: "68fadcf3-845e-4605-add5-6b5b6092e443"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:28:50 crc kubenswrapper[4836]: I0217 14:28:50.024932 4836 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68fadcf3-845e-4605-add5-6b5b6092e443-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:50 crc kubenswrapper[4836]: I0217 14:28:50.024986 4836 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68fadcf3-845e-4605-add5-6b5b6092e443-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:50 crc kubenswrapper[4836]: I0217 14:28:50.024998 4836 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68fadcf3-845e-4605-add5-6b5b6092e443-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:50 crc kubenswrapper[4836]: I0217 14:28:50.025007 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wrtlp\" (UniqueName: \"kubernetes.io/projected/68fadcf3-845e-4605-add5-6b5b6092e443-kube-api-access-wrtlp\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:50 crc kubenswrapper[4836]: I0217 14:28:50.025017 4836 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/68fadcf3-845e-4605-add5-6b5b6092e443-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:50 crc kubenswrapper[4836]: I0217 14:28:50.472269 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-pvljf" Feb 17 14:28:50 crc kubenswrapper[4836]: I0217 14:28:50.527847 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e016162-2025-44ad-989d-ce71d9f8f9bf-combined-ca-bundle\") pod \"4e016162-2025-44ad-989d-ce71d9f8f9bf\" (UID: \"4e016162-2025-44ad-989d-ce71d9f8f9bf\") " Feb 17 14:28:50 crc kubenswrapper[4836]: I0217 14:28:50.528007 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4e016162-2025-44ad-989d-ce71d9f8f9bf-scripts\") pod \"4e016162-2025-44ad-989d-ce71d9f8f9bf\" (UID: \"4e016162-2025-44ad-989d-ce71d9f8f9bf\") " Feb 17 14:28:50 crc kubenswrapper[4836]: I0217 14:28:50.528482 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/4e016162-2025-44ad-989d-ce71d9f8f9bf-certs\") pod \"4e016162-2025-44ad-989d-ce71d9f8f9bf\" (UID: \"4e016162-2025-44ad-989d-ce71d9f8f9bf\") " Feb 17 14:28:50 crc kubenswrapper[4836]: I0217 14:28:50.528578 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e016162-2025-44ad-989d-ce71d9f8f9bf-config-data\") pod \"4e016162-2025-44ad-989d-ce71d9f8f9bf\" (UID: \"4e016162-2025-44ad-989d-ce71d9f8f9bf\") " Feb 17 14:28:50 crc kubenswrapper[4836]: I0217 14:28:50.528702 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hfrn2\" (UniqueName: \"kubernetes.io/projected/4e016162-2025-44ad-989d-ce71d9f8f9bf-kube-api-access-hfrn2\") pod \"4e016162-2025-44ad-989d-ce71d9f8f9bf\" (UID: \"4e016162-2025-44ad-989d-ce71d9f8f9bf\") " Feb 17 14:28:50 crc kubenswrapper[4836]: I0217 14:28:50.555609 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e016162-2025-44ad-989d-ce71d9f8f9bf-scripts" (OuterVolumeSpecName: "scripts") pod "4e016162-2025-44ad-989d-ce71d9f8f9bf" (UID: "4e016162-2025-44ad-989d-ce71d9f8f9bf"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:28:50 crc kubenswrapper[4836]: I0217 14:28:50.557606 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e016162-2025-44ad-989d-ce71d9f8f9bf-certs" (OuterVolumeSpecName: "certs") pod "4e016162-2025-44ad-989d-ce71d9f8f9bf" (UID: "4e016162-2025-44ad-989d-ce71d9f8f9bf"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:28:50 crc kubenswrapper[4836]: I0217 14:28:50.558711 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e016162-2025-44ad-989d-ce71d9f8f9bf-kube-api-access-hfrn2" (OuterVolumeSpecName: "kube-api-access-hfrn2") pod "4e016162-2025-44ad-989d-ce71d9f8f9bf" (UID: "4e016162-2025-44ad-989d-ce71d9f8f9bf"). InnerVolumeSpecName "kube-api-access-hfrn2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:28:50 crc kubenswrapper[4836]: I0217 14:28:50.600682 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e016162-2025-44ad-989d-ce71d9f8f9bf-config-data" (OuterVolumeSpecName: "config-data") pod "4e016162-2025-44ad-989d-ce71d9f8f9bf" (UID: "4e016162-2025-44ad-989d-ce71d9f8f9bf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:28:50 crc kubenswrapper[4836]: I0217 14:28:50.600740 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e016162-2025-44ad-989d-ce71d9f8f9bf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4e016162-2025-44ad-989d-ce71d9f8f9bf" (UID: "4e016162-2025-44ad-989d-ce71d9f8f9bf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:28:50 crc kubenswrapper[4836]: I0217 14:28:50.608967 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 17 14:28:50 crc kubenswrapper[4836]: I0217 14:28:50.626845 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-pvljf" Feb 17 14:28:50 crc kubenswrapper[4836]: I0217 14:28:50.632449 4836 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4e016162-2025-44ad-989d-ce71d9f8f9bf-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:50 crc kubenswrapper[4836]: I0217 14:28:50.632491 4836 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/4e016162-2025-44ad-989d-ce71d9f8f9bf-certs\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:50 crc kubenswrapper[4836]: I0217 14:28:50.632505 4836 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e016162-2025-44ad-989d-ce71d9f8f9bf-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:50 crc kubenswrapper[4836]: I0217 14:28:50.632519 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hfrn2\" (UniqueName: \"kubernetes.io/projected/4e016162-2025-44ad-989d-ce71d9f8f9bf-kube-api-access-hfrn2\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:50 crc kubenswrapper[4836]: I0217 14:28:50.632529 4836 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e016162-2025-44ad-989d-ce71d9f8f9bf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:50 crc kubenswrapper[4836]: I0217 14:28:50.669081 4836 generic.go:334] "Generic (PLEG): container finished" podID="a7dc98d2-302d-4633-8123-fe76bb7dbd40" containerID="ef329f1c472e28115c477d0f824ce0452609f500341f5fe161170bb1b7dd1f36" exitCode=0 Feb 17 14:28:50 crc kubenswrapper[4836]: I0217 14:28:50.676540 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"68fadcf3-845e-4605-add5-6b5b6092e443","Type":"ContainerDied","Data":"f48faf4eb5b80c2380cc0bd40d17893777d9a019452082ae03e6f15d493c9099"} Feb 17 14:28:50 crc kubenswrapper[4836]: I0217 14:28:50.676610 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-pvljf" event={"ID":"4e016162-2025-44ad-989d-ce71d9f8f9bf","Type":"ContainerDied","Data":"5256492605b5f72154c618f9880c205b521d09a7d2c8e835b6a6c8642893045e"} Feb 17 14:28:50 crc kubenswrapper[4836]: I0217 14:28:50.676631 4836 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5256492605b5f72154c618f9880c205b521d09a7d2c8e835b6a6c8642893045e" Feb 17 14:28:50 crc kubenswrapper[4836]: I0217 14:28:50.676644 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-bc789578f-mcrrx" event={"ID":"a7dc98d2-302d-4633-8123-fe76bb7dbd40","Type":"ContainerDied","Data":"ef329f1c472e28115c477d0f824ce0452609f500341f5fe161170bb1b7dd1f36"} Feb 17 14:28:50 crc kubenswrapper[4836]: I0217 14:28:50.676676 4836 scope.go:117] "RemoveContainer" containerID="46227394029fb44dba4a63e6488a9dfab06641f1dffd60fc1769e1f7240d858f" Feb 17 14:28:50 crc kubenswrapper[4836]: I0217 14:28:50.767685 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 17 14:28:50 crc kubenswrapper[4836]: I0217 14:28:50.798857 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Feb 17 14:28:50 crc kubenswrapper[4836]: I0217 14:28:50.805686 4836 scope.go:117] "RemoveContainer" containerID="1faaca327f5f255f0110639e69475e673f9cf85d2a5f0e368b4e02002323c46e" Feb 17 14:28:50 crc kubenswrapper[4836]: I0217 14:28:50.810644 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 17 14:28:50 crc kubenswrapper[4836]: E0217 14:28:50.811922 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68fadcf3-845e-4605-add5-6b5b6092e443" containerName="cinder-api" Feb 17 14:28:50 crc kubenswrapper[4836]: I0217 14:28:50.811948 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="68fadcf3-845e-4605-add5-6b5b6092e443" containerName="cinder-api" Feb 17 14:28:50 crc kubenswrapper[4836]: E0217 14:28:50.811970 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68fadcf3-845e-4605-add5-6b5b6092e443" containerName="cinder-api-log" Feb 17 14:28:50 crc kubenswrapper[4836]: I0217 14:28:50.811977 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="68fadcf3-845e-4605-add5-6b5b6092e443" containerName="cinder-api-log" Feb 17 14:28:50 crc kubenswrapper[4836]: E0217 14:28:50.811987 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e016162-2025-44ad-989d-ce71d9f8f9bf" containerName="cloudkitty-db-sync" Feb 17 14:28:50 crc kubenswrapper[4836]: I0217 14:28:50.811994 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e016162-2025-44ad-989d-ce71d9f8f9bf" containerName="cloudkitty-db-sync" Feb 17 14:28:50 crc kubenswrapper[4836]: I0217 14:28:50.812236 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="68fadcf3-845e-4605-add5-6b5b6092e443" containerName="cinder-api-log" Feb 17 14:28:50 crc kubenswrapper[4836]: I0217 14:28:50.812258 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="68fadcf3-845e-4605-add5-6b5b6092e443" containerName="cinder-api" Feb 17 14:28:50 crc kubenswrapper[4836]: I0217 14:28:50.812265 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e016162-2025-44ad-989d-ce71d9f8f9bf" containerName="cloudkitty-db-sync" Feb 17 14:28:50 crc kubenswrapper[4836]: I0217 14:28:50.813868 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 17 14:28:50 crc kubenswrapper[4836]: I0217 14:28:50.819620 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 17 14:28:50 crc kubenswrapper[4836]: I0217 14:28:50.820724 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Feb 17 14:28:50 crc kubenswrapper[4836]: I0217 14:28:50.821013 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Feb 17 14:28:50 crc kubenswrapper[4836]: I0217 14:28:50.822644 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 17 14:28:50 crc kubenswrapper[4836]: I0217 14:28:50.979485 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8722776f-950d-46d6-8929-164cc70747af-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"8722776f-950d-46d6-8929-164cc70747af\") " pod="openstack/cinder-api-0" Feb 17 14:28:50 crc kubenswrapper[4836]: I0217 14:28:50.979565 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8722776f-950d-46d6-8929-164cc70747af-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"8722776f-950d-46d6-8929-164cc70747af\") " pod="openstack/cinder-api-0" Feb 17 14:28:50 crc kubenswrapper[4836]: I0217 14:28:50.979604 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8722776f-950d-46d6-8929-164cc70747af-public-tls-certs\") pod \"cinder-api-0\" (UID: \"8722776f-950d-46d6-8929-164cc70747af\") " pod="openstack/cinder-api-0" Feb 17 14:28:50 crc kubenswrapper[4836]: I0217 14:28:50.979658 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8722776f-950d-46d6-8929-164cc70747af-logs\") pod \"cinder-api-0\" (UID: \"8722776f-950d-46d6-8929-164cc70747af\") " pod="openstack/cinder-api-0" Feb 17 14:28:50 crc kubenswrapper[4836]: I0217 14:28:50.979718 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nprdx\" (UniqueName: \"kubernetes.io/projected/8722776f-950d-46d6-8929-164cc70747af-kube-api-access-nprdx\") pod \"cinder-api-0\" (UID: \"8722776f-950d-46d6-8929-164cc70747af\") " pod="openstack/cinder-api-0" Feb 17 14:28:50 crc kubenswrapper[4836]: I0217 14:28:50.979760 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8722776f-950d-46d6-8929-164cc70747af-scripts\") pod \"cinder-api-0\" (UID: \"8722776f-950d-46d6-8929-164cc70747af\") " pod="openstack/cinder-api-0" Feb 17 14:28:50 crc kubenswrapper[4836]: I0217 14:28:50.979777 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8722776f-950d-46d6-8929-164cc70747af-config-data-custom\") pod \"cinder-api-0\" (UID: \"8722776f-950d-46d6-8929-164cc70747af\") " pod="openstack/cinder-api-0" Feb 17 14:28:50 crc kubenswrapper[4836]: I0217 14:28:50.979850 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8722776f-950d-46d6-8929-164cc70747af-etc-machine-id\") pod \"cinder-api-0\" (UID: \"8722776f-950d-46d6-8929-164cc70747af\") " pod="openstack/cinder-api-0" Feb 17 14:28:50 crc kubenswrapper[4836]: I0217 14:28:50.979890 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8722776f-950d-46d6-8929-164cc70747af-config-data\") pod \"cinder-api-0\" (UID: \"8722776f-950d-46d6-8929-164cc70747af\") " pod="openstack/cinder-api-0" Feb 17 14:28:51 crc kubenswrapper[4836]: I0217 14:28:51.083937 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nprdx\" (UniqueName: \"kubernetes.io/projected/8722776f-950d-46d6-8929-164cc70747af-kube-api-access-nprdx\") pod \"cinder-api-0\" (UID: \"8722776f-950d-46d6-8929-164cc70747af\") " pod="openstack/cinder-api-0" Feb 17 14:28:51 crc kubenswrapper[4836]: I0217 14:28:51.084536 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8722776f-950d-46d6-8929-164cc70747af-scripts\") pod \"cinder-api-0\" (UID: \"8722776f-950d-46d6-8929-164cc70747af\") " pod="openstack/cinder-api-0" Feb 17 14:28:51 crc kubenswrapper[4836]: I0217 14:28:51.084573 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8722776f-950d-46d6-8929-164cc70747af-config-data-custom\") pod \"cinder-api-0\" (UID: \"8722776f-950d-46d6-8929-164cc70747af\") " pod="openstack/cinder-api-0" Feb 17 14:28:51 crc kubenswrapper[4836]: I0217 14:28:51.084827 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8722776f-950d-46d6-8929-164cc70747af-etc-machine-id\") pod \"cinder-api-0\" (UID: \"8722776f-950d-46d6-8929-164cc70747af\") " pod="openstack/cinder-api-0" Feb 17 14:28:51 crc kubenswrapper[4836]: I0217 14:28:51.084875 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8722776f-950d-46d6-8929-164cc70747af-config-data\") pod \"cinder-api-0\" (UID: \"8722776f-950d-46d6-8929-164cc70747af\") " pod="openstack/cinder-api-0" Feb 17 14:28:51 crc kubenswrapper[4836]: I0217 14:28:51.084912 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8722776f-950d-46d6-8929-164cc70747af-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"8722776f-950d-46d6-8929-164cc70747af\") " pod="openstack/cinder-api-0" Feb 17 14:28:51 crc kubenswrapper[4836]: I0217 14:28:51.084951 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8722776f-950d-46d6-8929-164cc70747af-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"8722776f-950d-46d6-8929-164cc70747af\") " pod="openstack/cinder-api-0" Feb 17 14:28:51 crc kubenswrapper[4836]: I0217 14:28:51.084988 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8722776f-950d-46d6-8929-164cc70747af-public-tls-certs\") pod \"cinder-api-0\" (UID: \"8722776f-950d-46d6-8929-164cc70747af\") " pod="openstack/cinder-api-0" Feb 17 14:28:51 crc kubenswrapper[4836]: I0217 14:28:51.085060 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8722776f-950d-46d6-8929-164cc70747af-logs\") pod \"cinder-api-0\" (UID: \"8722776f-950d-46d6-8929-164cc70747af\") " pod="openstack/cinder-api-0" Feb 17 14:28:51 crc kubenswrapper[4836]: I0217 14:28:51.085752 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8722776f-950d-46d6-8929-164cc70747af-logs\") pod \"cinder-api-0\" (UID: \"8722776f-950d-46d6-8929-164cc70747af\") " pod="openstack/cinder-api-0" Feb 17 14:28:51 crc kubenswrapper[4836]: I0217 14:28:51.097361 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8722776f-950d-46d6-8929-164cc70747af-scripts\") pod \"cinder-api-0\" (UID: \"8722776f-950d-46d6-8929-164cc70747af\") " pod="openstack/cinder-api-0" Feb 17 14:28:51 crc kubenswrapper[4836]: I0217 14:28:51.097535 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8722776f-950d-46d6-8929-164cc70747af-etc-machine-id\") pod \"cinder-api-0\" (UID: \"8722776f-950d-46d6-8929-164cc70747af\") " pod="openstack/cinder-api-0" Feb 17 14:28:51 crc kubenswrapper[4836]: I0217 14:28:51.100605 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8722776f-950d-46d6-8929-164cc70747af-config-data\") pod \"cinder-api-0\" (UID: \"8722776f-950d-46d6-8929-164cc70747af\") " pod="openstack/cinder-api-0" Feb 17 14:28:51 crc kubenswrapper[4836]: I0217 14:28:51.111206 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8722776f-950d-46d6-8929-164cc70747af-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"8722776f-950d-46d6-8929-164cc70747af\") " pod="openstack/cinder-api-0" Feb 17 14:28:51 crc kubenswrapper[4836]: I0217 14:28:51.117162 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8722776f-950d-46d6-8929-164cc70747af-config-data-custom\") pod \"cinder-api-0\" (UID: \"8722776f-950d-46d6-8929-164cc70747af\") " pod="openstack/cinder-api-0" Feb 17 14:28:51 crc kubenswrapper[4836]: I0217 14:28:51.125445 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nprdx\" (UniqueName: \"kubernetes.io/projected/8722776f-950d-46d6-8929-164cc70747af-kube-api-access-nprdx\") pod \"cinder-api-0\" (UID: \"8722776f-950d-46d6-8929-164cc70747af\") " pod="openstack/cinder-api-0" Feb 17 14:28:51 crc kubenswrapper[4836]: I0217 14:28:51.130328 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8722776f-950d-46d6-8929-164cc70747af-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"8722776f-950d-46d6-8929-164cc70747af\") " pod="openstack/cinder-api-0" Feb 17 14:28:51 crc kubenswrapper[4836]: I0217 14:28:51.130962 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8722776f-950d-46d6-8929-164cc70747af-public-tls-certs\") pod \"cinder-api-0\" (UID: \"8722776f-950d-46d6-8929-164cc70747af\") " pod="openstack/cinder-api-0" Feb 17 14:28:51 crc kubenswrapper[4836]: I0217 14:28:51.137713 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-bc789578f-mcrrx" Feb 17 14:28:51 crc kubenswrapper[4836]: I0217 14:28:51.159035 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 17 14:28:51 crc kubenswrapper[4836]: I0217 14:28:51.191600 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a7dc98d2-302d-4633-8123-fe76bb7dbd40-config\") pod \"a7dc98d2-302d-4633-8123-fe76bb7dbd40\" (UID: \"a7dc98d2-302d-4633-8123-fe76bb7dbd40\") " Feb 17 14:28:51 crc kubenswrapper[4836]: I0217 14:28:51.191893 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7dc98d2-302d-4633-8123-fe76bb7dbd40-public-tls-certs\") pod \"a7dc98d2-302d-4633-8123-fe76bb7dbd40\" (UID: \"a7dc98d2-302d-4633-8123-fe76bb7dbd40\") " Feb 17 14:28:51 crc kubenswrapper[4836]: I0217 14:28:51.191964 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7dc98d2-302d-4633-8123-fe76bb7dbd40-internal-tls-certs\") pod \"a7dc98d2-302d-4633-8123-fe76bb7dbd40\" (UID: \"a7dc98d2-302d-4633-8123-fe76bb7dbd40\") " Feb 17 14:28:51 crc kubenswrapper[4836]: I0217 14:28:51.192003 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a7dc98d2-302d-4633-8123-fe76bb7dbd40-httpd-config\") pod \"a7dc98d2-302d-4633-8123-fe76bb7dbd40\" (UID: \"a7dc98d2-302d-4633-8123-fe76bb7dbd40\") " Feb 17 14:28:51 crc kubenswrapper[4836]: I0217 14:28:51.192160 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7dc98d2-302d-4633-8123-fe76bb7dbd40-combined-ca-bundle\") pod \"a7dc98d2-302d-4633-8123-fe76bb7dbd40\" (UID: \"a7dc98d2-302d-4633-8123-fe76bb7dbd40\") " Feb 17 14:28:51 crc kubenswrapper[4836]: I0217 14:28:51.192242 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nv8ws\" (UniqueName: \"kubernetes.io/projected/a7dc98d2-302d-4633-8123-fe76bb7dbd40-kube-api-access-nv8ws\") pod \"a7dc98d2-302d-4633-8123-fe76bb7dbd40\" (UID: \"a7dc98d2-302d-4633-8123-fe76bb7dbd40\") " Feb 17 14:28:51 crc kubenswrapper[4836]: I0217 14:28:51.192328 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7dc98d2-302d-4633-8123-fe76bb7dbd40-ovndb-tls-certs\") pod \"a7dc98d2-302d-4633-8123-fe76bb7dbd40\" (UID: \"a7dc98d2-302d-4633-8123-fe76bb7dbd40\") " Feb 17 14:28:51 crc kubenswrapper[4836]: I0217 14:28:51.209911 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7dc98d2-302d-4633-8123-fe76bb7dbd40-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "a7dc98d2-302d-4633-8123-fe76bb7dbd40" (UID: "a7dc98d2-302d-4633-8123-fe76bb7dbd40"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:28:51 crc kubenswrapper[4836]: I0217 14:28:51.211227 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7dc98d2-302d-4633-8123-fe76bb7dbd40-kube-api-access-nv8ws" (OuterVolumeSpecName: "kube-api-access-nv8ws") pod "a7dc98d2-302d-4633-8123-fe76bb7dbd40" (UID: "a7dc98d2-302d-4633-8123-fe76bb7dbd40"). InnerVolumeSpecName "kube-api-access-nv8ws". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:28:51 crc kubenswrapper[4836]: I0217 14:28:51.297328 4836 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a7dc98d2-302d-4633-8123-fe76bb7dbd40-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:51 crc kubenswrapper[4836]: I0217 14:28:51.297373 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nv8ws\" (UniqueName: \"kubernetes.io/projected/a7dc98d2-302d-4633-8123-fe76bb7dbd40-kube-api-access-nv8ws\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:51 crc kubenswrapper[4836]: I0217 14:28:51.346866 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 17 14:28:51 crc kubenswrapper[4836]: I0217 14:28:51.365719 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7dc98d2-302d-4633-8123-fe76bb7dbd40-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a7dc98d2-302d-4633-8123-fe76bb7dbd40" (UID: "a7dc98d2-302d-4633-8123-fe76bb7dbd40"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:28:51 crc kubenswrapper[4836]: I0217 14:28:51.379082 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7dc98d2-302d-4633-8123-fe76bb7dbd40-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "a7dc98d2-302d-4633-8123-fe76bb7dbd40" (UID: "a7dc98d2-302d-4633-8123-fe76bb7dbd40"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:28:51 crc kubenswrapper[4836]: I0217 14:28:51.405019 4836 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7dc98d2-302d-4633-8123-fe76bb7dbd40-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:51 crc kubenswrapper[4836]: I0217 14:28:51.405070 4836 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7dc98d2-302d-4633-8123-fe76bb7dbd40-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:51 crc kubenswrapper[4836]: I0217 14:28:51.420806 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7dc98d2-302d-4633-8123-fe76bb7dbd40-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "a7dc98d2-302d-4633-8123-fe76bb7dbd40" (UID: "a7dc98d2-302d-4633-8123-fe76bb7dbd40"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:28:51 crc kubenswrapper[4836]: I0217 14:28:51.826206 4836 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7dc98d2-302d-4633-8123-fe76bb7dbd40-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:51 crc kubenswrapper[4836]: I0217 14:28:51.841441 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7dc98d2-302d-4633-8123-fe76bb7dbd40-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "a7dc98d2-302d-4633-8123-fe76bb7dbd40" (UID: "a7dc98d2-302d-4633-8123-fe76bb7dbd40"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:28:51 crc kubenswrapper[4836]: I0217 14:28:51.866936 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7dc98d2-302d-4633-8123-fe76bb7dbd40-config" (OuterVolumeSpecName: "config") pod "a7dc98d2-302d-4633-8123-fe76bb7dbd40" (UID: "a7dc98d2-302d-4633-8123-fe76bb7dbd40"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:28:51 crc kubenswrapper[4836]: I0217 14:28:51.885458 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-bc789578f-mcrrx" event={"ID":"a7dc98d2-302d-4633-8123-fe76bb7dbd40","Type":"ContainerDied","Data":"61e1414473aaed7533a1bb0fd531409b1cf0fa9ea0b92c1ed51519923f9cbabf"} Feb 17 14:28:51 crc kubenswrapper[4836]: I0217 14:28:51.885554 4836 scope.go:117] "RemoveContainer" containerID="29474b05f933bb7261368e691fe6f6124baae6cdcaac7f0997ad485f3fcff20d" Feb 17 14:28:51 crc kubenswrapper[4836]: I0217 14:28:51.885754 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-bc789578f-mcrrx" Feb 17 14:28:51 crc kubenswrapper[4836]: I0217 14:28:51.925715 4836 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 17 14:28:51 crc kubenswrapper[4836]: I0217 14:28:51.925748 4836 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 17 14:28:51 crc kubenswrapper[4836]: I0217 14:28:51.927620 4836 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 17 14:28:51 crc kubenswrapper[4836]: I0217 14:28:51.927647 4836 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 17 14:28:51 crc kubenswrapper[4836]: I0217 14:28:51.948576 4836 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7dc98d2-302d-4633-8123-fe76bb7dbd40-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:51 crc kubenswrapper[4836]: I0217 14:28:51.953205 4836 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/a7dc98d2-302d-4633-8123-fe76bb7dbd40-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:28:52 crc kubenswrapper[4836]: I0217 14:28:52.032495 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-storageinit-9z4jp"] Feb 17 14:28:52 crc kubenswrapper[4836]: E0217 14:28:52.033515 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7dc98d2-302d-4633-8123-fe76bb7dbd40" containerName="neutron-httpd" Feb 17 14:28:52 crc kubenswrapper[4836]: I0217 14:28:52.033536 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7dc98d2-302d-4633-8123-fe76bb7dbd40" containerName="neutron-httpd" Feb 17 14:28:52 crc kubenswrapper[4836]: E0217 14:28:52.033582 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7dc98d2-302d-4633-8123-fe76bb7dbd40" containerName="neutron-api" Feb 17 14:28:52 crc kubenswrapper[4836]: I0217 14:28:52.033592 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7dc98d2-302d-4633-8123-fe76bb7dbd40" containerName="neutron-api" Feb 17 14:28:52 crc kubenswrapper[4836]: I0217 14:28:52.033813 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7dc98d2-302d-4633-8123-fe76bb7dbd40" containerName="neutron-httpd" Feb 17 14:28:52 crc kubenswrapper[4836]: I0217 14:28:52.033849 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7dc98d2-302d-4633-8123-fe76bb7dbd40" containerName="neutron-api" Feb 17 14:28:52 crc kubenswrapper[4836]: I0217 14:28:52.034795 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-9z4jp" Feb 17 14:28:52 crc kubenswrapper[4836]: I0217 14:28:52.045698 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-config-data" Feb 17 14:28:52 crc kubenswrapper[4836]: I0217 14:28:52.045907 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 17 14:28:52 crc kubenswrapper[4836]: I0217 14:28:52.046021 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-scripts" Feb 17 14:28:52 crc kubenswrapper[4836]: I0217 14:28:52.046138 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-client-internal" Feb 17 14:28:52 crc kubenswrapper[4836]: I0217 14:28:52.046258 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-cloudkitty-dockercfg-l28cf" Feb 17 14:28:52 crc kubenswrapper[4836]: I0217 14:28:52.110073 4836 scope.go:117] "RemoveContainer" containerID="ef329f1c472e28115c477d0f824ce0452609f500341f5fe161170bb1b7dd1f36" Feb 17 14:28:52 crc kubenswrapper[4836]: I0217 14:28:52.117075 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-storageinit-9z4jp"] Feb 17 14:28:52 crc kubenswrapper[4836]: I0217 14:28:52.157612 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/f38b5f94-bc8b-4e64-abe6-8c39b920cb4b-certs\") pod \"cloudkitty-storageinit-9z4jp\" (UID: \"f38b5f94-bc8b-4e64-abe6-8c39b920cb4b\") " pod="openstack/cloudkitty-storageinit-9z4jp" Feb 17 14:28:52 crc kubenswrapper[4836]: I0217 14:28:52.157759 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f38b5f94-bc8b-4e64-abe6-8c39b920cb4b-config-data\") pod \"cloudkitty-storageinit-9z4jp\" (UID: \"f38b5f94-bc8b-4e64-abe6-8c39b920cb4b\") " pod="openstack/cloudkitty-storageinit-9z4jp" Feb 17 14:28:52 crc kubenswrapper[4836]: I0217 14:28:52.157805 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f38b5f94-bc8b-4e64-abe6-8c39b920cb4b-combined-ca-bundle\") pod \"cloudkitty-storageinit-9z4jp\" (UID: \"f38b5f94-bc8b-4e64-abe6-8c39b920cb4b\") " pod="openstack/cloudkitty-storageinit-9z4jp" Feb 17 14:28:52 crc kubenswrapper[4836]: I0217 14:28:52.157897 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82gk8\" (UniqueName: \"kubernetes.io/projected/f38b5f94-bc8b-4e64-abe6-8c39b920cb4b-kube-api-access-82gk8\") pod \"cloudkitty-storageinit-9z4jp\" (UID: \"f38b5f94-bc8b-4e64-abe6-8c39b920cb4b\") " pod="openstack/cloudkitty-storageinit-9z4jp" Feb 17 14:28:52 crc kubenswrapper[4836]: I0217 14:28:52.158136 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f38b5f94-bc8b-4e64-abe6-8c39b920cb4b-scripts\") pod \"cloudkitty-storageinit-9z4jp\" (UID: \"f38b5f94-bc8b-4e64-abe6-8c39b920cb4b\") " pod="openstack/cloudkitty-storageinit-9z4jp" Feb 17 14:28:52 crc kubenswrapper[4836]: I0217 14:28:52.179448 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-bc789578f-mcrrx"] Feb 17 14:28:52 crc kubenswrapper[4836]: I0217 14:28:52.256397 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-bc789578f-mcrrx"] Feb 17 14:28:52 crc kubenswrapper[4836]: I0217 14:28:52.272274 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f38b5f94-bc8b-4e64-abe6-8c39b920cb4b-scripts\") pod \"cloudkitty-storageinit-9z4jp\" (UID: \"f38b5f94-bc8b-4e64-abe6-8c39b920cb4b\") " pod="openstack/cloudkitty-storageinit-9z4jp" Feb 17 14:28:52 crc kubenswrapper[4836]: I0217 14:28:52.272413 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/f38b5f94-bc8b-4e64-abe6-8c39b920cb4b-certs\") pod \"cloudkitty-storageinit-9z4jp\" (UID: \"f38b5f94-bc8b-4e64-abe6-8c39b920cb4b\") " pod="openstack/cloudkitty-storageinit-9z4jp" Feb 17 14:28:52 crc kubenswrapper[4836]: I0217 14:28:52.272527 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f38b5f94-bc8b-4e64-abe6-8c39b920cb4b-config-data\") pod \"cloudkitty-storageinit-9z4jp\" (UID: \"f38b5f94-bc8b-4e64-abe6-8c39b920cb4b\") " pod="openstack/cloudkitty-storageinit-9z4jp" Feb 17 14:28:52 crc kubenswrapper[4836]: I0217 14:28:52.272603 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f38b5f94-bc8b-4e64-abe6-8c39b920cb4b-combined-ca-bundle\") pod \"cloudkitty-storageinit-9z4jp\" (UID: \"f38b5f94-bc8b-4e64-abe6-8c39b920cb4b\") " pod="openstack/cloudkitty-storageinit-9z4jp" Feb 17 14:28:52 crc kubenswrapper[4836]: I0217 14:28:52.272693 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82gk8\" (UniqueName: \"kubernetes.io/projected/f38b5f94-bc8b-4e64-abe6-8c39b920cb4b-kube-api-access-82gk8\") pod \"cloudkitty-storageinit-9z4jp\" (UID: \"f38b5f94-bc8b-4e64-abe6-8c39b920cb4b\") " pod="openstack/cloudkitty-storageinit-9z4jp" Feb 17 14:28:52 crc kubenswrapper[4836]: I0217 14:28:52.308456 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f38b5f94-bc8b-4e64-abe6-8c39b920cb4b-combined-ca-bundle\") pod \"cloudkitty-storageinit-9z4jp\" (UID: \"f38b5f94-bc8b-4e64-abe6-8c39b920cb4b\") " pod="openstack/cloudkitty-storageinit-9z4jp" Feb 17 14:28:52 crc kubenswrapper[4836]: I0217 14:28:52.311904 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/f38b5f94-bc8b-4e64-abe6-8c39b920cb4b-certs\") pod \"cloudkitty-storageinit-9z4jp\" (UID: \"f38b5f94-bc8b-4e64-abe6-8c39b920cb4b\") " pod="openstack/cloudkitty-storageinit-9z4jp" Feb 17 14:28:52 crc kubenswrapper[4836]: I0217 14:28:52.322755 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82gk8\" (UniqueName: \"kubernetes.io/projected/f38b5f94-bc8b-4e64-abe6-8c39b920cb4b-kube-api-access-82gk8\") pod \"cloudkitty-storageinit-9z4jp\" (UID: \"f38b5f94-bc8b-4e64-abe6-8c39b920cb4b\") " pod="openstack/cloudkitty-storageinit-9z4jp" Feb 17 14:28:52 crc kubenswrapper[4836]: I0217 14:28:52.357631 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f38b5f94-bc8b-4e64-abe6-8c39b920cb4b-scripts\") pod \"cloudkitty-storageinit-9z4jp\" (UID: \"f38b5f94-bc8b-4e64-abe6-8c39b920cb4b\") " pod="openstack/cloudkitty-storageinit-9z4jp" Feb 17 14:28:52 crc kubenswrapper[4836]: I0217 14:28:52.369047 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f38b5f94-bc8b-4e64-abe6-8c39b920cb4b-config-data\") pod \"cloudkitty-storageinit-9z4jp\" (UID: \"f38b5f94-bc8b-4e64-abe6-8c39b920cb4b\") " pod="openstack/cloudkitty-storageinit-9z4jp" Feb 17 14:28:52 crc kubenswrapper[4836]: I0217 14:28:52.398861 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-9z4jp" Feb 17 14:28:52 crc kubenswrapper[4836]: I0217 14:28:52.453090 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 17 14:28:52 crc kubenswrapper[4836]: W0217 14:28:52.508528 4836 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8722776f_950d_46d6_8929_164cc70747af.slice/crio-9637dd07ce54453ecac25fa854611f26e2d792f0eece79611b4ba99df3a280fc WatchSource:0}: Error finding container 9637dd07ce54453ecac25fa854611f26e2d792f0eece79611b4ba99df3a280fc: Status 404 returned error can't find the container with id 9637dd07ce54453ecac25fa854611f26e2d792f0eece79611b4ba99df3a280fc Feb 17 14:28:52 crc kubenswrapper[4836]: E0217 14:28:52.608508 4836 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda7dc98d2_302d_4633_8123_fe76bb7dbd40.slice/crio-61e1414473aaed7533a1bb0fd531409b1cf0fa9ea0b92c1ed51519923f9cbabf\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda7dc98d2_302d_4633_8123_fe76bb7dbd40.slice\": RecentStats: unable to find data in memory cache]" Feb 17 14:28:52 crc kubenswrapper[4836]: I0217 14:28:52.720885 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68fadcf3-845e-4605-add5-6b5b6092e443" path="/var/lib/kubelet/pods/68fadcf3-845e-4605-add5-6b5b6092e443/volumes" Feb 17 14:28:52 crc kubenswrapper[4836]: I0217 14:28:52.724129 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7dc98d2-302d-4633-8123-fe76bb7dbd40" path="/var/lib/kubelet/pods/a7dc98d2-302d-4633-8123-fe76bb7dbd40/volumes" Feb 17 14:28:52 crc kubenswrapper[4836]: I0217 14:28:52.729502 4836 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-7dc9c9fdbb-zxjj6" podUID="62b902ba-6ba2-48f3-a6dc-652fd1d6d58c" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.180:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 17 14:28:52 crc kubenswrapper[4836]: I0217 14:28:52.949067 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"8722776f-950d-46d6-8929-164cc70747af","Type":"ContainerStarted","Data":"9637dd07ce54453ecac25fa854611f26e2d792f0eece79611b4ba99df3a280fc"} Feb 17 14:28:53 crc kubenswrapper[4836]: I0217 14:28:53.257316 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-storageinit-9z4jp"] Feb 17 14:28:53 crc kubenswrapper[4836]: I0217 14:28:53.766500 4836 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7dc9c9fdbb-zxjj6" podUID="62b902ba-6ba2-48f3-a6dc-652fd1d6d58c" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.180:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 17 14:28:53 crc kubenswrapper[4836]: I0217 14:28:53.970793 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"8722776f-950d-46d6-8929-164cc70747af","Type":"ContainerStarted","Data":"52da630d0b5f10dbdeca848132a5aeddff0b17cf2d63ba22d9954c518c687970"} Feb 17 14:28:54 crc kubenswrapper[4836]: I0217 14:28:54.006786 4836 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-649d9995c8-rcxvp" podUID="ed247b9d-af54-401e-80a3-82d18772f29d" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.179:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 17 14:28:54 crc kubenswrapper[4836]: I0217 14:28:54.007144 4836 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-649d9995c8-rcxvp" podUID="ed247b9d-af54-401e-80a3-82d18772f29d" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.179:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 17 14:28:54 crc kubenswrapper[4836]: I0217 14:28:54.242328 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 17 14:28:54 crc kubenswrapper[4836]: I0217 14:28:54.242544 4836 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 17 14:28:54 crc kubenswrapper[4836]: I0217 14:28:54.742204 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 17 14:28:54 crc kubenswrapper[4836]: I0217 14:28:54.870020 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 17 14:28:54 crc kubenswrapper[4836]: I0217 14:28:54.870624 4836 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 17 14:28:54 crc kubenswrapper[4836]: I0217 14:28:54.871527 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 17 14:28:56 crc kubenswrapper[4836]: I0217 14:28:56.338041 4836 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-649d9995c8-rcxvp" podUID="ed247b9d-af54-401e-80a3-82d18772f29d" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.179:9311/healthcheck\": read tcp 10.217.0.2:57282->10.217.0.179:9311: read: connection reset by peer" Feb 17 14:28:56 crc kubenswrapper[4836]: I0217 14:28:56.341111 4836 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-649d9995c8-rcxvp" podUID="ed247b9d-af54-401e-80a3-82d18772f29d" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.179:9311/healthcheck\": read tcp 10.217.0.2:57288->10.217.0.179:9311: read: connection reset by peer" Feb 17 14:28:56 crc kubenswrapper[4836]: I0217 14:28:56.565902 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c9776ccc5-nvkvs" Feb 17 14:28:56 crc kubenswrapper[4836]: I0217 14:28:56.662929 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-knj6m"] Feb 17 14:28:56 crc kubenswrapper[4836]: I0217 14:28:56.663239 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-55f844cf75-knj6m" podUID="7622952e-3f9a-4569-8f4d-8a07f1cbcd2c" containerName="dnsmasq-dns" containerID="cri-o://85a21ea6f28662473a5cbe42dfa68ac85c766a6f09753e438f2c37af7356f777" gracePeriod=10 Feb 17 14:28:56 crc kubenswrapper[4836]: I0217 14:28:56.701661 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 17 14:28:56 crc kubenswrapper[4836]: I0217 14:28:56.775751 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 17 14:28:57 crc kubenswrapper[4836]: I0217 14:28:57.013316 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-649d9995c8-rcxvp" event={"ID":"ed247b9d-af54-401e-80a3-82d18772f29d","Type":"ContainerDied","Data":"0f637c77116f7f89955d3abdef322406d505cefd736b74cd4ec4ac6a045c16f5"} Feb 17 14:28:57 crc kubenswrapper[4836]: I0217 14:28:57.013252 4836 generic.go:334] "Generic (PLEG): container finished" podID="ed247b9d-af54-401e-80a3-82d18772f29d" containerID="0f637c77116f7f89955d3abdef322406d505cefd736b74cd4ec4ac6a045c16f5" exitCode=0 Feb 17 14:28:57 crc kubenswrapper[4836]: I0217 14:28:57.017422 4836 generic.go:334] "Generic (PLEG): container finished" podID="7622952e-3f9a-4569-8f4d-8a07f1cbcd2c" containerID="85a21ea6f28662473a5cbe42dfa68ac85c766a6f09753e438f2c37af7356f777" exitCode=0 Feb 17 14:28:57 crc kubenswrapper[4836]: I0217 14:28:57.017509 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-knj6m" event={"ID":"7622952e-3f9a-4569-8f4d-8a07f1cbcd2c","Type":"ContainerDied","Data":"85a21ea6f28662473a5cbe42dfa68ac85c766a6f09753e438f2c37af7356f777"} Feb 17 14:28:57 crc kubenswrapper[4836]: I0217 14:28:57.017798 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="f85d3a41-bec9-4783-a2c6-2e6627156cce" containerName="cinder-scheduler" containerID="cri-o://713b2d67a8bcdbac716e75c48ae87c2e03fb9564a7bca1ee0f598db4994298c7" gracePeriod=30 Feb 17 14:28:57 crc kubenswrapper[4836]: I0217 14:28:57.017949 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="f85d3a41-bec9-4783-a2c6-2e6627156cce" containerName="probe" containerID="cri-o://b853d8aee890fe988c231f41ba4f29bef7974b65a59ee2897863ebef85aca6e3" gracePeriod=30 Feb 17 14:28:57 crc kubenswrapper[4836]: I0217 14:28:57.256062 4836 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-649d9995c8-rcxvp" podUID="ed247b9d-af54-401e-80a3-82d18772f29d" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.179:9311/healthcheck\": dial tcp 10.217.0.179:9311: connect: connection refused" Feb 17 14:28:57 crc kubenswrapper[4836]: I0217 14:28:57.256059 4836 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-649d9995c8-rcxvp" podUID="ed247b9d-af54-401e-80a3-82d18772f29d" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.179:9311/healthcheck\": dial tcp 10.217.0.179:9311: connect: connection refused" Feb 17 14:28:58 crc kubenswrapper[4836]: I0217 14:28:58.051863 4836 generic.go:334] "Generic (PLEG): container finished" podID="f85d3a41-bec9-4783-a2c6-2e6627156cce" containerID="b853d8aee890fe988c231f41ba4f29bef7974b65a59ee2897863ebef85aca6e3" exitCode=0 Feb 17 14:28:58 crc kubenswrapper[4836]: I0217 14:28:58.051979 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"f85d3a41-bec9-4783-a2c6-2e6627156cce","Type":"ContainerDied","Data":"b853d8aee890fe988c231f41ba4f29bef7974b65a59ee2897863ebef85aca6e3"} Feb 17 14:28:59 crc kubenswrapper[4836]: I0217 14:28:59.075889 4836 generic.go:334] "Generic (PLEG): container finished" podID="f85d3a41-bec9-4783-a2c6-2e6627156cce" containerID="713b2d67a8bcdbac716e75c48ae87c2e03fb9564a7bca1ee0f598db4994298c7" exitCode=0 Feb 17 14:28:59 crc kubenswrapper[4836]: I0217 14:28:59.076392 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"f85d3a41-bec9-4783-a2c6-2e6627156cce","Type":"ContainerDied","Data":"713b2d67a8bcdbac716e75c48ae87c2e03fb9564a7bca1ee0f598db4994298c7"} Feb 17 14:28:59 crc kubenswrapper[4836]: I0217 14:28:59.312924 4836 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-55f844cf75-knj6m" podUID="7622952e-3f9a-4569-8f4d-8a07f1cbcd2c" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.169:5353: connect: connection refused" Feb 17 14:28:59 crc kubenswrapper[4836]: W0217 14:28:59.498317 4836 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf38b5f94_bc8b_4e64_abe6_8c39b920cb4b.slice/crio-bcf545502020f0699c5847e3cc9076fff2937319f46acd5c1c65027d98b9be99 WatchSource:0}: Error finding container bcf545502020f0699c5847e3cc9076fff2937319f46acd5c1c65027d98b9be99: Status 404 returned error can't find the container with id bcf545502020f0699c5847e3cc9076fff2937319f46acd5c1c65027d98b9be99 Feb 17 14:29:00 crc kubenswrapper[4836]: I0217 14:29:00.089232 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-9z4jp" event={"ID":"f38b5f94-bc8b-4e64-abe6-8c39b920cb4b","Type":"ContainerStarted","Data":"bcf545502020f0699c5847e3cc9076fff2937319f46acd5c1c65027d98b9be99"} Feb 17 14:29:00 crc kubenswrapper[4836]: I0217 14:29:00.841108 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-knj6m" Feb 17 14:29:00 crc kubenswrapper[4836]: I0217 14:29:00.854670 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-649d9995c8-rcxvp" Feb 17 14:29:00 crc kubenswrapper[4836]: I0217 14:29:00.931568 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7622952e-3f9a-4569-8f4d-8a07f1cbcd2c-dns-svc\") pod \"7622952e-3f9a-4569-8f4d-8a07f1cbcd2c\" (UID: \"7622952e-3f9a-4569-8f4d-8a07f1cbcd2c\") " Feb 17 14:29:00 crc kubenswrapper[4836]: I0217 14:29:00.931667 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7622952e-3f9a-4569-8f4d-8a07f1cbcd2c-ovsdbserver-sb\") pod \"7622952e-3f9a-4569-8f4d-8a07f1cbcd2c\" (UID: \"7622952e-3f9a-4569-8f4d-8a07f1cbcd2c\") " Feb 17 14:29:00 crc kubenswrapper[4836]: I0217 14:29:00.931789 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7622952e-3f9a-4569-8f4d-8a07f1cbcd2c-config\") pod \"7622952e-3f9a-4569-8f4d-8a07f1cbcd2c\" (UID: \"7622952e-3f9a-4569-8f4d-8a07f1cbcd2c\") " Feb 17 14:29:00 crc kubenswrapper[4836]: I0217 14:29:00.931921 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f7h58\" (UniqueName: \"kubernetes.io/projected/7622952e-3f9a-4569-8f4d-8a07f1cbcd2c-kube-api-access-f7h58\") pod \"7622952e-3f9a-4569-8f4d-8a07f1cbcd2c\" (UID: \"7622952e-3f9a-4569-8f4d-8a07f1cbcd2c\") " Feb 17 14:29:00 crc kubenswrapper[4836]: I0217 14:29:00.932031 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7622952e-3f9a-4569-8f4d-8a07f1cbcd2c-ovsdbserver-nb\") pod \"7622952e-3f9a-4569-8f4d-8a07f1cbcd2c\" (UID: \"7622952e-3f9a-4569-8f4d-8a07f1cbcd2c\") " Feb 17 14:29:00 crc kubenswrapper[4836]: I0217 14:29:00.932180 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7622952e-3f9a-4569-8f4d-8a07f1cbcd2c-dns-swift-storage-0\") pod \"7622952e-3f9a-4569-8f4d-8a07f1cbcd2c\" (UID: \"7622952e-3f9a-4569-8f4d-8a07f1cbcd2c\") " Feb 17 14:29:01 crc kubenswrapper[4836]: I0217 14:29:00.998663 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7622952e-3f9a-4569-8f4d-8a07f1cbcd2c-kube-api-access-f7h58" (OuterVolumeSpecName: "kube-api-access-f7h58") pod "7622952e-3f9a-4569-8f4d-8a07f1cbcd2c" (UID: "7622952e-3f9a-4569-8f4d-8a07f1cbcd2c"). InnerVolumeSpecName "kube-api-access-f7h58". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:29:01 crc kubenswrapper[4836]: I0217 14:29:01.039758 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed247b9d-af54-401e-80a3-82d18772f29d-logs\") pod \"ed247b9d-af54-401e-80a3-82d18772f29d\" (UID: \"ed247b9d-af54-401e-80a3-82d18772f29d\") " Feb 17 14:29:01 crc kubenswrapper[4836]: I0217 14:29:01.040102 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ed247b9d-af54-401e-80a3-82d18772f29d-config-data-custom\") pod \"ed247b9d-af54-401e-80a3-82d18772f29d\" (UID: \"ed247b9d-af54-401e-80a3-82d18772f29d\") " Feb 17 14:29:01 crc kubenswrapper[4836]: I0217 14:29:01.040144 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed247b9d-af54-401e-80a3-82d18772f29d-combined-ca-bundle\") pod \"ed247b9d-af54-401e-80a3-82d18772f29d\" (UID: \"ed247b9d-af54-401e-80a3-82d18772f29d\") " Feb 17 14:29:01 crc kubenswrapper[4836]: I0217 14:29:01.040188 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-slsjs\" (UniqueName: \"kubernetes.io/projected/ed247b9d-af54-401e-80a3-82d18772f29d-kube-api-access-slsjs\") pod \"ed247b9d-af54-401e-80a3-82d18772f29d\" (UID: \"ed247b9d-af54-401e-80a3-82d18772f29d\") " Feb 17 14:29:01 crc kubenswrapper[4836]: I0217 14:29:01.040226 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed247b9d-af54-401e-80a3-82d18772f29d-config-data\") pod \"ed247b9d-af54-401e-80a3-82d18772f29d\" (UID: \"ed247b9d-af54-401e-80a3-82d18772f29d\") " Feb 17 14:29:01 crc kubenswrapper[4836]: I0217 14:29:01.041063 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f7h58\" (UniqueName: \"kubernetes.io/projected/7622952e-3f9a-4569-8f4d-8a07f1cbcd2c-kube-api-access-f7h58\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:01 crc kubenswrapper[4836]: I0217 14:29:01.043573 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed247b9d-af54-401e-80a3-82d18772f29d-logs" (OuterVolumeSpecName: "logs") pod "ed247b9d-af54-401e-80a3-82d18772f29d" (UID: "ed247b9d-af54-401e-80a3-82d18772f29d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:29:01 crc kubenswrapper[4836]: I0217 14:29:01.095958 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed247b9d-af54-401e-80a3-82d18772f29d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "ed247b9d-af54-401e-80a3-82d18772f29d" (UID: "ed247b9d-af54-401e-80a3-82d18772f29d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:29:01 crc kubenswrapper[4836]: I0217 14:29:01.137981 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed247b9d-af54-401e-80a3-82d18772f29d-kube-api-access-slsjs" (OuterVolumeSpecName: "kube-api-access-slsjs") pod "ed247b9d-af54-401e-80a3-82d18772f29d" (UID: "ed247b9d-af54-401e-80a3-82d18772f29d"). InnerVolumeSpecName "kube-api-access-slsjs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:29:01 crc kubenswrapper[4836]: I0217 14:29:01.150248 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-slsjs\" (UniqueName: \"kubernetes.io/projected/ed247b9d-af54-401e-80a3-82d18772f29d-kube-api-access-slsjs\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:01 crc kubenswrapper[4836]: I0217 14:29:01.150312 4836 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed247b9d-af54-401e-80a3-82d18772f29d-logs\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:01 crc kubenswrapper[4836]: I0217 14:29:01.150327 4836 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ed247b9d-af54-401e-80a3-82d18772f29d-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:01 crc kubenswrapper[4836]: I0217 14:29:01.225698 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-knj6m" event={"ID":"7622952e-3f9a-4569-8f4d-8a07f1cbcd2c","Type":"ContainerDied","Data":"8555cc4b8a651ad8d38601eead66d8910d6d4cd8c7c50d4ab726898662d8c02f"} Feb 17 14:29:01 crc kubenswrapper[4836]: I0217 14:29:01.225781 4836 scope.go:117] "RemoveContainer" containerID="85a21ea6f28662473a5cbe42dfa68ac85c766a6f09753e438f2c37af7356f777" Feb 17 14:29:01 crc kubenswrapper[4836]: I0217 14:29:01.225994 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-knj6m" Feb 17 14:29:01 crc kubenswrapper[4836]: I0217 14:29:01.245478 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-649d9995c8-rcxvp" event={"ID":"ed247b9d-af54-401e-80a3-82d18772f29d","Type":"ContainerDied","Data":"569d3d6ec6caf0a5240f5c4b4d890d8ae7a0c22c3878dcb1e8c71559ae9f5a26"} Feb 17 14:29:01 crc kubenswrapper[4836]: I0217 14:29:01.245953 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-649d9995c8-rcxvp" Feb 17 14:29:01 crc kubenswrapper[4836]: I0217 14:29:01.319434 4836 scope.go:117] "RemoveContainer" containerID="50d4a249bcc48e57b448052c5a0747dd07cf392d7bd62132728c04243ac9a69b" Feb 17 14:29:01 crc kubenswrapper[4836]: I0217 14:29:01.442762 4836 scope.go:117] "RemoveContainer" containerID="0f637c77116f7f89955d3abdef322406d505cefd736b74cd4ec4ac6a045c16f5" Feb 17 14:29:01 crc kubenswrapper[4836]: I0217 14:29:01.517206 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed247b9d-af54-401e-80a3-82d18772f29d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ed247b9d-af54-401e-80a3-82d18772f29d" (UID: "ed247b9d-af54-401e-80a3-82d18772f29d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:29:01 crc kubenswrapper[4836]: I0217 14:29:01.520505 4836 scope.go:117] "RemoveContainer" containerID="1cbfdb3ab6153a9ef06f48a83598052cef7a8d6eccc057f49dbfbf1a10abee8c" Feb 17 14:29:01 crc kubenswrapper[4836]: I0217 14:29:01.576176 4836 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed247b9d-af54-401e-80a3-82d18772f29d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:01 crc kubenswrapper[4836]: I0217 14:29:01.728320 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7622952e-3f9a-4569-8f4d-8a07f1cbcd2c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7622952e-3f9a-4569-8f4d-8a07f1cbcd2c" (UID: "7622952e-3f9a-4569-8f4d-8a07f1cbcd2c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:29:01 crc kubenswrapper[4836]: I0217 14:29:01.756035 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7622952e-3f9a-4569-8f4d-8a07f1cbcd2c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7622952e-3f9a-4569-8f4d-8a07f1cbcd2c" (UID: "7622952e-3f9a-4569-8f4d-8a07f1cbcd2c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:29:01 crc kubenswrapper[4836]: I0217 14:29:01.770639 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7622952e-3f9a-4569-8f4d-8a07f1cbcd2c-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "7622952e-3f9a-4569-8f4d-8a07f1cbcd2c" (UID: "7622952e-3f9a-4569-8f4d-8a07f1cbcd2c"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:29:01 crc kubenswrapper[4836]: I0217 14:29:01.787826 4836 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7622952e-3f9a-4569-8f4d-8a07f1cbcd2c-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:01 crc kubenswrapper[4836]: I0217 14:29:01.787879 4836 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7622952e-3f9a-4569-8f4d-8a07f1cbcd2c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:01 crc kubenswrapper[4836]: I0217 14:29:01.787899 4836 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7622952e-3f9a-4569-8f4d-8a07f1cbcd2c-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:01 crc kubenswrapper[4836]: I0217 14:29:01.850615 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed247b9d-af54-401e-80a3-82d18772f29d-config-data" (OuterVolumeSpecName: "config-data") pod "ed247b9d-af54-401e-80a3-82d18772f29d" (UID: "ed247b9d-af54-401e-80a3-82d18772f29d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:29:01 crc kubenswrapper[4836]: I0217 14:29:01.851027 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7622952e-3f9a-4569-8f4d-8a07f1cbcd2c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7622952e-3f9a-4569-8f4d-8a07f1cbcd2c" (UID: "7622952e-3f9a-4569-8f4d-8a07f1cbcd2c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:29:01 crc kubenswrapper[4836]: I0217 14:29:01.881968 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7622952e-3f9a-4569-8f4d-8a07f1cbcd2c-config" (OuterVolumeSpecName: "config") pod "7622952e-3f9a-4569-8f4d-8a07f1cbcd2c" (UID: "7622952e-3f9a-4569-8f4d-8a07f1cbcd2c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:29:01 crc kubenswrapper[4836]: I0217 14:29:01.890722 4836 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed247b9d-af54-401e-80a3-82d18772f29d-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:01 crc kubenswrapper[4836]: I0217 14:29:01.890788 4836 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7622952e-3f9a-4569-8f4d-8a07f1cbcd2c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:01 crc kubenswrapper[4836]: I0217 14:29:01.890805 4836 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7622952e-3f9a-4569-8f4d-8a07f1cbcd2c-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:02 crc kubenswrapper[4836]: I0217 14:29:02.000462 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 17 14:29:02 crc kubenswrapper[4836]: I0217 14:29:02.098512 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f85d3a41-bec9-4783-a2c6-2e6627156cce-combined-ca-bundle\") pod \"f85d3a41-bec9-4783-a2c6-2e6627156cce\" (UID: \"f85d3a41-bec9-4783-a2c6-2e6627156cce\") " Feb 17 14:29:02 crc kubenswrapper[4836]: I0217 14:29:02.098563 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f85d3a41-bec9-4783-a2c6-2e6627156cce-config-data\") pod \"f85d3a41-bec9-4783-a2c6-2e6627156cce\" (UID: \"f85d3a41-bec9-4783-a2c6-2e6627156cce\") " Feb 17 14:29:02 crc kubenswrapper[4836]: I0217 14:29:02.098592 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mcwnw\" (UniqueName: \"kubernetes.io/projected/f85d3a41-bec9-4783-a2c6-2e6627156cce-kube-api-access-mcwnw\") pod \"f85d3a41-bec9-4783-a2c6-2e6627156cce\" (UID: \"f85d3a41-bec9-4783-a2c6-2e6627156cce\") " Feb 17 14:29:02 crc kubenswrapper[4836]: I0217 14:29:02.098637 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f85d3a41-bec9-4783-a2c6-2e6627156cce-scripts\") pod \"f85d3a41-bec9-4783-a2c6-2e6627156cce\" (UID: \"f85d3a41-bec9-4783-a2c6-2e6627156cce\") " Feb 17 14:29:02 crc kubenswrapper[4836]: I0217 14:29:02.098678 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f85d3a41-bec9-4783-a2c6-2e6627156cce-config-data-custom\") pod \"f85d3a41-bec9-4783-a2c6-2e6627156cce\" (UID: \"f85d3a41-bec9-4783-a2c6-2e6627156cce\") " Feb 17 14:29:02 crc kubenswrapper[4836]: I0217 14:29:02.098743 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f85d3a41-bec9-4783-a2c6-2e6627156cce-etc-machine-id\") pod \"f85d3a41-bec9-4783-a2c6-2e6627156cce\" (UID: \"f85d3a41-bec9-4783-a2c6-2e6627156cce\") " Feb 17 14:29:02 crc kubenswrapper[4836]: I0217 14:29:02.099241 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85d3a41-bec9-4783-a2c6-2e6627156cce-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "f85d3a41-bec9-4783-a2c6-2e6627156cce" (UID: "f85d3a41-bec9-4783-a2c6-2e6627156cce"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 14:29:02 crc kubenswrapper[4836]: I0217 14:29:02.108852 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f85d3a41-bec9-4783-a2c6-2e6627156cce-kube-api-access-mcwnw" (OuterVolumeSpecName: "kube-api-access-mcwnw") pod "f85d3a41-bec9-4783-a2c6-2e6627156cce" (UID: "f85d3a41-bec9-4783-a2c6-2e6627156cce"). InnerVolumeSpecName "kube-api-access-mcwnw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:29:02 crc kubenswrapper[4836]: I0217 14:29:02.116885 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f85d3a41-bec9-4783-a2c6-2e6627156cce-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "f85d3a41-bec9-4783-a2c6-2e6627156cce" (UID: "f85d3a41-bec9-4783-a2c6-2e6627156cce"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:29:02 crc kubenswrapper[4836]: I0217 14:29:02.149650 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f85d3a41-bec9-4783-a2c6-2e6627156cce-scripts" (OuterVolumeSpecName: "scripts") pod "f85d3a41-bec9-4783-a2c6-2e6627156cce" (UID: "f85d3a41-bec9-4783-a2c6-2e6627156cce"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:29:02 crc kubenswrapper[4836]: I0217 14:29:02.201975 4836 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f85d3a41-bec9-4783-a2c6-2e6627156cce-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:02 crc kubenswrapper[4836]: I0217 14:29:02.202068 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mcwnw\" (UniqueName: \"kubernetes.io/projected/f85d3a41-bec9-4783-a2c6-2e6627156cce-kube-api-access-mcwnw\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:02 crc kubenswrapper[4836]: I0217 14:29:02.202083 4836 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f85d3a41-bec9-4783-a2c6-2e6627156cce-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:02 crc kubenswrapper[4836]: I0217 14:29:02.202098 4836 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f85d3a41-bec9-4783-a2c6-2e6627156cce-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:02 crc kubenswrapper[4836]: I0217 14:29:02.267437 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-649d9995c8-rcxvp"] Feb 17 14:29:02 crc kubenswrapper[4836]: I0217 14:29:02.290385 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-649d9995c8-rcxvp"] Feb 17 14:29:02 crc kubenswrapper[4836]: I0217 14:29:02.316790 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"f85d3a41-bec9-4783-a2c6-2e6627156cce","Type":"ContainerDied","Data":"d7078c4a99b5b82f758808cb22901fe00a2b1684715c3195cb32e503ee688469"} Feb 17 14:29:02 crc kubenswrapper[4836]: I0217 14:29:02.316866 4836 scope.go:117] "RemoveContainer" containerID="b853d8aee890fe988c231f41ba4f29bef7974b65a59ee2897863ebef85aca6e3" Feb 17 14:29:02 crc kubenswrapper[4836]: I0217 14:29:02.317029 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 17 14:29:02 crc kubenswrapper[4836]: I0217 14:29:02.324930 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-knj6m"] Feb 17 14:29:02 crc kubenswrapper[4836]: I0217 14:29:02.345760 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-9z4jp" event={"ID":"f38b5f94-bc8b-4e64-abe6-8c39b920cb4b","Type":"ContainerStarted","Data":"852265bc6ffb6ef9657692f454a84caf832b683e76f800e8dccb3317d95a69ea"} Feb 17 14:29:02 crc kubenswrapper[4836]: I0217 14:29:02.363346 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-knj6m"] Feb 17 14:29:02 crc kubenswrapper[4836]: I0217 14:29:02.370813 4836 scope.go:117] "RemoveContainer" containerID="713b2d67a8bcdbac716e75c48ae87c2e03fb9564a7bca1ee0f598db4994298c7" Feb 17 14:29:02 crc kubenswrapper[4836]: I0217 14:29:02.378524 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-storageinit-9z4jp" podStartSLOduration=11.378493514 podStartE2EDuration="11.378493514s" podCreationTimestamp="2026-02-17 14:28:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:29:02.368662179 +0000 UTC m=+1368.711590448" watchObservedRunningTime="2026-02-17 14:29:02.378493514 +0000 UTC m=+1368.721421783" Feb 17 14:29:02 crc kubenswrapper[4836]: I0217 14:29:02.635745 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f85d3a41-bec9-4783-a2c6-2e6627156cce-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f85d3a41-bec9-4783-a2c6-2e6627156cce" (UID: "f85d3a41-bec9-4783-a2c6-2e6627156cce"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:29:02 crc kubenswrapper[4836]: I0217 14:29:02.647843 4836 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f85d3a41-bec9-4783-a2c6-2e6627156cce-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:02 crc kubenswrapper[4836]: I0217 14:29:02.654763 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7622952e-3f9a-4569-8f4d-8a07f1cbcd2c" path="/var/lib/kubelet/pods/7622952e-3f9a-4569-8f4d-8a07f1cbcd2c/volumes" Feb 17 14:29:02 crc kubenswrapper[4836]: I0217 14:29:02.655620 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed247b9d-af54-401e-80a3-82d18772f29d" path="/var/lib/kubelet/pods/ed247b9d-af54-401e-80a3-82d18772f29d/volumes" Feb 17 14:29:02 crc kubenswrapper[4836]: I0217 14:29:02.733443 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f85d3a41-bec9-4783-a2c6-2e6627156cce-config-data" (OuterVolumeSpecName: "config-data") pod "f85d3a41-bec9-4783-a2c6-2e6627156cce" (UID: "f85d3a41-bec9-4783-a2c6-2e6627156cce"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:29:02 crc kubenswrapper[4836]: I0217 14:29:02.750846 4836 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f85d3a41-bec9-4783-a2c6-2e6627156cce-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:02 crc kubenswrapper[4836]: I0217 14:29:02.986901 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 17 14:29:03 crc kubenswrapper[4836]: I0217 14:29:03.008495 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 17 14:29:03 crc kubenswrapper[4836]: I0217 14:29:03.061943 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-55d7557768-wvvpt" Feb 17 14:29:03 crc kubenswrapper[4836]: I0217 14:29:03.066941 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-55d7557768-wvvpt" Feb 17 14:29:03 crc kubenswrapper[4836]: I0217 14:29:03.066983 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 17 14:29:03 crc kubenswrapper[4836]: E0217 14:29:03.067536 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed247b9d-af54-401e-80a3-82d18772f29d" containerName="barbican-api-log" Feb 17 14:29:03 crc kubenswrapper[4836]: I0217 14:29:03.067561 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed247b9d-af54-401e-80a3-82d18772f29d" containerName="barbican-api-log" Feb 17 14:29:03 crc kubenswrapper[4836]: E0217 14:29:03.067578 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7622952e-3f9a-4569-8f4d-8a07f1cbcd2c" containerName="init" Feb 17 14:29:03 crc kubenswrapper[4836]: I0217 14:29:03.067585 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="7622952e-3f9a-4569-8f4d-8a07f1cbcd2c" containerName="init" Feb 17 14:29:03 crc kubenswrapper[4836]: E0217 14:29:03.067602 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85d3a41-bec9-4783-a2c6-2e6627156cce" containerName="probe" Feb 17 14:29:03 crc kubenswrapper[4836]: I0217 14:29:03.067609 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85d3a41-bec9-4783-a2c6-2e6627156cce" containerName="probe" Feb 17 14:29:03 crc kubenswrapper[4836]: E0217 14:29:03.067624 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85d3a41-bec9-4783-a2c6-2e6627156cce" containerName="cinder-scheduler" Feb 17 14:29:03 crc kubenswrapper[4836]: I0217 14:29:03.067631 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85d3a41-bec9-4783-a2c6-2e6627156cce" containerName="cinder-scheduler" Feb 17 14:29:03 crc kubenswrapper[4836]: E0217 14:29:03.067651 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7622952e-3f9a-4569-8f4d-8a07f1cbcd2c" containerName="dnsmasq-dns" Feb 17 14:29:03 crc kubenswrapper[4836]: I0217 14:29:03.067659 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="7622952e-3f9a-4569-8f4d-8a07f1cbcd2c" containerName="dnsmasq-dns" Feb 17 14:29:03 crc kubenswrapper[4836]: E0217 14:29:03.067668 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed247b9d-af54-401e-80a3-82d18772f29d" containerName="barbican-api" Feb 17 14:29:03 crc kubenswrapper[4836]: I0217 14:29:03.067674 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed247b9d-af54-401e-80a3-82d18772f29d" containerName="barbican-api" Feb 17 14:29:03 crc kubenswrapper[4836]: I0217 14:29:03.067912 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="7622952e-3f9a-4569-8f4d-8a07f1cbcd2c" containerName="dnsmasq-dns" Feb 17 14:29:03 crc kubenswrapper[4836]: I0217 14:29:03.067932 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85d3a41-bec9-4783-a2c6-2e6627156cce" containerName="probe" Feb 17 14:29:03 crc kubenswrapper[4836]: I0217 14:29:03.067956 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed247b9d-af54-401e-80a3-82d18772f29d" containerName="barbican-api-log" Feb 17 14:29:03 crc kubenswrapper[4836]: I0217 14:29:03.067970 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed247b9d-af54-401e-80a3-82d18772f29d" containerName="barbican-api" Feb 17 14:29:03 crc kubenswrapper[4836]: I0217 14:29:03.067980 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85d3a41-bec9-4783-a2c6-2e6627156cce" containerName="cinder-scheduler" Feb 17 14:29:03 crc kubenswrapper[4836]: I0217 14:29:03.070239 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 17 14:29:03 crc kubenswrapper[4836]: I0217 14:29:03.072853 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 17 14:29:03 crc kubenswrapper[4836]: I0217 14:29:03.109868 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 17 14:29:03 crc kubenswrapper[4836]: E0217 14:29:03.197045 4836 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf85d3a41_bec9_4783_a2c6_2e6627156cce.slice\": RecentStats: unable to find data in memory cache]" Feb 17 14:29:03 crc kubenswrapper[4836]: I0217 14:29:03.274561 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e6a7955-6cfb-4afe-b94a-8900513e5821-config-data\") pod \"cinder-scheduler-0\" (UID: \"0e6a7955-6cfb-4afe-b94a-8900513e5821\") " pod="openstack/cinder-scheduler-0" Feb 17 14:29:03 crc kubenswrapper[4836]: I0217 14:29:03.274666 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e6a7955-6cfb-4afe-b94a-8900513e5821-scripts\") pod \"cinder-scheduler-0\" (UID: \"0e6a7955-6cfb-4afe-b94a-8900513e5821\") " pod="openstack/cinder-scheduler-0" Feb 17 14:29:03 crc kubenswrapper[4836]: I0217 14:29:03.274773 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e6a7955-6cfb-4afe-b94a-8900513e5821-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"0e6a7955-6cfb-4afe-b94a-8900513e5821\") " pod="openstack/cinder-scheduler-0" Feb 17 14:29:03 crc kubenswrapper[4836]: I0217 14:29:03.274859 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0e6a7955-6cfb-4afe-b94a-8900513e5821-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"0e6a7955-6cfb-4afe-b94a-8900513e5821\") " pod="openstack/cinder-scheduler-0" Feb 17 14:29:03 crc kubenswrapper[4836]: I0217 14:29:03.274924 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0e6a7955-6cfb-4afe-b94a-8900513e5821-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"0e6a7955-6cfb-4afe-b94a-8900513e5821\") " pod="openstack/cinder-scheduler-0" Feb 17 14:29:03 crc kubenswrapper[4836]: I0217 14:29:03.274951 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fp7vf\" (UniqueName: \"kubernetes.io/projected/0e6a7955-6cfb-4afe-b94a-8900513e5821-kube-api-access-fp7vf\") pod \"cinder-scheduler-0\" (UID: \"0e6a7955-6cfb-4afe-b94a-8900513e5821\") " pod="openstack/cinder-scheduler-0" Feb 17 14:29:03 crc kubenswrapper[4836]: I0217 14:29:03.376927 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"8722776f-950d-46d6-8929-164cc70747af","Type":"ContainerStarted","Data":"c09d4eb94f75fb36fdad30edc1250daa438667981d17bf085edb909265f5f881"} Feb 17 14:29:03 crc kubenswrapper[4836]: I0217 14:29:03.377638 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 17 14:29:03 crc kubenswrapper[4836]: I0217 14:29:03.378435 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-bc958ddf6-kh2rq" Feb 17 14:29:03 crc kubenswrapper[4836]: I0217 14:29:03.387558 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e6a7955-6cfb-4afe-b94a-8900513e5821-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"0e6a7955-6cfb-4afe-b94a-8900513e5821\") " pod="openstack/cinder-scheduler-0" Feb 17 14:29:03 crc kubenswrapper[4836]: I0217 14:29:03.388238 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0e6a7955-6cfb-4afe-b94a-8900513e5821-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"0e6a7955-6cfb-4afe-b94a-8900513e5821\") " pod="openstack/cinder-scheduler-0" Feb 17 14:29:03 crc kubenswrapper[4836]: I0217 14:29:03.388382 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0e6a7955-6cfb-4afe-b94a-8900513e5821-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"0e6a7955-6cfb-4afe-b94a-8900513e5821\") " pod="openstack/cinder-scheduler-0" Feb 17 14:29:03 crc kubenswrapper[4836]: I0217 14:29:03.388869 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0e6a7955-6cfb-4afe-b94a-8900513e5821-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"0e6a7955-6cfb-4afe-b94a-8900513e5821\") " pod="openstack/cinder-scheduler-0" Feb 17 14:29:03 crc kubenswrapper[4836]: I0217 14:29:03.392412 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fp7vf\" (UniqueName: \"kubernetes.io/projected/0e6a7955-6cfb-4afe-b94a-8900513e5821-kube-api-access-fp7vf\") pod \"cinder-scheduler-0\" (UID: \"0e6a7955-6cfb-4afe-b94a-8900513e5821\") " pod="openstack/cinder-scheduler-0" Feb 17 14:29:03 crc kubenswrapper[4836]: I0217 14:29:03.395728 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e6a7955-6cfb-4afe-b94a-8900513e5821-config-data\") pod \"cinder-scheduler-0\" (UID: \"0e6a7955-6cfb-4afe-b94a-8900513e5821\") " pod="openstack/cinder-scheduler-0" Feb 17 14:29:03 crc kubenswrapper[4836]: I0217 14:29:03.396703 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2a1d16f5-4710-43b4-805e-315ed73bb24e" containerName="ceilometer-central-agent" containerID="cri-o://6102d176b1010bbf234d415140cba35d28570c5b514c7edd1c4a0962a14c5149" gracePeriod=30 Feb 17 14:29:03 crc kubenswrapper[4836]: I0217 14:29:03.396840 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2a1d16f5-4710-43b4-805e-315ed73bb24e" containerName="proxy-httpd" containerID="cri-o://00d436156aa07858f79630bc19852984525e2688b6a3d2302eeae168425ab6a8" gracePeriod=30 Feb 17 14:29:03 crc kubenswrapper[4836]: I0217 14:29:03.396898 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2a1d16f5-4710-43b4-805e-315ed73bb24e" containerName="sg-core" containerID="cri-o://b11cf843196ed96ab329470f8fb90c845e937e84667798d3853568520da77e41" gracePeriod=30 Feb 17 14:29:03 crc kubenswrapper[4836]: I0217 14:29:03.396950 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2a1d16f5-4710-43b4-805e-315ed73bb24e" containerName="ceilometer-notification-agent" containerID="cri-o://adc3ef3643d684dbbbf0790a30dd752752d5a28971c3915143c0a6ec314bc365" gracePeriod=30 Feb 17 14:29:03 crc kubenswrapper[4836]: I0217 14:29:03.395875 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2a1d16f5-4710-43b4-805e-315ed73bb24e","Type":"ContainerStarted","Data":"00d436156aa07858f79630bc19852984525e2688b6a3d2302eeae168425ab6a8"} Feb 17 14:29:03 crc kubenswrapper[4836]: I0217 14:29:03.403376 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e6a7955-6cfb-4afe-b94a-8900513e5821-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"0e6a7955-6cfb-4afe-b94a-8900513e5821\") " pod="openstack/cinder-scheduler-0" Feb 17 14:29:03 crc kubenswrapper[4836]: I0217 14:29:03.405466 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e6a7955-6cfb-4afe-b94a-8900513e5821-config-data\") pod \"cinder-scheduler-0\" (UID: \"0e6a7955-6cfb-4afe-b94a-8900513e5821\") " pod="openstack/cinder-scheduler-0" Feb 17 14:29:03 crc kubenswrapper[4836]: I0217 14:29:03.408763 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e6a7955-6cfb-4afe-b94a-8900513e5821-scripts\") pod \"cinder-scheduler-0\" (UID: \"0e6a7955-6cfb-4afe-b94a-8900513e5821\") " pod="openstack/cinder-scheduler-0" Feb 17 14:29:03 crc kubenswrapper[4836]: I0217 14:29:03.427901 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0e6a7955-6cfb-4afe-b94a-8900513e5821-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"0e6a7955-6cfb-4afe-b94a-8900513e5821\") " pod="openstack/cinder-scheduler-0" Feb 17 14:29:03 crc kubenswrapper[4836]: I0217 14:29:03.433083 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fp7vf\" (UniqueName: \"kubernetes.io/projected/0e6a7955-6cfb-4afe-b94a-8900513e5821-kube-api-access-fp7vf\") pod \"cinder-scheduler-0\" (UID: \"0e6a7955-6cfb-4afe-b94a-8900513e5821\") " pod="openstack/cinder-scheduler-0" Feb 17 14:29:03 crc kubenswrapper[4836]: I0217 14:29:03.433114 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e6a7955-6cfb-4afe-b94a-8900513e5821-scripts\") pod \"cinder-scheduler-0\" (UID: \"0e6a7955-6cfb-4afe-b94a-8900513e5821\") " pod="openstack/cinder-scheduler-0" Feb 17 14:29:03 crc kubenswrapper[4836]: I0217 14:29:03.449651 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=13.44962423 podStartE2EDuration="13.44962423s" podCreationTimestamp="2026-02-17 14:28:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:29:03.422372095 +0000 UTC m=+1369.765300384" watchObservedRunningTime="2026-02-17 14:29:03.44962423 +0000 UTC m=+1369.792552499" Feb 17 14:29:03 crc kubenswrapper[4836]: I0217 14:29:03.534100 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=6.678916803 podStartE2EDuration="1m26.534070411s" podCreationTimestamp="2026-02-17 14:27:37 +0000 UTC" firstStartedPulling="2026-02-17 14:27:41.033178975 +0000 UTC m=+1287.376107244" lastFinishedPulling="2026-02-17 14:29:00.888332583 +0000 UTC m=+1367.231260852" observedRunningTime="2026-02-17 14:29:03.503332231 +0000 UTC m=+1369.846260520" watchObservedRunningTime="2026-02-17 14:29:03.534070411 +0000 UTC m=+1369.876998690" Feb 17 14:29:03 crc kubenswrapper[4836]: I0217 14:29:03.710874 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 17 14:29:03 crc kubenswrapper[4836]: I0217 14:29:03.920432 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-bc958ddf6-kh2rq" Feb 17 14:29:04 crc kubenswrapper[4836]: I0217 14:29:04.019910 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-55d7557768-wvvpt"] Feb 17 14:29:04 crc kubenswrapper[4836]: I0217 14:29:04.447522 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 17 14:29:04 crc kubenswrapper[4836]: I0217 14:29:04.456773 4836 generic.go:334] "Generic (PLEG): container finished" podID="2a1d16f5-4710-43b4-805e-315ed73bb24e" containerID="00d436156aa07858f79630bc19852984525e2688b6a3d2302eeae168425ab6a8" exitCode=0 Feb 17 14:29:04 crc kubenswrapper[4836]: I0217 14:29:04.456822 4836 generic.go:334] "Generic (PLEG): container finished" podID="2a1d16f5-4710-43b4-805e-315ed73bb24e" containerID="b11cf843196ed96ab329470f8fb90c845e937e84667798d3853568520da77e41" exitCode=2 Feb 17 14:29:04 crc kubenswrapper[4836]: I0217 14:29:04.458095 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2a1d16f5-4710-43b4-805e-315ed73bb24e","Type":"ContainerDied","Data":"00d436156aa07858f79630bc19852984525e2688b6a3d2302eeae168425ab6a8"} Feb 17 14:29:04 crc kubenswrapper[4836]: I0217 14:29:04.458135 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2a1d16f5-4710-43b4-805e-315ed73bb24e","Type":"ContainerDied","Data":"b11cf843196ed96ab329470f8fb90c845e937e84667798d3853568520da77e41"} Feb 17 14:29:04 crc kubenswrapper[4836]: I0217 14:29:04.458664 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-55d7557768-wvvpt" podUID="21c73844-3235-4a12-9f77-901ba8614e11" containerName="placement-log" containerID="cri-o://d7a1e6a91c2fd3547de1abbc6562e149219faa94982d5287b5144a4b6b8eb8b9" gracePeriod=30 Feb 17 14:29:04 crc kubenswrapper[4836]: I0217 14:29:04.467908 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-55d7557768-wvvpt" podUID="21c73844-3235-4a12-9f77-901ba8614e11" containerName="placement-api" containerID="cri-o://f44d059f8118ff4bd369e70f4ee21eeb5fe30846206d89ff637f2b1a0a5be387" gracePeriod=30 Feb 17 14:29:04 crc kubenswrapper[4836]: I0217 14:29:04.647615 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85d3a41-bec9-4783-a2c6-2e6627156cce" path="/var/lib/kubelet/pods/f85d3a41-bec9-4783-a2c6-2e6627156cce/volumes" Feb 17 14:29:05 crc kubenswrapper[4836]: I0217 14:29:05.261621 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-78c4d587b5-cqhdl" Feb 17 14:29:05 crc kubenswrapper[4836]: I0217 14:29:05.479208 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"0e6a7955-6cfb-4afe-b94a-8900513e5821","Type":"ContainerStarted","Data":"bad994fb76f9d443edc0fdf20c6b6fc382886e5e32679fe615b7082beecb7dc9"} Feb 17 14:29:05 crc kubenswrapper[4836]: I0217 14:29:05.481529 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"0e6a7955-6cfb-4afe-b94a-8900513e5821","Type":"ContainerStarted","Data":"0ff66183221e23285e33ce8ba7036e671c6744494c4f6afbcbe3b944a278bb33"} Feb 17 14:29:05 crc kubenswrapper[4836]: I0217 14:29:05.507535 4836 generic.go:334] "Generic (PLEG): container finished" podID="21c73844-3235-4a12-9f77-901ba8614e11" containerID="d7a1e6a91c2fd3547de1abbc6562e149219faa94982d5287b5144a4b6b8eb8b9" exitCode=143 Feb 17 14:29:05 crc kubenswrapper[4836]: I0217 14:29:05.507666 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-55d7557768-wvvpt" event={"ID":"21c73844-3235-4a12-9f77-901ba8614e11","Type":"ContainerDied","Data":"d7a1e6a91c2fd3547de1abbc6562e149219faa94982d5287b5144a4b6b8eb8b9"} Feb 17 14:29:05 crc kubenswrapper[4836]: I0217 14:29:05.511454 4836 generic.go:334] "Generic (PLEG): container finished" podID="2a1d16f5-4710-43b4-805e-315ed73bb24e" containerID="6102d176b1010bbf234d415140cba35d28570c5b514c7edd1c4a0962a14c5149" exitCode=0 Feb 17 14:29:05 crc kubenswrapper[4836]: I0217 14:29:05.511488 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2a1d16f5-4710-43b4-805e-315ed73bb24e","Type":"ContainerDied","Data":"6102d176b1010bbf234d415140cba35d28570c5b514c7edd1c4a0962a14c5149"} Feb 17 14:29:06 crc kubenswrapper[4836]: I0217 14:29:06.528198 4836 generic.go:334] "Generic (PLEG): container finished" podID="f38b5f94-bc8b-4e64-abe6-8c39b920cb4b" containerID="852265bc6ffb6ef9657692f454a84caf832b683e76f800e8dccb3317d95a69ea" exitCode=0 Feb 17 14:29:06 crc kubenswrapper[4836]: I0217 14:29:06.528429 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-9z4jp" event={"ID":"f38b5f94-bc8b-4e64-abe6-8c39b920cb4b","Type":"ContainerDied","Data":"852265bc6ffb6ef9657692f454a84caf832b683e76f800e8dccb3317d95a69ea"} Feb 17 14:29:06 crc kubenswrapper[4836]: I0217 14:29:06.532438 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"0e6a7955-6cfb-4afe-b94a-8900513e5821","Type":"ContainerStarted","Data":"ce53db1025dd2df25f65899e35ad1a8def5e8eb83bb5ab312beb7b67fda33f93"} Feb 17 14:29:06 crc kubenswrapper[4836]: I0217 14:29:06.582063 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.582035321 podStartE2EDuration="4.582035321s" podCreationTimestamp="2026-02-17 14:29:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:29:06.573794308 +0000 UTC m=+1372.916722577" watchObservedRunningTime="2026-02-17 14:29:06.582035321 +0000 UTC m=+1372.924963590" Feb 17 14:29:07 crc kubenswrapper[4836]: I0217 14:29:07.582010 4836 generic.go:334] "Generic (PLEG): container finished" podID="2a1d16f5-4710-43b4-805e-315ed73bb24e" containerID="adc3ef3643d684dbbbf0790a30dd752752d5a28971c3915143c0a6ec314bc365" exitCode=0 Feb 17 14:29:07 crc kubenswrapper[4836]: I0217 14:29:07.582082 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2a1d16f5-4710-43b4-805e-315ed73bb24e","Type":"ContainerDied","Data":"adc3ef3643d684dbbbf0790a30dd752752d5a28971c3915143c0a6ec314bc365"} Feb 17 14:29:08 crc kubenswrapper[4836]: I0217 14:29:08.241571 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 14:29:08 crc kubenswrapper[4836]: I0217 14:29:08.326917 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-9z4jp" Feb 17 14:29:08 crc kubenswrapper[4836]: I0217 14:29:08.347732 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a1d16f5-4710-43b4-805e-315ed73bb24e-combined-ca-bundle\") pod \"2a1d16f5-4710-43b4-805e-315ed73bb24e\" (UID: \"2a1d16f5-4710-43b4-805e-315ed73bb24e\") " Feb 17 14:29:08 crc kubenswrapper[4836]: I0217 14:29:08.348715 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a1d16f5-4710-43b4-805e-315ed73bb24e-config-data\") pod \"2a1d16f5-4710-43b4-805e-315ed73bb24e\" (UID: \"2a1d16f5-4710-43b4-805e-315ed73bb24e\") " Feb 17 14:29:08 crc kubenswrapper[4836]: I0217 14:29:08.348915 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f38b5f94-bc8b-4e64-abe6-8c39b920cb4b-config-data\") pod \"f38b5f94-bc8b-4e64-abe6-8c39b920cb4b\" (UID: \"f38b5f94-bc8b-4e64-abe6-8c39b920cb4b\") " Feb 17 14:29:08 crc kubenswrapper[4836]: I0217 14:29:08.348961 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2a1d16f5-4710-43b4-805e-315ed73bb24e-run-httpd\") pod \"2a1d16f5-4710-43b4-805e-315ed73bb24e\" (UID: \"2a1d16f5-4710-43b4-805e-315ed73bb24e\") " Feb 17 14:29:08 crc kubenswrapper[4836]: I0217 14:29:08.349016 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a1d16f5-4710-43b4-805e-315ed73bb24e-scripts\") pod \"2a1d16f5-4710-43b4-805e-315ed73bb24e\" (UID: \"2a1d16f5-4710-43b4-805e-315ed73bb24e\") " Feb 17 14:29:08 crc kubenswrapper[4836]: I0217 14:29:08.349072 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2a1d16f5-4710-43b4-805e-315ed73bb24e-sg-core-conf-yaml\") pod \"2a1d16f5-4710-43b4-805e-315ed73bb24e\" (UID: \"2a1d16f5-4710-43b4-805e-315ed73bb24e\") " Feb 17 14:29:08 crc kubenswrapper[4836]: I0217 14:29:08.349119 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dpknx\" (UniqueName: \"kubernetes.io/projected/2a1d16f5-4710-43b4-805e-315ed73bb24e-kube-api-access-dpknx\") pod \"2a1d16f5-4710-43b4-805e-315ed73bb24e\" (UID: \"2a1d16f5-4710-43b4-805e-315ed73bb24e\") " Feb 17 14:29:08 crc kubenswrapper[4836]: I0217 14:29:08.349146 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f38b5f94-bc8b-4e64-abe6-8c39b920cb4b-scripts\") pod \"f38b5f94-bc8b-4e64-abe6-8c39b920cb4b\" (UID: \"f38b5f94-bc8b-4e64-abe6-8c39b920cb4b\") " Feb 17 14:29:08 crc kubenswrapper[4836]: I0217 14:29:08.349170 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/f38b5f94-bc8b-4e64-abe6-8c39b920cb4b-certs\") pod \"f38b5f94-bc8b-4e64-abe6-8c39b920cb4b\" (UID: \"f38b5f94-bc8b-4e64-abe6-8c39b920cb4b\") " Feb 17 14:29:08 crc kubenswrapper[4836]: I0217 14:29:08.349195 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2a1d16f5-4710-43b4-805e-315ed73bb24e-log-httpd\") pod \"2a1d16f5-4710-43b4-805e-315ed73bb24e\" (UID: \"2a1d16f5-4710-43b4-805e-315ed73bb24e\") " Feb 17 14:29:08 crc kubenswrapper[4836]: I0217 14:29:08.350244 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a1d16f5-4710-43b4-805e-315ed73bb24e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "2a1d16f5-4710-43b4-805e-315ed73bb24e" (UID: "2a1d16f5-4710-43b4-805e-315ed73bb24e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:29:08 crc kubenswrapper[4836]: I0217 14:29:08.350863 4836 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2a1d16f5-4710-43b4-805e-315ed73bb24e-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:08 crc kubenswrapper[4836]: I0217 14:29:08.355454 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a1d16f5-4710-43b4-805e-315ed73bb24e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "2a1d16f5-4710-43b4-805e-315ed73bb24e" (UID: "2a1d16f5-4710-43b4-805e-315ed73bb24e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:29:08 crc kubenswrapper[4836]: I0217 14:29:08.357807 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f38b5f94-bc8b-4e64-abe6-8c39b920cb4b-certs" (OuterVolumeSpecName: "certs") pod "f38b5f94-bc8b-4e64-abe6-8c39b920cb4b" (UID: "f38b5f94-bc8b-4e64-abe6-8c39b920cb4b"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:29:08 crc kubenswrapper[4836]: I0217 14:29:08.360803 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a1d16f5-4710-43b4-805e-315ed73bb24e-kube-api-access-dpknx" (OuterVolumeSpecName: "kube-api-access-dpknx") pod "2a1d16f5-4710-43b4-805e-315ed73bb24e" (UID: "2a1d16f5-4710-43b4-805e-315ed73bb24e"). InnerVolumeSpecName "kube-api-access-dpknx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:29:08 crc kubenswrapper[4836]: I0217 14:29:08.369208 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a1d16f5-4710-43b4-805e-315ed73bb24e-scripts" (OuterVolumeSpecName: "scripts") pod "2a1d16f5-4710-43b4-805e-315ed73bb24e" (UID: "2a1d16f5-4710-43b4-805e-315ed73bb24e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:29:08 crc kubenswrapper[4836]: I0217 14:29:08.380437 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f38b5f94-bc8b-4e64-abe6-8c39b920cb4b-scripts" (OuterVolumeSpecName: "scripts") pod "f38b5f94-bc8b-4e64-abe6-8c39b920cb4b" (UID: "f38b5f94-bc8b-4e64-abe6-8c39b920cb4b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:29:08 crc kubenswrapper[4836]: I0217 14:29:08.406705 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a1d16f5-4710-43b4-805e-315ed73bb24e-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "2a1d16f5-4710-43b4-805e-315ed73bb24e" (UID: "2a1d16f5-4710-43b4-805e-315ed73bb24e"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:29:08 crc kubenswrapper[4836]: I0217 14:29:08.412838 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f38b5f94-bc8b-4e64-abe6-8c39b920cb4b-config-data" (OuterVolumeSpecName: "config-data") pod "f38b5f94-bc8b-4e64-abe6-8c39b920cb4b" (UID: "f38b5f94-bc8b-4e64-abe6-8c39b920cb4b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:29:08 crc kubenswrapper[4836]: I0217 14:29:08.417984 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-55d7557768-wvvpt" Feb 17 14:29:08 crc kubenswrapper[4836]: I0217 14:29:08.464173 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a1d16f5-4710-43b4-805e-315ed73bb24e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2a1d16f5-4710-43b4-805e-315ed73bb24e" (UID: "2a1d16f5-4710-43b4-805e-315ed73bb24e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:29:08 crc kubenswrapper[4836]: I0217 14:29:08.472948 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nfjpm\" (UniqueName: \"kubernetes.io/projected/21c73844-3235-4a12-9f77-901ba8614e11-kube-api-access-nfjpm\") pod \"21c73844-3235-4a12-9f77-901ba8614e11\" (UID: \"21c73844-3235-4a12-9f77-901ba8614e11\") " Feb 17 14:29:08 crc kubenswrapper[4836]: I0217 14:29:08.473067 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21c73844-3235-4a12-9f77-901ba8614e11-combined-ca-bundle\") pod \"21c73844-3235-4a12-9f77-901ba8614e11\" (UID: \"21c73844-3235-4a12-9f77-901ba8614e11\") " Feb 17 14:29:08 crc kubenswrapper[4836]: I0217 14:29:08.473142 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21c73844-3235-4a12-9f77-901ba8614e11-scripts\") pod \"21c73844-3235-4a12-9f77-901ba8614e11\" (UID: \"21c73844-3235-4a12-9f77-901ba8614e11\") " Feb 17 14:29:08 crc kubenswrapper[4836]: I0217 14:29:08.473323 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/21c73844-3235-4a12-9f77-901ba8614e11-logs\") pod \"21c73844-3235-4a12-9f77-901ba8614e11\" (UID: \"21c73844-3235-4a12-9f77-901ba8614e11\") " Feb 17 14:29:08 crc kubenswrapper[4836]: I0217 14:29:08.473362 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/21c73844-3235-4a12-9f77-901ba8614e11-internal-tls-certs\") pod \"21c73844-3235-4a12-9f77-901ba8614e11\" (UID: \"21c73844-3235-4a12-9f77-901ba8614e11\") " Feb 17 14:29:08 crc kubenswrapper[4836]: I0217 14:29:08.473455 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21c73844-3235-4a12-9f77-901ba8614e11-config-data\") pod \"21c73844-3235-4a12-9f77-901ba8614e11\" (UID: \"21c73844-3235-4a12-9f77-901ba8614e11\") " Feb 17 14:29:08 crc kubenswrapper[4836]: I0217 14:29:08.473485 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f38b5f94-bc8b-4e64-abe6-8c39b920cb4b-combined-ca-bundle\") pod \"f38b5f94-bc8b-4e64-abe6-8c39b920cb4b\" (UID: \"f38b5f94-bc8b-4e64-abe6-8c39b920cb4b\") " Feb 17 14:29:08 crc kubenswrapper[4836]: I0217 14:29:08.473506 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/21c73844-3235-4a12-9f77-901ba8614e11-public-tls-certs\") pod \"21c73844-3235-4a12-9f77-901ba8614e11\" (UID: \"21c73844-3235-4a12-9f77-901ba8614e11\") " Feb 17 14:29:08 crc kubenswrapper[4836]: I0217 14:29:08.473550 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-82gk8\" (UniqueName: \"kubernetes.io/projected/f38b5f94-bc8b-4e64-abe6-8c39b920cb4b-kube-api-access-82gk8\") pod \"f38b5f94-bc8b-4e64-abe6-8c39b920cb4b\" (UID: \"f38b5f94-bc8b-4e64-abe6-8c39b920cb4b\") " Feb 17 14:29:08 crc kubenswrapper[4836]: I0217 14:29:08.478512 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21c73844-3235-4a12-9f77-901ba8614e11-logs" (OuterVolumeSpecName: "logs") pod "21c73844-3235-4a12-9f77-901ba8614e11" (UID: "21c73844-3235-4a12-9f77-901ba8614e11"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:29:08 crc kubenswrapper[4836]: I0217 14:29:08.481729 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f38b5f94-bc8b-4e64-abe6-8c39b920cb4b-kube-api-access-82gk8" (OuterVolumeSpecName: "kube-api-access-82gk8") pod "f38b5f94-bc8b-4e64-abe6-8c39b920cb4b" (UID: "f38b5f94-bc8b-4e64-abe6-8c39b920cb4b"). InnerVolumeSpecName "kube-api-access-82gk8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:29:08 crc kubenswrapper[4836]: I0217 14:29:08.482556 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21c73844-3235-4a12-9f77-901ba8614e11-kube-api-access-nfjpm" (OuterVolumeSpecName: "kube-api-access-nfjpm") pod "21c73844-3235-4a12-9f77-901ba8614e11" (UID: "21c73844-3235-4a12-9f77-901ba8614e11"). InnerVolumeSpecName "kube-api-access-nfjpm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:29:08 crc kubenswrapper[4836]: I0217 14:29:08.488514 4836 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a1d16f5-4710-43b4-805e-315ed73bb24e-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:08 crc kubenswrapper[4836]: I0217 14:29:08.488548 4836 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2a1d16f5-4710-43b4-805e-315ed73bb24e-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:08 crc kubenswrapper[4836]: I0217 14:29:08.488565 4836 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/21c73844-3235-4a12-9f77-901ba8614e11-logs\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:08 crc kubenswrapper[4836]: I0217 14:29:08.488576 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dpknx\" (UniqueName: \"kubernetes.io/projected/2a1d16f5-4710-43b4-805e-315ed73bb24e-kube-api-access-dpknx\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:08 crc kubenswrapper[4836]: I0217 14:29:08.488587 4836 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f38b5f94-bc8b-4e64-abe6-8c39b920cb4b-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:08 crc kubenswrapper[4836]: I0217 14:29:08.488598 4836 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/f38b5f94-bc8b-4e64-abe6-8c39b920cb4b-certs\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:08 crc kubenswrapper[4836]: I0217 14:29:08.488606 4836 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2a1d16f5-4710-43b4-805e-315ed73bb24e-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:08 crc kubenswrapper[4836]: I0217 14:29:08.488617 4836 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a1d16f5-4710-43b4-805e-315ed73bb24e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:08 crc kubenswrapper[4836]: I0217 14:29:08.488626 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-82gk8\" (UniqueName: \"kubernetes.io/projected/f38b5f94-bc8b-4e64-abe6-8c39b920cb4b-kube-api-access-82gk8\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:08 crc kubenswrapper[4836]: I0217 14:29:08.488637 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nfjpm\" (UniqueName: \"kubernetes.io/projected/21c73844-3235-4a12-9f77-901ba8614e11-kube-api-access-nfjpm\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:08 crc kubenswrapper[4836]: I0217 14:29:08.488648 4836 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f38b5f94-bc8b-4e64-abe6-8c39b920cb4b-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:08 crc kubenswrapper[4836]: I0217 14:29:08.519013 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a1d16f5-4710-43b4-805e-315ed73bb24e-config-data" (OuterVolumeSpecName: "config-data") pod "2a1d16f5-4710-43b4-805e-315ed73bb24e" (UID: "2a1d16f5-4710-43b4-805e-315ed73bb24e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:29:08 crc kubenswrapper[4836]: I0217 14:29:08.534406 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Feb 17 14:29:08 crc kubenswrapper[4836]: E0217 14:29:08.534991 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a1d16f5-4710-43b4-805e-315ed73bb24e" containerName="sg-core" Feb 17 14:29:08 crc kubenswrapper[4836]: I0217 14:29:08.535022 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a1d16f5-4710-43b4-805e-315ed73bb24e" containerName="sg-core" Feb 17 14:29:08 crc kubenswrapper[4836]: E0217 14:29:08.535051 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a1d16f5-4710-43b4-805e-315ed73bb24e" containerName="ceilometer-notification-agent" Feb 17 14:29:08 crc kubenswrapper[4836]: I0217 14:29:08.535059 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a1d16f5-4710-43b4-805e-315ed73bb24e" containerName="ceilometer-notification-agent" Feb 17 14:29:08 crc kubenswrapper[4836]: E0217 14:29:08.535081 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a1d16f5-4710-43b4-805e-315ed73bb24e" containerName="ceilometer-central-agent" Feb 17 14:29:08 crc kubenswrapper[4836]: I0217 14:29:08.535089 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a1d16f5-4710-43b4-805e-315ed73bb24e" containerName="ceilometer-central-agent" Feb 17 14:29:08 crc kubenswrapper[4836]: E0217 14:29:08.535099 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21c73844-3235-4a12-9f77-901ba8614e11" containerName="placement-log" Feb 17 14:29:08 crc kubenswrapper[4836]: I0217 14:29:08.535107 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="21c73844-3235-4a12-9f77-901ba8614e11" containerName="placement-log" Feb 17 14:29:08 crc kubenswrapper[4836]: E0217 14:29:08.535115 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a1d16f5-4710-43b4-805e-315ed73bb24e" containerName="proxy-httpd" Feb 17 14:29:08 crc kubenswrapper[4836]: I0217 14:29:08.535122 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a1d16f5-4710-43b4-805e-315ed73bb24e" containerName="proxy-httpd" Feb 17 14:29:08 crc kubenswrapper[4836]: E0217 14:29:08.535137 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f38b5f94-bc8b-4e64-abe6-8c39b920cb4b" containerName="cloudkitty-storageinit" Feb 17 14:29:08 crc kubenswrapper[4836]: I0217 14:29:08.535145 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="f38b5f94-bc8b-4e64-abe6-8c39b920cb4b" containerName="cloudkitty-storageinit" Feb 17 14:29:08 crc kubenswrapper[4836]: E0217 14:29:08.535155 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21c73844-3235-4a12-9f77-901ba8614e11" containerName="placement-api" Feb 17 14:29:08 crc kubenswrapper[4836]: I0217 14:29:08.535162 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="21c73844-3235-4a12-9f77-901ba8614e11" containerName="placement-api" Feb 17 14:29:08 crc kubenswrapper[4836]: I0217 14:29:08.535408 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a1d16f5-4710-43b4-805e-315ed73bb24e" containerName="ceilometer-central-agent" Feb 17 14:29:08 crc kubenswrapper[4836]: I0217 14:29:08.535430 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a1d16f5-4710-43b4-805e-315ed73bb24e" containerName="ceilometer-notification-agent" Feb 17 14:29:08 crc kubenswrapper[4836]: I0217 14:29:08.535441 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a1d16f5-4710-43b4-805e-315ed73bb24e" containerName="proxy-httpd" Feb 17 14:29:08 crc kubenswrapper[4836]: I0217 14:29:08.535460 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="21c73844-3235-4a12-9f77-901ba8614e11" containerName="placement-api" Feb 17 14:29:08 crc kubenswrapper[4836]: I0217 14:29:08.535483 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a1d16f5-4710-43b4-805e-315ed73bb24e" containerName="sg-core" Feb 17 14:29:08 crc kubenswrapper[4836]: I0217 14:29:08.535493 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="21c73844-3235-4a12-9f77-901ba8614e11" containerName="placement-log" Feb 17 14:29:08 crc kubenswrapper[4836]: I0217 14:29:08.535510 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="f38b5f94-bc8b-4e64-abe6-8c39b920cb4b" containerName="cloudkitty-storageinit" Feb 17 14:29:08 crc kubenswrapper[4836]: I0217 14:29:08.536563 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 17 14:29:08 crc kubenswrapper[4836]: I0217 14:29:08.542929 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21c73844-3235-4a12-9f77-901ba8614e11-scripts" (OuterVolumeSpecName: "scripts") pod "21c73844-3235-4a12-9f77-901ba8614e11" (UID: "21c73844-3235-4a12-9f77-901ba8614e11"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:29:08 crc kubenswrapper[4836]: I0217 14:29:08.548182 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-fcq7g" Feb 17 14:29:08 crc kubenswrapper[4836]: I0217 14:29:08.548592 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Feb 17 14:29:08 crc kubenswrapper[4836]: I0217 14:29:08.548209 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Feb 17 14:29:08 crc kubenswrapper[4836]: I0217 14:29:08.553258 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 17 14:29:08 crc kubenswrapper[4836]: I0217 14:29:08.590480 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drz9c\" (UniqueName: \"kubernetes.io/projected/a8afff37-cd9b-46c0-b407-7c2fb5bada37-kube-api-access-drz9c\") pod \"openstackclient\" (UID: \"a8afff37-cd9b-46c0-b407-7c2fb5bada37\") " pod="openstack/openstackclient" Feb 17 14:29:08 crc kubenswrapper[4836]: I0217 14:29:08.590587 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8afff37-cd9b-46c0-b407-7c2fb5bada37-combined-ca-bundle\") pod \"openstackclient\" (UID: \"a8afff37-cd9b-46c0-b407-7c2fb5bada37\") " pod="openstack/openstackclient" Feb 17 14:29:08 crc kubenswrapper[4836]: I0217 14:29:08.590612 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a8afff37-cd9b-46c0-b407-7c2fb5bada37-openstack-config-secret\") pod \"openstackclient\" (UID: \"a8afff37-cd9b-46c0-b407-7c2fb5bada37\") " pod="openstack/openstackclient" Feb 17 14:29:08 crc kubenswrapper[4836]: I0217 14:29:08.590662 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a8afff37-cd9b-46c0-b407-7c2fb5bada37-openstack-config\") pod \"openstackclient\" (UID: \"a8afff37-cd9b-46c0-b407-7c2fb5bada37\") " pod="openstack/openstackclient" Feb 17 14:29:08 crc kubenswrapper[4836]: I0217 14:29:08.590872 4836 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21c73844-3235-4a12-9f77-901ba8614e11-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:08 crc kubenswrapper[4836]: I0217 14:29:08.590886 4836 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a1d16f5-4710-43b4-805e-315ed73bb24e-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:08 crc kubenswrapper[4836]: I0217 14:29:08.629584 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f38b5f94-bc8b-4e64-abe6-8c39b920cb4b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f38b5f94-bc8b-4e64-abe6-8c39b920cb4b" (UID: "f38b5f94-bc8b-4e64-abe6-8c39b920cb4b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:29:08 crc kubenswrapper[4836]: I0217 14:29:08.635251 4836 generic.go:334] "Generic (PLEG): container finished" podID="21c73844-3235-4a12-9f77-901ba8614e11" containerID="f44d059f8118ff4bd369e70f4ee21eeb5fe30846206d89ff637f2b1a0a5be387" exitCode=0 Feb 17 14:29:08 crc kubenswrapper[4836]: I0217 14:29:08.635570 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-55d7557768-wvvpt" Feb 17 14:29:08 crc kubenswrapper[4836]: I0217 14:29:08.649470 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 14:29:08 crc kubenswrapper[4836]: I0217 14:29:08.654424 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21c73844-3235-4a12-9f77-901ba8614e11-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "21c73844-3235-4a12-9f77-901ba8614e11" (UID: "21c73844-3235-4a12-9f77-901ba8614e11"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:29:08 crc kubenswrapper[4836]: I0217 14:29:08.656461 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-9z4jp" Feb 17 14:29:08 crc kubenswrapper[4836]: I0217 14:29:08.683528 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21c73844-3235-4a12-9f77-901ba8614e11-config-data" (OuterVolumeSpecName: "config-data") pod "21c73844-3235-4a12-9f77-901ba8614e11" (UID: "21c73844-3235-4a12-9f77-901ba8614e11"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:29:08 crc kubenswrapper[4836]: I0217 14:29:08.692592 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8afff37-cd9b-46c0-b407-7c2fb5bada37-combined-ca-bundle\") pod \"openstackclient\" (UID: \"a8afff37-cd9b-46c0-b407-7c2fb5bada37\") " pod="openstack/openstackclient" Feb 17 14:29:08 crc kubenswrapper[4836]: I0217 14:29:08.692644 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a8afff37-cd9b-46c0-b407-7c2fb5bada37-openstack-config-secret\") pod \"openstackclient\" (UID: \"a8afff37-cd9b-46c0-b407-7c2fb5bada37\") " pod="openstack/openstackclient" Feb 17 14:29:08 crc kubenswrapper[4836]: I0217 14:29:08.692735 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a8afff37-cd9b-46c0-b407-7c2fb5bada37-openstack-config\") pod \"openstackclient\" (UID: \"a8afff37-cd9b-46c0-b407-7c2fb5bada37\") " pod="openstack/openstackclient" Feb 17 14:29:08 crc kubenswrapper[4836]: I0217 14:29:08.692850 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drz9c\" (UniqueName: \"kubernetes.io/projected/a8afff37-cd9b-46c0-b407-7c2fb5bada37-kube-api-access-drz9c\") pod \"openstackclient\" (UID: \"a8afff37-cd9b-46c0-b407-7c2fb5bada37\") " pod="openstack/openstackclient" Feb 17 14:29:08 crc kubenswrapper[4836]: I0217 14:29:08.692967 4836 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21c73844-3235-4a12-9f77-901ba8614e11-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:08 crc kubenswrapper[4836]: I0217 14:29:08.692979 4836 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21c73844-3235-4a12-9f77-901ba8614e11-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:08 crc kubenswrapper[4836]: I0217 14:29:08.692988 4836 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f38b5f94-bc8b-4e64-abe6-8c39b920cb4b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:08 crc kubenswrapper[4836]: I0217 14:29:08.694086 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a8afff37-cd9b-46c0-b407-7c2fb5bada37-openstack-config\") pod \"openstackclient\" (UID: \"a8afff37-cd9b-46c0-b407-7c2fb5bada37\") " pod="openstack/openstackclient" Feb 17 14:29:08 crc kubenswrapper[4836]: I0217 14:29:08.707606 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8afff37-cd9b-46c0-b407-7c2fb5bada37-combined-ca-bundle\") pod \"openstackclient\" (UID: \"a8afff37-cd9b-46c0-b407-7c2fb5bada37\") " pod="openstack/openstackclient" Feb 17 14:29:08 crc kubenswrapper[4836]: I0217 14:29:08.727688 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a8afff37-cd9b-46c0-b407-7c2fb5bada37-openstack-config-secret\") pod \"openstackclient\" (UID: \"a8afff37-cd9b-46c0-b407-7c2fb5bada37\") " pod="openstack/openstackclient" Feb 17 14:29:08 crc kubenswrapper[4836]: I0217 14:29:08.746054 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drz9c\" (UniqueName: \"kubernetes.io/projected/a8afff37-cd9b-46c0-b407-7c2fb5bada37-kube-api-access-drz9c\") pod \"openstackclient\" (UID: \"a8afff37-cd9b-46c0-b407-7c2fb5bada37\") " pod="openstack/openstackclient" Feb 17 14:29:08 crc kubenswrapper[4836]: I0217 14:29:08.809640 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21c73844-3235-4a12-9f77-901ba8614e11-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "21c73844-3235-4a12-9f77-901ba8614e11" (UID: "21c73844-3235-4a12-9f77-901ba8614e11"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:29:08 crc kubenswrapper[4836]: I0217 14:29:08.871662 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21c73844-3235-4a12-9f77-901ba8614e11-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "21c73844-3235-4a12-9f77-901ba8614e11" (UID: "21c73844-3235-4a12-9f77-901ba8614e11"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:29:08 crc kubenswrapper[4836]: I0217 14:29:08.890945 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 17 14:29:08 crc kubenswrapper[4836]: I0217 14:29:08.890998 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-55d7557768-wvvpt" event={"ID":"21c73844-3235-4a12-9f77-901ba8614e11","Type":"ContainerDied","Data":"f44d059f8118ff4bd369e70f4ee21eeb5fe30846206d89ff637f2b1a0a5be387"} Feb 17 14:29:08 crc kubenswrapper[4836]: I0217 14:29:08.891043 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-55d7557768-wvvpt" event={"ID":"21c73844-3235-4a12-9f77-901ba8614e11","Type":"ContainerDied","Data":"bbe789caf6ed33cc607fbf4e010b5eb03468b6cfaedd4af371c447ef9c0fa67b"} Feb 17 14:29:08 crc kubenswrapper[4836]: I0217 14:29:08.891066 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2a1d16f5-4710-43b4-805e-315ed73bb24e","Type":"ContainerDied","Data":"725a655ac601adcaa8185b937f6643704390b16c79c731f2de3ba649c346ef2b"} Feb 17 14:29:08 crc kubenswrapper[4836]: I0217 14:29:08.891084 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-9z4jp" event={"ID":"f38b5f94-bc8b-4e64-abe6-8c39b920cb4b","Type":"ContainerDied","Data":"bcf545502020f0699c5847e3cc9076fff2937319f46acd5c1c65027d98b9be99"} Feb 17 14:29:08 crc kubenswrapper[4836]: I0217 14:29:08.891104 4836 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bcf545502020f0699c5847e3cc9076fff2937319f46acd5c1c65027d98b9be99" Feb 17 14:29:08 crc kubenswrapper[4836]: I0217 14:29:08.891128 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 17 14:29:08 crc kubenswrapper[4836]: I0217 14:29:08.893145 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Feb 17 14:29:08 crc kubenswrapper[4836]: I0217 14:29:08.893197 4836 scope.go:117] "RemoveContainer" containerID="f44d059f8118ff4bd369e70f4ee21eeb5fe30846206d89ff637f2b1a0a5be387" Feb 17 14:29:08 crc kubenswrapper[4836]: I0217 14:29:08.897123 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-scripts" Feb 17 14:29:08 crc kubenswrapper[4836]: I0217 14:29:08.897538 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-proc-config-data" Feb 17 14:29:08 crc kubenswrapper[4836]: I0217 14:29:08.902035 4836 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/21c73844-3235-4a12-9f77-901ba8614e11-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:08 crc kubenswrapper[4836]: I0217 14:29:08.902078 4836 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/21c73844-3235-4a12-9f77-901ba8614e11-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:08 crc kubenswrapper[4836]: I0217 14:29:08.903445 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-config-data" Feb 17 14:29:08 crc kubenswrapper[4836]: I0217 14:29:08.903672 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-client-internal" Feb 17 14:29:08 crc kubenswrapper[4836]: I0217 14:29:08.906748 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-cloudkitty-dockercfg-l28cf" Feb 17 14:29:08 crc kubenswrapper[4836]: I0217 14:29:08.930885 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 17 14:29:08 crc kubenswrapper[4836]: I0217 14:29:08.940232 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 17 14:29:08 crc kubenswrapper[4836]: I0217 14:29:08.964956 4836 scope.go:117] "RemoveContainer" containerID="d7a1e6a91c2fd3547de1abbc6562e149219faa94982d5287b5144a4b6b8eb8b9" Feb 17 14:29:08 crc kubenswrapper[4836]: I0217 14:29:08.993114 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-67bdc55879-cjz8m"] Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.004962 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/39d3cbf1-d107-4004-9eec-698f8f4360b9-certs\") pod \"cloudkitty-proc-0\" (UID: \"39d3cbf1-d107-4004-9eec-698f8f4360b9\") " pod="openstack/cloudkitty-proc-0" Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.005031 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39d3cbf1-d107-4004-9eec-698f8f4360b9-scripts\") pod \"cloudkitty-proc-0\" (UID: \"39d3cbf1-d107-4004-9eec-698f8f4360b9\") " pod="openstack/cloudkitty-proc-0" Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.005065 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39d3cbf1-d107-4004-9eec-698f8f4360b9-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"39d3cbf1-d107-4004-9eec-698f8f4360b9\") " pod="openstack/cloudkitty-proc-0" Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.005083 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39d3cbf1-d107-4004-9eec-698f8f4360b9-config-data\") pod \"cloudkitty-proc-0\" (UID: \"39d3cbf1-d107-4004-9eec-698f8f4360b9\") " pod="openstack/cloudkitty-proc-0" Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.005105 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8gqz9\" (UniqueName: \"kubernetes.io/projected/39d3cbf1-d107-4004-9eec-698f8f4360b9-kube-api-access-8gqz9\") pod \"cloudkitty-proc-0\" (UID: \"39d3cbf1-d107-4004-9eec-698f8f4360b9\") " pod="openstack/cloudkitty-proc-0" Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.005363 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/39d3cbf1-d107-4004-9eec-698f8f4360b9-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"39d3cbf1-d107-4004-9eec-698f8f4360b9\") " pod="openstack/cloudkitty-proc-0" Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.012909 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67bdc55879-cjz8m" Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.040061 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67bdc55879-cjz8m"] Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.125224 4836 scope.go:117] "RemoveContainer" containerID="f44d059f8118ff4bd369e70f4ee21eeb5fe30846206d89ff637f2b1a0a5be387" Feb 17 14:29:09 crc kubenswrapper[4836]: E0217 14:29:09.128860 4836 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f44d059f8118ff4bd369e70f4ee21eeb5fe30846206d89ff637f2b1a0a5be387\": container with ID starting with f44d059f8118ff4bd369e70f4ee21eeb5fe30846206d89ff637f2b1a0a5be387 not found: ID does not exist" containerID="f44d059f8118ff4bd369e70f4ee21eeb5fe30846206d89ff637f2b1a0a5be387" Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.128932 4836 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f44d059f8118ff4bd369e70f4ee21eeb5fe30846206d89ff637f2b1a0a5be387"} err="failed to get container status \"f44d059f8118ff4bd369e70f4ee21eeb5fe30846206d89ff637f2b1a0a5be387\": rpc error: code = NotFound desc = could not find container \"f44d059f8118ff4bd369e70f4ee21eeb5fe30846206d89ff637f2b1a0a5be387\": container with ID starting with f44d059f8118ff4bd369e70f4ee21eeb5fe30846206d89ff637f2b1a0a5be387 not found: ID does not exist" Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.128977 4836 scope.go:117] "RemoveContainer" containerID="d7a1e6a91c2fd3547de1abbc6562e149219faa94982d5287b5144a4b6b8eb8b9" Feb 17 14:29:09 crc kubenswrapper[4836]: E0217 14:29:09.130880 4836 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7a1e6a91c2fd3547de1abbc6562e149219faa94982d5287b5144a4b6b8eb8b9\": container with ID starting with d7a1e6a91c2fd3547de1abbc6562e149219faa94982d5287b5144a4b6b8eb8b9 not found: ID does not exist" containerID="d7a1e6a91c2fd3547de1abbc6562e149219faa94982d5287b5144a4b6b8eb8b9" Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.130924 4836 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7a1e6a91c2fd3547de1abbc6562e149219faa94982d5287b5144a4b6b8eb8b9"} err="failed to get container status \"d7a1e6a91c2fd3547de1abbc6562e149219faa94982d5287b5144a4b6b8eb8b9\": rpc error: code = NotFound desc = could not find container \"d7a1e6a91c2fd3547de1abbc6562e149219faa94982d5287b5144a4b6b8eb8b9\": container with ID starting with d7a1e6a91c2fd3547de1abbc6562e149219faa94982d5287b5144a4b6b8eb8b9 not found: ID does not exist" Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.130953 4836 scope.go:117] "RemoveContainer" containerID="00d436156aa07858f79630bc19852984525e2688b6a3d2302eeae168425ab6a8" Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.133322 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9tzl2\" (UniqueName: \"kubernetes.io/projected/2cb84cd5-666c-4cf2-94df-5b51d0e3fa6d-kube-api-access-9tzl2\") pod \"dnsmasq-dns-67bdc55879-cjz8m\" (UID: \"2cb84cd5-666c-4cf2-94df-5b51d0e3fa6d\") " pod="openstack/dnsmasq-dns-67bdc55879-cjz8m" Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.133442 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2cb84cd5-666c-4cf2-94df-5b51d0e3fa6d-ovsdbserver-sb\") pod \"dnsmasq-dns-67bdc55879-cjz8m\" (UID: \"2cb84cd5-666c-4cf2-94df-5b51d0e3fa6d\") " pod="openstack/dnsmasq-dns-67bdc55879-cjz8m" Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.133503 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/39d3cbf1-d107-4004-9eec-698f8f4360b9-certs\") pod \"cloudkitty-proc-0\" (UID: \"39d3cbf1-d107-4004-9eec-698f8f4360b9\") " pod="openstack/cloudkitty-proc-0" Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.133531 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2cb84cd5-666c-4cf2-94df-5b51d0e3fa6d-ovsdbserver-nb\") pod \"dnsmasq-dns-67bdc55879-cjz8m\" (UID: \"2cb84cd5-666c-4cf2-94df-5b51d0e3fa6d\") " pod="openstack/dnsmasq-dns-67bdc55879-cjz8m" Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.133607 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39d3cbf1-d107-4004-9eec-698f8f4360b9-scripts\") pod \"cloudkitty-proc-0\" (UID: \"39d3cbf1-d107-4004-9eec-698f8f4360b9\") " pod="openstack/cloudkitty-proc-0" Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.133631 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2cb84cd5-666c-4cf2-94df-5b51d0e3fa6d-dns-swift-storage-0\") pod \"dnsmasq-dns-67bdc55879-cjz8m\" (UID: \"2cb84cd5-666c-4cf2-94df-5b51d0e3fa6d\") " pod="openstack/dnsmasq-dns-67bdc55879-cjz8m" Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.133986 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39d3cbf1-d107-4004-9eec-698f8f4360b9-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"39d3cbf1-d107-4004-9eec-698f8f4360b9\") " pod="openstack/cloudkitty-proc-0" Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.134027 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39d3cbf1-d107-4004-9eec-698f8f4360b9-config-data\") pod \"cloudkitty-proc-0\" (UID: \"39d3cbf1-d107-4004-9eec-698f8f4360b9\") " pod="openstack/cloudkitty-proc-0" Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.134045 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8gqz9\" (UniqueName: \"kubernetes.io/projected/39d3cbf1-d107-4004-9eec-698f8f4360b9-kube-api-access-8gqz9\") pod \"cloudkitty-proc-0\" (UID: \"39d3cbf1-d107-4004-9eec-698f8f4360b9\") " pod="openstack/cloudkitty-proc-0" Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.134072 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2cb84cd5-666c-4cf2-94df-5b51d0e3fa6d-config\") pod \"dnsmasq-dns-67bdc55879-cjz8m\" (UID: \"2cb84cd5-666c-4cf2-94df-5b51d0e3fa6d\") " pod="openstack/dnsmasq-dns-67bdc55879-cjz8m" Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.134409 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2cb84cd5-666c-4cf2-94df-5b51d0e3fa6d-dns-svc\") pod \"dnsmasq-dns-67bdc55879-cjz8m\" (UID: \"2cb84cd5-666c-4cf2-94df-5b51d0e3fa6d\") " pod="openstack/dnsmasq-dns-67bdc55879-cjz8m" Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.134444 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/39d3cbf1-d107-4004-9eec-698f8f4360b9-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"39d3cbf1-d107-4004-9eec-698f8f4360b9\") " pod="openstack/cloudkitty-proc-0" Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.150324 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39d3cbf1-d107-4004-9eec-698f8f4360b9-scripts\") pod \"cloudkitty-proc-0\" (UID: \"39d3cbf1-d107-4004-9eec-698f8f4360b9\") " pod="openstack/cloudkitty-proc-0" Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.153819 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39d3cbf1-d107-4004-9eec-698f8f4360b9-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"39d3cbf1-d107-4004-9eec-698f8f4360b9\") " pod="openstack/cloudkitty-proc-0" Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.161516 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/39d3cbf1-d107-4004-9eec-698f8f4360b9-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"39d3cbf1-d107-4004-9eec-698f8f4360b9\") " pod="openstack/cloudkitty-proc-0" Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.161672 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39d3cbf1-d107-4004-9eec-698f8f4360b9-config-data\") pod \"cloudkitty-proc-0\" (UID: \"39d3cbf1-d107-4004-9eec-698f8f4360b9\") " pod="openstack/cloudkitty-proc-0" Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.173769 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/39d3cbf1-d107-4004-9eec-698f8f4360b9-certs\") pod \"cloudkitty-proc-0\" (UID: \"39d3cbf1-d107-4004-9eec-698f8f4360b9\") " pod="openstack/cloudkitty-proc-0" Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.190029 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8gqz9\" (UniqueName: \"kubernetes.io/projected/39d3cbf1-d107-4004-9eec-698f8f4360b9-kube-api-access-8gqz9\") pod \"cloudkitty-proc-0\" (UID: \"39d3cbf1-d107-4004-9eec-698f8f4360b9\") " pod="openstack/cloudkitty-proc-0" Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.204044 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.229881 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.235931 4836 scope.go:117] "RemoveContainer" containerID="b11cf843196ed96ab329470f8fb90c845e937e84667798d3853568520da77e41" Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.238691 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2cb84cd5-666c-4cf2-94df-5b51d0e3fa6d-dns-swift-storage-0\") pod \"dnsmasq-dns-67bdc55879-cjz8m\" (UID: \"2cb84cd5-666c-4cf2-94df-5b51d0e3fa6d\") " pod="openstack/dnsmasq-dns-67bdc55879-cjz8m" Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.238766 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2cb84cd5-666c-4cf2-94df-5b51d0e3fa6d-config\") pod \"dnsmasq-dns-67bdc55879-cjz8m\" (UID: \"2cb84cd5-666c-4cf2-94df-5b51d0e3fa6d\") " pod="openstack/dnsmasq-dns-67bdc55879-cjz8m" Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.239016 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2cb84cd5-666c-4cf2-94df-5b51d0e3fa6d-dns-svc\") pod \"dnsmasq-dns-67bdc55879-cjz8m\" (UID: \"2cb84cd5-666c-4cf2-94df-5b51d0e3fa6d\") " pod="openstack/dnsmasq-dns-67bdc55879-cjz8m" Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.239088 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9tzl2\" (UniqueName: \"kubernetes.io/projected/2cb84cd5-666c-4cf2-94df-5b51d0e3fa6d-kube-api-access-9tzl2\") pod \"dnsmasq-dns-67bdc55879-cjz8m\" (UID: \"2cb84cd5-666c-4cf2-94df-5b51d0e3fa6d\") " pod="openstack/dnsmasq-dns-67bdc55879-cjz8m" Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.239210 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2cb84cd5-666c-4cf2-94df-5b51d0e3fa6d-ovsdbserver-sb\") pod \"dnsmasq-dns-67bdc55879-cjz8m\" (UID: \"2cb84cd5-666c-4cf2-94df-5b51d0e3fa6d\") " pod="openstack/dnsmasq-dns-67bdc55879-cjz8m" Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.239265 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2cb84cd5-666c-4cf2-94df-5b51d0e3fa6d-ovsdbserver-nb\") pod \"dnsmasq-dns-67bdc55879-cjz8m\" (UID: \"2cb84cd5-666c-4cf2-94df-5b51d0e3fa6d\") " pod="openstack/dnsmasq-dns-67bdc55879-cjz8m" Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.239692 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2cb84cd5-666c-4cf2-94df-5b51d0e3fa6d-dns-swift-storage-0\") pod \"dnsmasq-dns-67bdc55879-cjz8m\" (UID: \"2cb84cd5-666c-4cf2-94df-5b51d0e3fa6d\") " pod="openstack/dnsmasq-dns-67bdc55879-cjz8m" Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.240153 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2cb84cd5-666c-4cf2-94df-5b51d0e3fa6d-ovsdbserver-nb\") pod \"dnsmasq-dns-67bdc55879-cjz8m\" (UID: \"2cb84cd5-666c-4cf2-94df-5b51d0e3fa6d\") " pod="openstack/dnsmasq-dns-67bdc55879-cjz8m" Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.240276 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2cb84cd5-666c-4cf2-94df-5b51d0e3fa6d-dns-svc\") pod \"dnsmasq-dns-67bdc55879-cjz8m\" (UID: \"2cb84cd5-666c-4cf2-94df-5b51d0e3fa6d\") " pod="openstack/dnsmasq-dns-67bdc55879-cjz8m" Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.241318 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2cb84cd5-666c-4cf2-94df-5b51d0e3fa6d-config\") pod \"dnsmasq-dns-67bdc55879-cjz8m\" (UID: \"2cb84cd5-666c-4cf2-94df-5b51d0e3fa6d\") " pod="openstack/dnsmasq-dns-67bdc55879-cjz8m" Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.242006 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2cb84cd5-666c-4cf2-94df-5b51d0e3fa6d-ovsdbserver-sb\") pod \"dnsmasq-dns-67bdc55879-cjz8m\" (UID: \"2cb84cd5-666c-4cf2-94df-5b51d0e3fa6d\") " pod="openstack/dnsmasq-dns-67bdc55879-cjz8m" Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.251760 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.258927 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.261340 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.263358 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9tzl2\" (UniqueName: \"kubernetes.io/projected/2cb84cd5-666c-4cf2-94df-5b51d0e3fa6d-kube-api-access-9tzl2\") pod \"dnsmasq-dns-67bdc55879-cjz8m\" (UID: \"2cb84cd5-666c-4cf2-94df-5b51d0e3fa6d\") " pod="openstack/dnsmasq-dns-67bdc55879-cjz8m" Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.273212 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.277154 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.284760 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.285621 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 17 14:29:09 crc kubenswrapper[4836]: E0217 14:29:09.294744 4836 log.go:32] "RunPodSandbox from runtime service failed" err=< Feb 17 14:29:09 crc kubenswrapper[4836]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_a8afff37-cd9b-46c0-b407-7c2fb5bada37_0(2ca36da22407de10e49ceddacef100ff3f094b59a7a992ff8a6053e904e4c186): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"2ca36da22407de10e49ceddacef100ff3f094b59a7a992ff8a6053e904e4c186" Netns:"/var/run/netns/6f100115-555d-4d51-932f-d431d3cb1f50" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=2ca36da22407de10e49ceddacef100ff3f094b59a7a992ff8a6053e904e4c186;K8S_POD_UID=a8afff37-cd9b-46c0-b407-7c2fb5bada37" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/a8afff37-cd9b-46c0-b407-7c2fb5bada37]: expected pod UID "a8afff37-cd9b-46c0-b407-7c2fb5bada37" but got "4fe674a8-c32b-412e-8d20-2a6e7e18bb10" from Kube API Feb 17 14:29:09 crc kubenswrapper[4836]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 17 14:29:09 crc kubenswrapper[4836]: > Feb 17 14:29:09 crc kubenswrapper[4836]: E0217 14:29:09.295425 4836 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Feb 17 14:29:09 crc kubenswrapper[4836]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_a8afff37-cd9b-46c0-b407-7c2fb5bada37_0(2ca36da22407de10e49ceddacef100ff3f094b59a7a992ff8a6053e904e4c186): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"2ca36da22407de10e49ceddacef100ff3f094b59a7a992ff8a6053e904e4c186" Netns:"/var/run/netns/6f100115-555d-4d51-932f-d431d3cb1f50" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=2ca36da22407de10e49ceddacef100ff3f094b59a7a992ff8a6053e904e4c186;K8S_POD_UID=a8afff37-cd9b-46c0-b407-7c2fb5bada37" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/a8afff37-cd9b-46c0-b407-7c2fb5bada37]: expected pod UID "a8afff37-cd9b-46c0-b407-7c2fb5bada37" but got "4fe674a8-c32b-412e-8d20-2a6e7e18bb10" from Kube API Feb 17 14:29:09 crc kubenswrapper[4836]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 17 14:29:09 crc kubenswrapper[4836]: > pod="openstack/openstackclient" Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.297691 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-55d7557768-wvvpt"] Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.320427 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.326971 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.328567 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.341823 4836 scope.go:117] "RemoveContainer" containerID="adc3ef3643d684dbbbf0790a30dd752752d5a28971c3915143c0a6ec314bc365" Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.356879 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-55d7557768-wvvpt"] Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.357260 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8pml\" (UniqueName: \"kubernetes.io/projected/4fe674a8-c32b-412e-8d20-2a6e7e18bb10-kube-api-access-t8pml\") pod \"openstackclient\" (UID: \"4fe674a8-c32b-412e-8d20-2a6e7e18bb10\") " pod="openstack/openstackclient" Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.357439 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9f887f5-6ce0-4320-94fc-024b1b9ef725-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d9f887f5-6ce0-4320-94fc-024b1b9ef725\") " pod="openstack/ceilometer-0" Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.358019 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9f887f5-6ce0-4320-94fc-024b1b9ef725-config-data\") pod \"ceilometer-0\" (UID: \"d9f887f5-6ce0-4320-94fc-024b1b9ef725\") " pod="openstack/ceilometer-0" Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.358136 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4fe674a8-c32b-412e-8d20-2a6e7e18bb10-openstack-config\") pod \"openstackclient\" (UID: \"4fe674a8-c32b-412e-8d20-2a6e7e18bb10\") " pod="openstack/openstackclient" Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.358166 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d9f887f5-6ce0-4320-94fc-024b1b9ef725-log-httpd\") pod \"ceilometer-0\" (UID: \"d9f887f5-6ce0-4320-94fc-024b1b9ef725\") " pod="openstack/ceilometer-0" Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.358340 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7pcw\" (UniqueName: \"kubernetes.io/projected/d9f887f5-6ce0-4320-94fc-024b1b9ef725-kube-api-access-b7pcw\") pod \"ceilometer-0\" (UID: \"d9f887f5-6ce0-4320-94fc-024b1b9ef725\") " pod="openstack/ceilometer-0" Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.358447 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fe674a8-c32b-412e-8d20-2a6e7e18bb10-combined-ca-bundle\") pod \"openstackclient\" (UID: \"4fe674a8-c32b-412e-8d20-2a6e7e18bb10\") " pod="openstack/openstackclient" Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.358560 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4fe674a8-c32b-412e-8d20-2a6e7e18bb10-openstack-config-secret\") pod \"openstackclient\" (UID: \"4fe674a8-c32b-412e-8d20-2a6e7e18bb10\") " pod="openstack/openstackclient" Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.358589 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9f887f5-6ce0-4320-94fc-024b1b9ef725-scripts\") pod \"ceilometer-0\" (UID: \"d9f887f5-6ce0-4320-94fc-024b1b9ef725\") " pod="openstack/ceilometer-0" Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.358675 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d9f887f5-6ce0-4320-94fc-024b1b9ef725-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d9f887f5-6ce0-4320-94fc-024b1b9ef725\") " pod="openstack/ceilometer-0" Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.358809 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d9f887f5-6ce0-4320-94fc-024b1b9ef725-run-httpd\") pod \"ceilometer-0\" (UID: \"d9f887f5-6ce0-4320-94fc-024b1b9ef725\") " pod="openstack/ceilometer-0" Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.366453 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.385330 4836 scope.go:117] "RemoveContainer" containerID="6102d176b1010bbf234d415140cba35d28570c5b514c7edd1c4a0962a14c5149" Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.426434 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67bdc55879-cjz8m" Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.430154 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-api-0"] Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.443919 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.444040 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-api-0"] Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.447019 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-api-config-data" Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.462124 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d9f887f5-6ce0-4320-94fc-024b1b9ef725-run-httpd\") pod \"ceilometer-0\" (UID: \"d9f887f5-6ce0-4320-94fc-024b1b9ef725\") " pod="openstack/ceilometer-0" Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.462321 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8pml\" (UniqueName: \"kubernetes.io/projected/4fe674a8-c32b-412e-8d20-2a6e7e18bb10-kube-api-access-t8pml\") pod \"openstackclient\" (UID: \"4fe674a8-c32b-412e-8d20-2a6e7e18bb10\") " pod="openstack/openstackclient" Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.462394 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9f887f5-6ce0-4320-94fc-024b1b9ef725-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d9f887f5-6ce0-4320-94fc-024b1b9ef725\") " pod="openstack/ceilometer-0" Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.462448 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9f887f5-6ce0-4320-94fc-024b1b9ef725-config-data\") pod \"ceilometer-0\" (UID: \"d9f887f5-6ce0-4320-94fc-024b1b9ef725\") " pod="openstack/ceilometer-0" Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.462491 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4fe674a8-c32b-412e-8d20-2a6e7e18bb10-openstack-config\") pod \"openstackclient\" (UID: \"4fe674a8-c32b-412e-8d20-2a6e7e18bb10\") " pod="openstack/openstackclient" Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.462514 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d9f887f5-6ce0-4320-94fc-024b1b9ef725-log-httpd\") pod \"ceilometer-0\" (UID: \"d9f887f5-6ce0-4320-94fc-024b1b9ef725\") " pod="openstack/ceilometer-0" Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.462556 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7pcw\" (UniqueName: \"kubernetes.io/projected/d9f887f5-6ce0-4320-94fc-024b1b9ef725-kube-api-access-b7pcw\") pod \"ceilometer-0\" (UID: \"d9f887f5-6ce0-4320-94fc-024b1b9ef725\") " pod="openstack/ceilometer-0" Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.462603 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fe674a8-c32b-412e-8d20-2a6e7e18bb10-combined-ca-bundle\") pod \"openstackclient\" (UID: \"4fe674a8-c32b-412e-8d20-2a6e7e18bb10\") " pod="openstack/openstackclient" Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.462647 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4fe674a8-c32b-412e-8d20-2a6e7e18bb10-openstack-config-secret\") pod \"openstackclient\" (UID: \"4fe674a8-c32b-412e-8d20-2a6e7e18bb10\") " pod="openstack/openstackclient" Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.462680 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9f887f5-6ce0-4320-94fc-024b1b9ef725-scripts\") pod \"ceilometer-0\" (UID: \"d9f887f5-6ce0-4320-94fc-024b1b9ef725\") " pod="openstack/ceilometer-0" Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.462712 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d9f887f5-6ce0-4320-94fc-024b1b9ef725-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d9f887f5-6ce0-4320-94fc-024b1b9ef725\") " pod="openstack/ceilometer-0" Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.462951 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d9f887f5-6ce0-4320-94fc-024b1b9ef725-run-httpd\") pod \"ceilometer-0\" (UID: \"d9f887f5-6ce0-4320-94fc-024b1b9ef725\") " pod="openstack/ceilometer-0" Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.463323 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d9f887f5-6ce0-4320-94fc-024b1b9ef725-log-httpd\") pod \"ceilometer-0\" (UID: \"d9f887f5-6ce0-4320-94fc-024b1b9ef725\") " pod="openstack/ceilometer-0" Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.465047 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4fe674a8-c32b-412e-8d20-2a6e7e18bb10-openstack-config\") pod \"openstackclient\" (UID: \"4fe674a8-c32b-412e-8d20-2a6e7e18bb10\") " pod="openstack/openstackclient" Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.468701 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d9f887f5-6ce0-4320-94fc-024b1b9ef725-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d9f887f5-6ce0-4320-94fc-024b1b9ef725\") " pod="openstack/ceilometer-0" Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.469110 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fe674a8-c32b-412e-8d20-2a6e7e18bb10-combined-ca-bundle\") pod \"openstackclient\" (UID: \"4fe674a8-c32b-412e-8d20-2a6e7e18bb10\") " pod="openstack/openstackclient" Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.469977 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9f887f5-6ce0-4320-94fc-024b1b9ef725-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d9f887f5-6ce0-4320-94fc-024b1b9ef725\") " pod="openstack/ceilometer-0" Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.470416 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9f887f5-6ce0-4320-94fc-024b1b9ef725-scripts\") pod \"ceilometer-0\" (UID: \"d9f887f5-6ce0-4320-94fc-024b1b9ef725\") " pod="openstack/ceilometer-0" Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.470457 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9f887f5-6ce0-4320-94fc-024b1b9ef725-config-data\") pod \"ceilometer-0\" (UID: \"d9f887f5-6ce0-4320-94fc-024b1b9ef725\") " pod="openstack/ceilometer-0" Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.471202 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4fe674a8-c32b-412e-8d20-2a6e7e18bb10-openstack-config-secret\") pod \"openstackclient\" (UID: \"4fe674a8-c32b-412e-8d20-2a6e7e18bb10\") " pod="openstack/openstackclient" Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.486867 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7pcw\" (UniqueName: \"kubernetes.io/projected/d9f887f5-6ce0-4320-94fc-024b1b9ef725-kube-api-access-b7pcw\") pod \"ceilometer-0\" (UID: \"d9f887f5-6ce0-4320-94fc-024b1b9ef725\") " pod="openstack/ceilometer-0" Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.497715 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8pml\" (UniqueName: \"kubernetes.io/projected/4fe674a8-c32b-412e-8d20-2a6e7e18bb10-kube-api-access-t8pml\") pod \"openstackclient\" (UID: \"4fe674a8-c32b-412e-8d20-2a6e7e18bb10\") " pod="openstack/openstackclient" Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.566277 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bb10908e-7be1-4ca0-8743-7f9aaae820b7-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"bb10908e-7be1-4ca0-8743-7f9aaae820b7\") " pod="openstack/cloudkitty-api-0" Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.566965 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb10908e-7be1-4ca0-8743-7f9aaae820b7-config-data\") pod \"cloudkitty-api-0\" (UID: \"bb10908e-7be1-4ca0-8743-7f9aaae820b7\") " pod="openstack/cloudkitty-api-0" Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.567086 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/bb10908e-7be1-4ca0-8743-7f9aaae820b7-certs\") pod \"cloudkitty-api-0\" (UID: \"bb10908e-7be1-4ca0-8743-7f9aaae820b7\") " pod="openstack/cloudkitty-api-0" Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.567148 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb10908e-7be1-4ca0-8743-7f9aaae820b7-scripts\") pod \"cloudkitty-api-0\" (UID: \"bb10908e-7be1-4ca0-8743-7f9aaae820b7\") " pod="openstack/cloudkitty-api-0" Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.567235 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb10908e-7be1-4ca0-8743-7f9aaae820b7-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"bb10908e-7be1-4ca0-8743-7f9aaae820b7\") " pod="openstack/cloudkitty-api-0" Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.567272 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bb10908e-7be1-4ca0-8743-7f9aaae820b7-logs\") pod \"cloudkitty-api-0\" (UID: \"bb10908e-7be1-4ca0-8743-7f9aaae820b7\") " pod="openstack/cloudkitty-api-0" Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.567327 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ftfn\" (UniqueName: \"kubernetes.io/projected/bb10908e-7be1-4ca0-8743-7f9aaae820b7-kube-api-access-7ftfn\") pod \"cloudkitty-api-0\" (UID: \"bb10908e-7be1-4ca0-8743-7f9aaae820b7\") " pod="openstack/cloudkitty-api-0" Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.615303 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.656807 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.669840 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bb10908e-7be1-4ca0-8743-7f9aaae820b7-logs\") pod \"cloudkitty-api-0\" (UID: \"bb10908e-7be1-4ca0-8743-7f9aaae820b7\") " pod="openstack/cloudkitty-api-0" Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.669923 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ftfn\" (UniqueName: \"kubernetes.io/projected/bb10908e-7be1-4ca0-8743-7f9aaae820b7-kube-api-access-7ftfn\") pod \"cloudkitty-api-0\" (UID: \"bb10908e-7be1-4ca0-8743-7f9aaae820b7\") " pod="openstack/cloudkitty-api-0" Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.670035 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bb10908e-7be1-4ca0-8743-7f9aaae820b7-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"bb10908e-7be1-4ca0-8743-7f9aaae820b7\") " pod="openstack/cloudkitty-api-0" Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.670067 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb10908e-7be1-4ca0-8743-7f9aaae820b7-config-data\") pod \"cloudkitty-api-0\" (UID: \"bb10908e-7be1-4ca0-8743-7f9aaae820b7\") " pod="openstack/cloudkitty-api-0" Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.670174 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/bb10908e-7be1-4ca0-8743-7f9aaae820b7-certs\") pod \"cloudkitty-api-0\" (UID: \"bb10908e-7be1-4ca0-8743-7f9aaae820b7\") " pod="openstack/cloudkitty-api-0" Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.670252 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb10908e-7be1-4ca0-8743-7f9aaae820b7-scripts\") pod \"cloudkitty-api-0\" (UID: \"bb10908e-7be1-4ca0-8743-7f9aaae820b7\") " pod="openstack/cloudkitty-api-0" Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.670354 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb10908e-7be1-4ca0-8743-7f9aaae820b7-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"bb10908e-7be1-4ca0-8743-7f9aaae820b7\") " pod="openstack/cloudkitty-api-0" Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.679093 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb10908e-7be1-4ca0-8743-7f9aaae820b7-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"bb10908e-7be1-4ca0-8743-7f9aaae820b7\") " pod="openstack/cloudkitty-api-0" Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.683562 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb10908e-7be1-4ca0-8743-7f9aaae820b7-config-data\") pod \"cloudkitty-api-0\" (UID: \"bb10908e-7be1-4ca0-8743-7f9aaae820b7\") " pod="openstack/cloudkitty-api-0" Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.686348 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb10908e-7be1-4ca0-8743-7f9aaae820b7-scripts\") pod \"cloudkitty-api-0\" (UID: \"bb10908e-7be1-4ca0-8743-7f9aaae820b7\") " pod="openstack/cloudkitty-api-0" Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.688942 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/bb10908e-7be1-4ca0-8743-7f9aaae820b7-certs\") pod \"cloudkitty-api-0\" (UID: \"bb10908e-7be1-4ca0-8743-7f9aaae820b7\") " pod="openstack/cloudkitty-api-0" Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.690051 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bb10908e-7be1-4ca0-8743-7f9aaae820b7-logs\") pod \"cloudkitty-api-0\" (UID: \"bb10908e-7be1-4ca0-8743-7f9aaae820b7\") " pod="openstack/cloudkitty-api-0" Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.721664 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bb10908e-7be1-4ca0-8743-7f9aaae820b7-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"bb10908e-7be1-4ca0-8743-7f9aaae820b7\") " pod="openstack/cloudkitty-api-0" Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.766511 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ftfn\" (UniqueName: \"kubernetes.io/projected/bb10908e-7be1-4ca0-8743-7f9aaae820b7-kube-api-access-7ftfn\") pod \"cloudkitty-api-0\" (UID: \"bb10908e-7be1-4ca0-8743-7f9aaae820b7\") " pod="openstack/cloudkitty-api-0" Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.777587 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.792500 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.802820 4836 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="a8afff37-cd9b-46c0-b407-7c2fb5bada37" podUID="4fe674a8-c32b-412e-8d20-2a6e7e18bb10" Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.817709 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.884848 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a8afff37-cd9b-46c0-b407-7c2fb5bada37-openstack-config-secret\") pod \"a8afff37-cd9b-46c0-b407-7c2fb5bada37\" (UID: \"a8afff37-cd9b-46c0-b407-7c2fb5bada37\") " Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.890004 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8afff37-cd9b-46c0-b407-7c2fb5bada37-combined-ca-bundle\") pod \"a8afff37-cd9b-46c0-b407-7c2fb5bada37\" (UID: \"a8afff37-cd9b-46c0-b407-7c2fb5bada37\") " Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.890627 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a8afff37-cd9b-46c0-b407-7c2fb5bada37-openstack-config\") pod \"a8afff37-cd9b-46c0-b407-7c2fb5bada37\" (UID: \"a8afff37-cd9b-46c0-b407-7c2fb5bada37\") " Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.890935 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-drz9c\" (UniqueName: \"kubernetes.io/projected/a8afff37-cd9b-46c0-b407-7c2fb5bada37-kube-api-access-drz9c\") pod \"a8afff37-cd9b-46c0-b407-7c2fb5bada37\" (UID: \"a8afff37-cd9b-46c0-b407-7c2fb5bada37\") " Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.891067 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8afff37-cd9b-46c0-b407-7c2fb5bada37-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "a8afff37-cd9b-46c0-b407-7c2fb5bada37" (UID: "a8afff37-cd9b-46c0-b407-7c2fb5bada37"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.892683 4836 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a8afff37-cd9b-46c0-b407-7c2fb5bada37-openstack-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.893896 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8afff37-cd9b-46c0-b407-7c2fb5bada37-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "a8afff37-cd9b-46c0-b407-7c2fb5bada37" (UID: "a8afff37-cd9b-46c0-b407-7c2fb5bada37"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.895454 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8afff37-cd9b-46c0-b407-7c2fb5bada37-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a8afff37-cd9b-46c0-b407-7c2fb5bada37" (UID: "a8afff37-cd9b-46c0-b407-7c2fb5bada37"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.910181 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8afff37-cd9b-46c0-b407-7c2fb5bada37-kube-api-access-drz9c" (OuterVolumeSpecName: "kube-api-access-drz9c") pod "a8afff37-cd9b-46c0-b407-7c2fb5bada37" (UID: "a8afff37-cd9b-46c0-b407-7c2fb5bada37"). InnerVolumeSpecName "kube-api-access-drz9c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.995116 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-drz9c\" (UniqueName: \"kubernetes.io/projected/a8afff37-cd9b-46c0-b407-7c2fb5bada37-kube-api-access-drz9c\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.995169 4836 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a8afff37-cd9b-46c0-b407-7c2fb5bada37-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:09 crc kubenswrapper[4836]: I0217 14:29:09.995193 4836 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8afff37-cd9b-46c0-b407-7c2fb5bada37-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:10 crc kubenswrapper[4836]: I0217 14:29:10.048005 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 17 14:29:10 crc kubenswrapper[4836]: I0217 14:29:10.308276 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67bdc55879-cjz8m"] Feb 17 14:29:10 crc kubenswrapper[4836]: I0217 14:29:10.610426 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21c73844-3235-4a12-9f77-901ba8614e11" path="/var/lib/kubelet/pods/21c73844-3235-4a12-9f77-901ba8614e11/volumes" Feb 17 14:29:10 crc kubenswrapper[4836]: I0217 14:29:10.611879 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a1d16f5-4710-43b4-805e-315ed73bb24e" path="/var/lib/kubelet/pods/2a1d16f5-4710-43b4-805e-315ed73bb24e/volumes" Feb 17 14:29:10 crc kubenswrapper[4836]: I0217 14:29:10.614208 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8afff37-cd9b-46c0-b407-7c2fb5bada37" path="/var/lib/kubelet/pods/a8afff37-cd9b-46c0-b407-7c2fb5bada37/volumes" Feb 17 14:29:10 crc kubenswrapper[4836]: I0217 14:29:10.648761 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 17 14:29:10 crc kubenswrapper[4836]: W0217 14:29:10.680213 4836 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4fe674a8_c32b_412e_8d20_2a6e7e18bb10.slice/crio-8d85d78a74909b5d4e4e057b867c51974828a505741fdff580c1df982a520ea3 WatchSource:0}: Error finding container 8d85d78a74909b5d4e4e057b867c51974828a505741fdff580c1df982a520ea3: Status 404 returned error can't find the container with id 8d85d78a74909b5d4e4e057b867c51974828a505741fdff580c1df982a520ea3 Feb 17 14:29:10 crc kubenswrapper[4836]: I0217 14:29:10.754142 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 14:29:10 crc kubenswrapper[4836]: I0217 14:29:10.816829 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"4fe674a8-c32b-412e-8d20-2a6e7e18bb10","Type":"ContainerStarted","Data":"8d85d78a74909b5d4e4e057b867c51974828a505741fdff580c1df982a520ea3"} Feb 17 14:29:10 crc kubenswrapper[4836]: I0217 14:29:10.820252 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d9f887f5-6ce0-4320-94fc-024b1b9ef725","Type":"ContainerStarted","Data":"2b8910649b123c250a9b2ae2a0273df5052a76cf9ac3a4d666b31acdde9dcd6e"} Feb 17 14:29:10 crc kubenswrapper[4836]: I0217 14:29:10.833662 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"39d3cbf1-d107-4004-9eec-698f8f4360b9","Type":"ContainerStarted","Data":"9d18961b4f807b2d078a92a071d329120e24d89e463eadbb04ec662d87231dc8"} Feb 17 14:29:10 crc kubenswrapper[4836]: I0217 14:29:10.842197 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 17 14:29:10 crc kubenswrapper[4836]: I0217 14:29:10.842742 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67bdc55879-cjz8m" event={"ID":"2cb84cd5-666c-4cf2-94df-5b51d0e3fa6d","Type":"ContainerStarted","Data":"3d5f1259a1d6811a1bf928961a17e403bc60cdb65dc5d67063f562d7b7e44223"} Feb 17 14:29:10 crc kubenswrapper[4836]: I0217 14:29:10.848861 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-api-0"] Feb 17 14:29:10 crc kubenswrapper[4836]: I0217 14:29:10.899569 4836 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="a8afff37-cd9b-46c0-b407-7c2fb5bada37" podUID="4fe674a8-c32b-412e-8d20-2a6e7e18bb10" Feb 17 14:29:11 crc kubenswrapper[4836]: I0217 14:29:11.874027 4836 generic.go:334] "Generic (PLEG): container finished" podID="2cb84cd5-666c-4cf2-94df-5b51d0e3fa6d" containerID="1fc9116efed5aa1cde1e1851a8feece763300523cbdc4d6253a5c08f4f4f9f36" exitCode=0 Feb 17 14:29:11 crc kubenswrapper[4836]: I0217 14:29:11.874206 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67bdc55879-cjz8m" event={"ID":"2cb84cd5-666c-4cf2-94df-5b51d0e3fa6d","Type":"ContainerDied","Data":"1fc9116efed5aa1cde1e1851a8feece763300523cbdc4d6253a5c08f4f4f9f36"} Feb 17 14:29:11 crc kubenswrapper[4836]: I0217 14:29:11.900160 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"bb10908e-7be1-4ca0-8743-7f9aaae820b7","Type":"ContainerStarted","Data":"82bc2e3a70ecec92d3994393a44d6752f39e87009a67fa6ce836cf5fea4e8d25"} Feb 17 14:29:11 crc kubenswrapper[4836]: I0217 14:29:11.900226 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"bb10908e-7be1-4ca0-8743-7f9aaae820b7","Type":"ContainerStarted","Data":"fef50b67c5370254f40d86b0f2cec5c6baf88547e44514223d3def4388ffb9b9"} Feb 17 14:29:11 crc kubenswrapper[4836]: I0217 14:29:11.900239 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"bb10908e-7be1-4ca0-8743-7f9aaae820b7","Type":"ContainerStarted","Data":"d9850d73e5e57596ca879cc8b2c2875e4f65ab2930249ef121a2ca2c42da234e"} Feb 17 14:29:11 crc kubenswrapper[4836]: I0217 14:29:11.900323 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-api-0" Feb 17 14:29:11 crc kubenswrapper[4836]: I0217 14:29:11.985130 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-api-0" podStartSLOduration=2.98509147 podStartE2EDuration="2.98509147s" podCreationTimestamp="2026-02-17 14:29:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:29:11.937148184 +0000 UTC m=+1378.280076463" watchObservedRunningTime="2026-02-17 14:29:11.98509147 +0000 UTC m=+1378.328019749" Feb 17 14:29:12 crc kubenswrapper[4836]: I0217 14:29:12.723435 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-api-0"] Feb 17 14:29:13 crc kubenswrapper[4836]: I0217 14:29:13.939085 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"39d3cbf1-d107-4004-9eec-698f8f4360b9","Type":"ContainerStarted","Data":"9ee60ada822c522c9249d0e3c31f511d939804abdb610bce124e951b7000a09d"} Feb 17 14:29:13 crc kubenswrapper[4836]: I0217 14:29:13.948010 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67bdc55879-cjz8m" event={"ID":"2cb84cd5-666c-4cf2-94df-5b51d0e3fa6d","Type":"ContainerStarted","Data":"407f5678203e5e174c01300835b55b61252a1ab248014426970911ab531d756b"} Feb 17 14:29:13 crc kubenswrapper[4836]: I0217 14:29:13.948165 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-67bdc55879-cjz8m" Feb 17 14:29:13 crc kubenswrapper[4836]: I0217 14:29:13.950242 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cloudkitty-api-0" podUID="bb10908e-7be1-4ca0-8743-7f9aaae820b7" containerName="cloudkitty-api-log" containerID="cri-o://fef50b67c5370254f40d86b0f2cec5c6baf88547e44514223d3def4388ffb9b9" gracePeriod=30 Feb 17 14:29:13 crc kubenswrapper[4836]: I0217 14:29:13.950542 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d9f887f5-6ce0-4320-94fc-024b1b9ef725","Type":"ContainerStarted","Data":"fb5cf9ee8d101cc6fae1fb5f79f35c27c28cf9fc0fa0631bc345f006efba64c9"} Feb 17 14:29:13 crc kubenswrapper[4836]: I0217 14:29:13.950610 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cloudkitty-api-0" podUID="bb10908e-7be1-4ca0-8743-7f9aaae820b7" containerName="cloudkitty-api" containerID="cri-o://82bc2e3a70ecec92d3994393a44d6752f39e87009a67fa6ce836cf5fea4e8d25" gracePeriod=30 Feb 17 14:29:13 crc kubenswrapper[4836]: I0217 14:29:13.971654 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-proc-0" podStartSLOduration=2.838848005 podStartE2EDuration="5.971626155s" podCreationTimestamp="2026-02-17 14:29:08 +0000 UTC" firstStartedPulling="2026-02-17 14:29:10.299594533 +0000 UTC m=+1376.642522802" lastFinishedPulling="2026-02-17 14:29:13.432372683 +0000 UTC m=+1379.775300952" observedRunningTime="2026-02-17 14:29:13.960916066 +0000 UTC m=+1380.303844335" watchObservedRunningTime="2026-02-17 14:29:13.971626155 +0000 UTC m=+1380.314554424" Feb 17 14:29:14 crc kubenswrapper[4836]: I0217 14:29:14.016989 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 17 14:29:14 crc kubenswrapper[4836]: I0217 14:29:14.029770 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-67bdc55879-cjz8m" podStartSLOduration=6.029737255 podStartE2EDuration="6.029737255s" podCreationTimestamp="2026-02-17 14:29:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:29:13.988192693 +0000 UTC m=+1380.331120962" watchObservedRunningTime="2026-02-17 14:29:14.029737255 +0000 UTC m=+1380.372665524" Feb 17 14:29:14 crc kubenswrapper[4836]: I0217 14:29:14.231956 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 17 14:29:15 crc kubenswrapper[4836]: I0217 14:29:15.002679 4836 generic.go:334] "Generic (PLEG): container finished" podID="bb10908e-7be1-4ca0-8743-7f9aaae820b7" containerID="82bc2e3a70ecec92d3994393a44d6752f39e87009a67fa6ce836cf5fea4e8d25" exitCode=0 Feb 17 14:29:15 crc kubenswrapper[4836]: I0217 14:29:15.003268 4836 generic.go:334] "Generic (PLEG): container finished" podID="bb10908e-7be1-4ca0-8743-7f9aaae820b7" containerID="fef50b67c5370254f40d86b0f2cec5c6baf88547e44514223d3def4388ffb9b9" exitCode=143 Feb 17 14:29:15 crc kubenswrapper[4836]: I0217 14:29:15.002797 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"bb10908e-7be1-4ca0-8743-7f9aaae820b7","Type":"ContainerDied","Data":"82bc2e3a70ecec92d3994393a44d6752f39e87009a67fa6ce836cf5fea4e8d25"} Feb 17 14:29:15 crc kubenswrapper[4836]: I0217 14:29:15.003434 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"bb10908e-7be1-4ca0-8743-7f9aaae820b7","Type":"ContainerDied","Data":"fef50b67c5370254f40d86b0f2cec5c6baf88547e44514223d3def4388ffb9b9"} Feb 17 14:29:15 crc kubenswrapper[4836]: I0217 14:29:15.024427 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d9f887f5-6ce0-4320-94fc-024b1b9ef725","Type":"ContainerStarted","Data":"3af1c0902b859232ebd0fe7d2dd1bbcfb19e56b6f4b1d314aace0818279210a3"} Feb 17 14:29:15 crc kubenswrapper[4836]: I0217 14:29:15.205315 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Feb 17 14:29:15 crc kubenswrapper[4836]: I0217 14:29:15.298517 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb10908e-7be1-4ca0-8743-7f9aaae820b7-scripts\") pod \"bb10908e-7be1-4ca0-8743-7f9aaae820b7\" (UID: \"bb10908e-7be1-4ca0-8743-7f9aaae820b7\") " Feb 17 14:29:15 crc kubenswrapper[4836]: I0217 14:29:15.298995 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb10908e-7be1-4ca0-8743-7f9aaae820b7-config-data\") pod \"bb10908e-7be1-4ca0-8743-7f9aaae820b7\" (UID: \"bb10908e-7be1-4ca0-8743-7f9aaae820b7\") " Feb 17 14:29:15 crc kubenswrapper[4836]: I0217 14:29:15.299059 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bb10908e-7be1-4ca0-8743-7f9aaae820b7-logs\") pod \"bb10908e-7be1-4ca0-8743-7f9aaae820b7\" (UID: \"bb10908e-7be1-4ca0-8743-7f9aaae820b7\") " Feb 17 14:29:15 crc kubenswrapper[4836]: I0217 14:29:15.299129 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/bb10908e-7be1-4ca0-8743-7f9aaae820b7-certs\") pod \"bb10908e-7be1-4ca0-8743-7f9aaae820b7\" (UID: \"bb10908e-7be1-4ca0-8743-7f9aaae820b7\") " Feb 17 14:29:15 crc kubenswrapper[4836]: I0217 14:29:15.299156 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb10908e-7be1-4ca0-8743-7f9aaae820b7-combined-ca-bundle\") pod \"bb10908e-7be1-4ca0-8743-7f9aaae820b7\" (UID: \"bb10908e-7be1-4ca0-8743-7f9aaae820b7\") " Feb 17 14:29:15 crc kubenswrapper[4836]: I0217 14:29:15.299216 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7ftfn\" (UniqueName: \"kubernetes.io/projected/bb10908e-7be1-4ca0-8743-7f9aaae820b7-kube-api-access-7ftfn\") pod \"bb10908e-7be1-4ca0-8743-7f9aaae820b7\" (UID: \"bb10908e-7be1-4ca0-8743-7f9aaae820b7\") " Feb 17 14:29:15 crc kubenswrapper[4836]: I0217 14:29:15.299280 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bb10908e-7be1-4ca0-8743-7f9aaae820b7-config-data-custom\") pod \"bb10908e-7be1-4ca0-8743-7f9aaae820b7\" (UID: \"bb10908e-7be1-4ca0-8743-7f9aaae820b7\") " Feb 17 14:29:15 crc kubenswrapper[4836]: I0217 14:29:15.300388 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb10908e-7be1-4ca0-8743-7f9aaae820b7-logs" (OuterVolumeSpecName: "logs") pod "bb10908e-7be1-4ca0-8743-7f9aaae820b7" (UID: "bb10908e-7be1-4ca0-8743-7f9aaae820b7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:29:15 crc kubenswrapper[4836]: I0217 14:29:15.310556 4836 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bb10908e-7be1-4ca0-8743-7f9aaae820b7-logs\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:15 crc kubenswrapper[4836]: I0217 14:29:15.600408 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-6fc4994bf7-cqhhj" Feb 17 14:29:15 crc kubenswrapper[4836]: I0217 14:29:15.608332 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb10908e-7be1-4ca0-8743-7f9aaae820b7-certs" (OuterVolumeSpecName: "certs") pod "bb10908e-7be1-4ca0-8743-7f9aaae820b7" (UID: "bb10908e-7be1-4ca0-8743-7f9aaae820b7"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:29:15 crc kubenswrapper[4836]: I0217 14:29:15.608474 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb10908e-7be1-4ca0-8743-7f9aaae820b7-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "bb10908e-7be1-4ca0-8743-7f9aaae820b7" (UID: "bb10908e-7be1-4ca0-8743-7f9aaae820b7"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:29:15 crc kubenswrapper[4836]: I0217 14:29:15.608732 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb10908e-7be1-4ca0-8743-7f9aaae820b7-scripts" (OuterVolumeSpecName: "scripts") pod "bb10908e-7be1-4ca0-8743-7f9aaae820b7" (UID: "bb10908e-7be1-4ca0-8743-7f9aaae820b7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:29:15 crc kubenswrapper[4836]: I0217 14:29:15.609333 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb10908e-7be1-4ca0-8743-7f9aaae820b7-kube-api-access-7ftfn" (OuterVolumeSpecName: "kube-api-access-7ftfn") pod "bb10908e-7be1-4ca0-8743-7f9aaae820b7" (UID: "bb10908e-7be1-4ca0-8743-7f9aaae820b7"). InnerVolumeSpecName "kube-api-access-7ftfn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:29:15 crc kubenswrapper[4836]: I0217 14:29:15.614755 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb10908e-7be1-4ca0-8743-7f9aaae820b7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bb10908e-7be1-4ca0-8743-7f9aaae820b7" (UID: "bb10908e-7be1-4ca0-8743-7f9aaae820b7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:29:15 crc kubenswrapper[4836]: I0217 14:29:15.616970 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb10908e-7be1-4ca0-8743-7f9aaae820b7-config-data" (OuterVolumeSpecName: "config-data") pod "bb10908e-7be1-4ca0-8743-7f9aaae820b7" (UID: "bb10908e-7be1-4ca0-8743-7f9aaae820b7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:29:15 crc kubenswrapper[4836]: I0217 14:29:15.653424 4836 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb10908e-7be1-4ca0-8743-7f9aaae820b7-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:15 crc kubenswrapper[4836]: I0217 14:29:15.653809 4836 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/bb10908e-7be1-4ca0-8743-7f9aaae820b7-certs\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:15 crc kubenswrapper[4836]: I0217 14:29:15.653911 4836 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb10908e-7be1-4ca0-8743-7f9aaae820b7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:15 crc kubenswrapper[4836]: I0217 14:29:15.653995 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7ftfn\" (UniqueName: \"kubernetes.io/projected/bb10908e-7be1-4ca0-8743-7f9aaae820b7-kube-api-access-7ftfn\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:15 crc kubenswrapper[4836]: I0217 14:29:15.654076 4836 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bb10908e-7be1-4ca0-8743-7f9aaae820b7-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:15 crc kubenswrapper[4836]: I0217 14:29:15.654158 4836 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb10908e-7be1-4ca0-8743-7f9aaae820b7-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:15 crc kubenswrapper[4836]: I0217 14:29:15.711060 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-56bdc657f6-lhdd4"] Feb 17 14:29:15 crc kubenswrapper[4836]: I0217 14:29:15.711449 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-56bdc657f6-lhdd4" podUID="10f74a60-5438-45cd-a8e1-74ccc1c3b16a" containerName="neutron-api" containerID="cri-o://b7a5e210ee7a505ae087f3c56329942b71db962383e4ae1693812dd8340169c8" gracePeriod=30 Feb 17 14:29:15 crc kubenswrapper[4836]: I0217 14:29:15.711540 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-56bdc657f6-lhdd4" podUID="10f74a60-5438-45cd-a8e1-74ccc1c3b16a" containerName="neutron-httpd" containerID="cri-o://d2098b2a7c4dcbee4fa27ea9bfa1c19e32c5f83e96aa663b877abb8284852c74" gracePeriod=30 Feb 17 14:29:15 crc kubenswrapper[4836]: I0217 14:29:15.993933 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Feb 17 14:29:16 crc kubenswrapper[4836]: I0217 14:29:16.089191 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d9f887f5-6ce0-4320-94fc-024b1b9ef725","Type":"ContainerStarted","Data":"b3a6771b8e2f194ac7bcb94abab4f0e58b19807bc132dac6154a588305752da0"} Feb 17 14:29:16 crc kubenswrapper[4836]: I0217 14:29:16.091631 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cloudkitty-proc-0" podUID="39d3cbf1-d107-4004-9eec-698f8f4360b9" containerName="cloudkitty-proc" containerID="cri-o://9ee60ada822c522c9249d0e3c31f511d939804abdb610bce124e951b7000a09d" gracePeriod=30 Feb 17 14:29:16 crc kubenswrapper[4836]: I0217 14:29:16.091973 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Feb 17 14:29:16 crc kubenswrapper[4836]: I0217 14:29:16.106249 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"bb10908e-7be1-4ca0-8743-7f9aaae820b7","Type":"ContainerDied","Data":"d9850d73e5e57596ca879cc8b2c2875e4f65ab2930249ef121a2ca2c42da234e"} Feb 17 14:29:16 crc kubenswrapper[4836]: I0217 14:29:16.106340 4836 scope.go:117] "RemoveContainer" containerID="82bc2e3a70ecec92d3994393a44d6752f39e87009a67fa6ce836cf5fea4e8d25" Feb 17 14:29:16 crc kubenswrapper[4836]: I0217 14:29:16.194367 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-api-0"] Feb 17 14:29:16 crc kubenswrapper[4836]: I0217 14:29:16.238560 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-api-0"] Feb 17 14:29:16 crc kubenswrapper[4836]: I0217 14:29:16.273104 4836 scope.go:117] "RemoveContainer" containerID="fef50b67c5370254f40d86b0f2cec5c6baf88547e44514223d3def4388ffb9b9" Feb 17 14:29:16 crc kubenswrapper[4836]: I0217 14:29:16.273261 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-api-0"] Feb 17 14:29:16 crc kubenswrapper[4836]: E0217 14:29:16.273790 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb10908e-7be1-4ca0-8743-7f9aaae820b7" containerName="cloudkitty-api-log" Feb 17 14:29:16 crc kubenswrapper[4836]: I0217 14:29:16.273808 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb10908e-7be1-4ca0-8743-7f9aaae820b7" containerName="cloudkitty-api-log" Feb 17 14:29:16 crc kubenswrapper[4836]: E0217 14:29:16.273860 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb10908e-7be1-4ca0-8743-7f9aaae820b7" containerName="cloudkitty-api" Feb 17 14:29:16 crc kubenswrapper[4836]: I0217 14:29:16.273870 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb10908e-7be1-4ca0-8743-7f9aaae820b7" containerName="cloudkitty-api" Feb 17 14:29:16 crc kubenswrapper[4836]: I0217 14:29:16.274222 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb10908e-7be1-4ca0-8743-7f9aaae820b7" containerName="cloudkitty-api" Feb 17 14:29:16 crc kubenswrapper[4836]: I0217 14:29:16.274255 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb10908e-7be1-4ca0-8743-7f9aaae820b7" containerName="cloudkitty-api-log" Feb 17 14:29:16 crc kubenswrapper[4836]: I0217 14:29:16.292020 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Feb 17 14:29:16 crc kubenswrapper[4836]: I0217 14:29:16.299026 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-internal-svc" Feb 17 14:29:16 crc kubenswrapper[4836]: I0217 14:29:16.299375 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-api-config-data" Feb 17 14:29:16 crc kubenswrapper[4836]: I0217 14:29:16.316775 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-public-svc" Feb 17 14:29:16 crc kubenswrapper[4836]: I0217 14:29:16.356032 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-api-0"] Feb 17 14:29:16 crc kubenswrapper[4836]: I0217 14:29:16.388896 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/260ff3a2-31e2-4ef9-9d7a-ef52ddb0fc49-certs\") pod \"cloudkitty-api-0\" (UID: \"260ff3a2-31e2-4ef9-9d7a-ef52ddb0fc49\") " pod="openstack/cloudkitty-api-0" Feb 17 14:29:16 crc kubenswrapper[4836]: I0217 14:29:16.409981 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/260ff3a2-31e2-4ef9-9d7a-ef52ddb0fc49-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"260ff3a2-31e2-4ef9-9d7a-ef52ddb0fc49\") " pod="openstack/cloudkitty-api-0" Feb 17 14:29:16 crc kubenswrapper[4836]: I0217 14:29:16.410076 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/260ff3a2-31e2-4ef9-9d7a-ef52ddb0fc49-scripts\") pod \"cloudkitty-api-0\" (UID: \"260ff3a2-31e2-4ef9-9d7a-ef52ddb0fc49\") " pod="openstack/cloudkitty-api-0" Feb 17 14:29:16 crc kubenswrapper[4836]: I0217 14:29:16.410170 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/260ff3a2-31e2-4ef9-9d7a-ef52ddb0fc49-logs\") pod \"cloudkitty-api-0\" (UID: \"260ff3a2-31e2-4ef9-9d7a-ef52ddb0fc49\") " pod="openstack/cloudkitty-api-0" Feb 17 14:29:16 crc kubenswrapper[4836]: I0217 14:29:16.410396 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/260ff3a2-31e2-4ef9-9d7a-ef52ddb0fc49-internal-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"260ff3a2-31e2-4ef9-9d7a-ef52ddb0fc49\") " pod="openstack/cloudkitty-api-0" Feb 17 14:29:16 crc kubenswrapper[4836]: I0217 14:29:16.410491 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/260ff3a2-31e2-4ef9-9d7a-ef52ddb0fc49-public-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"260ff3a2-31e2-4ef9-9d7a-ef52ddb0fc49\") " pod="openstack/cloudkitty-api-0" Feb 17 14:29:16 crc kubenswrapper[4836]: I0217 14:29:16.410542 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/260ff3a2-31e2-4ef9-9d7a-ef52ddb0fc49-config-data\") pod \"cloudkitty-api-0\" (UID: \"260ff3a2-31e2-4ef9-9d7a-ef52ddb0fc49\") " pod="openstack/cloudkitty-api-0" Feb 17 14:29:16 crc kubenswrapper[4836]: I0217 14:29:16.410678 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/260ff3a2-31e2-4ef9-9d7a-ef52ddb0fc49-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"260ff3a2-31e2-4ef9-9d7a-ef52ddb0fc49\") " pod="openstack/cloudkitty-api-0" Feb 17 14:29:16 crc kubenswrapper[4836]: I0217 14:29:16.410717 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxmlr\" (UniqueName: \"kubernetes.io/projected/260ff3a2-31e2-4ef9-9d7a-ef52ddb0fc49-kube-api-access-wxmlr\") pod \"cloudkitty-api-0\" (UID: \"260ff3a2-31e2-4ef9-9d7a-ef52ddb0fc49\") " pod="openstack/cloudkitty-api-0" Feb 17 14:29:16 crc kubenswrapper[4836]: I0217 14:29:16.523626 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/260ff3a2-31e2-4ef9-9d7a-ef52ddb0fc49-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"260ff3a2-31e2-4ef9-9d7a-ef52ddb0fc49\") " pod="openstack/cloudkitty-api-0" Feb 17 14:29:16 crc kubenswrapper[4836]: I0217 14:29:16.523720 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/260ff3a2-31e2-4ef9-9d7a-ef52ddb0fc49-scripts\") pod \"cloudkitty-api-0\" (UID: \"260ff3a2-31e2-4ef9-9d7a-ef52ddb0fc49\") " pod="openstack/cloudkitty-api-0" Feb 17 14:29:16 crc kubenswrapper[4836]: I0217 14:29:16.523792 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/260ff3a2-31e2-4ef9-9d7a-ef52ddb0fc49-logs\") pod \"cloudkitty-api-0\" (UID: \"260ff3a2-31e2-4ef9-9d7a-ef52ddb0fc49\") " pod="openstack/cloudkitty-api-0" Feb 17 14:29:16 crc kubenswrapper[4836]: I0217 14:29:16.523914 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/260ff3a2-31e2-4ef9-9d7a-ef52ddb0fc49-internal-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"260ff3a2-31e2-4ef9-9d7a-ef52ddb0fc49\") " pod="openstack/cloudkitty-api-0" Feb 17 14:29:16 crc kubenswrapper[4836]: I0217 14:29:16.523992 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/260ff3a2-31e2-4ef9-9d7a-ef52ddb0fc49-public-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"260ff3a2-31e2-4ef9-9d7a-ef52ddb0fc49\") " pod="openstack/cloudkitty-api-0" Feb 17 14:29:16 crc kubenswrapper[4836]: I0217 14:29:16.524029 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/260ff3a2-31e2-4ef9-9d7a-ef52ddb0fc49-config-data\") pod \"cloudkitty-api-0\" (UID: \"260ff3a2-31e2-4ef9-9d7a-ef52ddb0fc49\") " pod="openstack/cloudkitty-api-0" Feb 17 14:29:16 crc kubenswrapper[4836]: I0217 14:29:16.524102 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxmlr\" (UniqueName: \"kubernetes.io/projected/260ff3a2-31e2-4ef9-9d7a-ef52ddb0fc49-kube-api-access-wxmlr\") pod \"cloudkitty-api-0\" (UID: \"260ff3a2-31e2-4ef9-9d7a-ef52ddb0fc49\") " pod="openstack/cloudkitty-api-0" Feb 17 14:29:16 crc kubenswrapper[4836]: I0217 14:29:16.524143 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/260ff3a2-31e2-4ef9-9d7a-ef52ddb0fc49-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"260ff3a2-31e2-4ef9-9d7a-ef52ddb0fc49\") " pod="openstack/cloudkitty-api-0" Feb 17 14:29:16 crc kubenswrapper[4836]: I0217 14:29:16.524238 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/260ff3a2-31e2-4ef9-9d7a-ef52ddb0fc49-certs\") pod \"cloudkitty-api-0\" (UID: \"260ff3a2-31e2-4ef9-9d7a-ef52ddb0fc49\") " pod="openstack/cloudkitty-api-0" Feb 17 14:29:16 crc kubenswrapper[4836]: I0217 14:29:16.536229 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/260ff3a2-31e2-4ef9-9d7a-ef52ddb0fc49-logs\") pod \"cloudkitty-api-0\" (UID: \"260ff3a2-31e2-4ef9-9d7a-ef52ddb0fc49\") " pod="openstack/cloudkitty-api-0" Feb 17 14:29:16 crc kubenswrapper[4836]: I0217 14:29:16.590874 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/260ff3a2-31e2-4ef9-9d7a-ef52ddb0fc49-certs\") pod \"cloudkitty-api-0\" (UID: \"260ff3a2-31e2-4ef9-9d7a-ef52ddb0fc49\") " pod="openstack/cloudkitty-api-0" Feb 17 14:29:16 crc kubenswrapper[4836]: I0217 14:29:16.592185 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/260ff3a2-31e2-4ef9-9d7a-ef52ddb0fc49-config-data\") pod \"cloudkitty-api-0\" (UID: \"260ff3a2-31e2-4ef9-9d7a-ef52ddb0fc49\") " pod="openstack/cloudkitty-api-0" Feb 17 14:29:16 crc kubenswrapper[4836]: I0217 14:29:16.592906 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/260ff3a2-31e2-4ef9-9d7a-ef52ddb0fc49-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"260ff3a2-31e2-4ef9-9d7a-ef52ddb0fc49\") " pod="openstack/cloudkitty-api-0" Feb 17 14:29:16 crc kubenswrapper[4836]: I0217 14:29:16.593378 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/260ff3a2-31e2-4ef9-9d7a-ef52ddb0fc49-scripts\") pod \"cloudkitty-api-0\" (UID: \"260ff3a2-31e2-4ef9-9d7a-ef52ddb0fc49\") " pod="openstack/cloudkitty-api-0" Feb 17 14:29:16 crc kubenswrapper[4836]: I0217 14:29:16.600044 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/260ff3a2-31e2-4ef9-9d7a-ef52ddb0fc49-public-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"260ff3a2-31e2-4ef9-9d7a-ef52ddb0fc49\") " pod="openstack/cloudkitty-api-0" Feb 17 14:29:16 crc kubenswrapper[4836]: I0217 14:29:16.615041 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxmlr\" (UniqueName: \"kubernetes.io/projected/260ff3a2-31e2-4ef9-9d7a-ef52ddb0fc49-kube-api-access-wxmlr\") pod \"cloudkitty-api-0\" (UID: \"260ff3a2-31e2-4ef9-9d7a-ef52ddb0fc49\") " pod="openstack/cloudkitty-api-0" Feb 17 14:29:16 crc kubenswrapper[4836]: I0217 14:29:16.631323 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/260ff3a2-31e2-4ef9-9d7a-ef52ddb0fc49-internal-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"260ff3a2-31e2-4ef9-9d7a-ef52ddb0fc49\") " pod="openstack/cloudkitty-api-0" Feb 17 14:29:16 crc kubenswrapper[4836]: I0217 14:29:16.648822 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb10908e-7be1-4ca0-8743-7f9aaae820b7" path="/var/lib/kubelet/pods/bb10908e-7be1-4ca0-8743-7f9aaae820b7/volumes" Feb 17 14:29:16 crc kubenswrapper[4836]: I0217 14:29:16.687858 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/260ff3a2-31e2-4ef9-9d7a-ef52ddb0fc49-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"260ff3a2-31e2-4ef9-9d7a-ef52ddb0fc49\") " pod="openstack/cloudkitty-api-0" Feb 17 14:29:16 crc kubenswrapper[4836]: I0217 14:29:16.714597 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Feb 17 14:29:17 crc kubenswrapper[4836]: I0217 14:29:17.151586 4836 generic.go:334] "Generic (PLEG): container finished" podID="10f74a60-5438-45cd-a8e1-74ccc1c3b16a" containerID="d2098b2a7c4dcbee4fa27ea9bfa1c19e32c5f83e96aa663b877abb8284852c74" exitCode=0 Feb 17 14:29:17 crc kubenswrapper[4836]: I0217 14:29:17.151772 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-56bdc657f6-lhdd4" event={"ID":"10f74a60-5438-45cd-a8e1-74ccc1c3b16a","Type":"ContainerDied","Data":"d2098b2a7c4dcbee4fa27ea9bfa1c19e32c5f83e96aa663b877abb8284852c74"} Feb 17 14:29:17 crc kubenswrapper[4836]: I0217 14:29:17.879839 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-api-0"] Feb 17 14:29:18 crc kubenswrapper[4836]: I0217 14:29:18.199678 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"260ff3a2-31e2-4ef9-9d7a-ef52ddb0fc49","Type":"ContainerStarted","Data":"171bb4aa1094f8a0584a040c722a33e6019383c4680c5fcc2e47aa3bfa0b5335"} Feb 17 14:29:18 crc kubenswrapper[4836]: I0217 14:29:18.924270 4836 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cinder-api-0" podUID="8722776f-950d-46d6-8929-164cc70747af" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.185:8776/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 17 14:29:19 crc kubenswrapper[4836]: I0217 14:29:19.301581 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"260ff3a2-31e2-4ef9-9d7a-ef52ddb0fc49","Type":"ContainerStarted","Data":"b182c0e43bd062d36f7a248e3b4dc068a23d4ee84664564a49eac1b47e4ae8bd"} Feb 17 14:29:19 crc kubenswrapper[4836]: I0217 14:29:19.301642 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"260ff3a2-31e2-4ef9-9d7a-ef52ddb0fc49","Type":"ContainerStarted","Data":"03b77758d4164ee76b6cd9772d66859d23a2ed9d8f68d0c6fcf072d038fdaabe"} Feb 17 14:29:19 crc kubenswrapper[4836]: I0217 14:29:19.302476 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-api-0" Feb 17 14:29:19 crc kubenswrapper[4836]: I0217 14:29:19.333353 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d9f887f5-6ce0-4320-94fc-024b1b9ef725","Type":"ContainerStarted","Data":"c8b99a1c879fa5cc6c1f47dcb0736390ea6d8c4736e3f5b0bc65697ec35d7092"} Feb 17 14:29:19 crc kubenswrapper[4836]: I0217 14:29:19.333854 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 17 14:29:19 crc kubenswrapper[4836]: I0217 14:29:19.337148 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-api-0" podStartSLOduration=3.33711234 podStartE2EDuration="3.33711234s" podCreationTimestamp="2026-02-17 14:29:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:29:19.333163233 +0000 UTC m=+1385.676091502" watchObservedRunningTime="2026-02-17 14:29:19.33711234 +0000 UTC m=+1385.680040609" Feb 17 14:29:19 crc kubenswrapper[4836]: I0217 14:29:19.368274 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=4.651975759 podStartE2EDuration="11.368248131s" podCreationTimestamp="2026-02-17 14:29:08 +0000 UTC" firstStartedPulling="2026-02-17 14:29:10.785218917 +0000 UTC m=+1377.128147186" lastFinishedPulling="2026-02-17 14:29:17.501491299 +0000 UTC m=+1383.844419558" observedRunningTime="2026-02-17 14:29:19.360737817 +0000 UTC m=+1385.703666096" watchObservedRunningTime="2026-02-17 14:29:19.368248131 +0000 UTC m=+1385.711176400" Feb 17 14:29:19 crc kubenswrapper[4836]: I0217 14:29:19.428514 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-67bdc55879-cjz8m" Feb 17 14:29:19 crc kubenswrapper[4836]: I0217 14:29:19.529648 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-nvkvs"] Feb 17 14:29:19 crc kubenswrapper[4836]: I0217 14:29:19.530131 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c9776ccc5-nvkvs" podUID="79b71acb-6b55-4f99-8b13-0c5aea065cbb" containerName="dnsmasq-dns" containerID="cri-o://116ce92f31628ecf8d5384bc487f6288540b5a8b08da5572838c3c49083bb344" gracePeriod=10 Feb 17 14:29:20 crc kubenswrapper[4836]: I0217 14:29:20.455900 4836 generic.go:334] "Generic (PLEG): container finished" podID="10f74a60-5438-45cd-a8e1-74ccc1c3b16a" containerID="b7a5e210ee7a505ae087f3c56329942b71db962383e4ae1693812dd8340169c8" exitCode=0 Feb 17 14:29:20 crc kubenswrapper[4836]: I0217 14:29:20.456025 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-56bdc657f6-lhdd4" event={"ID":"10f74a60-5438-45cd-a8e1-74ccc1c3b16a","Type":"ContainerDied","Data":"b7a5e210ee7a505ae087f3c56329942b71db962383e4ae1693812dd8340169c8"} Feb 17 14:29:20 crc kubenswrapper[4836]: I0217 14:29:20.469158 4836 generic.go:334] "Generic (PLEG): container finished" podID="79b71acb-6b55-4f99-8b13-0c5aea065cbb" containerID="116ce92f31628ecf8d5384bc487f6288540b5a8b08da5572838c3c49083bb344" exitCode=0 Feb 17 14:29:20 crc kubenswrapper[4836]: I0217 14:29:20.470508 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-nvkvs" event={"ID":"79b71acb-6b55-4f99-8b13-0c5aea065cbb","Type":"ContainerDied","Data":"116ce92f31628ecf8d5384bc487f6288540b5a8b08da5572838c3c49083bb344"} Feb 17 14:29:21 crc kubenswrapper[4836]: I0217 14:29:21.048243 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-nvkvs" Feb 17 14:29:21 crc kubenswrapper[4836]: I0217 14:29:21.137533 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/79b71acb-6b55-4f99-8b13-0c5aea065cbb-ovsdbserver-nb\") pod \"79b71acb-6b55-4f99-8b13-0c5aea065cbb\" (UID: \"79b71acb-6b55-4f99-8b13-0c5aea065cbb\") " Feb 17 14:29:21 crc kubenswrapper[4836]: I0217 14:29:21.137616 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/79b71acb-6b55-4f99-8b13-0c5aea065cbb-dns-svc\") pod \"79b71acb-6b55-4f99-8b13-0c5aea065cbb\" (UID: \"79b71acb-6b55-4f99-8b13-0c5aea065cbb\") " Feb 17 14:29:21 crc kubenswrapper[4836]: I0217 14:29:21.137710 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79b71acb-6b55-4f99-8b13-0c5aea065cbb-config\") pod \"79b71acb-6b55-4f99-8b13-0c5aea065cbb\" (UID: \"79b71acb-6b55-4f99-8b13-0c5aea065cbb\") " Feb 17 14:29:21 crc kubenswrapper[4836]: I0217 14:29:21.137841 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zvggg\" (UniqueName: \"kubernetes.io/projected/79b71acb-6b55-4f99-8b13-0c5aea065cbb-kube-api-access-zvggg\") pod \"79b71acb-6b55-4f99-8b13-0c5aea065cbb\" (UID: \"79b71acb-6b55-4f99-8b13-0c5aea065cbb\") " Feb 17 14:29:21 crc kubenswrapper[4836]: I0217 14:29:21.137930 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/79b71acb-6b55-4f99-8b13-0c5aea065cbb-ovsdbserver-sb\") pod \"79b71acb-6b55-4f99-8b13-0c5aea065cbb\" (UID: \"79b71acb-6b55-4f99-8b13-0c5aea065cbb\") " Feb 17 14:29:21 crc kubenswrapper[4836]: I0217 14:29:21.137976 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/79b71acb-6b55-4f99-8b13-0c5aea065cbb-dns-swift-storage-0\") pod \"79b71acb-6b55-4f99-8b13-0c5aea065cbb\" (UID: \"79b71acb-6b55-4f99-8b13-0c5aea065cbb\") " Feb 17 14:29:21 crc kubenswrapper[4836]: I0217 14:29:21.152024 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79b71acb-6b55-4f99-8b13-0c5aea065cbb-kube-api-access-zvggg" (OuterVolumeSpecName: "kube-api-access-zvggg") pod "79b71acb-6b55-4f99-8b13-0c5aea065cbb" (UID: "79b71acb-6b55-4f99-8b13-0c5aea065cbb"). InnerVolumeSpecName "kube-api-access-zvggg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:29:21 crc kubenswrapper[4836]: I0217 14:29:21.175545 4836 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="8722776f-950d-46d6-8929-164cc70747af" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.185:8776/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 17 14:29:21 crc kubenswrapper[4836]: I0217 14:29:21.431848 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79b71acb-6b55-4f99-8b13-0c5aea065cbb-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "79b71acb-6b55-4f99-8b13-0c5aea065cbb" (UID: "79b71acb-6b55-4f99-8b13-0c5aea065cbb"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:29:21 crc kubenswrapper[4836]: I0217 14:29:21.432793 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79b71acb-6b55-4f99-8b13-0c5aea065cbb-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "79b71acb-6b55-4f99-8b13-0c5aea065cbb" (UID: "79b71acb-6b55-4f99-8b13-0c5aea065cbb"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:29:21 crc kubenswrapper[4836]: I0217 14:29:21.438535 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79b71acb-6b55-4f99-8b13-0c5aea065cbb-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "79b71acb-6b55-4f99-8b13-0c5aea065cbb" (UID: "79b71acb-6b55-4f99-8b13-0c5aea065cbb"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:29:21 crc kubenswrapper[4836]: I0217 14:29:21.440740 4836 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/79b71acb-6b55-4f99-8b13-0c5aea065cbb-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:21 crc kubenswrapper[4836]: I0217 14:29:21.440782 4836 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/79b71acb-6b55-4f99-8b13-0c5aea065cbb-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:21 crc kubenswrapper[4836]: I0217 14:29:21.440793 4836 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/79b71acb-6b55-4f99-8b13-0c5aea065cbb-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:21 crc kubenswrapper[4836]: I0217 14:29:21.440802 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zvggg\" (UniqueName: \"kubernetes.io/projected/79b71acb-6b55-4f99-8b13-0c5aea065cbb-kube-api-access-zvggg\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:21 crc kubenswrapper[4836]: I0217 14:29:21.453501 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79b71acb-6b55-4f99-8b13-0c5aea065cbb-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "79b71acb-6b55-4f99-8b13-0c5aea065cbb" (UID: "79b71acb-6b55-4f99-8b13-0c5aea065cbb"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:29:21 crc kubenswrapper[4836]: I0217 14:29:21.480032 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79b71acb-6b55-4f99-8b13-0c5aea065cbb-config" (OuterVolumeSpecName: "config") pod "79b71acb-6b55-4f99-8b13-0c5aea065cbb" (UID: "79b71acb-6b55-4f99-8b13-0c5aea065cbb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:29:21 crc kubenswrapper[4836]: I0217 14:29:21.503706 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-56bdc657f6-lhdd4" event={"ID":"10f74a60-5438-45cd-a8e1-74ccc1c3b16a","Type":"ContainerDied","Data":"3131621aad6bddf8f2539d514b9526e7c3c20a9b86076d983784e09cb9285473"} Feb 17 14:29:21 crc kubenswrapper[4836]: I0217 14:29:21.503765 4836 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3131621aad6bddf8f2539d514b9526e7c3c20a9b86076d983784e09cb9285473" Feb 17 14:29:21 crc kubenswrapper[4836]: I0217 14:29:21.506078 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-56bdc657f6-lhdd4" Feb 17 14:29:21 crc kubenswrapper[4836]: I0217 14:29:21.506686 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-nvkvs" event={"ID":"79b71acb-6b55-4f99-8b13-0c5aea065cbb","Type":"ContainerDied","Data":"1fe3b4e682953cc1e2a6a78ba19a4ca238a5effc3b1823c6d9c0ce3876e226a4"} Feb 17 14:29:21 crc kubenswrapper[4836]: I0217 14:29:21.506738 4836 scope.go:117] "RemoveContainer" containerID="116ce92f31628ecf8d5384bc487f6288540b5a8b08da5572838c3c49083bb344" Feb 17 14:29:21 crc kubenswrapper[4836]: I0217 14:29:21.506869 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-nvkvs" Feb 17 14:29:21 crc kubenswrapper[4836]: I0217 14:29:21.545121 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10f74a60-5438-45cd-a8e1-74ccc1c3b16a-combined-ca-bundle\") pod \"10f74a60-5438-45cd-a8e1-74ccc1c3b16a\" (UID: \"10f74a60-5438-45cd-a8e1-74ccc1c3b16a\") " Feb 17 14:29:21 crc kubenswrapper[4836]: I0217 14:29:21.545256 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/10f74a60-5438-45cd-a8e1-74ccc1c3b16a-ovndb-tls-certs\") pod \"10f74a60-5438-45cd-a8e1-74ccc1c3b16a\" (UID: \"10f74a60-5438-45cd-a8e1-74ccc1c3b16a\") " Feb 17 14:29:21 crc kubenswrapper[4836]: I0217 14:29:21.553550 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/10f74a60-5438-45cd-a8e1-74ccc1c3b16a-config\") pod \"10f74a60-5438-45cd-a8e1-74ccc1c3b16a\" (UID: \"10f74a60-5438-45cd-a8e1-74ccc1c3b16a\") " Feb 17 14:29:21 crc kubenswrapper[4836]: I0217 14:29:21.553610 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/10f74a60-5438-45cd-a8e1-74ccc1c3b16a-httpd-config\") pod \"10f74a60-5438-45cd-a8e1-74ccc1c3b16a\" (UID: \"10f74a60-5438-45cd-a8e1-74ccc1c3b16a\") " Feb 17 14:29:21 crc kubenswrapper[4836]: I0217 14:29:21.553969 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-whzb4\" (UniqueName: \"kubernetes.io/projected/10f74a60-5438-45cd-a8e1-74ccc1c3b16a-kube-api-access-whzb4\") pod \"10f74a60-5438-45cd-a8e1-74ccc1c3b16a\" (UID: \"10f74a60-5438-45cd-a8e1-74ccc1c3b16a\") " Feb 17 14:29:21 crc kubenswrapper[4836]: I0217 14:29:21.556186 4836 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79b71acb-6b55-4f99-8b13-0c5aea065cbb-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:21 crc kubenswrapper[4836]: I0217 14:29:21.556205 4836 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/79b71acb-6b55-4f99-8b13-0c5aea065cbb-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:21 crc kubenswrapper[4836]: I0217 14:29:21.562441 4836 scope.go:117] "RemoveContainer" containerID="277bd33eae834b988e7c295c653ee707631d0efdc5453cfacb6a97be01ceb016" Feb 17 14:29:21 crc kubenswrapper[4836]: I0217 14:29:21.572471 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10f74a60-5438-45cd-a8e1-74ccc1c3b16a-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "10f74a60-5438-45cd-a8e1-74ccc1c3b16a" (UID: "10f74a60-5438-45cd-a8e1-74ccc1c3b16a"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:29:21 crc kubenswrapper[4836]: I0217 14:29:21.579720 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10f74a60-5438-45cd-a8e1-74ccc1c3b16a-kube-api-access-whzb4" (OuterVolumeSpecName: "kube-api-access-whzb4") pod "10f74a60-5438-45cd-a8e1-74ccc1c3b16a" (UID: "10f74a60-5438-45cd-a8e1-74ccc1c3b16a"). InnerVolumeSpecName "kube-api-access-whzb4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:29:21 crc kubenswrapper[4836]: I0217 14:29:21.617367 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-nvkvs"] Feb 17 14:29:21 crc kubenswrapper[4836]: I0217 14:29:21.637629 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-nvkvs"] Feb 17 14:29:21 crc kubenswrapper[4836]: I0217 14:29:21.663333 4836 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/10f74a60-5438-45cd-a8e1-74ccc1c3b16a-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:21 crc kubenswrapper[4836]: I0217 14:29:21.663392 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-whzb4\" (UniqueName: \"kubernetes.io/projected/10f74a60-5438-45cd-a8e1-74ccc1c3b16a-kube-api-access-whzb4\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:21 crc kubenswrapper[4836]: I0217 14:29:21.672037 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10f74a60-5438-45cd-a8e1-74ccc1c3b16a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "10f74a60-5438-45cd-a8e1-74ccc1c3b16a" (UID: "10f74a60-5438-45cd-a8e1-74ccc1c3b16a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:29:21 crc kubenswrapper[4836]: I0217 14:29:21.688518 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10f74a60-5438-45cd-a8e1-74ccc1c3b16a-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "10f74a60-5438-45cd-a8e1-74ccc1c3b16a" (UID: "10f74a60-5438-45cd-a8e1-74ccc1c3b16a"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:29:21 crc kubenswrapper[4836]: I0217 14:29:21.696549 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10f74a60-5438-45cd-a8e1-74ccc1c3b16a-config" (OuterVolumeSpecName: "config") pod "10f74a60-5438-45cd-a8e1-74ccc1c3b16a" (UID: "10f74a60-5438-45cd-a8e1-74ccc1c3b16a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:29:21 crc kubenswrapper[4836]: I0217 14:29:21.766397 4836 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/10f74a60-5438-45cd-a8e1-74ccc1c3b16a-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:21 crc kubenswrapper[4836]: I0217 14:29:21.766472 4836 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10f74a60-5438-45cd-a8e1-74ccc1c3b16a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:21 crc kubenswrapper[4836]: I0217 14:29:21.766486 4836 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/10f74a60-5438-45cd-a8e1-74ccc1c3b16a-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:22 crc kubenswrapper[4836]: I0217 14:29:22.528859 4836 generic.go:334] "Generic (PLEG): container finished" podID="39d3cbf1-d107-4004-9eec-698f8f4360b9" containerID="9ee60ada822c522c9249d0e3c31f511d939804abdb610bce124e951b7000a09d" exitCode=0 Feb 17 14:29:22 crc kubenswrapper[4836]: I0217 14:29:22.528976 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"39d3cbf1-d107-4004-9eec-698f8f4360b9","Type":"ContainerDied","Data":"9ee60ada822c522c9249d0e3c31f511d939804abdb610bce124e951b7000a09d"} Feb 17 14:29:22 crc kubenswrapper[4836]: I0217 14:29:22.529432 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"39d3cbf1-d107-4004-9eec-698f8f4360b9","Type":"ContainerDied","Data":"9d18961b4f807b2d078a92a071d329120e24d89e463eadbb04ec662d87231dc8"} Feb 17 14:29:22 crc kubenswrapper[4836]: I0217 14:29:22.529454 4836 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9d18961b4f807b2d078a92a071d329120e24d89e463eadbb04ec662d87231dc8" Feb 17 14:29:22 crc kubenswrapper[4836]: I0217 14:29:22.537959 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-56bdc657f6-lhdd4" Feb 17 14:29:22 crc kubenswrapper[4836]: I0217 14:29:22.560601 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Feb 17 14:29:22 crc kubenswrapper[4836]: I0217 14:29:22.835969 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/39d3cbf1-d107-4004-9eec-698f8f4360b9-config-data-custom\") pod \"39d3cbf1-d107-4004-9eec-698f8f4360b9\" (UID: \"39d3cbf1-d107-4004-9eec-698f8f4360b9\") " Feb 17 14:29:22 crc kubenswrapper[4836]: I0217 14:29:22.836090 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39d3cbf1-d107-4004-9eec-698f8f4360b9-scripts\") pod \"39d3cbf1-d107-4004-9eec-698f8f4360b9\" (UID: \"39d3cbf1-d107-4004-9eec-698f8f4360b9\") " Feb 17 14:29:22 crc kubenswrapper[4836]: I0217 14:29:22.836172 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39d3cbf1-d107-4004-9eec-698f8f4360b9-config-data\") pod \"39d3cbf1-d107-4004-9eec-698f8f4360b9\" (UID: \"39d3cbf1-d107-4004-9eec-698f8f4360b9\") " Feb 17 14:29:22 crc kubenswrapper[4836]: I0217 14:29:22.836373 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39d3cbf1-d107-4004-9eec-698f8f4360b9-combined-ca-bundle\") pod \"39d3cbf1-d107-4004-9eec-698f8f4360b9\" (UID: \"39d3cbf1-d107-4004-9eec-698f8f4360b9\") " Feb 17 14:29:22 crc kubenswrapper[4836]: I0217 14:29:22.836407 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/39d3cbf1-d107-4004-9eec-698f8f4360b9-certs\") pod \"39d3cbf1-d107-4004-9eec-698f8f4360b9\" (UID: \"39d3cbf1-d107-4004-9eec-698f8f4360b9\") " Feb 17 14:29:22 crc kubenswrapper[4836]: I0217 14:29:22.836444 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8gqz9\" (UniqueName: \"kubernetes.io/projected/39d3cbf1-d107-4004-9eec-698f8f4360b9-kube-api-access-8gqz9\") pod \"39d3cbf1-d107-4004-9eec-698f8f4360b9\" (UID: \"39d3cbf1-d107-4004-9eec-698f8f4360b9\") " Feb 17 14:29:22 crc kubenswrapper[4836]: I0217 14:29:22.845601 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39d3cbf1-d107-4004-9eec-698f8f4360b9-scripts" (OuterVolumeSpecName: "scripts") pod "39d3cbf1-d107-4004-9eec-698f8f4360b9" (UID: "39d3cbf1-d107-4004-9eec-698f8f4360b9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:29:22 crc kubenswrapper[4836]: I0217 14:29:22.846257 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39d3cbf1-d107-4004-9eec-698f8f4360b9-kube-api-access-8gqz9" (OuterVolumeSpecName: "kube-api-access-8gqz9") pod "39d3cbf1-d107-4004-9eec-698f8f4360b9" (UID: "39d3cbf1-d107-4004-9eec-698f8f4360b9"). InnerVolumeSpecName "kube-api-access-8gqz9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:29:22 crc kubenswrapper[4836]: I0217 14:29:22.849019 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79b71acb-6b55-4f99-8b13-0c5aea065cbb" path="/var/lib/kubelet/pods/79b71acb-6b55-4f99-8b13-0c5aea065cbb/volumes" Feb 17 14:29:22 crc kubenswrapper[4836]: I0217 14:29:22.857993 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39d3cbf1-d107-4004-9eec-698f8f4360b9-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "39d3cbf1-d107-4004-9eec-698f8f4360b9" (UID: "39d3cbf1-d107-4004-9eec-698f8f4360b9"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:29:22 crc kubenswrapper[4836]: I0217 14:29:22.878727 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39d3cbf1-d107-4004-9eec-698f8f4360b9-certs" (OuterVolumeSpecName: "certs") pod "39d3cbf1-d107-4004-9eec-698f8f4360b9" (UID: "39d3cbf1-d107-4004-9eec-698f8f4360b9"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:29:22 crc kubenswrapper[4836]: I0217 14:29:22.897946 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-56bdc657f6-lhdd4"] Feb 17 14:29:22 crc kubenswrapper[4836]: I0217 14:29:22.913582 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39d3cbf1-d107-4004-9eec-698f8f4360b9-config-data" (OuterVolumeSpecName: "config-data") pod "39d3cbf1-d107-4004-9eec-698f8f4360b9" (UID: "39d3cbf1-d107-4004-9eec-698f8f4360b9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:29:22 crc kubenswrapper[4836]: I0217 14:29:22.940425 4836 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/39d3cbf1-d107-4004-9eec-698f8f4360b9-certs\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:22 crc kubenswrapper[4836]: I0217 14:29:22.940469 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8gqz9\" (UniqueName: \"kubernetes.io/projected/39d3cbf1-d107-4004-9eec-698f8f4360b9-kube-api-access-8gqz9\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:22 crc kubenswrapper[4836]: I0217 14:29:22.940485 4836 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/39d3cbf1-d107-4004-9eec-698f8f4360b9-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:22 crc kubenswrapper[4836]: I0217 14:29:22.940496 4836 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/39d3cbf1-d107-4004-9eec-698f8f4360b9-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:22 crc kubenswrapper[4836]: I0217 14:29:22.940507 4836 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39d3cbf1-d107-4004-9eec-698f8f4360b9-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:22 crc kubenswrapper[4836]: I0217 14:29:22.941633 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-56bdc657f6-lhdd4"] Feb 17 14:29:23 crc kubenswrapper[4836]: I0217 14:29:23.044573 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39d3cbf1-d107-4004-9eec-698f8f4360b9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "39d3cbf1-d107-4004-9eec-698f8f4360b9" (UID: "39d3cbf1-d107-4004-9eec-698f8f4360b9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:29:23 crc kubenswrapper[4836]: I0217 14:29:23.145722 4836 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39d3cbf1-d107-4004-9eec-698f8f4360b9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:23 crc kubenswrapper[4836]: I0217 14:29:23.547714 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Feb 17 14:29:23 crc kubenswrapper[4836]: I0217 14:29:23.599684 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 17 14:29:23 crc kubenswrapper[4836]: I0217 14:29:23.618993 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 17 14:29:23 crc kubenswrapper[4836]: I0217 14:29:23.655258 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 17 14:29:23 crc kubenswrapper[4836]: E0217 14:29:23.659584 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10f74a60-5438-45cd-a8e1-74ccc1c3b16a" containerName="neutron-api" Feb 17 14:29:23 crc kubenswrapper[4836]: I0217 14:29:23.659732 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="10f74a60-5438-45cd-a8e1-74ccc1c3b16a" containerName="neutron-api" Feb 17 14:29:23 crc kubenswrapper[4836]: E0217 14:29:23.659872 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39d3cbf1-d107-4004-9eec-698f8f4360b9" containerName="cloudkitty-proc" Feb 17 14:29:23 crc kubenswrapper[4836]: I0217 14:29:23.659885 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="39d3cbf1-d107-4004-9eec-698f8f4360b9" containerName="cloudkitty-proc" Feb 17 14:29:23 crc kubenswrapper[4836]: E0217 14:29:23.659909 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79b71acb-6b55-4f99-8b13-0c5aea065cbb" containerName="init" Feb 17 14:29:23 crc kubenswrapper[4836]: I0217 14:29:23.659916 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="79b71acb-6b55-4f99-8b13-0c5aea065cbb" containerName="init" Feb 17 14:29:23 crc kubenswrapper[4836]: E0217 14:29:23.660034 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79b71acb-6b55-4f99-8b13-0c5aea065cbb" containerName="dnsmasq-dns" Feb 17 14:29:23 crc kubenswrapper[4836]: I0217 14:29:23.660069 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="79b71acb-6b55-4f99-8b13-0c5aea065cbb" containerName="dnsmasq-dns" Feb 17 14:29:23 crc kubenswrapper[4836]: E0217 14:29:23.660274 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10f74a60-5438-45cd-a8e1-74ccc1c3b16a" containerName="neutron-httpd" Feb 17 14:29:23 crc kubenswrapper[4836]: I0217 14:29:23.660284 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="10f74a60-5438-45cd-a8e1-74ccc1c3b16a" containerName="neutron-httpd" Feb 17 14:29:23 crc kubenswrapper[4836]: I0217 14:29:23.664784 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="10f74a60-5438-45cd-a8e1-74ccc1c3b16a" containerName="neutron-api" Feb 17 14:29:23 crc kubenswrapper[4836]: I0217 14:29:23.664849 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="10f74a60-5438-45cd-a8e1-74ccc1c3b16a" containerName="neutron-httpd" Feb 17 14:29:23 crc kubenswrapper[4836]: I0217 14:29:23.664907 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="79b71acb-6b55-4f99-8b13-0c5aea065cbb" containerName="dnsmasq-dns" Feb 17 14:29:23 crc kubenswrapper[4836]: I0217 14:29:23.664936 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="39d3cbf1-d107-4004-9eec-698f8f4360b9" containerName="cloudkitty-proc" Feb 17 14:29:23 crc kubenswrapper[4836]: I0217 14:29:23.704544 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Feb 17 14:29:23 crc kubenswrapper[4836]: I0217 14:29:23.708282 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-proc-config-data" Feb 17 14:29:23 crc kubenswrapper[4836]: I0217 14:29:23.715328 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 17 14:29:24 crc kubenswrapper[4836]: I0217 14:29:24.066765 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79c00bb2-9487-433a-be90-07b6d885e4d0-scripts\") pod \"cloudkitty-proc-0\" (UID: \"79c00bb2-9487-433a-be90-07b6d885e4d0\") " pod="openstack/cloudkitty-proc-0" Feb 17 14:29:24 crc kubenswrapper[4836]: I0217 14:29:24.066856 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4zk4\" (UniqueName: \"kubernetes.io/projected/79c00bb2-9487-433a-be90-07b6d885e4d0-kube-api-access-t4zk4\") pod \"cloudkitty-proc-0\" (UID: \"79c00bb2-9487-433a-be90-07b6d885e4d0\") " pod="openstack/cloudkitty-proc-0" Feb 17 14:29:24 crc kubenswrapper[4836]: I0217 14:29:24.066902 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/79c00bb2-9487-433a-be90-07b6d885e4d0-certs\") pod \"cloudkitty-proc-0\" (UID: \"79c00bb2-9487-433a-be90-07b6d885e4d0\") " pod="openstack/cloudkitty-proc-0" Feb 17 14:29:24 crc kubenswrapper[4836]: I0217 14:29:24.066930 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/79c00bb2-9487-433a-be90-07b6d885e4d0-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"79c00bb2-9487-433a-be90-07b6d885e4d0\") " pod="openstack/cloudkitty-proc-0" Feb 17 14:29:24 crc kubenswrapper[4836]: I0217 14:29:24.066977 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79c00bb2-9487-433a-be90-07b6d885e4d0-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"79c00bb2-9487-433a-be90-07b6d885e4d0\") " pod="openstack/cloudkitty-proc-0" Feb 17 14:29:24 crc kubenswrapper[4836]: I0217 14:29:24.067245 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79c00bb2-9487-433a-be90-07b6d885e4d0-config-data\") pod \"cloudkitty-proc-0\" (UID: \"79c00bb2-9487-433a-be90-07b6d885e4d0\") " pod="openstack/cloudkitty-proc-0" Feb 17 14:29:24 crc kubenswrapper[4836]: I0217 14:29:24.168841 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79c00bb2-9487-433a-be90-07b6d885e4d0-scripts\") pod \"cloudkitty-proc-0\" (UID: \"79c00bb2-9487-433a-be90-07b6d885e4d0\") " pod="openstack/cloudkitty-proc-0" Feb 17 14:29:24 crc kubenswrapper[4836]: I0217 14:29:24.168945 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4zk4\" (UniqueName: \"kubernetes.io/projected/79c00bb2-9487-433a-be90-07b6d885e4d0-kube-api-access-t4zk4\") pod \"cloudkitty-proc-0\" (UID: \"79c00bb2-9487-433a-be90-07b6d885e4d0\") " pod="openstack/cloudkitty-proc-0" Feb 17 14:29:24 crc kubenswrapper[4836]: I0217 14:29:24.168984 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/79c00bb2-9487-433a-be90-07b6d885e4d0-certs\") pod \"cloudkitty-proc-0\" (UID: \"79c00bb2-9487-433a-be90-07b6d885e4d0\") " pod="openstack/cloudkitty-proc-0" Feb 17 14:29:24 crc kubenswrapper[4836]: I0217 14:29:24.169009 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/79c00bb2-9487-433a-be90-07b6d885e4d0-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"79c00bb2-9487-433a-be90-07b6d885e4d0\") " pod="openstack/cloudkitty-proc-0" Feb 17 14:29:24 crc kubenswrapper[4836]: I0217 14:29:24.169044 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79c00bb2-9487-433a-be90-07b6d885e4d0-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"79c00bb2-9487-433a-be90-07b6d885e4d0\") " pod="openstack/cloudkitty-proc-0" Feb 17 14:29:24 crc kubenswrapper[4836]: I0217 14:29:24.169084 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79c00bb2-9487-433a-be90-07b6d885e4d0-config-data\") pod \"cloudkitty-proc-0\" (UID: \"79c00bb2-9487-433a-be90-07b6d885e4d0\") " pod="openstack/cloudkitty-proc-0" Feb 17 14:29:24 crc kubenswrapper[4836]: I0217 14:29:24.173690 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79c00bb2-9487-433a-be90-07b6d885e4d0-config-data\") pod \"cloudkitty-proc-0\" (UID: \"79c00bb2-9487-433a-be90-07b6d885e4d0\") " pod="openstack/cloudkitty-proc-0" Feb 17 14:29:24 crc kubenswrapper[4836]: I0217 14:29:24.174608 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/79c00bb2-9487-433a-be90-07b6d885e4d0-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"79c00bb2-9487-433a-be90-07b6d885e4d0\") " pod="openstack/cloudkitty-proc-0" Feb 17 14:29:24 crc kubenswrapper[4836]: I0217 14:29:24.177103 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79c00bb2-9487-433a-be90-07b6d885e4d0-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"79c00bb2-9487-433a-be90-07b6d885e4d0\") " pod="openstack/cloudkitty-proc-0" Feb 17 14:29:24 crc kubenswrapper[4836]: I0217 14:29:24.178274 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/79c00bb2-9487-433a-be90-07b6d885e4d0-certs\") pod \"cloudkitty-proc-0\" (UID: \"79c00bb2-9487-433a-be90-07b6d885e4d0\") " pod="openstack/cloudkitty-proc-0" Feb 17 14:29:24 crc kubenswrapper[4836]: I0217 14:29:24.181851 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79c00bb2-9487-433a-be90-07b6d885e4d0-scripts\") pod \"cloudkitty-proc-0\" (UID: \"79c00bb2-9487-433a-be90-07b6d885e4d0\") " pod="openstack/cloudkitty-proc-0" Feb 17 14:29:24 crc kubenswrapper[4836]: I0217 14:29:24.195283 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4zk4\" (UniqueName: \"kubernetes.io/projected/79c00bb2-9487-433a-be90-07b6d885e4d0-kube-api-access-t4zk4\") pod \"cloudkitty-proc-0\" (UID: \"79c00bb2-9487-433a-be90-07b6d885e4d0\") " pod="openstack/cloudkitty-proc-0" Feb 17 14:29:24 crc kubenswrapper[4836]: I0217 14:29:24.352675 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Feb 17 14:29:24 crc kubenswrapper[4836]: I0217 14:29:24.605614 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10f74a60-5438-45cd-a8e1-74ccc1c3b16a" path="/var/lib/kubelet/pods/10f74a60-5438-45cd-a8e1-74ccc1c3b16a/volumes" Feb 17 14:29:24 crc kubenswrapper[4836]: I0217 14:29:24.607109 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39d3cbf1-d107-4004-9eec-698f8f4360b9" path="/var/lib/kubelet/pods/39d3cbf1-d107-4004-9eec-698f8f4360b9/volumes" Feb 17 14:29:24 crc kubenswrapper[4836]: I0217 14:29:24.955152 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 17 14:29:24 crc kubenswrapper[4836]: W0217 14:29:24.972655 4836 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod79c00bb2_9487_433a_be90_07b6d885e4d0.slice/crio-037da9c804a575629b2d00187037dbbd47c8894e5055ae16baa77a9785db9029 WatchSource:0}: Error finding container 037da9c804a575629b2d00187037dbbd47c8894e5055ae16baa77a9785db9029: Status 404 returned error can't find the container with id 037da9c804a575629b2d00187037dbbd47c8894e5055ae16baa77a9785db9029 Feb 17 14:29:25 crc kubenswrapper[4836]: I0217 14:29:25.617666 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"79c00bb2-9487-433a-be90-07b6d885e4d0","Type":"ContainerStarted","Data":"4822625b39e69e126abd0c471117fa05d8239917395a393efe328d6f62d1df58"} Feb 17 14:29:25 crc kubenswrapper[4836]: I0217 14:29:25.619498 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"79c00bb2-9487-433a-be90-07b6d885e4d0","Type":"ContainerStarted","Data":"037da9c804a575629b2d00187037dbbd47c8894e5055ae16baa77a9785db9029"} Feb 17 14:29:25 crc kubenswrapper[4836]: I0217 14:29:25.650676 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-proc-0" podStartSLOduration=2.650655536 podStartE2EDuration="2.650655536s" podCreationTimestamp="2026-02-17 14:29:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:29:25.646779191 +0000 UTC m=+1391.989707470" watchObservedRunningTime="2026-02-17 14:29:25.650655536 +0000 UTC m=+1391.993583805" Feb 17 14:29:25 crc kubenswrapper[4836]: I0217 14:29:25.981614 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-5d87f46c5f-vfn9f"] Feb 17 14:29:25 crc kubenswrapper[4836]: I0217 14:29:25.984457 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5d87f46c5f-vfn9f" Feb 17 14:29:25 crc kubenswrapper[4836]: I0217 14:29:25.988231 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Feb 17 14:29:25 crc kubenswrapper[4836]: I0217 14:29:25.988444 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Feb 17 14:29:25 crc kubenswrapper[4836]: I0217 14:29:25.988756 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Feb 17 14:29:26 crc kubenswrapper[4836]: I0217 14:29:26.017705 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-5d87f46c5f-vfn9f"] Feb 17 14:29:26 crc kubenswrapper[4836]: I0217 14:29:26.182813 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a17ffb1e-09d2-4524-8c33-e50e15b9031d-config-data\") pod \"swift-proxy-5d87f46c5f-vfn9f\" (UID: \"a17ffb1e-09d2-4524-8c33-e50e15b9031d\") " pod="openstack/swift-proxy-5d87f46c5f-vfn9f" Feb 17 14:29:26 crc kubenswrapper[4836]: I0217 14:29:26.182998 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a17ffb1e-09d2-4524-8c33-e50e15b9031d-run-httpd\") pod \"swift-proxy-5d87f46c5f-vfn9f\" (UID: \"a17ffb1e-09d2-4524-8c33-e50e15b9031d\") " pod="openstack/swift-proxy-5d87f46c5f-vfn9f" Feb 17 14:29:26 crc kubenswrapper[4836]: I0217 14:29:26.183047 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knqlh\" (UniqueName: \"kubernetes.io/projected/a17ffb1e-09d2-4524-8c33-e50e15b9031d-kube-api-access-knqlh\") pod \"swift-proxy-5d87f46c5f-vfn9f\" (UID: \"a17ffb1e-09d2-4524-8c33-e50e15b9031d\") " pod="openstack/swift-proxy-5d87f46c5f-vfn9f" Feb 17 14:29:26 crc kubenswrapper[4836]: I0217 14:29:26.183147 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a17ffb1e-09d2-4524-8c33-e50e15b9031d-public-tls-certs\") pod \"swift-proxy-5d87f46c5f-vfn9f\" (UID: \"a17ffb1e-09d2-4524-8c33-e50e15b9031d\") " pod="openstack/swift-proxy-5d87f46c5f-vfn9f" Feb 17 14:29:26 crc kubenswrapper[4836]: I0217 14:29:26.183182 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a17ffb1e-09d2-4524-8c33-e50e15b9031d-log-httpd\") pod \"swift-proxy-5d87f46c5f-vfn9f\" (UID: \"a17ffb1e-09d2-4524-8c33-e50e15b9031d\") " pod="openstack/swift-proxy-5d87f46c5f-vfn9f" Feb 17 14:29:26 crc kubenswrapper[4836]: I0217 14:29:26.183219 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a17ffb1e-09d2-4524-8c33-e50e15b9031d-etc-swift\") pod \"swift-proxy-5d87f46c5f-vfn9f\" (UID: \"a17ffb1e-09d2-4524-8c33-e50e15b9031d\") " pod="openstack/swift-proxy-5d87f46c5f-vfn9f" Feb 17 14:29:26 crc kubenswrapper[4836]: I0217 14:29:26.183253 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a17ffb1e-09d2-4524-8c33-e50e15b9031d-internal-tls-certs\") pod \"swift-proxy-5d87f46c5f-vfn9f\" (UID: \"a17ffb1e-09d2-4524-8c33-e50e15b9031d\") " pod="openstack/swift-proxy-5d87f46c5f-vfn9f" Feb 17 14:29:26 crc kubenswrapper[4836]: I0217 14:29:26.183334 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a17ffb1e-09d2-4524-8c33-e50e15b9031d-combined-ca-bundle\") pod \"swift-proxy-5d87f46c5f-vfn9f\" (UID: \"a17ffb1e-09d2-4524-8c33-e50e15b9031d\") " pod="openstack/swift-proxy-5d87f46c5f-vfn9f" Feb 17 14:29:26 crc kubenswrapper[4836]: I0217 14:29:26.285659 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a17ffb1e-09d2-4524-8c33-e50e15b9031d-run-httpd\") pod \"swift-proxy-5d87f46c5f-vfn9f\" (UID: \"a17ffb1e-09d2-4524-8c33-e50e15b9031d\") " pod="openstack/swift-proxy-5d87f46c5f-vfn9f" Feb 17 14:29:26 crc kubenswrapper[4836]: I0217 14:29:26.286001 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-knqlh\" (UniqueName: \"kubernetes.io/projected/a17ffb1e-09d2-4524-8c33-e50e15b9031d-kube-api-access-knqlh\") pod \"swift-proxy-5d87f46c5f-vfn9f\" (UID: \"a17ffb1e-09d2-4524-8c33-e50e15b9031d\") " pod="openstack/swift-proxy-5d87f46c5f-vfn9f" Feb 17 14:29:26 crc kubenswrapper[4836]: I0217 14:29:26.286179 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a17ffb1e-09d2-4524-8c33-e50e15b9031d-public-tls-certs\") pod \"swift-proxy-5d87f46c5f-vfn9f\" (UID: \"a17ffb1e-09d2-4524-8c33-e50e15b9031d\") " pod="openstack/swift-proxy-5d87f46c5f-vfn9f" Feb 17 14:29:26 crc kubenswrapper[4836]: I0217 14:29:26.286480 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a17ffb1e-09d2-4524-8c33-e50e15b9031d-log-httpd\") pod \"swift-proxy-5d87f46c5f-vfn9f\" (UID: \"a17ffb1e-09d2-4524-8c33-e50e15b9031d\") " pod="openstack/swift-proxy-5d87f46c5f-vfn9f" Feb 17 14:29:26 crc kubenswrapper[4836]: I0217 14:29:26.286675 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a17ffb1e-09d2-4524-8c33-e50e15b9031d-etc-swift\") pod \"swift-proxy-5d87f46c5f-vfn9f\" (UID: \"a17ffb1e-09d2-4524-8c33-e50e15b9031d\") " pod="openstack/swift-proxy-5d87f46c5f-vfn9f" Feb 17 14:29:26 crc kubenswrapper[4836]: I0217 14:29:26.287231 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a17ffb1e-09d2-4524-8c33-e50e15b9031d-internal-tls-certs\") pod \"swift-proxy-5d87f46c5f-vfn9f\" (UID: \"a17ffb1e-09d2-4524-8c33-e50e15b9031d\") " pod="openstack/swift-proxy-5d87f46c5f-vfn9f" Feb 17 14:29:26 crc kubenswrapper[4836]: I0217 14:29:26.287383 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a17ffb1e-09d2-4524-8c33-e50e15b9031d-combined-ca-bundle\") pod \"swift-proxy-5d87f46c5f-vfn9f\" (UID: \"a17ffb1e-09d2-4524-8c33-e50e15b9031d\") " pod="openstack/swift-proxy-5d87f46c5f-vfn9f" Feb 17 14:29:26 crc kubenswrapper[4836]: I0217 14:29:26.287518 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a17ffb1e-09d2-4524-8c33-e50e15b9031d-config-data\") pod \"swift-proxy-5d87f46c5f-vfn9f\" (UID: \"a17ffb1e-09d2-4524-8c33-e50e15b9031d\") " pod="openstack/swift-proxy-5d87f46c5f-vfn9f" Feb 17 14:29:26 crc kubenswrapper[4836]: I0217 14:29:26.286904 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a17ffb1e-09d2-4524-8c33-e50e15b9031d-log-httpd\") pod \"swift-proxy-5d87f46c5f-vfn9f\" (UID: \"a17ffb1e-09d2-4524-8c33-e50e15b9031d\") " pod="openstack/swift-proxy-5d87f46c5f-vfn9f" Feb 17 14:29:26 crc kubenswrapper[4836]: I0217 14:29:26.299808 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a17ffb1e-09d2-4524-8c33-e50e15b9031d-run-httpd\") pod \"swift-proxy-5d87f46c5f-vfn9f\" (UID: \"a17ffb1e-09d2-4524-8c33-e50e15b9031d\") " pod="openstack/swift-proxy-5d87f46c5f-vfn9f" Feb 17 14:29:26 crc kubenswrapper[4836]: I0217 14:29:26.302790 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a17ffb1e-09d2-4524-8c33-e50e15b9031d-internal-tls-certs\") pod \"swift-proxy-5d87f46c5f-vfn9f\" (UID: \"a17ffb1e-09d2-4524-8c33-e50e15b9031d\") " pod="openstack/swift-proxy-5d87f46c5f-vfn9f" Feb 17 14:29:26 crc kubenswrapper[4836]: I0217 14:29:26.302948 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a17ffb1e-09d2-4524-8c33-e50e15b9031d-config-data\") pod \"swift-proxy-5d87f46c5f-vfn9f\" (UID: \"a17ffb1e-09d2-4524-8c33-e50e15b9031d\") " pod="openstack/swift-proxy-5d87f46c5f-vfn9f" Feb 17 14:29:26 crc kubenswrapper[4836]: I0217 14:29:26.303088 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a17ffb1e-09d2-4524-8c33-e50e15b9031d-combined-ca-bundle\") pod \"swift-proxy-5d87f46c5f-vfn9f\" (UID: \"a17ffb1e-09d2-4524-8c33-e50e15b9031d\") " pod="openstack/swift-proxy-5d87f46c5f-vfn9f" Feb 17 14:29:26 crc kubenswrapper[4836]: I0217 14:29:26.307073 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a17ffb1e-09d2-4524-8c33-e50e15b9031d-public-tls-certs\") pod \"swift-proxy-5d87f46c5f-vfn9f\" (UID: \"a17ffb1e-09d2-4524-8c33-e50e15b9031d\") " pod="openstack/swift-proxy-5d87f46c5f-vfn9f" Feb 17 14:29:26 crc kubenswrapper[4836]: I0217 14:29:26.307378 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a17ffb1e-09d2-4524-8c33-e50e15b9031d-etc-swift\") pod \"swift-proxy-5d87f46c5f-vfn9f\" (UID: \"a17ffb1e-09d2-4524-8c33-e50e15b9031d\") " pod="openstack/swift-proxy-5d87f46c5f-vfn9f" Feb 17 14:29:26 crc kubenswrapper[4836]: I0217 14:29:26.311969 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-knqlh\" (UniqueName: \"kubernetes.io/projected/a17ffb1e-09d2-4524-8c33-e50e15b9031d-kube-api-access-knqlh\") pod \"swift-proxy-5d87f46c5f-vfn9f\" (UID: \"a17ffb1e-09d2-4524-8c33-e50e15b9031d\") " pod="openstack/swift-proxy-5d87f46c5f-vfn9f" Feb 17 14:29:26 crc kubenswrapper[4836]: I0217 14:29:26.610755 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5d87f46c5f-vfn9f" Feb 17 14:29:27 crc kubenswrapper[4836]: I0217 14:29:27.287653 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 14:29:27 crc kubenswrapper[4836]: I0217 14:29:27.294771 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d9f887f5-6ce0-4320-94fc-024b1b9ef725" containerName="ceilometer-central-agent" containerID="cri-o://fb5cf9ee8d101cc6fae1fb5f79f35c27c28cf9fc0fa0631bc345f006efba64c9" gracePeriod=30 Feb 17 14:29:27 crc kubenswrapper[4836]: I0217 14:29:27.295002 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d9f887f5-6ce0-4320-94fc-024b1b9ef725" containerName="proxy-httpd" containerID="cri-o://c8b99a1c879fa5cc6c1f47dcb0736390ea6d8c4736e3f5b0bc65697ec35d7092" gracePeriod=30 Feb 17 14:29:27 crc kubenswrapper[4836]: I0217 14:29:27.295056 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d9f887f5-6ce0-4320-94fc-024b1b9ef725" containerName="sg-core" containerID="cri-o://b3a6771b8e2f194ac7bcb94abab4f0e58b19807bc132dac6154a588305752da0" gracePeriod=30 Feb 17 14:29:27 crc kubenswrapper[4836]: I0217 14:29:27.295106 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d9f887f5-6ce0-4320-94fc-024b1b9ef725" containerName="ceilometer-notification-agent" containerID="cri-o://3af1c0902b859232ebd0fe7d2dd1bbcfb19e56b6f4b1d314aace0818279210a3" gracePeriod=30 Feb 17 14:29:27 crc kubenswrapper[4836]: I0217 14:29:27.706566 4836 generic.go:334] "Generic (PLEG): container finished" podID="d9f887f5-6ce0-4320-94fc-024b1b9ef725" containerID="c8b99a1c879fa5cc6c1f47dcb0736390ea6d8c4736e3f5b0bc65697ec35d7092" exitCode=0 Feb 17 14:29:27 crc kubenswrapper[4836]: I0217 14:29:27.706607 4836 generic.go:334] "Generic (PLEG): container finished" podID="d9f887f5-6ce0-4320-94fc-024b1b9ef725" containerID="b3a6771b8e2f194ac7bcb94abab4f0e58b19807bc132dac6154a588305752da0" exitCode=2 Feb 17 14:29:27 crc kubenswrapper[4836]: I0217 14:29:27.706633 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d9f887f5-6ce0-4320-94fc-024b1b9ef725","Type":"ContainerDied","Data":"c8b99a1c879fa5cc6c1f47dcb0736390ea6d8c4736e3f5b0bc65697ec35d7092"} Feb 17 14:29:27 crc kubenswrapper[4836]: I0217 14:29:27.706666 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d9f887f5-6ce0-4320-94fc-024b1b9ef725","Type":"ContainerDied","Data":"b3a6771b8e2f194ac7bcb94abab4f0e58b19807bc132dac6154a588305752da0"} Feb 17 14:29:28 crc kubenswrapper[4836]: I0217 14:29:28.738153 4836 generic.go:334] "Generic (PLEG): container finished" podID="d9f887f5-6ce0-4320-94fc-024b1b9ef725" containerID="3af1c0902b859232ebd0fe7d2dd1bbcfb19e56b6f4b1d314aace0818279210a3" exitCode=0 Feb 17 14:29:28 crc kubenswrapper[4836]: I0217 14:29:28.738209 4836 generic.go:334] "Generic (PLEG): container finished" podID="d9f887f5-6ce0-4320-94fc-024b1b9ef725" containerID="fb5cf9ee8d101cc6fae1fb5f79f35c27c28cf9fc0fa0631bc345f006efba64c9" exitCode=0 Feb 17 14:29:28 crc kubenswrapper[4836]: I0217 14:29:28.738242 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d9f887f5-6ce0-4320-94fc-024b1b9ef725","Type":"ContainerDied","Data":"3af1c0902b859232ebd0fe7d2dd1bbcfb19e56b6f4b1d314aace0818279210a3"} Feb 17 14:29:28 crc kubenswrapper[4836]: I0217 14:29:28.738345 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d9f887f5-6ce0-4320-94fc-024b1b9ef725","Type":"ContainerDied","Data":"fb5cf9ee8d101cc6fae1fb5f79f35c27c28cf9fc0fa0631bc345f006efba64c9"} Feb 17 14:29:29 crc kubenswrapper[4836]: I0217 14:29:29.765007 4836 patch_prober.go:28] interesting pod/machine-config-daemon-bkk9g container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 14:29:29 crc kubenswrapper[4836]: I0217 14:29:29.765446 4836 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" podUID="895a19c9-a3f0-4a15-aa19-19347121388c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 14:29:35 crc kubenswrapper[4836]: E0217 14:29:35.676148 4836 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-openstackclient:current-podified" Feb 17 14:29:35 crc kubenswrapper[4836]: E0217 14:29:35.677548 4836 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:openstackclient,Image:quay.io/podified-antelope-centos9/openstack-openstackclient:current-podified,Command:[/bin/sleep],Args:[infinity],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n65h677h7bh89h55dh65hbh5fh5f4h5b7h66fh5bdh6ch95h5d6hc4h598h5fch54bh5h5f6h5d9hdh79h678h8dh95h55h5c5hdhd5h5b9q,ValueFrom:nil,},EnvVar{Name:OS_CLOUD,Value:default,ValueFrom:nil,},EnvVar{Name:PROMETHEUS_CA_CERT,Value:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,ValueFrom:nil,},EnvVar{Name:PROMETHEUS_HOST,Value:metric-storage-prometheus.openstack.svc,ValueFrom:nil,},EnvVar{Name:PROMETHEUS_PORT,Value:9090,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:openstack-config,ReadOnly:false,MountPath:/home/cloud-admin/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/home/cloud-admin/.config/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/home/cloud-admin/cloudrc,SubPath:cloudrc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-t8pml,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42401,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*42401,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstackclient_openstack(4fe674a8-c32b-412e-8d20-2a6e7e18bb10): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 14:29:35 crc kubenswrapper[4836]: E0217 14:29:35.678836 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openstackclient\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstackclient" podUID="4fe674a8-c32b-412e-8d20-2a6e7e18bb10" Feb 17 14:29:36 crc kubenswrapper[4836]: E0217 14:29:36.020456 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openstackclient\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-openstackclient:current-podified\\\"\"" pod="openstack/openstackclient" podUID="4fe674a8-c32b-412e-8d20-2a6e7e18bb10" Feb 17 14:29:36 crc kubenswrapper[4836]: I0217 14:29:36.237496 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 14:29:36 crc kubenswrapper[4836]: I0217 14:29:36.395740 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-5d87f46c5f-vfn9f"] Feb 17 14:29:36 crc kubenswrapper[4836]: W0217 14:29:36.398988 4836 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda17ffb1e_09d2_4524_8c33_e50e15b9031d.slice/crio-24bb742ba2025f263a0fd97d88da8e8a321ca187d115b21d9de3b0fa231a07a0 WatchSource:0}: Error finding container 24bb742ba2025f263a0fd97d88da8e8a321ca187d115b21d9de3b0fa231a07a0: Status 404 returned error can't find the container with id 24bb742ba2025f263a0fd97d88da8e8a321ca187d115b21d9de3b0fa231a07a0 Feb 17 14:29:36 crc kubenswrapper[4836]: I0217 14:29:36.440004 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9f887f5-6ce0-4320-94fc-024b1b9ef725-combined-ca-bundle\") pod \"d9f887f5-6ce0-4320-94fc-024b1b9ef725\" (UID: \"d9f887f5-6ce0-4320-94fc-024b1b9ef725\") " Feb 17 14:29:36 crc kubenswrapper[4836]: I0217 14:29:36.440167 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b7pcw\" (UniqueName: \"kubernetes.io/projected/d9f887f5-6ce0-4320-94fc-024b1b9ef725-kube-api-access-b7pcw\") pod \"d9f887f5-6ce0-4320-94fc-024b1b9ef725\" (UID: \"d9f887f5-6ce0-4320-94fc-024b1b9ef725\") " Feb 17 14:29:36 crc kubenswrapper[4836]: I0217 14:29:36.440317 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d9f887f5-6ce0-4320-94fc-024b1b9ef725-log-httpd\") pod \"d9f887f5-6ce0-4320-94fc-024b1b9ef725\" (UID: \"d9f887f5-6ce0-4320-94fc-024b1b9ef725\") " Feb 17 14:29:36 crc kubenswrapper[4836]: I0217 14:29:36.441272 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9f887f5-6ce0-4320-94fc-024b1b9ef725-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "d9f887f5-6ce0-4320-94fc-024b1b9ef725" (UID: "d9f887f5-6ce0-4320-94fc-024b1b9ef725"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:29:36 crc kubenswrapper[4836]: I0217 14:29:36.441522 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d9f887f5-6ce0-4320-94fc-024b1b9ef725-sg-core-conf-yaml\") pod \"d9f887f5-6ce0-4320-94fc-024b1b9ef725\" (UID: \"d9f887f5-6ce0-4320-94fc-024b1b9ef725\") " Feb 17 14:29:36 crc kubenswrapper[4836]: I0217 14:29:36.442052 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9f887f5-6ce0-4320-94fc-024b1b9ef725-config-data\") pod \"d9f887f5-6ce0-4320-94fc-024b1b9ef725\" (UID: \"d9f887f5-6ce0-4320-94fc-024b1b9ef725\") " Feb 17 14:29:36 crc kubenswrapper[4836]: I0217 14:29:36.442102 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9f887f5-6ce0-4320-94fc-024b1b9ef725-scripts\") pod \"d9f887f5-6ce0-4320-94fc-024b1b9ef725\" (UID: \"d9f887f5-6ce0-4320-94fc-024b1b9ef725\") " Feb 17 14:29:36 crc kubenswrapper[4836]: I0217 14:29:36.442243 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d9f887f5-6ce0-4320-94fc-024b1b9ef725-run-httpd\") pod \"d9f887f5-6ce0-4320-94fc-024b1b9ef725\" (UID: \"d9f887f5-6ce0-4320-94fc-024b1b9ef725\") " Feb 17 14:29:36 crc kubenswrapper[4836]: I0217 14:29:36.443158 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9f887f5-6ce0-4320-94fc-024b1b9ef725-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "d9f887f5-6ce0-4320-94fc-024b1b9ef725" (UID: "d9f887f5-6ce0-4320-94fc-024b1b9ef725"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:29:36 crc kubenswrapper[4836]: I0217 14:29:36.443193 4836 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d9f887f5-6ce0-4320-94fc-024b1b9ef725-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:36 crc kubenswrapper[4836]: I0217 14:29:36.443441 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9f887f5-6ce0-4320-94fc-024b1b9ef725-kube-api-access-b7pcw" (OuterVolumeSpecName: "kube-api-access-b7pcw") pod "d9f887f5-6ce0-4320-94fc-024b1b9ef725" (UID: "d9f887f5-6ce0-4320-94fc-024b1b9ef725"). InnerVolumeSpecName "kube-api-access-b7pcw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:29:36 crc kubenswrapper[4836]: I0217 14:29:36.447997 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9f887f5-6ce0-4320-94fc-024b1b9ef725-scripts" (OuterVolumeSpecName: "scripts") pod "d9f887f5-6ce0-4320-94fc-024b1b9ef725" (UID: "d9f887f5-6ce0-4320-94fc-024b1b9ef725"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:29:36 crc kubenswrapper[4836]: I0217 14:29:36.489967 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9f887f5-6ce0-4320-94fc-024b1b9ef725-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "d9f887f5-6ce0-4320-94fc-024b1b9ef725" (UID: "d9f887f5-6ce0-4320-94fc-024b1b9ef725"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:29:36 crc kubenswrapper[4836]: I0217 14:29:36.545258 4836 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d9f887f5-6ce0-4320-94fc-024b1b9ef725-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:36 crc kubenswrapper[4836]: I0217 14:29:36.545507 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b7pcw\" (UniqueName: \"kubernetes.io/projected/d9f887f5-6ce0-4320-94fc-024b1b9ef725-kube-api-access-b7pcw\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:36 crc kubenswrapper[4836]: I0217 14:29:36.545615 4836 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d9f887f5-6ce0-4320-94fc-024b1b9ef725-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:36 crc kubenswrapper[4836]: I0217 14:29:36.545691 4836 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9f887f5-6ce0-4320-94fc-024b1b9ef725-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:36 crc kubenswrapper[4836]: I0217 14:29:36.552944 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9f887f5-6ce0-4320-94fc-024b1b9ef725-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d9f887f5-6ce0-4320-94fc-024b1b9ef725" (UID: "d9f887f5-6ce0-4320-94fc-024b1b9ef725"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:29:36 crc kubenswrapper[4836]: I0217 14:29:36.603499 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9f887f5-6ce0-4320-94fc-024b1b9ef725-config-data" (OuterVolumeSpecName: "config-data") pod "d9f887f5-6ce0-4320-94fc-024b1b9ef725" (UID: "d9f887f5-6ce0-4320-94fc-024b1b9ef725"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:29:36 crc kubenswrapper[4836]: I0217 14:29:36.648091 4836 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9f887f5-6ce0-4320-94fc-024b1b9ef725-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:36 crc kubenswrapper[4836]: I0217 14:29:36.648128 4836 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9f887f5-6ce0-4320-94fc-024b1b9ef725-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:37 crc kubenswrapper[4836]: I0217 14:29:37.209206 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5d87f46c5f-vfn9f" event={"ID":"a17ffb1e-09d2-4524-8c33-e50e15b9031d","Type":"ContainerStarted","Data":"7821eb4bc638ddc9a8abc154e0b8520b0768fce182177f2c81c9157b5724d831"} Feb 17 14:29:37 crc kubenswrapper[4836]: I0217 14:29:37.210522 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5d87f46c5f-vfn9f" event={"ID":"a17ffb1e-09d2-4524-8c33-e50e15b9031d","Type":"ContainerStarted","Data":"24bb742ba2025f263a0fd97d88da8e8a321ca187d115b21d9de3b0fa231a07a0"} Feb 17 14:29:37 crc kubenswrapper[4836]: I0217 14:29:37.218828 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d9f887f5-6ce0-4320-94fc-024b1b9ef725","Type":"ContainerDied","Data":"2b8910649b123c250a9b2ae2a0273df5052a76cf9ac3a4d666b31acdde9dcd6e"} Feb 17 14:29:37 crc kubenswrapper[4836]: I0217 14:29:37.219042 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 14:29:37 crc kubenswrapper[4836]: I0217 14:29:37.219180 4836 scope.go:117] "RemoveContainer" containerID="c8b99a1c879fa5cc6c1f47dcb0736390ea6d8c4736e3f5b0bc65697ec35d7092" Feb 17 14:29:37 crc kubenswrapper[4836]: I0217 14:29:37.264069 4836 scope.go:117] "RemoveContainer" containerID="b3a6771b8e2f194ac7bcb94abab4f0e58b19807bc132dac6154a588305752da0" Feb 17 14:29:37 crc kubenswrapper[4836]: I0217 14:29:37.265443 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 14:29:37 crc kubenswrapper[4836]: I0217 14:29:37.287481 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 17 14:29:37 crc kubenswrapper[4836]: I0217 14:29:37.309288 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 17 14:29:37 crc kubenswrapper[4836]: E0217 14:29:37.309896 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9f887f5-6ce0-4320-94fc-024b1b9ef725" containerName="ceilometer-central-agent" Feb 17 14:29:37 crc kubenswrapper[4836]: I0217 14:29:37.309918 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9f887f5-6ce0-4320-94fc-024b1b9ef725" containerName="ceilometer-central-agent" Feb 17 14:29:37 crc kubenswrapper[4836]: E0217 14:29:37.309948 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9f887f5-6ce0-4320-94fc-024b1b9ef725" containerName="ceilometer-notification-agent" Feb 17 14:29:37 crc kubenswrapper[4836]: I0217 14:29:37.309954 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9f887f5-6ce0-4320-94fc-024b1b9ef725" containerName="ceilometer-notification-agent" Feb 17 14:29:37 crc kubenswrapper[4836]: E0217 14:29:37.309982 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9f887f5-6ce0-4320-94fc-024b1b9ef725" containerName="sg-core" Feb 17 14:29:37 crc kubenswrapper[4836]: I0217 14:29:37.309988 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9f887f5-6ce0-4320-94fc-024b1b9ef725" containerName="sg-core" Feb 17 14:29:37 crc kubenswrapper[4836]: E0217 14:29:37.309999 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9f887f5-6ce0-4320-94fc-024b1b9ef725" containerName="proxy-httpd" Feb 17 14:29:37 crc kubenswrapper[4836]: I0217 14:29:37.310005 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9f887f5-6ce0-4320-94fc-024b1b9ef725" containerName="proxy-httpd" Feb 17 14:29:37 crc kubenswrapper[4836]: I0217 14:29:37.310220 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9f887f5-6ce0-4320-94fc-024b1b9ef725" containerName="ceilometer-notification-agent" Feb 17 14:29:37 crc kubenswrapper[4836]: I0217 14:29:37.310242 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9f887f5-6ce0-4320-94fc-024b1b9ef725" containerName="sg-core" Feb 17 14:29:37 crc kubenswrapper[4836]: I0217 14:29:37.310253 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9f887f5-6ce0-4320-94fc-024b1b9ef725" containerName="proxy-httpd" Feb 17 14:29:37 crc kubenswrapper[4836]: I0217 14:29:37.310265 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9f887f5-6ce0-4320-94fc-024b1b9ef725" containerName="ceilometer-central-agent" Feb 17 14:29:37 crc kubenswrapper[4836]: I0217 14:29:37.312998 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 14:29:37 crc kubenswrapper[4836]: I0217 14:29:37.321955 4836 scope.go:117] "RemoveContainer" containerID="3af1c0902b859232ebd0fe7d2dd1bbcfb19e56b6f4b1d314aace0818279210a3" Feb 17 14:29:37 crc kubenswrapper[4836]: I0217 14:29:37.325462 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 17 14:29:37 crc kubenswrapper[4836]: I0217 14:29:37.325684 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 17 14:29:37 crc kubenswrapper[4836]: I0217 14:29:37.351212 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 14:29:37 crc kubenswrapper[4836]: I0217 14:29:37.355774 4836 scope.go:117] "RemoveContainer" containerID="fb5cf9ee8d101cc6fae1fb5f79f35c27c28cf9fc0fa0631bc345f006efba64c9" Feb 17 14:29:37 crc kubenswrapper[4836]: I0217 14:29:37.365708 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/56c9a452-ffd5-4b03-97a9-93546a194414-scripts\") pod \"ceilometer-0\" (UID: \"56c9a452-ffd5-4b03-97a9-93546a194414\") " pod="openstack/ceilometer-0" Feb 17 14:29:37 crc kubenswrapper[4836]: I0217 14:29:37.365773 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/56c9a452-ffd5-4b03-97a9-93546a194414-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"56c9a452-ffd5-4b03-97a9-93546a194414\") " pod="openstack/ceilometer-0" Feb 17 14:29:37 crc kubenswrapper[4836]: I0217 14:29:37.365858 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/56c9a452-ffd5-4b03-97a9-93546a194414-log-httpd\") pod \"ceilometer-0\" (UID: \"56c9a452-ffd5-4b03-97a9-93546a194414\") " pod="openstack/ceilometer-0" Feb 17 14:29:37 crc kubenswrapper[4836]: I0217 14:29:37.365915 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pq7ns\" (UniqueName: \"kubernetes.io/projected/56c9a452-ffd5-4b03-97a9-93546a194414-kube-api-access-pq7ns\") pod \"ceilometer-0\" (UID: \"56c9a452-ffd5-4b03-97a9-93546a194414\") " pod="openstack/ceilometer-0" Feb 17 14:29:37 crc kubenswrapper[4836]: I0217 14:29:37.366026 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56c9a452-ffd5-4b03-97a9-93546a194414-config-data\") pod \"ceilometer-0\" (UID: \"56c9a452-ffd5-4b03-97a9-93546a194414\") " pod="openstack/ceilometer-0" Feb 17 14:29:37 crc kubenswrapper[4836]: I0217 14:29:37.366096 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/56c9a452-ffd5-4b03-97a9-93546a194414-run-httpd\") pod \"ceilometer-0\" (UID: \"56c9a452-ffd5-4b03-97a9-93546a194414\") " pod="openstack/ceilometer-0" Feb 17 14:29:37 crc kubenswrapper[4836]: I0217 14:29:37.366186 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56c9a452-ffd5-4b03-97a9-93546a194414-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"56c9a452-ffd5-4b03-97a9-93546a194414\") " pod="openstack/ceilometer-0" Feb 17 14:29:37 crc kubenswrapper[4836]: I0217 14:29:37.422384 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 14:29:37 crc kubenswrapper[4836]: I0217 14:29:37.422721 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="c29f84b9-3879-4fc6-b2aa-e334bd08f24e" containerName="glance-log" containerID="cri-o://4231e0f0134e5c8db2d1379ad611e9d1ddd911c706b7c534c46f5a480fa7035b" gracePeriod=30 Feb 17 14:29:37 crc kubenswrapper[4836]: I0217 14:29:37.423320 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="c29f84b9-3879-4fc6-b2aa-e334bd08f24e" containerName="glance-httpd" containerID="cri-o://ffac93583d3a46218a79cd0eec11b0e9213bdce6e0622ee8ec1b1030a56cebbf" gracePeriod=30 Feb 17 14:29:37 crc kubenswrapper[4836]: I0217 14:29:37.469243 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pq7ns\" (UniqueName: \"kubernetes.io/projected/56c9a452-ffd5-4b03-97a9-93546a194414-kube-api-access-pq7ns\") pod \"ceilometer-0\" (UID: \"56c9a452-ffd5-4b03-97a9-93546a194414\") " pod="openstack/ceilometer-0" Feb 17 14:29:37 crc kubenswrapper[4836]: I0217 14:29:37.469393 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56c9a452-ffd5-4b03-97a9-93546a194414-config-data\") pod \"ceilometer-0\" (UID: \"56c9a452-ffd5-4b03-97a9-93546a194414\") " pod="openstack/ceilometer-0" Feb 17 14:29:37 crc kubenswrapper[4836]: I0217 14:29:37.469462 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/56c9a452-ffd5-4b03-97a9-93546a194414-run-httpd\") pod \"ceilometer-0\" (UID: \"56c9a452-ffd5-4b03-97a9-93546a194414\") " pod="openstack/ceilometer-0" Feb 17 14:29:37 crc kubenswrapper[4836]: I0217 14:29:37.469504 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56c9a452-ffd5-4b03-97a9-93546a194414-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"56c9a452-ffd5-4b03-97a9-93546a194414\") " pod="openstack/ceilometer-0" Feb 17 14:29:37 crc kubenswrapper[4836]: I0217 14:29:37.469614 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/56c9a452-ffd5-4b03-97a9-93546a194414-scripts\") pod \"ceilometer-0\" (UID: \"56c9a452-ffd5-4b03-97a9-93546a194414\") " pod="openstack/ceilometer-0" Feb 17 14:29:37 crc kubenswrapper[4836]: I0217 14:29:37.469648 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/56c9a452-ffd5-4b03-97a9-93546a194414-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"56c9a452-ffd5-4b03-97a9-93546a194414\") " pod="openstack/ceilometer-0" Feb 17 14:29:37 crc kubenswrapper[4836]: I0217 14:29:37.469697 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/56c9a452-ffd5-4b03-97a9-93546a194414-log-httpd\") pod \"ceilometer-0\" (UID: \"56c9a452-ffd5-4b03-97a9-93546a194414\") " pod="openstack/ceilometer-0" Feb 17 14:29:37 crc kubenswrapper[4836]: I0217 14:29:37.470484 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/56c9a452-ffd5-4b03-97a9-93546a194414-log-httpd\") pod \"ceilometer-0\" (UID: \"56c9a452-ffd5-4b03-97a9-93546a194414\") " pod="openstack/ceilometer-0" Feb 17 14:29:37 crc kubenswrapper[4836]: I0217 14:29:37.471432 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/56c9a452-ffd5-4b03-97a9-93546a194414-run-httpd\") pod \"ceilometer-0\" (UID: \"56c9a452-ffd5-4b03-97a9-93546a194414\") " pod="openstack/ceilometer-0" Feb 17 14:29:37 crc kubenswrapper[4836]: I0217 14:29:37.478422 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56c9a452-ffd5-4b03-97a9-93546a194414-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"56c9a452-ffd5-4b03-97a9-93546a194414\") " pod="openstack/ceilometer-0" Feb 17 14:29:37 crc kubenswrapper[4836]: I0217 14:29:37.482557 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56c9a452-ffd5-4b03-97a9-93546a194414-config-data\") pod \"ceilometer-0\" (UID: \"56c9a452-ffd5-4b03-97a9-93546a194414\") " pod="openstack/ceilometer-0" Feb 17 14:29:37 crc kubenswrapper[4836]: I0217 14:29:37.483807 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/56c9a452-ffd5-4b03-97a9-93546a194414-scripts\") pod \"ceilometer-0\" (UID: \"56c9a452-ffd5-4b03-97a9-93546a194414\") " pod="openstack/ceilometer-0" Feb 17 14:29:37 crc kubenswrapper[4836]: I0217 14:29:37.486208 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/56c9a452-ffd5-4b03-97a9-93546a194414-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"56c9a452-ffd5-4b03-97a9-93546a194414\") " pod="openstack/ceilometer-0" Feb 17 14:29:37 crc kubenswrapper[4836]: I0217 14:29:37.497178 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pq7ns\" (UniqueName: \"kubernetes.io/projected/56c9a452-ffd5-4b03-97a9-93546a194414-kube-api-access-pq7ns\") pod \"ceilometer-0\" (UID: \"56c9a452-ffd5-4b03-97a9-93546a194414\") " pod="openstack/ceilometer-0" Feb 17 14:29:37 crc kubenswrapper[4836]: I0217 14:29:37.648248 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 14:29:37 crc kubenswrapper[4836]: I0217 14:29:37.949252 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-q8wrd"] Feb 17 14:29:37 crc kubenswrapper[4836]: I0217 14:29:37.951004 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-q8wrd" Feb 17 14:29:37 crc kubenswrapper[4836]: I0217 14:29:37.961149 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-q8wrd"] Feb 17 14:29:37 crc kubenswrapper[4836]: I0217 14:29:37.983231 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/88b1aa3a-dc15-4ec1-ba76-8246e300422f-operator-scripts\") pod \"nova-api-db-create-q8wrd\" (UID: \"88b1aa3a-dc15-4ec1-ba76-8246e300422f\") " pod="openstack/nova-api-db-create-q8wrd" Feb 17 14:29:37 crc kubenswrapper[4836]: I0217 14:29:37.983597 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2z4m\" (UniqueName: \"kubernetes.io/projected/88b1aa3a-dc15-4ec1-ba76-8246e300422f-kube-api-access-d2z4m\") pod \"nova-api-db-create-q8wrd\" (UID: \"88b1aa3a-dc15-4ec1-ba76-8246e300422f\") " pod="openstack/nova-api-db-create-q8wrd" Feb 17 14:29:38 crc kubenswrapper[4836]: I0217 14:29:38.133192 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2z4m\" (UniqueName: \"kubernetes.io/projected/88b1aa3a-dc15-4ec1-ba76-8246e300422f-kube-api-access-d2z4m\") pod \"nova-api-db-create-q8wrd\" (UID: \"88b1aa3a-dc15-4ec1-ba76-8246e300422f\") " pod="openstack/nova-api-db-create-q8wrd" Feb 17 14:29:38 crc kubenswrapper[4836]: I0217 14:29:38.133906 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/88b1aa3a-dc15-4ec1-ba76-8246e300422f-operator-scripts\") pod \"nova-api-db-create-q8wrd\" (UID: \"88b1aa3a-dc15-4ec1-ba76-8246e300422f\") " pod="openstack/nova-api-db-create-q8wrd" Feb 17 14:29:38 crc kubenswrapper[4836]: I0217 14:29:38.141398 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/88b1aa3a-dc15-4ec1-ba76-8246e300422f-operator-scripts\") pod \"nova-api-db-create-q8wrd\" (UID: \"88b1aa3a-dc15-4ec1-ba76-8246e300422f\") " pod="openstack/nova-api-db-create-q8wrd" Feb 17 14:29:38 crc kubenswrapper[4836]: I0217 14:29:38.175238 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2z4m\" (UniqueName: \"kubernetes.io/projected/88b1aa3a-dc15-4ec1-ba76-8246e300422f-kube-api-access-d2z4m\") pod \"nova-api-db-create-q8wrd\" (UID: \"88b1aa3a-dc15-4ec1-ba76-8246e300422f\") " pod="openstack/nova-api-db-create-q8wrd" Feb 17 14:29:38 crc kubenswrapper[4836]: I0217 14:29:38.192974 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-npl52"] Feb 17 14:29:38 crc kubenswrapper[4836]: I0217 14:29:38.225431 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-npl52"] Feb 17 14:29:38 crc kubenswrapper[4836]: I0217 14:29:38.232410 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-npl52" Feb 17 14:29:38 crc kubenswrapper[4836]: I0217 14:29:38.587171 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-q8wrd" Feb 17 14:29:38 crc kubenswrapper[4836]: I0217 14:29:38.600656 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/db342a3d-55f5-4b0c-b96f-327014b6fb82-operator-scripts\") pod \"nova-cell0-db-create-npl52\" (UID: \"db342a3d-55f5-4b0c-b96f-327014b6fb82\") " pod="openstack/nova-cell0-db-create-npl52" Feb 17 14:29:38 crc kubenswrapper[4836]: I0217 14:29:38.600738 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hknvs\" (UniqueName: \"kubernetes.io/projected/db342a3d-55f5-4b0c-b96f-327014b6fb82-kube-api-access-hknvs\") pod \"nova-cell0-db-create-npl52\" (UID: \"db342a3d-55f5-4b0c-b96f-327014b6fb82\") " pod="openstack/nova-cell0-db-create-npl52" Feb 17 14:29:38 crc kubenswrapper[4836]: I0217 14:29:38.606564 4836 generic.go:334] "Generic (PLEG): container finished" podID="c29f84b9-3879-4fc6-b2aa-e334bd08f24e" containerID="4231e0f0134e5c8db2d1379ad611e9d1ddd911c706b7c534c46f5a480fa7035b" exitCode=143 Feb 17 14:29:38 crc kubenswrapper[4836]: I0217 14:29:38.634889 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9f887f5-6ce0-4320-94fc-024b1b9ef725" path="/var/lib/kubelet/pods/d9f887f5-6ce0-4320-94fc-024b1b9ef725/volumes" Feb 17 14:29:38 crc kubenswrapper[4836]: I0217 14:29:38.639480 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c29f84b9-3879-4fc6-b2aa-e334bd08f24e","Type":"ContainerDied","Data":"4231e0f0134e5c8db2d1379ad611e9d1ddd911c706b7c534c46f5a480fa7035b"} Feb 17 14:29:38 crc kubenswrapper[4836]: I0217 14:29:38.650988 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5d87f46c5f-vfn9f" event={"ID":"a17ffb1e-09d2-4524-8c33-e50e15b9031d","Type":"ContainerStarted","Data":"86a79ba10952e5f4ee8bb7f3e479555554f9c751b514251142b7ca704ac5d0dc"} Feb 17 14:29:38 crc kubenswrapper[4836]: I0217 14:29:38.651413 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-5d87f46c5f-vfn9f" Feb 17 14:29:38 crc kubenswrapper[4836]: I0217 14:29:38.651434 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-5d87f46c5f-vfn9f" Feb 17 14:29:38 crc kubenswrapper[4836]: I0217 14:29:38.702698 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/db342a3d-55f5-4b0c-b96f-327014b6fb82-operator-scripts\") pod \"nova-cell0-db-create-npl52\" (UID: \"db342a3d-55f5-4b0c-b96f-327014b6fb82\") " pod="openstack/nova-cell0-db-create-npl52" Feb 17 14:29:38 crc kubenswrapper[4836]: I0217 14:29:38.702830 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hknvs\" (UniqueName: \"kubernetes.io/projected/db342a3d-55f5-4b0c-b96f-327014b6fb82-kube-api-access-hknvs\") pod \"nova-cell0-db-create-npl52\" (UID: \"db342a3d-55f5-4b0c-b96f-327014b6fb82\") " pod="openstack/nova-cell0-db-create-npl52" Feb 17 14:29:38 crc kubenswrapper[4836]: I0217 14:29:38.704730 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/db342a3d-55f5-4b0c-b96f-327014b6fb82-operator-scripts\") pod \"nova-cell0-db-create-npl52\" (UID: \"db342a3d-55f5-4b0c-b96f-327014b6fb82\") " pod="openstack/nova-cell0-db-create-npl52" Feb 17 14:29:38 crc kubenswrapper[4836]: I0217 14:29:38.743840 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hknvs\" (UniqueName: \"kubernetes.io/projected/db342a3d-55f5-4b0c-b96f-327014b6fb82-kube-api-access-hknvs\") pod \"nova-cell0-db-create-npl52\" (UID: \"db342a3d-55f5-4b0c-b96f-327014b6fb82\") " pod="openstack/nova-cell0-db-create-npl52" Feb 17 14:29:38 crc kubenswrapper[4836]: I0217 14:29:38.799585 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-a7c4-account-create-update-qj5lb"] Feb 17 14:29:38 crc kubenswrapper[4836]: I0217 14:29:38.801443 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-a7c4-account-create-update-qj5lb" Feb 17 14:29:38 crc kubenswrapper[4836]: I0217 14:29:38.808644 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Feb 17 14:29:38 crc kubenswrapper[4836]: I0217 14:29:38.827596 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-a7c4-account-create-update-qj5lb"] Feb 17 14:29:38 crc kubenswrapper[4836]: I0217 14:29:38.873480 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-5h5m9"] Feb 17 14:29:38 crc kubenswrapper[4836]: I0217 14:29:38.876188 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-5h5m9" Feb 17 14:29:38 crc kubenswrapper[4836]: I0217 14:29:38.894107 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-5h5m9"] Feb 17 14:29:38 crc kubenswrapper[4836]: I0217 14:29:38.910232 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0312359b-98a6-49c7-83f1-fb44c679e8aa-operator-scripts\") pod \"nova-cell1-db-create-5h5m9\" (UID: \"0312359b-98a6-49c7-83f1-fb44c679e8aa\") " pod="openstack/nova-cell1-db-create-5h5m9" Feb 17 14:29:38 crc kubenswrapper[4836]: I0217 14:29:38.910617 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmsfl\" (UniqueName: \"kubernetes.io/projected/0312359b-98a6-49c7-83f1-fb44c679e8aa-kube-api-access-nmsfl\") pod \"nova-cell1-db-create-5h5m9\" (UID: \"0312359b-98a6-49c7-83f1-fb44c679e8aa\") " pod="openstack/nova-cell1-db-create-5h5m9" Feb 17 14:29:38 crc kubenswrapper[4836]: I0217 14:29:38.910830 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c7d61f8c-4804-49b6-937e-fbaf20aa3ed2-operator-scripts\") pod \"nova-api-a7c4-account-create-update-qj5lb\" (UID: \"c7d61f8c-4804-49b6-937e-fbaf20aa3ed2\") " pod="openstack/nova-api-a7c4-account-create-update-qj5lb" Feb 17 14:29:38 crc kubenswrapper[4836]: I0217 14:29:38.910920 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2xjk\" (UniqueName: \"kubernetes.io/projected/c7d61f8c-4804-49b6-937e-fbaf20aa3ed2-kube-api-access-f2xjk\") pod \"nova-api-a7c4-account-create-update-qj5lb\" (UID: \"c7d61f8c-4804-49b6-937e-fbaf20aa3ed2\") " pod="openstack/nova-api-a7c4-account-create-update-qj5lb" Feb 17 14:29:38 crc kubenswrapper[4836]: I0217 14:29:38.916257 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-5d87f46c5f-vfn9f" podStartSLOduration=13.916213951 podStartE2EDuration="13.916213951s" podCreationTimestamp="2026-02-17 14:29:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:29:38.706321833 +0000 UTC m=+1405.049250102" watchObservedRunningTime="2026-02-17 14:29:38.916213951 +0000 UTC m=+1405.259142240" Feb 17 14:29:38 crc kubenswrapper[4836]: I0217 14:29:38.951798 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-npl52" Feb 17 14:29:38 crc kubenswrapper[4836]: I0217 14:29:38.968581 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-8fba-account-create-update-gqd5n"] Feb 17 14:29:38 crc kubenswrapper[4836]: I0217 14:29:38.970331 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-8fba-account-create-update-gqd5n" Feb 17 14:29:38 crc kubenswrapper[4836]: I0217 14:29:38.975817 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Feb 17 14:29:38 crc kubenswrapper[4836]: I0217 14:29:38.979068 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-8fba-account-create-update-gqd5n"] Feb 17 14:29:39 crc kubenswrapper[4836]: I0217 14:29:39.012961 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0b8171da-ad25-4388-9dab-2afc19993d97-operator-scripts\") pod \"nova-cell0-8fba-account-create-update-gqd5n\" (UID: \"0b8171da-ad25-4388-9dab-2afc19993d97\") " pod="openstack/nova-cell0-8fba-account-create-update-gqd5n" Feb 17 14:29:39 crc kubenswrapper[4836]: I0217 14:29:39.014169 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2xjk\" (UniqueName: \"kubernetes.io/projected/c7d61f8c-4804-49b6-937e-fbaf20aa3ed2-kube-api-access-f2xjk\") pod \"nova-api-a7c4-account-create-update-qj5lb\" (UID: \"c7d61f8c-4804-49b6-937e-fbaf20aa3ed2\") " pod="openstack/nova-api-a7c4-account-create-update-qj5lb" Feb 17 14:29:39 crc kubenswrapper[4836]: I0217 14:29:39.014593 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bk6l\" (UniqueName: \"kubernetes.io/projected/0b8171da-ad25-4388-9dab-2afc19993d97-kube-api-access-9bk6l\") pod \"nova-cell0-8fba-account-create-update-gqd5n\" (UID: \"0b8171da-ad25-4388-9dab-2afc19993d97\") " pod="openstack/nova-cell0-8fba-account-create-update-gqd5n" Feb 17 14:29:39 crc kubenswrapper[4836]: I0217 14:29:39.014781 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0312359b-98a6-49c7-83f1-fb44c679e8aa-operator-scripts\") pod \"nova-cell1-db-create-5h5m9\" (UID: \"0312359b-98a6-49c7-83f1-fb44c679e8aa\") " pod="openstack/nova-cell1-db-create-5h5m9" Feb 17 14:29:39 crc kubenswrapper[4836]: I0217 14:29:39.014936 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmsfl\" (UniqueName: \"kubernetes.io/projected/0312359b-98a6-49c7-83f1-fb44c679e8aa-kube-api-access-nmsfl\") pod \"nova-cell1-db-create-5h5m9\" (UID: \"0312359b-98a6-49c7-83f1-fb44c679e8aa\") " pod="openstack/nova-cell1-db-create-5h5m9" Feb 17 14:29:39 crc kubenswrapper[4836]: I0217 14:29:39.032381 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-28f5-account-create-update-74tvm"] Feb 17 14:29:39 crc kubenswrapper[4836]: I0217 14:29:39.035041 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-28f5-account-create-update-74tvm" Feb 17 14:29:39 crc kubenswrapper[4836]: I0217 14:29:39.042513 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Feb 17 14:29:39 crc kubenswrapper[4836]: I0217 14:29:39.064741 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-28f5-account-create-update-74tvm"] Feb 17 14:29:39 crc kubenswrapper[4836]: I0217 14:29:39.073748 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c7d61f8c-4804-49b6-937e-fbaf20aa3ed2-operator-scripts\") pod \"nova-api-a7c4-account-create-update-qj5lb\" (UID: \"c7d61f8c-4804-49b6-937e-fbaf20aa3ed2\") " pod="openstack/nova-api-a7c4-account-create-update-qj5lb" Feb 17 14:29:39 crc kubenswrapper[4836]: I0217 14:29:39.074706 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0312359b-98a6-49c7-83f1-fb44c679e8aa-operator-scripts\") pod \"nova-cell1-db-create-5h5m9\" (UID: \"0312359b-98a6-49c7-83f1-fb44c679e8aa\") " pod="openstack/nova-cell1-db-create-5h5m9" Feb 17 14:29:39 crc kubenswrapper[4836]: I0217 14:29:39.075859 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c7d61f8c-4804-49b6-937e-fbaf20aa3ed2-operator-scripts\") pod \"nova-api-a7c4-account-create-update-qj5lb\" (UID: \"c7d61f8c-4804-49b6-937e-fbaf20aa3ed2\") " pod="openstack/nova-api-a7c4-account-create-update-qj5lb" Feb 17 14:29:39 crc kubenswrapper[4836]: I0217 14:29:39.083639 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2xjk\" (UniqueName: \"kubernetes.io/projected/c7d61f8c-4804-49b6-937e-fbaf20aa3ed2-kube-api-access-f2xjk\") pod \"nova-api-a7c4-account-create-update-qj5lb\" (UID: \"c7d61f8c-4804-49b6-937e-fbaf20aa3ed2\") " pod="openstack/nova-api-a7c4-account-create-update-qj5lb" Feb 17 14:29:39 crc kubenswrapper[4836]: I0217 14:29:39.122859 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmsfl\" (UniqueName: \"kubernetes.io/projected/0312359b-98a6-49c7-83f1-fb44c679e8aa-kube-api-access-nmsfl\") pod \"nova-cell1-db-create-5h5m9\" (UID: \"0312359b-98a6-49c7-83f1-fb44c679e8aa\") " pod="openstack/nova-cell1-db-create-5h5m9" Feb 17 14:29:39 crc kubenswrapper[4836]: I0217 14:29:39.152151 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-a7c4-account-create-update-qj5lb" Feb 17 14:29:39 crc kubenswrapper[4836]: I0217 14:29:39.183759 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0b8171da-ad25-4388-9dab-2afc19993d97-operator-scripts\") pod \"nova-cell0-8fba-account-create-update-gqd5n\" (UID: \"0b8171da-ad25-4388-9dab-2afc19993d97\") " pod="openstack/nova-cell0-8fba-account-create-update-gqd5n" Feb 17 14:29:39 crc kubenswrapper[4836]: I0217 14:29:39.183853 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4dc00367-2940-413d-872a-74d4fa37fc1f-operator-scripts\") pod \"nova-cell1-28f5-account-create-update-74tvm\" (UID: \"4dc00367-2940-413d-872a-74d4fa37fc1f\") " pod="openstack/nova-cell1-28f5-account-create-update-74tvm" Feb 17 14:29:39 crc kubenswrapper[4836]: I0217 14:29:39.183918 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbccw\" (UniqueName: \"kubernetes.io/projected/4dc00367-2940-413d-872a-74d4fa37fc1f-kube-api-access-dbccw\") pod \"nova-cell1-28f5-account-create-update-74tvm\" (UID: \"4dc00367-2940-413d-872a-74d4fa37fc1f\") " pod="openstack/nova-cell1-28f5-account-create-update-74tvm" Feb 17 14:29:39 crc kubenswrapper[4836]: I0217 14:29:39.183977 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bk6l\" (UniqueName: \"kubernetes.io/projected/0b8171da-ad25-4388-9dab-2afc19993d97-kube-api-access-9bk6l\") pod \"nova-cell0-8fba-account-create-update-gqd5n\" (UID: \"0b8171da-ad25-4388-9dab-2afc19993d97\") " pod="openstack/nova-cell0-8fba-account-create-update-gqd5n" Feb 17 14:29:39 crc kubenswrapper[4836]: I0217 14:29:39.185520 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0b8171da-ad25-4388-9dab-2afc19993d97-operator-scripts\") pod \"nova-cell0-8fba-account-create-update-gqd5n\" (UID: \"0b8171da-ad25-4388-9dab-2afc19993d97\") " pod="openstack/nova-cell0-8fba-account-create-update-gqd5n" Feb 17 14:29:39 crc kubenswrapper[4836]: I0217 14:29:39.213965 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bk6l\" (UniqueName: \"kubernetes.io/projected/0b8171da-ad25-4388-9dab-2afc19993d97-kube-api-access-9bk6l\") pod \"nova-cell0-8fba-account-create-update-gqd5n\" (UID: \"0b8171da-ad25-4388-9dab-2afc19993d97\") " pod="openstack/nova-cell0-8fba-account-create-update-gqd5n" Feb 17 14:29:39 crc kubenswrapper[4836]: I0217 14:29:39.225442 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-5h5m9" Feb 17 14:29:39 crc kubenswrapper[4836]: I0217 14:29:39.285515 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4dc00367-2940-413d-872a-74d4fa37fc1f-operator-scripts\") pod \"nova-cell1-28f5-account-create-update-74tvm\" (UID: \"4dc00367-2940-413d-872a-74d4fa37fc1f\") " pod="openstack/nova-cell1-28f5-account-create-update-74tvm" Feb 17 14:29:39 crc kubenswrapper[4836]: I0217 14:29:39.285852 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbccw\" (UniqueName: \"kubernetes.io/projected/4dc00367-2940-413d-872a-74d4fa37fc1f-kube-api-access-dbccw\") pod \"nova-cell1-28f5-account-create-update-74tvm\" (UID: \"4dc00367-2940-413d-872a-74d4fa37fc1f\") " pod="openstack/nova-cell1-28f5-account-create-update-74tvm" Feb 17 14:29:39 crc kubenswrapper[4836]: I0217 14:29:39.287837 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4dc00367-2940-413d-872a-74d4fa37fc1f-operator-scripts\") pod \"nova-cell1-28f5-account-create-update-74tvm\" (UID: \"4dc00367-2940-413d-872a-74d4fa37fc1f\") " pod="openstack/nova-cell1-28f5-account-create-update-74tvm" Feb 17 14:29:39 crc kubenswrapper[4836]: I0217 14:29:39.315258 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbccw\" (UniqueName: \"kubernetes.io/projected/4dc00367-2940-413d-872a-74d4fa37fc1f-kube-api-access-dbccw\") pod \"nova-cell1-28f5-account-create-update-74tvm\" (UID: \"4dc00367-2940-413d-872a-74d4fa37fc1f\") " pod="openstack/nova-cell1-28f5-account-create-update-74tvm" Feb 17 14:29:39 crc kubenswrapper[4836]: I0217 14:29:39.332682 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-8fba-account-create-update-gqd5n" Feb 17 14:29:39 crc kubenswrapper[4836]: I0217 14:29:39.788337 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-28f5-account-create-update-74tvm" Feb 17 14:29:39 crc kubenswrapper[4836]: I0217 14:29:39.922496 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 14:29:39 crc kubenswrapper[4836]: I0217 14:29:39.940113 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-q8wrd"] Feb 17 14:29:40 crc kubenswrapper[4836]: W0217 14:29:40.008843 4836 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod88b1aa3a_dc15_4ec1_ba76_8246e300422f.slice/crio-bb26936403352ce5d4b38858c684c231085b015394ef1a491b1db62d38cc94f0 WatchSource:0}: Error finding container bb26936403352ce5d4b38858c684c231085b015394ef1a491b1db62d38cc94f0: Status 404 returned error can't find the container with id bb26936403352ce5d4b38858c684c231085b015394ef1a491b1db62d38cc94f0 Feb 17 14:29:40 crc kubenswrapper[4836]: I0217 14:29:40.227652 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-npl52"] Feb 17 14:29:40 crc kubenswrapper[4836]: W0217 14:29:40.668327 4836 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc7d61f8c_4804_49b6_937e_fbaf20aa3ed2.slice/crio-a8bfacaf56b208729fed6ae7379213c44a5bf9bbc00aaa497d58947acfd5fda8 WatchSource:0}: Error finding container a8bfacaf56b208729fed6ae7379213c44a5bf9bbc00aaa497d58947acfd5fda8: Status 404 returned error can't find the container with id a8bfacaf56b208729fed6ae7379213c44a5bf9bbc00aaa497d58947acfd5fda8 Feb 17 14:29:41 crc kubenswrapper[4836]: I0217 14:29:41.025848 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-5h5m9"] Feb 17 14:29:41 crc kubenswrapper[4836]: I0217 14:29:41.081839 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-5h5m9" event={"ID":"0312359b-98a6-49c7-83f1-fb44c679e8aa","Type":"ContainerStarted","Data":"8aebd8b0cf09f0b5a71ad7edb46a57f5a3212f3d6f8147621e038f3b2d4a75eb"} Feb 17 14:29:41 crc kubenswrapper[4836]: I0217 14:29:41.105675 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-q8wrd" event={"ID":"88b1aa3a-dc15-4ec1-ba76-8246e300422f","Type":"ContainerStarted","Data":"e2428efba069899bf573bcb1f933d6f640083a8f0e4830cd36751b8b3332488d"} Feb 17 14:29:41 crc kubenswrapper[4836]: I0217 14:29:41.106074 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-q8wrd" event={"ID":"88b1aa3a-dc15-4ec1-ba76-8246e300422f","Type":"ContainerStarted","Data":"bb26936403352ce5d4b38858c684c231085b015394ef1a491b1db62d38cc94f0"} Feb 17 14:29:41 crc kubenswrapper[4836]: I0217 14:29:41.122966 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"56c9a452-ffd5-4b03-97a9-93546a194414","Type":"ContainerStarted","Data":"3732eb36b7746243f4a9bad758b1bf9afb106bf058ee751c3feddbab6042cb9c"} Feb 17 14:29:41 crc kubenswrapper[4836]: I0217 14:29:41.151412 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-a7c4-account-create-update-qj5lb"] Feb 17 14:29:41 crc kubenswrapper[4836]: I0217 14:29:41.162060 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-8fba-account-create-update-gqd5n"] Feb 17 14:29:41 crc kubenswrapper[4836]: I0217 14:29:41.167728 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-npl52" event={"ID":"db342a3d-55f5-4b0c-b96f-327014b6fb82","Type":"ContainerStarted","Data":"7dba2d07908548962f40435efa50aed2a21f68c9f55a50ad39cc396d718c6cf2"} Feb 17 14:29:41 crc kubenswrapper[4836]: I0217 14:29:41.167796 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-npl52" event={"ID":"db342a3d-55f5-4b0c-b96f-327014b6fb82","Type":"ContainerStarted","Data":"582663419ea06870f82c61e67b714be6e79694fd2b49d90ddf21ffdb14cf9940"} Feb 17 14:29:41 crc kubenswrapper[4836]: I0217 14:29:41.191597 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-a7c4-account-create-update-qj5lb" event={"ID":"c7d61f8c-4804-49b6-937e-fbaf20aa3ed2","Type":"ContainerStarted","Data":"a8bfacaf56b208729fed6ae7379213c44a5bf9bbc00aaa497d58947acfd5fda8"} Feb 17 14:29:41 crc kubenswrapper[4836]: I0217 14:29:41.219195 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-28f5-account-create-update-74tvm"] Feb 17 14:29:41 crc kubenswrapper[4836]: I0217 14:29:41.261390 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 14:29:41 crc kubenswrapper[4836]: I0217 14:29:41.261785 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="9fc032cb-3063-4e39-a91f-ccc89defe9c4" containerName="glance-log" containerID="cri-o://2d37a99072f4fb6a9bc38dee8c6986d96ef5977cd1d2c3dca6d3d95cb5f3bcee" gracePeriod=30 Feb 17 14:29:41 crc kubenswrapper[4836]: I0217 14:29:41.262555 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="9fc032cb-3063-4e39-a91f-ccc89defe9c4" containerName="glance-httpd" containerID="cri-o://253884c8bfee6f38dc03fef1da6c5e47b92d31a3b1592567360ef3f04d7144a9" gracePeriod=30 Feb 17 14:29:41 crc kubenswrapper[4836]: I0217 14:29:41.281970 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-db-create-npl52" podStartSLOduration=3.281934907 podStartE2EDuration="3.281934907s" podCreationTimestamp="2026-02-17 14:29:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:29:41.20837493 +0000 UTC m=+1407.551303209" watchObservedRunningTime="2026-02-17 14:29:41.281934907 +0000 UTC m=+1407.624863176" Feb 17 14:29:41 crc kubenswrapper[4836]: I0217 14:29:41.630146 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-5d87f46c5f-vfn9f" Feb 17 14:29:42 crc kubenswrapper[4836]: I0217 14:29:42.635159 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"56c9a452-ffd5-4b03-97a9-93546a194414","Type":"ContainerStarted","Data":"d3bbefd170a172c21cb9f9e3cfad807a2c0bb5fe5338d9d264fb6ae4c6ff5de7"} Feb 17 14:29:42 crc kubenswrapper[4836]: I0217 14:29:42.641411 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-8fba-account-create-update-gqd5n" event={"ID":"0b8171da-ad25-4388-9dab-2afc19993d97","Type":"ContainerStarted","Data":"4978da281b4ffb4cbde1dc06e973f40c67a116248d8a8623898e48ea004f575f"} Feb 17 14:29:42 crc kubenswrapper[4836]: I0217 14:29:42.663972 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-a7c4-account-create-update-qj5lb" event={"ID":"c7d61f8c-4804-49b6-937e-fbaf20aa3ed2","Type":"ContainerStarted","Data":"940b27e8f09ea23f3f385f55c83e9233f241038d9dc1c8761036c1c3dbf2e000"} Feb 17 14:29:42 crc kubenswrapper[4836]: I0217 14:29:42.692446 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-5h5m9" event={"ID":"0312359b-98a6-49c7-83f1-fb44c679e8aa","Type":"ContainerStarted","Data":"66b9158b23020b3eaa0a3cea1af11df9fcdac6316e74751284cbec084e23c3a0"} Feb 17 14:29:42 crc kubenswrapper[4836]: I0217 14:29:42.719913 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-a7c4-account-create-update-qj5lb" podStartSLOduration=4.719883018 podStartE2EDuration="4.719883018s" podCreationTimestamp="2026-02-17 14:29:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:29:42.706776283 +0000 UTC m=+1409.049704552" watchObservedRunningTime="2026-02-17 14:29:42.719883018 +0000 UTC m=+1409.062811307" Feb 17 14:29:42 crc kubenswrapper[4836]: I0217 14:29:42.721767 4836 generic.go:334] "Generic (PLEG): container finished" podID="c29f84b9-3879-4fc6-b2aa-e334bd08f24e" containerID="ffac93583d3a46218a79cd0eec11b0e9213bdce6e0622ee8ec1b1030a56cebbf" exitCode=0 Feb 17 14:29:42 crc kubenswrapper[4836]: I0217 14:29:42.721944 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c29f84b9-3879-4fc6-b2aa-e334bd08f24e","Type":"ContainerDied","Data":"ffac93583d3a46218a79cd0eec11b0e9213bdce6e0622ee8ec1b1030a56cebbf"} Feb 17 14:29:42 crc kubenswrapper[4836]: I0217 14:29:42.735241 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-28f5-account-create-update-74tvm" event={"ID":"4dc00367-2940-413d-872a-74d4fa37fc1f","Type":"ContainerStarted","Data":"684fea3361f7992d7677d58a81ef405045d31b021d431929ec2a4e0d9ce8e5bf"} Feb 17 14:29:42 crc kubenswrapper[4836]: I0217 14:29:42.739884 4836 generic.go:334] "Generic (PLEG): container finished" podID="db342a3d-55f5-4b0c-b96f-327014b6fb82" containerID="7dba2d07908548962f40435efa50aed2a21f68c9f55a50ad39cc396d718c6cf2" exitCode=0 Feb 17 14:29:42 crc kubenswrapper[4836]: I0217 14:29:42.740011 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-npl52" event={"ID":"db342a3d-55f5-4b0c-b96f-327014b6fb82","Type":"ContainerDied","Data":"7dba2d07908548962f40435efa50aed2a21f68c9f55a50ad39cc396d718c6cf2"} Feb 17 14:29:42 crc kubenswrapper[4836]: I0217 14:29:42.747260 4836 generic.go:334] "Generic (PLEG): container finished" podID="88b1aa3a-dc15-4ec1-ba76-8246e300422f" containerID="e2428efba069899bf573bcb1f933d6f640083a8f0e4830cd36751b8b3332488d" exitCode=0 Feb 17 14:29:42 crc kubenswrapper[4836]: I0217 14:29:42.747423 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-q8wrd" event={"ID":"88b1aa3a-dc15-4ec1-ba76-8246e300422f","Type":"ContainerDied","Data":"e2428efba069899bf573bcb1f933d6f640083a8f0e4830cd36751b8b3332488d"} Feb 17 14:29:42 crc kubenswrapper[4836]: I0217 14:29:42.773576 4836 generic.go:334] "Generic (PLEG): container finished" podID="9fc032cb-3063-4e39-a91f-ccc89defe9c4" containerID="2d37a99072f4fb6a9bc38dee8c6986d96ef5977cd1d2c3dca6d3d95cb5f3bcee" exitCode=143 Feb 17 14:29:42 crc kubenswrapper[4836]: I0217 14:29:42.773653 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9fc032cb-3063-4e39-a91f-ccc89defe9c4","Type":"ContainerDied","Data":"2d37a99072f4fb6a9bc38dee8c6986d96ef5977cd1d2c3dca6d3d95cb5f3bcee"} Feb 17 14:29:42 crc kubenswrapper[4836]: I0217 14:29:42.796449 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-5h5m9" podStartSLOduration=4.796421274 podStartE2EDuration="4.796421274s" podCreationTimestamp="2026-02-17 14:29:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:29:42.75921038 +0000 UTC m=+1409.102138639" watchObservedRunningTime="2026-02-17 14:29:42.796421274 +0000 UTC m=+1409.139349543" Feb 17 14:29:42 crc kubenswrapper[4836]: I0217 14:29:42.820750 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-28f5-account-create-update-74tvm" podStartSLOduration=4.820720211 podStartE2EDuration="4.820720211s" podCreationTimestamp="2026-02-17 14:29:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:29:42.805384497 +0000 UTC m=+1409.148312786" watchObservedRunningTime="2026-02-17 14:29:42.820720211 +0000 UTC m=+1409.163648490" Feb 17 14:29:43 crc kubenswrapper[4836]: I0217 14:29:43.769080 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 17 14:29:43 crc kubenswrapper[4836]: I0217 14:29:43.776164 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-q8wrd" Feb 17 14:29:43 crc kubenswrapper[4836]: I0217 14:29:43.791860 4836 generic.go:334] "Generic (PLEG): container finished" podID="4dc00367-2940-413d-872a-74d4fa37fc1f" containerID="b40337010298624b5f124e89e37fbded22f8ac5a672bad50ecf9c49dfa1ed535" exitCode=0 Feb 17 14:29:43 crc kubenswrapper[4836]: I0217 14:29:43.792002 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-28f5-account-create-update-74tvm" event={"ID":"4dc00367-2940-413d-872a-74d4fa37fc1f","Type":"ContainerDied","Data":"b40337010298624b5f124e89e37fbded22f8ac5a672bad50ecf9c49dfa1ed535"} Feb 17 14:29:43 crc kubenswrapper[4836]: I0217 14:29:43.795661 4836 generic.go:334] "Generic (PLEG): container finished" podID="c7d61f8c-4804-49b6-937e-fbaf20aa3ed2" containerID="940b27e8f09ea23f3f385f55c83e9233f241038d9dc1c8761036c1c3dbf2e000" exitCode=0 Feb 17 14:29:43 crc kubenswrapper[4836]: I0217 14:29:43.795848 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-a7c4-account-create-update-qj5lb" event={"ID":"c7d61f8c-4804-49b6-937e-fbaf20aa3ed2","Type":"ContainerDied","Data":"940b27e8f09ea23f3f385f55c83e9233f241038d9dc1c8761036c1c3dbf2e000"} Feb 17 14:29:43 crc kubenswrapper[4836]: I0217 14:29:43.800165 4836 generic.go:334] "Generic (PLEG): container finished" podID="0312359b-98a6-49c7-83f1-fb44c679e8aa" containerID="66b9158b23020b3eaa0a3cea1af11df9fcdac6316e74751284cbec084e23c3a0" exitCode=0 Feb 17 14:29:43 crc kubenswrapper[4836]: I0217 14:29:43.800267 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-5h5m9" event={"ID":"0312359b-98a6-49c7-83f1-fb44c679e8aa","Type":"ContainerDied","Data":"66b9158b23020b3eaa0a3cea1af11df9fcdac6316e74751284cbec084e23c3a0"} Feb 17 14:29:43 crc kubenswrapper[4836]: I0217 14:29:43.803579 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-q8wrd" event={"ID":"88b1aa3a-dc15-4ec1-ba76-8246e300422f","Type":"ContainerDied","Data":"bb26936403352ce5d4b38858c684c231085b015394ef1a491b1db62d38cc94f0"} Feb 17 14:29:43 crc kubenswrapper[4836]: I0217 14:29:43.803610 4836 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bb26936403352ce5d4b38858c684c231085b015394ef1a491b1db62d38cc94f0" Feb 17 14:29:43 crc kubenswrapper[4836]: I0217 14:29:43.803704 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-q8wrd" Feb 17 14:29:43 crc kubenswrapper[4836]: I0217 14:29:43.811644 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c29f84b9-3879-4fc6-b2aa-e334bd08f24e","Type":"ContainerDied","Data":"a73e6cf975755957f05fddc903522d5d75b3eb7f41eb5a42c5ad06b115f44634"} Feb 17 14:29:43 crc kubenswrapper[4836]: I0217 14:29:43.811727 4836 scope.go:117] "RemoveContainer" containerID="ffac93583d3a46218a79cd0eec11b0e9213bdce6e0622ee8ec1b1030a56cebbf" Feb 17 14:29:43 crc kubenswrapper[4836]: I0217 14:29:43.811963 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 17 14:29:43 crc kubenswrapper[4836]: I0217 14:29:43.818686 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"56c9a452-ffd5-4b03-97a9-93546a194414","Type":"ContainerStarted","Data":"dfd206b5463d3cc6f1e9888d9e21e49bdbfd5f95e1989f13c38e03bb00682c21"} Feb 17 14:29:43 crc kubenswrapper[4836]: I0217 14:29:43.838836 4836 generic.go:334] "Generic (PLEG): container finished" podID="0b8171da-ad25-4388-9dab-2afc19993d97" containerID="a870dbadddedc2cd296e8c04a81b16817f6df39787b8061ee58f3dfc1fec3ca8" exitCode=0 Feb 17 14:29:43 crc kubenswrapper[4836]: I0217 14:29:43.839135 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-8fba-account-create-update-gqd5n" event={"ID":"0b8171da-ad25-4388-9dab-2afc19993d97","Type":"ContainerDied","Data":"a870dbadddedc2cd296e8c04a81b16817f6df39787b8061ee58f3dfc1fec3ca8"} Feb 17 14:29:43 crc kubenswrapper[4836]: I0217 14:29:43.856499 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-47a47ba0-1e72-4c37-bb61-5251f8f68b34\") pod \"c29f84b9-3879-4fc6-b2aa-e334bd08f24e\" (UID: \"c29f84b9-3879-4fc6-b2aa-e334bd08f24e\") " Feb 17 14:29:43 crc kubenswrapper[4836]: I0217 14:29:43.856611 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d2z4m\" (UniqueName: \"kubernetes.io/projected/88b1aa3a-dc15-4ec1-ba76-8246e300422f-kube-api-access-d2z4m\") pod \"88b1aa3a-dc15-4ec1-ba76-8246e300422f\" (UID: \"88b1aa3a-dc15-4ec1-ba76-8246e300422f\") " Feb 17 14:29:43 crc kubenswrapper[4836]: I0217 14:29:43.856641 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c29f84b9-3879-4fc6-b2aa-e334bd08f24e-httpd-run\") pod \"c29f84b9-3879-4fc6-b2aa-e334bd08f24e\" (UID: \"c29f84b9-3879-4fc6-b2aa-e334bd08f24e\") " Feb 17 14:29:43 crc kubenswrapper[4836]: I0217 14:29:43.856689 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c29f84b9-3879-4fc6-b2aa-e334bd08f24e-logs\") pod \"c29f84b9-3879-4fc6-b2aa-e334bd08f24e\" (UID: \"c29f84b9-3879-4fc6-b2aa-e334bd08f24e\") " Feb 17 14:29:43 crc kubenswrapper[4836]: I0217 14:29:43.856713 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c29f84b9-3879-4fc6-b2aa-e334bd08f24e-scripts\") pod \"c29f84b9-3879-4fc6-b2aa-e334bd08f24e\" (UID: \"c29f84b9-3879-4fc6-b2aa-e334bd08f24e\") " Feb 17 14:29:43 crc kubenswrapper[4836]: I0217 14:29:43.856737 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c29f84b9-3879-4fc6-b2aa-e334bd08f24e-combined-ca-bundle\") pod \"c29f84b9-3879-4fc6-b2aa-e334bd08f24e\" (UID: \"c29f84b9-3879-4fc6-b2aa-e334bd08f24e\") " Feb 17 14:29:43 crc kubenswrapper[4836]: I0217 14:29:43.856772 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8cbj8\" (UniqueName: \"kubernetes.io/projected/c29f84b9-3879-4fc6-b2aa-e334bd08f24e-kube-api-access-8cbj8\") pod \"c29f84b9-3879-4fc6-b2aa-e334bd08f24e\" (UID: \"c29f84b9-3879-4fc6-b2aa-e334bd08f24e\") " Feb 17 14:29:43 crc kubenswrapper[4836]: I0217 14:29:43.856813 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/88b1aa3a-dc15-4ec1-ba76-8246e300422f-operator-scripts\") pod \"88b1aa3a-dc15-4ec1-ba76-8246e300422f\" (UID: \"88b1aa3a-dc15-4ec1-ba76-8246e300422f\") " Feb 17 14:29:43 crc kubenswrapper[4836]: I0217 14:29:43.856861 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c29f84b9-3879-4fc6-b2aa-e334bd08f24e-config-data\") pod \"c29f84b9-3879-4fc6-b2aa-e334bd08f24e\" (UID: \"c29f84b9-3879-4fc6-b2aa-e334bd08f24e\") " Feb 17 14:29:43 crc kubenswrapper[4836]: I0217 14:29:43.856950 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c29f84b9-3879-4fc6-b2aa-e334bd08f24e-public-tls-certs\") pod \"c29f84b9-3879-4fc6-b2aa-e334bd08f24e\" (UID: \"c29f84b9-3879-4fc6-b2aa-e334bd08f24e\") " Feb 17 14:29:43 crc kubenswrapper[4836]: I0217 14:29:43.869975 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88b1aa3a-dc15-4ec1-ba76-8246e300422f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "88b1aa3a-dc15-4ec1-ba76-8246e300422f" (UID: "88b1aa3a-dc15-4ec1-ba76-8246e300422f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:29:43 crc kubenswrapper[4836]: I0217 14:29:43.874808 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c29f84b9-3879-4fc6-b2aa-e334bd08f24e-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "c29f84b9-3879-4fc6-b2aa-e334bd08f24e" (UID: "c29f84b9-3879-4fc6-b2aa-e334bd08f24e"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:29:43 crc kubenswrapper[4836]: I0217 14:29:43.871874 4836 scope.go:117] "RemoveContainer" containerID="4231e0f0134e5c8db2d1379ad611e9d1ddd911c706b7c534c46f5a480fa7035b" Feb 17 14:29:43 crc kubenswrapper[4836]: I0217 14:29:43.882742 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c29f84b9-3879-4fc6-b2aa-e334bd08f24e-logs" (OuterVolumeSpecName: "logs") pod "c29f84b9-3879-4fc6-b2aa-e334bd08f24e" (UID: "c29f84b9-3879-4fc6-b2aa-e334bd08f24e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:29:43 crc kubenswrapper[4836]: I0217 14:29:43.915864 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c29f84b9-3879-4fc6-b2aa-e334bd08f24e-kube-api-access-8cbj8" (OuterVolumeSpecName: "kube-api-access-8cbj8") pod "c29f84b9-3879-4fc6-b2aa-e334bd08f24e" (UID: "c29f84b9-3879-4fc6-b2aa-e334bd08f24e"). InnerVolumeSpecName "kube-api-access-8cbj8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:29:43 crc kubenswrapper[4836]: I0217 14:29:43.915990 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88b1aa3a-dc15-4ec1-ba76-8246e300422f-kube-api-access-d2z4m" (OuterVolumeSpecName: "kube-api-access-d2z4m") pod "88b1aa3a-dc15-4ec1-ba76-8246e300422f" (UID: "88b1aa3a-dc15-4ec1-ba76-8246e300422f"). InnerVolumeSpecName "kube-api-access-d2z4m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:29:43 crc kubenswrapper[4836]: I0217 14:29:43.931488 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c29f84b9-3879-4fc6-b2aa-e334bd08f24e-scripts" (OuterVolumeSpecName: "scripts") pod "c29f84b9-3879-4fc6-b2aa-e334bd08f24e" (UID: "c29f84b9-3879-4fc6-b2aa-e334bd08f24e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:29:43 crc kubenswrapper[4836]: I0217 14:29:43.943862 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c29f84b9-3879-4fc6-b2aa-e334bd08f24e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c29f84b9-3879-4fc6-b2aa-e334bd08f24e" (UID: "c29f84b9-3879-4fc6-b2aa-e334bd08f24e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:29:43 crc kubenswrapper[4836]: I0217 14:29:43.961428 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d2z4m\" (UniqueName: \"kubernetes.io/projected/88b1aa3a-dc15-4ec1-ba76-8246e300422f-kube-api-access-d2z4m\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:43 crc kubenswrapper[4836]: I0217 14:29:43.961478 4836 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c29f84b9-3879-4fc6-b2aa-e334bd08f24e-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:43 crc kubenswrapper[4836]: I0217 14:29:43.961498 4836 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c29f84b9-3879-4fc6-b2aa-e334bd08f24e-logs\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:43 crc kubenswrapper[4836]: I0217 14:29:43.961511 4836 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c29f84b9-3879-4fc6-b2aa-e334bd08f24e-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:43 crc kubenswrapper[4836]: I0217 14:29:43.961521 4836 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c29f84b9-3879-4fc6-b2aa-e334bd08f24e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:43 crc kubenswrapper[4836]: I0217 14:29:43.961531 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8cbj8\" (UniqueName: \"kubernetes.io/projected/c29f84b9-3879-4fc6-b2aa-e334bd08f24e-kube-api-access-8cbj8\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:43 crc kubenswrapper[4836]: I0217 14:29:43.961548 4836 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/88b1aa3a-dc15-4ec1-ba76-8246e300422f-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:44 crc kubenswrapper[4836]: I0217 14:29:44.070386 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c29f84b9-3879-4fc6-b2aa-e334bd08f24e-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "c29f84b9-3879-4fc6-b2aa-e334bd08f24e" (UID: "c29f84b9-3879-4fc6-b2aa-e334bd08f24e"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:29:44 crc kubenswrapper[4836]: I0217 14:29:44.073040 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-47a47ba0-1e72-4c37-bb61-5251f8f68b34" (OuterVolumeSpecName: "glance") pod "c29f84b9-3879-4fc6-b2aa-e334bd08f24e" (UID: "c29f84b9-3879-4fc6-b2aa-e334bd08f24e"). InnerVolumeSpecName "pvc-47a47ba0-1e72-4c37-bb61-5251f8f68b34". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 17 14:29:44 crc kubenswrapper[4836]: I0217 14:29:44.134816 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c29f84b9-3879-4fc6-b2aa-e334bd08f24e-config-data" (OuterVolumeSpecName: "config-data") pod "c29f84b9-3879-4fc6-b2aa-e334bd08f24e" (UID: "c29f84b9-3879-4fc6-b2aa-e334bd08f24e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:29:44 crc kubenswrapper[4836]: I0217 14:29:44.154552 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 14:29:44 crc kubenswrapper[4836]: I0217 14:29:44.198106 4836 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-47a47ba0-1e72-4c37-bb61-5251f8f68b34\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-47a47ba0-1e72-4c37-bb61-5251f8f68b34\") on node \"crc\" " Feb 17 14:29:44 crc kubenswrapper[4836]: I0217 14:29:44.198152 4836 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c29f84b9-3879-4fc6-b2aa-e334bd08f24e-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:44 crc kubenswrapper[4836]: I0217 14:29:44.198169 4836 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c29f84b9-3879-4fc6-b2aa-e334bd08f24e-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:44 crc kubenswrapper[4836]: I0217 14:29:44.247887 4836 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 17 14:29:44 crc kubenswrapper[4836]: I0217 14:29:44.248063 4836 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-47a47ba0-1e72-4c37-bb61-5251f8f68b34" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-47a47ba0-1e72-4c37-bb61-5251f8f68b34") on node "crc" Feb 17 14:29:44 crc kubenswrapper[4836]: I0217 14:29:44.298870 4836 reconciler_common.go:293] "Volume detached for volume \"pvc-47a47ba0-1e72-4c37-bb61-5251f8f68b34\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-47a47ba0-1e72-4c37-bb61-5251f8f68b34\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:44 crc kubenswrapper[4836]: I0217 14:29:44.460152 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-npl52" Feb 17 14:29:44 crc kubenswrapper[4836]: I0217 14:29:44.983166 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/db342a3d-55f5-4b0c-b96f-327014b6fb82-operator-scripts\") pod \"db342a3d-55f5-4b0c-b96f-327014b6fb82\" (UID: \"db342a3d-55f5-4b0c-b96f-327014b6fb82\") " Feb 17 14:29:44 crc kubenswrapper[4836]: I0217 14:29:44.983273 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hknvs\" (UniqueName: \"kubernetes.io/projected/db342a3d-55f5-4b0c-b96f-327014b6fb82-kube-api-access-hknvs\") pod \"db342a3d-55f5-4b0c-b96f-327014b6fb82\" (UID: \"db342a3d-55f5-4b0c-b96f-327014b6fb82\") " Feb 17 14:29:44 crc kubenswrapper[4836]: I0217 14:29:44.995238 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db342a3d-55f5-4b0c-b96f-327014b6fb82-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "db342a3d-55f5-4b0c-b96f-327014b6fb82" (UID: "db342a3d-55f5-4b0c-b96f-327014b6fb82"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:29:45 crc kubenswrapper[4836]: I0217 14:29:45.044940 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db342a3d-55f5-4b0c-b96f-327014b6fb82-kube-api-access-hknvs" (OuterVolumeSpecName: "kube-api-access-hknvs") pod "db342a3d-55f5-4b0c-b96f-327014b6fb82" (UID: "db342a3d-55f5-4b0c-b96f-327014b6fb82"). InnerVolumeSpecName "kube-api-access-hknvs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:29:45 crc kubenswrapper[4836]: I0217 14:29:45.084833 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 14:29:45 crc kubenswrapper[4836]: I0217 14:29:45.086000 4836 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/db342a3d-55f5-4b0c-b96f-327014b6fb82-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:45 crc kubenswrapper[4836]: I0217 14:29:45.086049 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hknvs\" (UniqueName: \"kubernetes.io/projected/db342a3d-55f5-4b0c-b96f-327014b6fb82-kube-api-access-hknvs\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:45 crc kubenswrapper[4836]: I0217 14:29:45.135363 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-npl52" event={"ID":"db342a3d-55f5-4b0c-b96f-327014b6fb82","Type":"ContainerDied","Data":"582663419ea06870f82c61e67b714be6e79694fd2b49d90ddf21ffdb14cf9940"} Feb 17 14:29:45 crc kubenswrapper[4836]: I0217 14:29:45.135415 4836 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="582663419ea06870f82c61e67b714be6e79694fd2b49d90ddf21ffdb14cf9940" Feb 17 14:29:45 crc kubenswrapper[4836]: I0217 14:29:45.135529 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-npl52" Feb 17 14:29:45 crc kubenswrapper[4836]: I0217 14:29:45.162545 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 14:29:45 crc kubenswrapper[4836]: I0217 14:29:45.293039 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 14:29:45 crc kubenswrapper[4836]: E0217 14:29:45.294229 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c29f84b9-3879-4fc6-b2aa-e334bd08f24e" containerName="glance-log" Feb 17 14:29:45 crc kubenswrapper[4836]: I0217 14:29:45.294261 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="c29f84b9-3879-4fc6-b2aa-e334bd08f24e" containerName="glance-log" Feb 17 14:29:45 crc kubenswrapper[4836]: E0217 14:29:45.294308 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88b1aa3a-dc15-4ec1-ba76-8246e300422f" containerName="mariadb-database-create" Feb 17 14:29:45 crc kubenswrapper[4836]: I0217 14:29:45.294318 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="88b1aa3a-dc15-4ec1-ba76-8246e300422f" containerName="mariadb-database-create" Feb 17 14:29:45 crc kubenswrapper[4836]: E0217 14:29:45.294355 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c29f84b9-3879-4fc6-b2aa-e334bd08f24e" containerName="glance-httpd" Feb 17 14:29:45 crc kubenswrapper[4836]: I0217 14:29:45.294365 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="c29f84b9-3879-4fc6-b2aa-e334bd08f24e" containerName="glance-httpd" Feb 17 14:29:45 crc kubenswrapper[4836]: E0217 14:29:45.294401 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db342a3d-55f5-4b0c-b96f-327014b6fb82" containerName="mariadb-database-create" Feb 17 14:29:45 crc kubenswrapper[4836]: I0217 14:29:45.294410 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="db342a3d-55f5-4b0c-b96f-327014b6fb82" containerName="mariadb-database-create" Feb 17 14:29:45 crc kubenswrapper[4836]: I0217 14:29:45.294683 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="c29f84b9-3879-4fc6-b2aa-e334bd08f24e" containerName="glance-log" Feb 17 14:29:45 crc kubenswrapper[4836]: I0217 14:29:45.294718 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="db342a3d-55f5-4b0c-b96f-327014b6fb82" containerName="mariadb-database-create" Feb 17 14:29:45 crc kubenswrapper[4836]: I0217 14:29:45.294740 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="c29f84b9-3879-4fc6-b2aa-e334bd08f24e" containerName="glance-httpd" Feb 17 14:29:45 crc kubenswrapper[4836]: I0217 14:29:45.294755 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="88b1aa3a-dc15-4ec1-ba76-8246e300422f" containerName="mariadb-database-create" Feb 17 14:29:45 crc kubenswrapper[4836]: I0217 14:29:45.302462 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 17 14:29:45 crc kubenswrapper[4836]: I0217 14:29:45.316641 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 17 14:29:45 crc kubenswrapper[4836]: I0217 14:29:45.317023 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 17 14:29:45 crc kubenswrapper[4836]: I0217 14:29:45.374383 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 14:29:45 crc kubenswrapper[4836]: E0217 14:29:45.438016 4836 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddb342a3d_55f5_4b0c_b96f_327014b6fb82.slice\": RecentStats: unable to find data in memory cache]" Feb 17 14:29:45 crc kubenswrapper[4836]: I0217 14:29:45.512389 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b5121f0d-e93f-44c6-96b5-4ed7b6ec960e-scripts\") pod \"glance-default-external-api-0\" (UID: \"b5121f0d-e93f-44c6-96b5-4ed7b6ec960e\") " pod="openstack/glance-default-external-api-0" Feb 17 14:29:45 crc kubenswrapper[4836]: I0217 14:29:45.516420 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-47a47ba0-1e72-4c37-bb61-5251f8f68b34\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-47a47ba0-1e72-4c37-bb61-5251f8f68b34\") pod \"glance-default-external-api-0\" (UID: \"b5121f0d-e93f-44c6-96b5-4ed7b6ec960e\") " pod="openstack/glance-default-external-api-0" Feb 17 14:29:45 crc kubenswrapper[4836]: I0217 14:29:45.516474 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcl4r\" (UniqueName: \"kubernetes.io/projected/b5121f0d-e93f-44c6-96b5-4ed7b6ec960e-kube-api-access-wcl4r\") pod \"glance-default-external-api-0\" (UID: \"b5121f0d-e93f-44c6-96b5-4ed7b6ec960e\") " pod="openstack/glance-default-external-api-0" Feb 17 14:29:45 crc kubenswrapper[4836]: I0217 14:29:45.516605 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5121f0d-e93f-44c6-96b5-4ed7b6ec960e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"b5121f0d-e93f-44c6-96b5-4ed7b6ec960e\") " pod="openstack/glance-default-external-api-0" Feb 17 14:29:45 crc kubenswrapper[4836]: I0217 14:29:45.516628 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5121f0d-e93f-44c6-96b5-4ed7b6ec960e-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"b5121f0d-e93f-44c6-96b5-4ed7b6ec960e\") " pod="openstack/glance-default-external-api-0" Feb 17 14:29:45 crc kubenswrapper[4836]: I0217 14:29:45.516725 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b5121f0d-e93f-44c6-96b5-4ed7b6ec960e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"b5121f0d-e93f-44c6-96b5-4ed7b6ec960e\") " pod="openstack/glance-default-external-api-0" Feb 17 14:29:45 crc kubenswrapper[4836]: I0217 14:29:45.516772 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5121f0d-e93f-44c6-96b5-4ed7b6ec960e-logs\") pod \"glance-default-external-api-0\" (UID: \"b5121f0d-e93f-44c6-96b5-4ed7b6ec960e\") " pod="openstack/glance-default-external-api-0" Feb 17 14:29:45 crc kubenswrapper[4836]: I0217 14:29:45.517072 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5121f0d-e93f-44c6-96b5-4ed7b6ec960e-config-data\") pod \"glance-default-external-api-0\" (UID: \"b5121f0d-e93f-44c6-96b5-4ed7b6ec960e\") " pod="openstack/glance-default-external-api-0" Feb 17 14:29:45 crc kubenswrapper[4836]: I0217 14:29:45.640195 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5121f0d-e93f-44c6-96b5-4ed7b6ec960e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"b5121f0d-e93f-44c6-96b5-4ed7b6ec960e\") " pod="openstack/glance-default-external-api-0" Feb 17 14:29:45 crc kubenswrapper[4836]: I0217 14:29:45.640246 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5121f0d-e93f-44c6-96b5-4ed7b6ec960e-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"b5121f0d-e93f-44c6-96b5-4ed7b6ec960e\") " pod="openstack/glance-default-external-api-0" Feb 17 14:29:45 crc kubenswrapper[4836]: I0217 14:29:45.640353 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b5121f0d-e93f-44c6-96b5-4ed7b6ec960e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"b5121f0d-e93f-44c6-96b5-4ed7b6ec960e\") " pod="openstack/glance-default-external-api-0" Feb 17 14:29:45 crc kubenswrapper[4836]: I0217 14:29:45.640403 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5121f0d-e93f-44c6-96b5-4ed7b6ec960e-logs\") pod \"glance-default-external-api-0\" (UID: \"b5121f0d-e93f-44c6-96b5-4ed7b6ec960e\") " pod="openstack/glance-default-external-api-0" Feb 17 14:29:45 crc kubenswrapper[4836]: I0217 14:29:45.640645 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5121f0d-e93f-44c6-96b5-4ed7b6ec960e-config-data\") pod \"glance-default-external-api-0\" (UID: \"b5121f0d-e93f-44c6-96b5-4ed7b6ec960e\") " pod="openstack/glance-default-external-api-0" Feb 17 14:29:45 crc kubenswrapper[4836]: I0217 14:29:45.640848 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b5121f0d-e93f-44c6-96b5-4ed7b6ec960e-scripts\") pod \"glance-default-external-api-0\" (UID: \"b5121f0d-e93f-44c6-96b5-4ed7b6ec960e\") " pod="openstack/glance-default-external-api-0" Feb 17 14:29:45 crc kubenswrapper[4836]: I0217 14:29:45.640895 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-47a47ba0-1e72-4c37-bb61-5251f8f68b34\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-47a47ba0-1e72-4c37-bb61-5251f8f68b34\") pod \"glance-default-external-api-0\" (UID: \"b5121f0d-e93f-44c6-96b5-4ed7b6ec960e\") " pod="openstack/glance-default-external-api-0" Feb 17 14:29:45 crc kubenswrapper[4836]: I0217 14:29:45.640922 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wcl4r\" (UniqueName: \"kubernetes.io/projected/b5121f0d-e93f-44c6-96b5-4ed7b6ec960e-kube-api-access-wcl4r\") pod \"glance-default-external-api-0\" (UID: \"b5121f0d-e93f-44c6-96b5-4ed7b6ec960e\") " pod="openstack/glance-default-external-api-0" Feb 17 14:29:45 crc kubenswrapper[4836]: I0217 14:29:45.646409 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5121f0d-e93f-44c6-96b5-4ed7b6ec960e-logs\") pod \"glance-default-external-api-0\" (UID: \"b5121f0d-e93f-44c6-96b5-4ed7b6ec960e\") " pod="openstack/glance-default-external-api-0" Feb 17 14:29:45 crc kubenswrapper[4836]: I0217 14:29:45.646761 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b5121f0d-e93f-44c6-96b5-4ed7b6ec960e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"b5121f0d-e93f-44c6-96b5-4ed7b6ec960e\") " pod="openstack/glance-default-external-api-0" Feb 17 14:29:45 crc kubenswrapper[4836]: I0217 14:29:45.662906 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5121f0d-e93f-44c6-96b5-4ed7b6ec960e-config-data\") pod \"glance-default-external-api-0\" (UID: \"b5121f0d-e93f-44c6-96b5-4ed7b6ec960e\") " pod="openstack/glance-default-external-api-0" Feb 17 14:29:45 crc kubenswrapper[4836]: I0217 14:29:45.670793 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b5121f0d-e93f-44c6-96b5-4ed7b6ec960e-scripts\") pod \"glance-default-external-api-0\" (UID: \"b5121f0d-e93f-44c6-96b5-4ed7b6ec960e\") " pod="openstack/glance-default-external-api-0" Feb 17 14:29:45 crc kubenswrapper[4836]: I0217 14:29:45.671544 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5121f0d-e93f-44c6-96b5-4ed7b6ec960e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"b5121f0d-e93f-44c6-96b5-4ed7b6ec960e\") " pod="openstack/glance-default-external-api-0" Feb 17 14:29:45 crc kubenswrapper[4836]: I0217 14:29:45.677076 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcl4r\" (UniqueName: \"kubernetes.io/projected/b5121f0d-e93f-44c6-96b5-4ed7b6ec960e-kube-api-access-wcl4r\") pod \"glance-default-external-api-0\" (UID: \"b5121f0d-e93f-44c6-96b5-4ed7b6ec960e\") " pod="openstack/glance-default-external-api-0" Feb 17 14:29:45 crc kubenswrapper[4836]: I0217 14:29:45.682250 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5121f0d-e93f-44c6-96b5-4ed7b6ec960e-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"b5121f0d-e93f-44c6-96b5-4ed7b6ec960e\") " pod="openstack/glance-default-external-api-0" Feb 17 14:29:45 crc kubenswrapper[4836]: I0217 14:29:45.736842 4836 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 17 14:29:45 crc kubenswrapper[4836]: I0217 14:29:45.737141 4836 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-47a47ba0-1e72-4c37-bb61-5251f8f68b34\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-47a47ba0-1e72-4c37-bb61-5251f8f68b34\") pod \"glance-default-external-api-0\" (UID: \"b5121f0d-e93f-44c6-96b5-4ed7b6ec960e\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/f1c05c143b5a67726d067625f4c5da25dac4624853da03b1088e3ef561519b77/globalmount\"" pod="openstack/glance-default-external-api-0" Feb 17 14:29:45 crc kubenswrapper[4836]: I0217 14:29:45.826555 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-47a47ba0-1e72-4c37-bb61-5251f8f68b34\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-47a47ba0-1e72-4c37-bb61-5251f8f68b34\") pod \"glance-default-external-api-0\" (UID: \"b5121f0d-e93f-44c6-96b5-4ed7b6ec960e\") " pod="openstack/glance-default-external-api-0" Feb 17 14:29:46 crc kubenswrapper[4836]: I0217 14:29:46.168069 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 17 14:29:46 crc kubenswrapper[4836]: I0217 14:29:46.266910 4836 generic.go:334] "Generic (PLEG): container finished" podID="9fc032cb-3063-4e39-a91f-ccc89defe9c4" containerID="253884c8bfee6f38dc03fef1da6c5e47b92d31a3b1592567360ef3f04d7144a9" exitCode=0 Feb 17 14:29:46 crc kubenswrapper[4836]: I0217 14:29:46.273679 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9fc032cb-3063-4e39-a91f-ccc89defe9c4","Type":"ContainerDied","Data":"253884c8bfee6f38dc03fef1da6c5e47b92d31a3b1592567360ef3f04d7144a9"} Feb 17 14:29:46 crc kubenswrapper[4836]: I0217 14:29:46.503013 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-8fba-account-create-update-gqd5n" Feb 17 14:29:46 crc kubenswrapper[4836]: I0217 14:29:46.625611 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c29f84b9-3879-4fc6-b2aa-e334bd08f24e" path="/var/lib/kubelet/pods/c29f84b9-3879-4fc6-b2aa-e334bd08f24e/volumes" Feb 17 14:29:46 crc kubenswrapper[4836]: I0217 14:29:46.643360 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-5h5m9" Feb 17 14:29:46 crc kubenswrapper[4836]: I0217 14:29:46.666057 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-a7c4-account-create-update-qj5lb" Feb 17 14:29:46 crc kubenswrapper[4836]: I0217 14:29:46.688782 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-5d87f46c5f-vfn9f" Feb 17 14:29:46 crc kubenswrapper[4836]: I0217 14:29:46.689014 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-28f5-account-create-update-74tvm" Feb 17 14:29:46 crc kubenswrapper[4836]: I0217 14:29:46.690408 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9bk6l\" (UniqueName: \"kubernetes.io/projected/0b8171da-ad25-4388-9dab-2afc19993d97-kube-api-access-9bk6l\") pod \"0b8171da-ad25-4388-9dab-2afc19993d97\" (UID: \"0b8171da-ad25-4388-9dab-2afc19993d97\") " Feb 17 14:29:46 crc kubenswrapper[4836]: I0217 14:29:46.690606 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0b8171da-ad25-4388-9dab-2afc19993d97-operator-scripts\") pod \"0b8171da-ad25-4388-9dab-2afc19993d97\" (UID: \"0b8171da-ad25-4388-9dab-2afc19993d97\") " Feb 17 14:29:46 crc kubenswrapper[4836]: I0217 14:29:46.691809 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b8171da-ad25-4388-9dab-2afc19993d97-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0b8171da-ad25-4388-9dab-2afc19993d97" (UID: "0b8171da-ad25-4388-9dab-2afc19993d97"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:29:46 crc kubenswrapper[4836]: I0217 14:29:46.714088 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b8171da-ad25-4388-9dab-2afc19993d97-kube-api-access-9bk6l" (OuterVolumeSpecName: "kube-api-access-9bk6l") pod "0b8171da-ad25-4388-9dab-2afc19993d97" (UID: "0b8171da-ad25-4388-9dab-2afc19993d97"). InnerVolumeSpecName "kube-api-access-9bk6l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:29:46 crc kubenswrapper[4836]: I0217 14:29:46.794471 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f2xjk\" (UniqueName: \"kubernetes.io/projected/c7d61f8c-4804-49b6-937e-fbaf20aa3ed2-kube-api-access-f2xjk\") pod \"c7d61f8c-4804-49b6-937e-fbaf20aa3ed2\" (UID: \"c7d61f8c-4804-49b6-937e-fbaf20aa3ed2\") " Feb 17 14:29:46 crc kubenswrapper[4836]: I0217 14:29:46.794707 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0312359b-98a6-49c7-83f1-fb44c679e8aa-operator-scripts\") pod \"0312359b-98a6-49c7-83f1-fb44c679e8aa\" (UID: \"0312359b-98a6-49c7-83f1-fb44c679e8aa\") " Feb 17 14:29:46 crc kubenswrapper[4836]: I0217 14:29:46.794750 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c7d61f8c-4804-49b6-937e-fbaf20aa3ed2-operator-scripts\") pod \"c7d61f8c-4804-49b6-937e-fbaf20aa3ed2\" (UID: \"c7d61f8c-4804-49b6-937e-fbaf20aa3ed2\") " Feb 17 14:29:46 crc kubenswrapper[4836]: I0217 14:29:46.794828 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nmsfl\" (UniqueName: \"kubernetes.io/projected/0312359b-98a6-49c7-83f1-fb44c679e8aa-kube-api-access-nmsfl\") pod \"0312359b-98a6-49c7-83f1-fb44c679e8aa\" (UID: \"0312359b-98a6-49c7-83f1-fb44c679e8aa\") " Feb 17 14:29:46 crc kubenswrapper[4836]: I0217 14:29:46.794883 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4dc00367-2940-413d-872a-74d4fa37fc1f-operator-scripts\") pod \"4dc00367-2940-413d-872a-74d4fa37fc1f\" (UID: \"4dc00367-2940-413d-872a-74d4fa37fc1f\") " Feb 17 14:29:46 crc kubenswrapper[4836]: I0217 14:29:46.794902 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbccw\" (UniqueName: \"kubernetes.io/projected/4dc00367-2940-413d-872a-74d4fa37fc1f-kube-api-access-dbccw\") pod \"4dc00367-2940-413d-872a-74d4fa37fc1f\" (UID: \"4dc00367-2940-413d-872a-74d4fa37fc1f\") " Feb 17 14:29:46 crc kubenswrapper[4836]: I0217 14:29:46.795966 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9bk6l\" (UniqueName: \"kubernetes.io/projected/0b8171da-ad25-4388-9dab-2afc19993d97-kube-api-access-9bk6l\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:46 crc kubenswrapper[4836]: I0217 14:29:46.795982 4836 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0b8171da-ad25-4388-9dab-2afc19993d97-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:46 crc kubenswrapper[4836]: I0217 14:29:46.798031 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7d61f8c-4804-49b6-937e-fbaf20aa3ed2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c7d61f8c-4804-49b6-937e-fbaf20aa3ed2" (UID: "c7d61f8c-4804-49b6-937e-fbaf20aa3ed2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:29:46 crc kubenswrapper[4836]: I0217 14:29:46.798407 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0312359b-98a6-49c7-83f1-fb44c679e8aa-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0312359b-98a6-49c7-83f1-fb44c679e8aa" (UID: "0312359b-98a6-49c7-83f1-fb44c679e8aa"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:29:46 crc kubenswrapper[4836]: I0217 14:29:46.799438 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4dc00367-2940-413d-872a-74d4fa37fc1f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4dc00367-2940-413d-872a-74d4fa37fc1f" (UID: "4dc00367-2940-413d-872a-74d4fa37fc1f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:29:46 crc kubenswrapper[4836]: I0217 14:29:46.807676 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0312359b-98a6-49c7-83f1-fb44c679e8aa-kube-api-access-nmsfl" (OuterVolumeSpecName: "kube-api-access-nmsfl") pod "0312359b-98a6-49c7-83f1-fb44c679e8aa" (UID: "0312359b-98a6-49c7-83f1-fb44c679e8aa"). InnerVolumeSpecName "kube-api-access-nmsfl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:29:46 crc kubenswrapper[4836]: I0217 14:29:46.807714 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4dc00367-2940-413d-872a-74d4fa37fc1f-kube-api-access-dbccw" (OuterVolumeSpecName: "kube-api-access-dbccw") pod "4dc00367-2940-413d-872a-74d4fa37fc1f" (UID: "4dc00367-2940-413d-872a-74d4fa37fc1f"). InnerVolumeSpecName "kube-api-access-dbccw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:29:46 crc kubenswrapper[4836]: I0217 14:29:46.808197 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7d61f8c-4804-49b6-937e-fbaf20aa3ed2-kube-api-access-f2xjk" (OuterVolumeSpecName: "kube-api-access-f2xjk") pod "c7d61f8c-4804-49b6-937e-fbaf20aa3ed2" (UID: "c7d61f8c-4804-49b6-937e-fbaf20aa3ed2"). InnerVolumeSpecName "kube-api-access-f2xjk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:29:46 crc kubenswrapper[4836]: I0217 14:29:46.901443 4836 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0312359b-98a6-49c7-83f1-fb44c679e8aa-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:46 crc kubenswrapper[4836]: I0217 14:29:46.901478 4836 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c7d61f8c-4804-49b6-937e-fbaf20aa3ed2-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:46 crc kubenswrapper[4836]: I0217 14:29:46.901487 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nmsfl\" (UniqueName: \"kubernetes.io/projected/0312359b-98a6-49c7-83f1-fb44c679e8aa-kube-api-access-nmsfl\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:46 crc kubenswrapper[4836]: I0217 14:29:46.901500 4836 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4dc00367-2940-413d-872a-74d4fa37fc1f-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:46 crc kubenswrapper[4836]: I0217 14:29:46.901508 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbccw\" (UniqueName: \"kubernetes.io/projected/4dc00367-2940-413d-872a-74d4fa37fc1f-kube-api-access-dbccw\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:46 crc kubenswrapper[4836]: I0217 14:29:46.901516 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f2xjk\" (UniqueName: \"kubernetes.io/projected/c7d61f8c-4804-49b6-937e-fbaf20aa3ed2-kube-api-access-f2xjk\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:46 crc kubenswrapper[4836]: I0217 14:29:46.995774 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 17 14:29:47 crc kubenswrapper[4836]: I0217 14:29:47.110223 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9fc032cb-3063-4e39-a91f-ccc89defe9c4-internal-tls-certs\") pod \"9fc032cb-3063-4e39-a91f-ccc89defe9c4\" (UID: \"9fc032cb-3063-4e39-a91f-ccc89defe9c4\") " Feb 17 14:29:47 crc kubenswrapper[4836]: I0217 14:29:47.110364 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fc032cb-3063-4e39-a91f-ccc89defe9c4-combined-ca-bundle\") pod \"9fc032cb-3063-4e39-a91f-ccc89defe9c4\" (UID: \"9fc032cb-3063-4e39-a91f-ccc89defe9c4\") " Feb 17 14:29:47 crc kubenswrapper[4836]: I0217 14:29:47.110405 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9fc032cb-3063-4e39-a91f-ccc89defe9c4-scripts\") pod \"9fc032cb-3063-4e39-a91f-ccc89defe9c4\" (UID: \"9fc032cb-3063-4e39-a91f-ccc89defe9c4\") " Feb 17 14:29:47 crc kubenswrapper[4836]: I0217 14:29:47.110451 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9fc032cb-3063-4e39-a91f-ccc89defe9c4-config-data\") pod \"9fc032cb-3063-4e39-a91f-ccc89defe9c4\" (UID: \"9fc032cb-3063-4e39-a91f-ccc89defe9c4\") " Feb 17 14:29:47 crc kubenswrapper[4836]: I0217 14:29:47.110496 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9fc032cb-3063-4e39-a91f-ccc89defe9c4-logs\") pod \"9fc032cb-3063-4e39-a91f-ccc89defe9c4\" (UID: \"9fc032cb-3063-4e39-a91f-ccc89defe9c4\") " Feb 17 14:29:47 crc kubenswrapper[4836]: I0217 14:29:47.110550 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9fc032cb-3063-4e39-a91f-ccc89defe9c4-httpd-run\") pod \"9fc032cb-3063-4e39-a91f-ccc89defe9c4\" (UID: \"9fc032cb-3063-4e39-a91f-ccc89defe9c4\") " Feb 17 14:29:47 crc kubenswrapper[4836]: I0217 14:29:47.110716 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-21ce1609-dbd7-400f-b0f4-c62fbe057ccf\") pod \"9fc032cb-3063-4e39-a91f-ccc89defe9c4\" (UID: \"9fc032cb-3063-4e39-a91f-ccc89defe9c4\") " Feb 17 14:29:47 crc kubenswrapper[4836]: I0217 14:29:47.110775 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x6smd\" (UniqueName: \"kubernetes.io/projected/9fc032cb-3063-4e39-a91f-ccc89defe9c4-kube-api-access-x6smd\") pod \"9fc032cb-3063-4e39-a91f-ccc89defe9c4\" (UID: \"9fc032cb-3063-4e39-a91f-ccc89defe9c4\") " Feb 17 14:29:47 crc kubenswrapper[4836]: I0217 14:29:47.113799 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9fc032cb-3063-4e39-a91f-ccc89defe9c4-logs" (OuterVolumeSpecName: "logs") pod "9fc032cb-3063-4e39-a91f-ccc89defe9c4" (UID: "9fc032cb-3063-4e39-a91f-ccc89defe9c4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:29:47 crc kubenswrapper[4836]: I0217 14:29:47.113996 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9fc032cb-3063-4e39-a91f-ccc89defe9c4-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "9fc032cb-3063-4e39-a91f-ccc89defe9c4" (UID: "9fc032cb-3063-4e39-a91f-ccc89defe9c4"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:29:47 crc kubenswrapper[4836]: I0217 14:29:47.126552 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9fc032cb-3063-4e39-a91f-ccc89defe9c4-kube-api-access-x6smd" (OuterVolumeSpecName: "kube-api-access-x6smd") pod "9fc032cb-3063-4e39-a91f-ccc89defe9c4" (UID: "9fc032cb-3063-4e39-a91f-ccc89defe9c4"). InnerVolumeSpecName "kube-api-access-x6smd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:29:47 crc kubenswrapper[4836]: I0217 14:29:47.396966 4836 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9fc032cb-3063-4e39-a91f-ccc89defe9c4-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:47 crc kubenswrapper[4836]: I0217 14:29:47.398155 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x6smd\" (UniqueName: \"kubernetes.io/projected/9fc032cb-3063-4e39-a91f-ccc89defe9c4-kube-api-access-x6smd\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:47 crc kubenswrapper[4836]: I0217 14:29:47.398173 4836 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9fc032cb-3063-4e39-a91f-ccc89defe9c4-logs\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:47 crc kubenswrapper[4836]: I0217 14:29:47.398329 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9fc032cb-3063-4e39-a91f-ccc89defe9c4-scripts" (OuterVolumeSpecName: "scripts") pod "9fc032cb-3063-4e39-a91f-ccc89defe9c4" (UID: "9fc032cb-3063-4e39-a91f-ccc89defe9c4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:29:47 crc kubenswrapper[4836]: I0217 14:29:47.430488 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9fc032cb-3063-4e39-a91f-ccc89defe9c4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9fc032cb-3063-4e39-a91f-ccc89defe9c4" (UID: "9fc032cb-3063-4e39-a91f-ccc89defe9c4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:29:47 crc kubenswrapper[4836]: I0217 14:29:47.445667 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-21ce1609-dbd7-400f-b0f4-c62fbe057ccf" (OuterVolumeSpecName: "glance") pod "9fc032cb-3063-4e39-a91f-ccc89defe9c4" (UID: "9fc032cb-3063-4e39-a91f-ccc89defe9c4"). InnerVolumeSpecName "pvc-21ce1609-dbd7-400f-b0f4-c62fbe057ccf". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 17 14:29:47 crc kubenswrapper[4836]: I0217 14:29:47.469706 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9fc032cb-3063-4e39-a91f-ccc89defe9c4","Type":"ContainerDied","Data":"38f3541a8bef919fb1afd541589fd4540ccef699d3e6a2e7f1dcb0859f09ea45"} Feb 17 14:29:47 crc kubenswrapper[4836]: I0217 14:29:47.469767 4836 scope.go:117] "RemoveContainer" containerID="253884c8bfee6f38dc03fef1da6c5e47b92d31a3b1592567360ef3f04d7144a9" Feb 17 14:29:47 crc kubenswrapper[4836]: I0217 14:29:47.470007 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 17 14:29:47 crc kubenswrapper[4836]: I0217 14:29:47.473788 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9fc032cb-3063-4e39-a91f-ccc89defe9c4-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "9fc032cb-3063-4e39-a91f-ccc89defe9c4" (UID: "9fc032cb-3063-4e39-a91f-ccc89defe9c4"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:29:47 crc kubenswrapper[4836]: I0217 14:29:47.482130 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-8fba-account-create-update-gqd5n" event={"ID":"0b8171da-ad25-4388-9dab-2afc19993d97","Type":"ContainerDied","Data":"4978da281b4ffb4cbde1dc06e973f40c67a116248d8a8623898e48ea004f575f"} Feb 17 14:29:47 crc kubenswrapper[4836]: I0217 14:29:47.482171 4836 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4978da281b4ffb4cbde1dc06e973f40c67a116248d8a8623898e48ea004f575f" Feb 17 14:29:47 crc kubenswrapper[4836]: I0217 14:29:47.482315 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-8fba-account-create-update-gqd5n" Feb 17 14:29:47 crc kubenswrapper[4836]: I0217 14:29:47.497772 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-28f5-account-create-update-74tvm" event={"ID":"4dc00367-2940-413d-872a-74d4fa37fc1f","Type":"ContainerDied","Data":"684fea3361f7992d7677d58a81ef405045d31b021d431929ec2a4e0d9ce8e5bf"} Feb 17 14:29:47 crc kubenswrapper[4836]: I0217 14:29:47.497838 4836 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="684fea3361f7992d7677d58a81ef405045d31b021d431929ec2a4e0d9ce8e5bf" Feb 17 14:29:47 crc kubenswrapper[4836]: I0217 14:29:47.498001 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-28f5-account-create-update-74tvm" Feb 17 14:29:47 crc kubenswrapper[4836]: I0217 14:29:47.504871 4836 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-21ce1609-dbd7-400f-b0f4-c62fbe057ccf\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-21ce1609-dbd7-400f-b0f4-c62fbe057ccf\") on node \"crc\" " Feb 17 14:29:47 crc kubenswrapper[4836]: I0217 14:29:47.504919 4836 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9fc032cb-3063-4e39-a91f-ccc89defe9c4-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:47 crc kubenswrapper[4836]: I0217 14:29:47.504933 4836 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fc032cb-3063-4e39-a91f-ccc89defe9c4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:47 crc kubenswrapper[4836]: I0217 14:29:47.504946 4836 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9fc032cb-3063-4e39-a91f-ccc89defe9c4-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:47 crc kubenswrapper[4836]: I0217 14:29:47.510230 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-a7c4-account-create-update-qj5lb" event={"ID":"c7d61f8c-4804-49b6-937e-fbaf20aa3ed2","Type":"ContainerDied","Data":"a8bfacaf56b208729fed6ae7379213c44a5bf9bbc00aaa497d58947acfd5fda8"} Feb 17 14:29:47 crc kubenswrapper[4836]: I0217 14:29:47.510277 4836 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a8bfacaf56b208729fed6ae7379213c44a5bf9bbc00aaa497d58947acfd5fda8" Feb 17 14:29:47 crc kubenswrapper[4836]: I0217 14:29:47.510373 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-a7c4-account-create-update-qj5lb" Feb 17 14:29:47 crc kubenswrapper[4836]: I0217 14:29:47.513824 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-5h5m9" event={"ID":"0312359b-98a6-49c7-83f1-fb44c679e8aa","Type":"ContainerDied","Data":"8aebd8b0cf09f0b5a71ad7edb46a57f5a3212f3d6f8147621e038f3b2d4a75eb"} Feb 17 14:29:47 crc kubenswrapper[4836]: I0217 14:29:47.513873 4836 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8aebd8b0cf09f0b5a71ad7edb46a57f5a3212f3d6f8147621e038f3b2d4a75eb" Feb 17 14:29:47 crc kubenswrapper[4836]: I0217 14:29:47.513966 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-5h5m9" Feb 17 14:29:47 crc kubenswrapper[4836]: I0217 14:29:47.515002 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9fc032cb-3063-4e39-a91f-ccc89defe9c4-config-data" (OuterVolumeSpecName: "config-data") pod "9fc032cb-3063-4e39-a91f-ccc89defe9c4" (UID: "9fc032cb-3063-4e39-a91f-ccc89defe9c4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:29:47 crc kubenswrapper[4836]: I0217 14:29:47.528511 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"56c9a452-ffd5-4b03-97a9-93546a194414","Type":"ContainerStarted","Data":"e9f32ae3116b957a3eb3e85cc5cb945cd5cac421baa8b9f3e186e90cea341d86"} Feb 17 14:29:47 crc kubenswrapper[4836]: I0217 14:29:47.612037 4836 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9fc032cb-3063-4e39-a91f-ccc89defe9c4-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:47 crc kubenswrapper[4836]: I0217 14:29:47.666232 4836 scope.go:117] "RemoveContainer" containerID="2d37a99072f4fb6a9bc38dee8c6986d96ef5977cd1d2c3dca6d3d95cb5f3bcee" Feb 17 14:29:47 crc kubenswrapper[4836]: I0217 14:29:47.676142 4836 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 17 14:29:47 crc kubenswrapper[4836]: I0217 14:29:47.681567 4836 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-21ce1609-dbd7-400f-b0f4-c62fbe057ccf" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-21ce1609-dbd7-400f-b0f4-c62fbe057ccf") on node "crc" Feb 17 14:29:47 crc kubenswrapper[4836]: I0217 14:29:47.726791 4836 reconciler_common.go:293] "Volume detached for volume \"pvc-21ce1609-dbd7-400f-b0f4-c62fbe057ccf\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-21ce1609-dbd7-400f-b0f4-c62fbe057ccf\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:47 crc kubenswrapper[4836]: I0217 14:29:47.783211 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 14:29:47 crc kubenswrapper[4836]: I0217 14:29:47.924433 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 14:29:47 crc kubenswrapper[4836]: I0217 14:29:47.953979 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 14:29:47 crc kubenswrapper[4836]: I0217 14:29:47.994734 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 14:29:47 crc kubenswrapper[4836]: E0217 14:29:47.995853 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7d61f8c-4804-49b6-937e-fbaf20aa3ed2" containerName="mariadb-account-create-update" Feb 17 14:29:47 crc kubenswrapper[4836]: I0217 14:29:47.995963 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7d61f8c-4804-49b6-937e-fbaf20aa3ed2" containerName="mariadb-account-create-update" Feb 17 14:29:47 crc kubenswrapper[4836]: E0217 14:29:47.996124 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fc032cb-3063-4e39-a91f-ccc89defe9c4" containerName="glance-log" Feb 17 14:29:47 crc kubenswrapper[4836]: I0217 14:29:47.996187 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fc032cb-3063-4e39-a91f-ccc89defe9c4" containerName="glance-log" Feb 17 14:29:47 crc kubenswrapper[4836]: E0217 14:29:47.996268 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0312359b-98a6-49c7-83f1-fb44c679e8aa" containerName="mariadb-database-create" Feb 17 14:29:47 crc kubenswrapper[4836]: I0217 14:29:47.996361 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="0312359b-98a6-49c7-83f1-fb44c679e8aa" containerName="mariadb-database-create" Feb 17 14:29:47 crc kubenswrapper[4836]: E0217 14:29:47.996477 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b8171da-ad25-4388-9dab-2afc19993d97" containerName="mariadb-account-create-update" Feb 17 14:29:47 crc kubenswrapper[4836]: I0217 14:29:47.996547 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b8171da-ad25-4388-9dab-2afc19993d97" containerName="mariadb-account-create-update" Feb 17 14:29:47 crc kubenswrapper[4836]: E0217 14:29:47.996614 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4dc00367-2940-413d-872a-74d4fa37fc1f" containerName="mariadb-account-create-update" Feb 17 14:29:47 crc kubenswrapper[4836]: I0217 14:29:47.996671 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="4dc00367-2940-413d-872a-74d4fa37fc1f" containerName="mariadb-account-create-update" Feb 17 14:29:47 crc kubenswrapper[4836]: E0217 14:29:47.996728 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fc032cb-3063-4e39-a91f-ccc89defe9c4" containerName="glance-httpd" Feb 17 14:29:47 crc kubenswrapper[4836]: I0217 14:29:47.996808 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fc032cb-3063-4e39-a91f-ccc89defe9c4" containerName="glance-httpd" Feb 17 14:29:47 crc kubenswrapper[4836]: I0217 14:29:47.997172 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="9fc032cb-3063-4e39-a91f-ccc89defe9c4" containerName="glance-log" Feb 17 14:29:47 crc kubenswrapper[4836]: I0217 14:29:47.997246 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="0312359b-98a6-49c7-83f1-fb44c679e8aa" containerName="mariadb-database-create" Feb 17 14:29:47 crc kubenswrapper[4836]: I0217 14:29:47.997380 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7d61f8c-4804-49b6-937e-fbaf20aa3ed2" containerName="mariadb-account-create-update" Feb 17 14:29:47 crc kubenswrapper[4836]: I0217 14:29:47.997447 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="4dc00367-2940-413d-872a-74d4fa37fc1f" containerName="mariadb-account-create-update" Feb 17 14:29:47 crc kubenswrapper[4836]: I0217 14:29:47.997516 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="9fc032cb-3063-4e39-a91f-ccc89defe9c4" containerName="glance-httpd" Feb 17 14:29:47 crc kubenswrapper[4836]: I0217 14:29:47.997589 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b8171da-ad25-4388-9dab-2afc19993d97" containerName="mariadb-account-create-update" Feb 17 14:29:47 crc kubenswrapper[4836]: I0217 14:29:47.999531 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 17 14:29:48 crc kubenswrapper[4836]: I0217 14:29:48.002629 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 17 14:29:48 crc kubenswrapper[4836]: I0217 14:29:48.003170 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 17 14:29:48 crc kubenswrapper[4836]: I0217 14:29:48.010532 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 14:29:48 crc kubenswrapper[4836]: I0217 14:29:48.148256 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/172fadf8-99d3-436a-b711-010e8ffe289b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"172fadf8-99d3-436a-b711-010e8ffe289b\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:29:48 crc kubenswrapper[4836]: I0217 14:29:48.148754 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/172fadf8-99d3-436a-b711-010e8ffe289b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"172fadf8-99d3-436a-b711-010e8ffe289b\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:29:48 crc kubenswrapper[4836]: I0217 14:29:48.148811 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/172fadf8-99d3-436a-b711-010e8ffe289b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"172fadf8-99d3-436a-b711-010e8ffe289b\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:29:48 crc kubenswrapper[4836]: I0217 14:29:48.148850 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/172fadf8-99d3-436a-b711-010e8ffe289b-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"172fadf8-99d3-436a-b711-010e8ffe289b\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:29:48 crc kubenswrapper[4836]: I0217 14:29:48.148884 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-21ce1609-dbd7-400f-b0f4-c62fbe057ccf\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-21ce1609-dbd7-400f-b0f4-c62fbe057ccf\") pod \"glance-default-internal-api-0\" (UID: \"172fadf8-99d3-436a-b711-010e8ffe289b\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:29:48 crc kubenswrapper[4836]: I0217 14:29:48.149037 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/172fadf8-99d3-436a-b711-010e8ffe289b-logs\") pod \"glance-default-internal-api-0\" (UID: \"172fadf8-99d3-436a-b711-010e8ffe289b\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:29:48 crc kubenswrapper[4836]: I0217 14:29:48.149110 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/172fadf8-99d3-436a-b711-010e8ffe289b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"172fadf8-99d3-436a-b711-010e8ffe289b\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:29:48 crc kubenswrapper[4836]: I0217 14:29:48.149344 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d984t\" (UniqueName: \"kubernetes.io/projected/172fadf8-99d3-436a-b711-010e8ffe289b-kube-api-access-d984t\") pod \"glance-default-internal-api-0\" (UID: \"172fadf8-99d3-436a-b711-010e8ffe289b\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:29:48 crc kubenswrapper[4836]: I0217 14:29:48.257925 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d984t\" (UniqueName: \"kubernetes.io/projected/172fadf8-99d3-436a-b711-010e8ffe289b-kube-api-access-d984t\") pod \"glance-default-internal-api-0\" (UID: \"172fadf8-99d3-436a-b711-010e8ffe289b\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:29:48 crc kubenswrapper[4836]: I0217 14:29:48.258017 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/172fadf8-99d3-436a-b711-010e8ffe289b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"172fadf8-99d3-436a-b711-010e8ffe289b\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:29:48 crc kubenswrapper[4836]: I0217 14:29:48.258061 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/172fadf8-99d3-436a-b711-010e8ffe289b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"172fadf8-99d3-436a-b711-010e8ffe289b\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:29:48 crc kubenswrapper[4836]: I0217 14:29:48.258094 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/172fadf8-99d3-436a-b711-010e8ffe289b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"172fadf8-99d3-436a-b711-010e8ffe289b\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:29:48 crc kubenswrapper[4836]: I0217 14:29:48.258125 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/172fadf8-99d3-436a-b711-010e8ffe289b-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"172fadf8-99d3-436a-b711-010e8ffe289b\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:29:48 crc kubenswrapper[4836]: I0217 14:29:48.258152 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-21ce1609-dbd7-400f-b0f4-c62fbe057ccf\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-21ce1609-dbd7-400f-b0f4-c62fbe057ccf\") pod \"glance-default-internal-api-0\" (UID: \"172fadf8-99d3-436a-b711-010e8ffe289b\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:29:48 crc kubenswrapper[4836]: I0217 14:29:48.258214 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/172fadf8-99d3-436a-b711-010e8ffe289b-logs\") pod \"glance-default-internal-api-0\" (UID: \"172fadf8-99d3-436a-b711-010e8ffe289b\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:29:48 crc kubenswrapper[4836]: I0217 14:29:48.258233 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/172fadf8-99d3-436a-b711-010e8ffe289b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"172fadf8-99d3-436a-b711-010e8ffe289b\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:29:48 crc kubenswrapper[4836]: I0217 14:29:48.259514 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/172fadf8-99d3-436a-b711-010e8ffe289b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"172fadf8-99d3-436a-b711-010e8ffe289b\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:29:48 crc kubenswrapper[4836]: I0217 14:29:48.261835 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/172fadf8-99d3-436a-b711-010e8ffe289b-logs\") pod \"glance-default-internal-api-0\" (UID: \"172fadf8-99d3-436a-b711-010e8ffe289b\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:29:48 crc kubenswrapper[4836]: I0217 14:29:48.262580 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/172fadf8-99d3-436a-b711-010e8ffe289b-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"172fadf8-99d3-436a-b711-010e8ffe289b\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:29:48 crc kubenswrapper[4836]: I0217 14:29:48.264508 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/172fadf8-99d3-436a-b711-010e8ffe289b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"172fadf8-99d3-436a-b711-010e8ffe289b\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:29:48 crc kubenswrapper[4836]: I0217 14:29:48.266571 4836 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 17 14:29:48 crc kubenswrapper[4836]: I0217 14:29:48.266596 4836 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-21ce1609-dbd7-400f-b0f4-c62fbe057ccf\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-21ce1609-dbd7-400f-b0f4-c62fbe057ccf\") pod \"glance-default-internal-api-0\" (UID: \"172fadf8-99d3-436a-b711-010e8ffe289b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/20e9fd566d593755c515c6f55c386051b7cebe94721b27d85313d87ab22fcec4/globalmount\"" pod="openstack/glance-default-internal-api-0" Feb 17 14:29:48 crc kubenswrapper[4836]: I0217 14:29:48.267487 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/172fadf8-99d3-436a-b711-010e8ffe289b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"172fadf8-99d3-436a-b711-010e8ffe289b\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:29:48 crc kubenswrapper[4836]: I0217 14:29:48.269028 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/172fadf8-99d3-436a-b711-010e8ffe289b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"172fadf8-99d3-436a-b711-010e8ffe289b\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:29:48 crc kubenswrapper[4836]: I0217 14:29:48.280002 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d984t\" (UniqueName: \"kubernetes.io/projected/172fadf8-99d3-436a-b711-010e8ffe289b-kube-api-access-d984t\") pod \"glance-default-internal-api-0\" (UID: \"172fadf8-99d3-436a-b711-010e8ffe289b\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:29:48 crc kubenswrapper[4836]: I0217 14:29:48.333368 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-21ce1609-dbd7-400f-b0f4-c62fbe057ccf\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-21ce1609-dbd7-400f-b0f4-c62fbe057ccf\") pod \"glance-default-internal-api-0\" (UID: \"172fadf8-99d3-436a-b711-010e8ffe289b\") " pod="openstack/glance-default-internal-api-0" Feb 17 14:29:48 crc kubenswrapper[4836]: I0217 14:29:48.371670 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 17 14:29:48 crc kubenswrapper[4836]: I0217 14:29:48.960939 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9fc032cb-3063-4e39-a91f-ccc89defe9c4" path="/var/lib/kubelet/pods/9fc032cb-3063-4e39-a91f-ccc89defe9c4/volumes" Feb 17 14:29:49 crc kubenswrapper[4836]: I0217 14:29:49.004310 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b5121f0d-e93f-44c6-96b5-4ed7b6ec960e","Type":"ContainerStarted","Data":"eee4041fb1838b993b2e57f0d04d074f1c54f5467ef071b6911929052f11a3ae"} Feb 17 14:29:49 crc kubenswrapper[4836]: I0217 14:29:49.578739 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-896gw"] Feb 17 14:29:49 crc kubenswrapper[4836]: I0217 14:29:49.595401 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-896gw" Feb 17 14:29:49 crc kubenswrapper[4836]: I0217 14:29:49.608885 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Feb 17 14:29:49 crc kubenswrapper[4836]: I0217 14:29:49.609687 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-4kxcj" Feb 17 14:29:49 crc kubenswrapper[4836]: I0217 14:29:49.609835 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 17 14:29:50 crc kubenswrapper[4836]: I0217 14:29:50.045815 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-896gw"] Feb 17 14:29:50 crc kubenswrapper[4836]: I0217 14:29:50.109965 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5284ac65-3629-4b0f-94ce-114964fe6d15-scripts\") pod \"nova-cell0-conductor-db-sync-896gw\" (UID: \"5284ac65-3629-4b0f-94ce-114964fe6d15\") " pod="openstack/nova-cell0-conductor-db-sync-896gw" Feb 17 14:29:50 crc kubenswrapper[4836]: I0217 14:29:50.110192 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5284ac65-3629-4b0f-94ce-114964fe6d15-config-data\") pod \"nova-cell0-conductor-db-sync-896gw\" (UID: \"5284ac65-3629-4b0f-94ce-114964fe6d15\") " pod="openstack/nova-cell0-conductor-db-sync-896gw" Feb 17 14:29:50 crc kubenswrapper[4836]: I0217 14:29:50.110742 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5284ac65-3629-4b0f-94ce-114964fe6d15-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-896gw\" (UID: \"5284ac65-3629-4b0f-94ce-114964fe6d15\") " pod="openstack/nova-cell0-conductor-db-sync-896gw" Feb 17 14:29:50 crc kubenswrapper[4836]: I0217 14:29:50.110853 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lng9\" (UniqueName: \"kubernetes.io/projected/5284ac65-3629-4b0f-94ce-114964fe6d15-kube-api-access-9lng9\") pod \"nova-cell0-conductor-db-sync-896gw\" (UID: \"5284ac65-3629-4b0f-94ce-114964fe6d15\") " pod="openstack/nova-cell0-conductor-db-sync-896gw" Feb 17 14:29:50 crc kubenswrapper[4836]: I0217 14:29:50.119829 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"56c9a452-ffd5-4b03-97a9-93546a194414","Type":"ContainerStarted","Data":"af4d958a142f654427a1232ba635917cf28fa5a94e024ffbc7ea40f1fda64c7a"} Feb 17 14:29:50 crc kubenswrapper[4836]: I0217 14:29:50.120244 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="56c9a452-ffd5-4b03-97a9-93546a194414" containerName="ceilometer-central-agent" containerID="cri-o://d3bbefd170a172c21cb9f9e3cfad807a2c0bb5fe5338d9d264fb6ae4c6ff5de7" gracePeriod=30 Feb 17 14:29:50 crc kubenswrapper[4836]: I0217 14:29:50.125763 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="56c9a452-ffd5-4b03-97a9-93546a194414" containerName="proxy-httpd" containerID="cri-o://af4d958a142f654427a1232ba635917cf28fa5a94e024ffbc7ea40f1fda64c7a" gracePeriod=30 Feb 17 14:29:50 crc kubenswrapper[4836]: I0217 14:29:50.125837 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="56c9a452-ffd5-4b03-97a9-93546a194414" containerName="sg-core" containerID="cri-o://e9f32ae3116b957a3eb3e85cc5cb945cd5cac421baa8b9f3e186e90cea341d86" gracePeriod=30 Feb 17 14:29:50 crc kubenswrapper[4836]: I0217 14:29:50.125893 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="56c9a452-ffd5-4b03-97a9-93546a194414" containerName="ceilometer-notification-agent" containerID="cri-o://dfd206b5463d3cc6f1e9888d9e21e49bdbfd5f95e1989f13c38e03bb00682c21" gracePeriod=30 Feb 17 14:29:50 crc kubenswrapper[4836]: I0217 14:29:50.120819 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 17 14:29:50 crc kubenswrapper[4836]: I0217 14:29:50.174504 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=5.170128532 podStartE2EDuration="13.174475318s" podCreationTimestamp="2026-02-17 14:29:37 +0000 UTC" firstStartedPulling="2026-02-17 14:29:39.966639217 +0000 UTC m=+1406.309567486" lastFinishedPulling="2026-02-17 14:29:47.970986003 +0000 UTC m=+1414.313914272" observedRunningTime="2026-02-17 14:29:50.168922818 +0000 UTC m=+1416.511851087" watchObservedRunningTime="2026-02-17 14:29:50.174475318 +0000 UTC m=+1416.517403587" Feb 17 14:29:50 crc kubenswrapper[4836]: I0217 14:29:50.213179 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5284ac65-3629-4b0f-94ce-114964fe6d15-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-896gw\" (UID: \"5284ac65-3629-4b0f-94ce-114964fe6d15\") " pod="openstack/nova-cell0-conductor-db-sync-896gw" Feb 17 14:29:50 crc kubenswrapper[4836]: I0217 14:29:50.213625 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9lng9\" (UniqueName: \"kubernetes.io/projected/5284ac65-3629-4b0f-94ce-114964fe6d15-kube-api-access-9lng9\") pod \"nova-cell0-conductor-db-sync-896gw\" (UID: \"5284ac65-3629-4b0f-94ce-114964fe6d15\") " pod="openstack/nova-cell0-conductor-db-sync-896gw" Feb 17 14:29:50 crc kubenswrapper[4836]: I0217 14:29:50.213669 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5284ac65-3629-4b0f-94ce-114964fe6d15-scripts\") pod \"nova-cell0-conductor-db-sync-896gw\" (UID: \"5284ac65-3629-4b0f-94ce-114964fe6d15\") " pod="openstack/nova-cell0-conductor-db-sync-896gw" Feb 17 14:29:50 crc kubenswrapper[4836]: I0217 14:29:50.213703 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5284ac65-3629-4b0f-94ce-114964fe6d15-config-data\") pod \"nova-cell0-conductor-db-sync-896gw\" (UID: \"5284ac65-3629-4b0f-94ce-114964fe6d15\") " pod="openstack/nova-cell0-conductor-db-sync-896gw" Feb 17 14:29:50 crc kubenswrapper[4836]: I0217 14:29:50.244462 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5284ac65-3629-4b0f-94ce-114964fe6d15-config-data\") pod \"nova-cell0-conductor-db-sync-896gw\" (UID: \"5284ac65-3629-4b0f-94ce-114964fe6d15\") " pod="openstack/nova-cell0-conductor-db-sync-896gw" Feb 17 14:29:50 crc kubenswrapper[4836]: I0217 14:29:50.245310 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5284ac65-3629-4b0f-94ce-114964fe6d15-scripts\") pod \"nova-cell0-conductor-db-sync-896gw\" (UID: \"5284ac65-3629-4b0f-94ce-114964fe6d15\") " pod="openstack/nova-cell0-conductor-db-sync-896gw" Feb 17 14:29:50 crc kubenswrapper[4836]: I0217 14:29:50.248755 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lng9\" (UniqueName: \"kubernetes.io/projected/5284ac65-3629-4b0f-94ce-114964fe6d15-kube-api-access-9lng9\") pod \"nova-cell0-conductor-db-sync-896gw\" (UID: \"5284ac65-3629-4b0f-94ce-114964fe6d15\") " pod="openstack/nova-cell0-conductor-db-sync-896gw" Feb 17 14:29:50 crc kubenswrapper[4836]: I0217 14:29:50.262663 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5284ac65-3629-4b0f-94ce-114964fe6d15-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-896gw\" (UID: \"5284ac65-3629-4b0f-94ce-114964fe6d15\") " pod="openstack/nova-cell0-conductor-db-sync-896gw" Feb 17 14:29:50 crc kubenswrapper[4836]: W0217 14:29:50.299197 4836 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod172fadf8_99d3_436a_b711_010e8ffe289b.slice/crio-179c39149c981ba659609d3f03303df1c786c03bd40758053a07f8ad2935bd68 WatchSource:0}: Error finding container 179c39149c981ba659609d3f03303df1c786c03bd40758053a07f8ad2935bd68: Status 404 returned error can't find the container with id 179c39149c981ba659609d3f03303df1c786c03bd40758053a07f8ad2935bd68 Feb 17 14:29:50 crc kubenswrapper[4836]: I0217 14:29:50.299333 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 14:29:50 crc kubenswrapper[4836]: I0217 14:29:50.332714 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-896gw" Feb 17 14:29:51 crc kubenswrapper[4836]: I0217 14:29:51.025692 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-896gw"] Feb 17 14:29:51 crc kubenswrapper[4836]: W0217 14:29:51.037511 4836 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5284ac65_3629_4b0f_94ce_114964fe6d15.slice/crio-9e6101c106f2eea30b241c7cb1b7e2d51a47ac641af556866c4a4cf6f00c0aad WatchSource:0}: Error finding container 9e6101c106f2eea30b241c7cb1b7e2d51a47ac641af556866c4a4cf6f00c0aad: Status 404 returned error can't find the container with id 9e6101c106f2eea30b241c7cb1b7e2d51a47ac641af556866c4a4cf6f00c0aad Feb 17 14:29:51 crc kubenswrapper[4836]: I0217 14:29:51.144661 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"172fadf8-99d3-436a-b711-010e8ffe289b","Type":"ContainerStarted","Data":"179c39149c981ba659609d3f03303df1c786c03bd40758053a07f8ad2935bd68"} Feb 17 14:29:51 crc kubenswrapper[4836]: I0217 14:29:51.153168 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-896gw" event={"ID":"5284ac65-3629-4b0f-94ce-114964fe6d15","Type":"ContainerStarted","Data":"9e6101c106f2eea30b241c7cb1b7e2d51a47ac641af556866c4a4cf6f00c0aad"} Feb 17 14:29:51 crc kubenswrapper[4836]: I0217 14:29:51.158091 4836 generic.go:334] "Generic (PLEG): container finished" podID="56c9a452-ffd5-4b03-97a9-93546a194414" containerID="af4d958a142f654427a1232ba635917cf28fa5a94e024ffbc7ea40f1fda64c7a" exitCode=0 Feb 17 14:29:51 crc kubenswrapper[4836]: I0217 14:29:51.158147 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"56c9a452-ffd5-4b03-97a9-93546a194414","Type":"ContainerDied","Data":"af4d958a142f654427a1232ba635917cf28fa5a94e024ffbc7ea40f1fda64c7a"} Feb 17 14:29:52 crc kubenswrapper[4836]: I0217 14:29:52.205612 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"172fadf8-99d3-436a-b711-010e8ffe289b","Type":"ContainerStarted","Data":"16df76dbfff6e228f61eb7dd36f51ac2cc1e26c0fbe526656d756c8cd2c0e93e"} Feb 17 14:29:52 crc kubenswrapper[4836]: I0217 14:29:52.209358 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b5121f0d-e93f-44c6-96b5-4ed7b6ec960e","Type":"ContainerStarted","Data":"cfc49db59f8cb7aada23feb5e94f69f569dcdd81eb5540b251670502235191b8"} Feb 17 14:29:52 crc kubenswrapper[4836]: I0217 14:29:52.212245 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"4fe674a8-c32b-412e-8d20-2a6e7e18bb10","Type":"ContainerStarted","Data":"46a4010331e50fdf7e24756f54d4faba9466e11dc0a570feeed789e0e0fe6807"} Feb 17 14:29:52 crc kubenswrapper[4836]: I0217 14:29:52.236120 4836 generic.go:334] "Generic (PLEG): container finished" podID="56c9a452-ffd5-4b03-97a9-93546a194414" containerID="e9f32ae3116b957a3eb3e85cc5cb945cd5cac421baa8b9f3e186e90cea341d86" exitCode=2 Feb 17 14:29:52 crc kubenswrapper[4836]: I0217 14:29:52.236164 4836 generic.go:334] "Generic (PLEG): container finished" podID="56c9a452-ffd5-4b03-97a9-93546a194414" containerID="dfd206b5463d3cc6f1e9888d9e21e49bdbfd5f95e1989f13c38e03bb00682c21" exitCode=0 Feb 17 14:29:52 crc kubenswrapper[4836]: I0217 14:29:52.236188 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"56c9a452-ffd5-4b03-97a9-93546a194414","Type":"ContainerDied","Data":"e9f32ae3116b957a3eb3e85cc5cb945cd5cac421baa8b9f3e186e90cea341d86"} Feb 17 14:29:52 crc kubenswrapper[4836]: I0217 14:29:52.236225 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"56c9a452-ffd5-4b03-97a9-93546a194414","Type":"ContainerDied","Data":"dfd206b5463d3cc6f1e9888d9e21e49bdbfd5f95e1989f13c38e03bb00682c21"} Feb 17 14:29:52 crc kubenswrapper[4836]: I0217 14:29:52.251681 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=6.811449014 podStartE2EDuration="44.251651241s" podCreationTimestamp="2026-02-17 14:29:08 +0000 UTC" firstStartedPulling="2026-02-17 14:29:10.694855327 +0000 UTC m=+1377.037783596" lastFinishedPulling="2026-02-17 14:29:48.135057554 +0000 UTC m=+1414.477985823" observedRunningTime="2026-02-17 14:29:52.239562156 +0000 UTC m=+1418.582490425" watchObservedRunningTime="2026-02-17 14:29:52.251651241 +0000 UTC m=+1418.594579530" Feb 17 14:29:53 crc kubenswrapper[4836]: I0217 14:29:53.257841 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"172fadf8-99d3-436a-b711-010e8ffe289b","Type":"ContainerStarted","Data":"f21a427ef140473387eb828643e5c1c1f5df7ae54ee3624be131f330b8f47e43"} Feb 17 14:29:53 crc kubenswrapper[4836]: I0217 14:29:53.266975 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b5121f0d-e93f-44c6-96b5-4ed7b6ec960e","Type":"ContainerStarted","Data":"e5f0909bf1bdf3c38ab94d209afa4c703fec8864b01634ec0cb8a2070cb29a63"} Feb 17 14:29:53 crc kubenswrapper[4836]: I0217 14:29:53.289551 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=6.28953086 podStartE2EDuration="6.28953086s" podCreationTimestamp="2026-02-17 14:29:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:29:53.286186729 +0000 UTC m=+1419.629114998" watchObservedRunningTime="2026-02-17 14:29:53.28953086 +0000 UTC m=+1419.632459129" Feb 17 14:29:53 crc kubenswrapper[4836]: I0217 14:29:53.326583 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=8.326531669 podStartE2EDuration="8.326531669s" podCreationTimestamp="2026-02-17 14:29:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:29:53.321417591 +0000 UTC m=+1419.664345860" watchObservedRunningTime="2026-02-17 14:29:53.326531669 +0000 UTC m=+1419.669459938" Feb 17 14:29:56 crc kubenswrapper[4836]: I0217 14:29:56.403424 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 17 14:29:56 crc kubenswrapper[4836]: I0217 14:29:56.405081 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 17 14:29:56 crc kubenswrapper[4836]: I0217 14:29:56.512812 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 17 14:29:56 crc kubenswrapper[4836]: I0217 14:29:56.533197 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 17 14:29:56 crc kubenswrapper[4836]: I0217 14:29:56.864536 4836 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cloudkitty-api-0" podUID="260ff3a2-31e2-4ef9-9d7a-ef52ddb0fc49" containerName="cloudkitty-api" probeResult="failure" output="Get \"https://10.217.0.194:8889/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 17 14:29:56 crc kubenswrapper[4836]: I0217 14:29:56.892509 4836 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cloudkitty-api-0" podUID="260ff3a2-31e2-4ef9-9d7a-ef52ddb0fc49" containerName="cloudkitty-api" probeResult="failure" output="Get \"https://10.217.0.194:8889/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 17 14:29:57 crc kubenswrapper[4836]: I0217 14:29:57.491909 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 17 14:29:57 crc kubenswrapper[4836]: I0217 14:29:57.491951 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 17 14:29:58 crc kubenswrapper[4836]: I0217 14:29:58.874486 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 17 14:29:58 crc kubenswrapper[4836]: I0217 14:29:58.930021 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 17 14:29:58 crc kubenswrapper[4836]: I0217 14:29:58.954615 4836 generic.go:334] "Generic (PLEG): container finished" podID="56c9a452-ffd5-4b03-97a9-93546a194414" containerID="d3bbefd170a172c21cb9f9e3cfad807a2c0bb5fe5338d9d264fb6ae4c6ff5de7" exitCode=0 Feb 17 14:29:58 crc kubenswrapper[4836]: I0217 14:29:58.955799 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"56c9a452-ffd5-4b03-97a9-93546a194414","Type":"ContainerDied","Data":"d3bbefd170a172c21cb9f9e3cfad807a2c0bb5fe5338d9d264fb6ae4c6ff5de7"} Feb 17 14:29:58 crc kubenswrapper[4836]: I0217 14:29:58.972923 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 17 14:29:59 crc kubenswrapper[4836]: I0217 14:29:59.013193 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 17 14:29:59 crc kubenswrapper[4836]: I0217 14:29:59.485944 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 14:29:59 crc kubenswrapper[4836]: I0217 14:29:59.630847 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56c9a452-ffd5-4b03-97a9-93546a194414-combined-ca-bundle\") pod \"56c9a452-ffd5-4b03-97a9-93546a194414\" (UID: \"56c9a452-ffd5-4b03-97a9-93546a194414\") " Feb 17 14:29:59 crc kubenswrapper[4836]: I0217 14:29:59.631588 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56c9a452-ffd5-4b03-97a9-93546a194414-config-data\") pod \"56c9a452-ffd5-4b03-97a9-93546a194414\" (UID: \"56c9a452-ffd5-4b03-97a9-93546a194414\") " Feb 17 14:29:59 crc kubenswrapper[4836]: I0217 14:29:59.631681 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/56c9a452-ffd5-4b03-97a9-93546a194414-scripts\") pod \"56c9a452-ffd5-4b03-97a9-93546a194414\" (UID: \"56c9a452-ffd5-4b03-97a9-93546a194414\") " Feb 17 14:29:59 crc kubenswrapper[4836]: I0217 14:29:59.631739 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/56c9a452-ffd5-4b03-97a9-93546a194414-log-httpd\") pod \"56c9a452-ffd5-4b03-97a9-93546a194414\" (UID: \"56c9a452-ffd5-4b03-97a9-93546a194414\") " Feb 17 14:29:59 crc kubenswrapper[4836]: I0217 14:29:59.631889 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pq7ns\" (UniqueName: \"kubernetes.io/projected/56c9a452-ffd5-4b03-97a9-93546a194414-kube-api-access-pq7ns\") pod \"56c9a452-ffd5-4b03-97a9-93546a194414\" (UID: \"56c9a452-ffd5-4b03-97a9-93546a194414\") " Feb 17 14:29:59 crc kubenswrapper[4836]: I0217 14:29:59.631952 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/56c9a452-ffd5-4b03-97a9-93546a194414-sg-core-conf-yaml\") pod \"56c9a452-ffd5-4b03-97a9-93546a194414\" (UID: \"56c9a452-ffd5-4b03-97a9-93546a194414\") " Feb 17 14:29:59 crc kubenswrapper[4836]: I0217 14:29:59.631979 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/56c9a452-ffd5-4b03-97a9-93546a194414-run-httpd\") pod \"56c9a452-ffd5-4b03-97a9-93546a194414\" (UID: \"56c9a452-ffd5-4b03-97a9-93546a194414\") " Feb 17 14:29:59 crc kubenswrapper[4836]: I0217 14:29:59.632693 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56c9a452-ffd5-4b03-97a9-93546a194414-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "56c9a452-ffd5-4b03-97a9-93546a194414" (UID: "56c9a452-ffd5-4b03-97a9-93546a194414"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:29:59 crc kubenswrapper[4836]: I0217 14:29:59.632901 4836 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/56c9a452-ffd5-4b03-97a9-93546a194414-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:59 crc kubenswrapper[4836]: I0217 14:29:59.633122 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56c9a452-ffd5-4b03-97a9-93546a194414-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "56c9a452-ffd5-4b03-97a9-93546a194414" (UID: "56c9a452-ffd5-4b03-97a9-93546a194414"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:29:59 crc kubenswrapper[4836]: I0217 14:29:59.640019 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56c9a452-ffd5-4b03-97a9-93546a194414-scripts" (OuterVolumeSpecName: "scripts") pod "56c9a452-ffd5-4b03-97a9-93546a194414" (UID: "56c9a452-ffd5-4b03-97a9-93546a194414"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:29:59 crc kubenswrapper[4836]: I0217 14:29:59.652476 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56c9a452-ffd5-4b03-97a9-93546a194414-kube-api-access-pq7ns" (OuterVolumeSpecName: "kube-api-access-pq7ns") pod "56c9a452-ffd5-4b03-97a9-93546a194414" (UID: "56c9a452-ffd5-4b03-97a9-93546a194414"). InnerVolumeSpecName "kube-api-access-pq7ns". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:29:59 crc kubenswrapper[4836]: I0217 14:29:59.714453 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56c9a452-ffd5-4b03-97a9-93546a194414-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "56c9a452-ffd5-4b03-97a9-93546a194414" (UID: "56c9a452-ffd5-4b03-97a9-93546a194414"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:29:59 crc kubenswrapper[4836]: I0217 14:29:59.737133 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pq7ns\" (UniqueName: \"kubernetes.io/projected/56c9a452-ffd5-4b03-97a9-93546a194414-kube-api-access-pq7ns\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:59 crc kubenswrapper[4836]: I0217 14:29:59.737184 4836 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/56c9a452-ffd5-4b03-97a9-93546a194414-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:59 crc kubenswrapper[4836]: I0217 14:29:59.737198 4836 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/56c9a452-ffd5-4b03-97a9-93546a194414-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:59 crc kubenswrapper[4836]: I0217 14:29:59.737208 4836 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/56c9a452-ffd5-4b03-97a9-93546a194414-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:59 crc kubenswrapper[4836]: I0217 14:29:59.765089 4836 patch_prober.go:28] interesting pod/machine-config-daemon-bkk9g container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 14:29:59 crc kubenswrapper[4836]: I0217 14:29:59.765364 4836 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" podUID="895a19c9-a3f0-4a15-aa19-19347121388c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 14:29:59 crc kubenswrapper[4836]: I0217 14:29:59.813683 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56c9a452-ffd5-4b03-97a9-93546a194414-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "56c9a452-ffd5-4b03-97a9-93546a194414" (UID: "56c9a452-ffd5-4b03-97a9-93546a194414"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:29:59 crc kubenswrapper[4836]: I0217 14:29:59.825131 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56c9a452-ffd5-4b03-97a9-93546a194414-config-data" (OuterVolumeSpecName: "config-data") pod "56c9a452-ffd5-4b03-97a9-93546a194414" (UID: "56c9a452-ffd5-4b03-97a9-93546a194414"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:29:59 crc kubenswrapper[4836]: I0217 14:29:59.839831 4836 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56c9a452-ffd5-4b03-97a9-93546a194414-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:59 crc kubenswrapper[4836]: I0217 14:29:59.839892 4836 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56c9a452-ffd5-4b03-97a9-93546a194414-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 14:29:59 crc kubenswrapper[4836]: I0217 14:29:59.970185 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"56c9a452-ffd5-4b03-97a9-93546a194414","Type":"ContainerDied","Data":"3732eb36b7746243f4a9bad758b1bf9afb106bf058ee751c3feddbab6042cb9c"} Feb 17 14:29:59 crc kubenswrapper[4836]: I0217 14:29:59.970252 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 14:29:59 crc kubenswrapper[4836]: I0217 14:29:59.970286 4836 scope.go:117] "RemoveContainer" containerID="af4d958a142f654427a1232ba635917cf28fa5a94e024ffbc7ea40f1fda64c7a" Feb 17 14:29:59 crc kubenswrapper[4836]: I0217 14:29:59.970450 4836 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 17 14:29:59 crc kubenswrapper[4836]: I0217 14:29:59.970468 4836 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 17 14:29:59 crc kubenswrapper[4836]: I0217 14:29:59.972644 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 17 14:29:59 crc kubenswrapper[4836]: I0217 14:29:59.972675 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 17 14:30:00 crc kubenswrapper[4836]: I0217 14:30:00.006241 4836 scope.go:117] "RemoveContainer" containerID="e9f32ae3116b957a3eb3e85cc5cb945cd5cac421baa8b9f3e186e90cea341d86" Feb 17 14:30:00 crc kubenswrapper[4836]: I0217 14:30:00.015016 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 14:30:00 crc kubenswrapper[4836]: I0217 14:30:00.030088 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 17 14:30:00 crc kubenswrapper[4836]: I0217 14:30:00.095213 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 17 14:30:00 crc kubenswrapper[4836]: E0217 14:30:00.097522 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56c9a452-ffd5-4b03-97a9-93546a194414" containerName="ceilometer-central-agent" Feb 17 14:30:00 crc kubenswrapper[4836]: I0217 14:30:00.097554 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="56c9a452-ffd5-4b03-97a9-93546a194414" containerName="ceilometer-central-agent" Feb 17 14:30:00 crc kubenswrapper[4836]: E0217 14:30:00.097604 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56c9a452-ffd5-4b03-97a9-93546a194414" containerName="ceilometer-notification-agent" Feb 17 14:30:00 crc kubenswrapper[4836]: I0217 14:30:00.097612 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="56c9a452-ffd5-4b03-97a9-93546a194414" containerName="ceilometer-notification-agent" Feb 17 14:30:00 crc kubenswrapper[4836]: E0217 14:30:00.097638 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56c9a452-ffd5-4b03-97a9-93546a194414" containerName="proxy-httpd" Feb 17 14:30:00 crc kubenswrapper[4836]: I0217 14:30:00.097645 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="56c9a452-ffd5-4b03-97a9-93546a194414" containerName="proxy-httpd" Feb 17 14:30:00 crc kubenswrapper[4836]: E0217 14:30:00.097663 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56c9a452-ffd5-4b03-97a9-93546a194414" containerName="sg-core" Feb 17 14:30:00 crc kubenswrapper[4836]: I0217 14:30:00.097670 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="56c9a452-ffd5-4b03-97a9-93546a194414" containerName="sg-core" Feb 17 14:30:00 crc kubenswrapper[4836]: I0217 14:30:00.098205 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="56c9a452-ffd5-4b03-97a9-93546a194414" containerName="ceilometer-notification-agent" Feb 17 14:30:00 crc kubenswrapper[4836]: I0217 14:30:00.098255 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="56c9a452-ffd5-4b03-97a9-93546a194414" containerName="ceilometer-central-agent" Feb 17 14:30:00 crc kubenswrapper[4836]: I0217 14:30:00.098276 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="56c9a452-ffd5-4b03-97a9-93546a194414" containerName="proxy-httpd" Feb 17 14:30:00 crc kubenswrapper[4836]: I0217 14:30:00.098310 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="56c9a452-ffd5-4b03-97a9-93546a194414" containerName="sg-core" Feb 17 14:30:00 crc kubenswrapper[4836]: I0217 14:30:00.098365 4836 scope.go:117] "RemoveContainer" containerID="dfd206b5463d3cc6f1e9888d9e21e49bdbfd5f95e1989f13c38e03bb00682c21" Feb 17 14:30:00 crc kubenswrapper[4836]: I0217 14:30:00.112276 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 14:30:00 crc kubenswrapper[4836]: I0217 14:30:00.119087 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 17 14:30:00 crc kubenswrapper[4836]: I0217 14:30:00.119759 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 17 14:30:00 crc kubenswrapper[4836]: I0217 14:30:00.141740 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-api-0" Feb 17 14:30:00 crc kubenswrapper[4836]: I0217 14:30:00.158651 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ff6c86e-b884-480e-b74b-30e4a586b5fa-scripts\") pod \"ceilometer-0\" (UID: \"3ff6c86e-b884-480e-b74b-30e4a586b5fa\") " pod="openstack/ceilometer-0" Feb 17 14:30:00 crc kubenswrapper[4836]: I0217 14:30:00.158786 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfvn4\" (UniqueName: \"kubernetes.io/projected/3ff6c86e-b884-480e-b74b-30e4a586b5fa-kube-api-access-pfvn4\") pod \"ceilometer-0\" (UID: \"3ff6c86e-b884-480e-b74b-30e4a586b5fa\") " pod="openstack/ceilometer-0" Feb 17 14:30:00 crc kubenswrapper[4836]: I0217 14:30:00.158818 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ff6c86e-b884-480e-b74b-30e4a586b5fa-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3ff6c86e-b884-480e-b74b-30e4a586b5fa\") " pod="openstack/ceilometer-0" Feb 17 14:30:00 crc kubenswrapper[4836]: I0217 14:30:00.159003 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ff6c86e-b884-480e-b74b-30e4a586b5fa-config-data\") pod \"ceilometer-0\" (UID: \"3ff6c86e-b884-480e-b74b-30e4a586b5fa\") " pod="openstack/ceilometer-0" Feb 17 14:30:00 crc kubenswrapper[4836]: I0217 14:30:00.159068 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3ff6c86e-b884-480e-b74b-30e4a586b5fa-log-httpd\") pod \"ceilometer-0\" (UID: \"3ff6c86e-b884-480e-b74b-30e4a586b5fa\") " pod="openstack/ceilometer-0" Feb 17 14:30:00 crc kubenswrapper[4836]: I0217 14:30:00.159179 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3ff6c86e-b884-480e-b74b-30e4a586b5fa-run-httpd\") pod \"ceilometer-0\" (UID: \"3ff6c86e-b884-480e-b74b-30e4a586b5fa\") " pod="openstack/ceilometer-0" Feb 17 14:30:00 crc kubenswrapper[4836]: I0217 14:30:00.159236 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3ff6c86e-b884-480e-b74b-30e4a586b5fa-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3ff6c86e-b884-480e-b74b-30e4a586b5fa\") " pod="openstack/ceilometer-0" Feb 17 14:30:00 crc kubenswrapper[4836]: I0217 14:30:00.200065 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 14:30:00 crc kubenswrapper[4836]: I0217 14:30:00.216394 4836 scope.go:117] "RemoveContainer" containerID="d3bbefd170a172c21cb9f9e3cfad807a2c0bb5fe5338d9d264fb6ae4c6ff5de7" Feb 17 14:30:00 crc kubenswrapper[4836]: I0217 14:30:00.251953 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 14:30:00 crc kubenswrapper[4836]: E0217 14:30:00.253153 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle config-data kube-api-access-pfvn4 log-httpd run-httpd scripts sg-core-conf-yaml], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/ceilometer-0" podUID="3ff6c86e-b884-480e-b74b-30e4a586b5fa" Feb 17 14:30:00 crc kubenswrapper[4836]: I0217 14:30:00.261862 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ff6c86e-b884-480e-b74b-30e4a586b5fa-scripts\") pod \"ceilometer-0\" (UID: \"3ff6c86e-b884-480e-b74b-30e4a586b5fa\") " pod="openstack/ceilometer-0" Feb 17 14:30:00 crc kubenswrapper[4836]: I0217 14:30:00.261976 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfvn4\" (UniqueName: \"kubernetes.io/projected/3ff6c86e-b884-480e-b74b-30e4a586b5fa-kube-api-access-pfvn4\") pod \"ceilometer-0\" (UID: \"3ff6c86e-b884-480e-b74b-30e4a586b5fa\") " pod="openstack/ceilometer-0" Feb 17 14:30:00 crc kubenswrapper[4836]: I0217 14:30:00.261996 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ff6c86e-b884-480e-b74b-30e4a586b5fa-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3ff6c86e-b884-480e-b74b-30e4a586b5fa\") " pod="openstack/ceilometer-0" Feb 17 14:30:00 crc kubenswrapper[4836]: I0217 14:30:00.262086 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ff6c86e-b884-480e-b74b-30e4a586b5fa-config-data\") pod \"ceilometer-0\" (UID: \"3ff6c86e-b884-480e-b74b-30e4a586b5fa\") " pod="openstack/ceilometer-0" Feb 17 14:30:00 crc kubenswrapper[4836]: I0217 14:30:00.262115 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3ff6c86e-b884-480e-b74b-30e4a586b5fa-log-httpd\") pod \"ceilometer-0\" (UID: \"3ff6c86e-b884-480e-b74b-30e4a586b5fa\") " pod="openstack/ceilometer-0" Feb 17 14:30:00 crc kubenswrapper[4836]: I0217 14:30:00.262177 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3ff6c86e-b884-480e-b74b-30e4a586b5fa-run-httpd\") pod \"ceilometer-0\" (UID: \"3ff6c86e-b884-480e-b74b-30e4a586b5fa\") " pod="openstack/ceilometer-0" Feb 17 14:30:00 crc kubenswrapper[4836]: I0217 14:30:00.262208 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3ff6c86e-b884-480e-b74b-30e4a586b5fa-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3ff6c86e-b884-480e-b74b-30e4a586b5fa\") " pod="openstack/ceilometer-0" Feb 17 14:30:00 crc kubenswrapper[4836]: I0217 14:30:00.263525 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3ff6c86e-b884-480e-b74b-30e4a586b5fa-log-httpd\") pod \"ceilometer-0\" (UID: \"3ff6c86e-b884-480e-b74b-30e4a586b5fa\") " pod="openstack/ceilometer-0" Feb 17 14:30:00 crc kubenswrapper[4836]: I0217 14:30:00.264619 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3ff6c86e-b884-480e-b74b-30e4a586b5fa-run-httpd\") pod \"ceilometer-0\" (UID: \"3ff6c86e-b884-480e-b74b-30e4a586b5fa\") " pod="openstack/ceilometer-0" Feb 17 14:30:00 crc kubenswrapper[4836]: I0217 14:30:00.267928 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ff6c86e-b884-480e-b74b-30e4a586b5fa-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3ff6c86e-b884-480e-b74b-30e4a586b5fa\") " pod="openstack/ceilometer-0" Feb 17 14:30:00 crc kubenswrapper[4836]: I0217 14:30:00.273135 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ff6c86e-b884-480e-b74b-30e4a586b5fa-scripts\") pod \"ceilometer-0\" (UID: \"3ff6c86e-b884-480e-b74b-30e4a586b5fa\") " pod="openstack/ceilometer-0" Feb 17 14:30:00 crc kubenswrapper[4836]: I0217 14:30:00.273245 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522310-jpncr"] Feb 17 14:30:00 crc kubenswrapper[4836]: I0217 14:30:00.274158 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3ff6c86e-b884-480e-b74b-30e4a586b5fa-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3ff6c86e-b884-480e-b74b-30e4a586b5fa\") " pod="openstack/ceilometer-0" Feb 17 14:30:00 crc kubenswrapper[4836]: I0217 14:30:00.276140 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522310-jpncr" Feb 17 14:30:00 crc kubenswrapper[4836]: I0217 14:30:00.278553 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ff6c86e-b884-480e-b74b-30e4a586b5fa-config-data\") pod \"ceilometer-0\" (UID: \"3ff6c86e-b884-480e-b74b-30e4a586b5fa\") " pod="openstack/ceilometer-0" Feb 17 14:30:00 crc kubenswrapper[4836]: I0217 14:30:00.280926 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 17 14:30:00 crc kubenswrapper[4836]: I0217 14:30:00.285327 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfvn4\" (UniqueName: \"kubernetes.io/projected/3ff6c86e-b884-480e-b74b-30e4a586b5fa-kube-api-access-pfvn4\") pod \"ceilometer-0\" (UID: \"3ff6c86e-b884-480e-b74b-30e4a586b5fa\") " pod="openstack/ceilometer-0" Feb 17 14:30:00 crc kubenswrapper[4836]: I0217 14:30:00.285551 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 17 14:30:00 crc kubenswrapper[4836]: I0217 14:30:00.370261 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/576199c0-9d59-4a1d-bd1d-ec32eb8fac02-secret-volume\") pod \"collect-profiles-29522310-jpncr\" (UID: \"576199c0-9d59-4a1d-bd1d-ec32eb8fac02\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522310-jpncr" Feb 17 14:30:00 crc kubenswrapper[4836]: I0217 14:30:00.370755 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/576199c0-9d59-4a1d-bd1d-ec32eb8fac02-config-volume\") pod \"collect-profiles-29522310-jpncr\" (UID: \"576199c0-9d59-4a1d-bd1d-ec32eb8fac02\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522310-jpncr" Feb 17 14:30:00 crc kubenswrapper[4836]: I0217 14:30:00.371345 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cv9x\" (UniqueName: \"kubernetes.io/projected/576199c0-9d59-4a1d-bd1d-ec32eb8fac02-kube-api-access-5cv9x\") pod \"collect-profiles-29522310-jpncr\" (UID: \"576199c0-9d59-4a1d-bd1d-ec32eb8fac02\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522310-jpncr" Feb 17 14:30:00 crc kubenswrapper[4836]: I0217 14:30:00.735787 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5cv9x\" (UniqueName: \"kubernetes.io/projected/576199c0-9d59-4a1d-bd1d-ec32eb8fac02-kube-api-access-5cv9x\") pod \"collect-profiles-29522310-jpncr\" (UID: \"576199c0-9d59-4a1d-bd1d-ec32eb8fac02\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522310-jpncr" Feb 17 14:30:00 crc kubenswrapper[4836]: I0217 14:30:00.735900 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/576199c0-9d59-4a1d-bd1d-ec32eb8fac02-secret-volume\") pod \"collect-profiles-29522310-jpncr\" (UID: \"576199c0-9d59-4a1d-bd1d-ec32eb8fac02\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522310-jpncr" Feb 17 14:30:00 crc kubenswrapper[4836]: I0217 14:30:00.735969 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/576199c0-9d59-4a1d-bd1d-ec32eb8fac02-config-volume\") pod \"collect-profiles-29522310-jpncr\" (UID: \"576199c0-9d59-4a1d-bd1d-ec32eb8fac02\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522310-jpncr" Feb 17 14:30:00 crc kubenswrapper[4836]: I0217 14:30:00.737461 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/576199c0-9d59-4a1d-bd1d-ec32eb8fac02-config-volume\") pod \"collect-profiles-29522310-jpncr\" (UID: \"576199c0-9d59-4a1d-bd1d-ec32eb8fac02\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522310-jpncr" Feb 17 14:30:00 crc kubenswrapper[4836]: I0217 14:30:00.752498 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/576199c0-9d59-4a1d-bd1d-ec32eb8fac02-secret-volume\") pod \"collect-profiles-29522310-jpncr\" (UID: \"576199c0-9d59-4a1d-bd1d-ec32eb8fac02\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522310-jpncr" Feb 17 14:30:00 crc kubenswrapper[4836]: I0217 14:30:00.778513 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cv9x\" (UniqueName: \"kubernetes.io/projected/576199c0-9d59-4a1d-bd1d-ec32eb8fac02-kube-api-access-5cv9x\") pod \"collect-profiles-29522310-jpncr\" (UID: \"576199c0-9d59-4a1d-bd1d-ec32eb8fac02\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522310-jpncr" Feb 17 14:30:00 crc kubenswrapper[4836]: I0217 14:30:00.799018 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56c9a452-ffd5-4b03-97a9-93546a194414" path="/var/lib/kubelet/pods/56c9a452-ffd5-4b03-97a9-93546a194414/volumes" Feb 17 14:30:00 crc kubenswrapper[4836]: I0217 14:30:00.842882 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522310-jpncr"] Feb 17 14:30:00 crc kubenswrapper[4836]: I0217 14:30:00.846111 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522310-jpncr" Feb 17 14:30:00 crc kubenswrapper[4836]: I0217 14:30:00.991725 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 14:30:01 crc kubenswrapper[4836]: I0217 14:30:01.089601 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 14:30:01 crc kubenswrapper[4836]: I0217 14:30:01.261322 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ff6c86e-b884-480e-b74b-30e4a586b5fa-combined-ca-bundle\") pod \"3ff6c86e-b884-480e-b74b-30e4a586b5fa\" (UID: \"3ff6c86e-b884-480e-b74b-30e4a586b5fa\") " Feb 17 14:30:01 crc kubenswrapper[4836]: I0217 14:30:01.261455 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3ff6c86e-b884-480e-b74b-30e4a586b5fa-sg-core-conf-yaml\") pod \"3ff6c86e-b884-480e-b74b-30e4a586b5fa\" (UID: \"3ff6c86e-b884-480e-b74b-30e4a586b5fa\") " Feb 17 14:30:01 crc kubenswrapper[4836]: I0217 14:30:01.261507 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pfvn4\" (UniqueName: \"kubernetes.io/projected/3ff6c86e-b884-480e-b74b-30e4a586b5fa-kube-api-access-pfvn4\") pod \"3ff6c86e-b884-480e-b74b-30e4a586b5fa\" (UID: \"3ff6c86e-b884-480e-b74b-30e4a586b5fa\") " Feb 17 14:30:01 crc kubenswrapper[4836]: I0217 14:30:01.261568 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ff6c86e-b884-480e-b74b-30e4a586b5fa-scripts\") pod \"3ff6c86e-b884-480e-b74b-30e4a586b5fa\" (UID: \"3ff6c86e-b884-480e-b74b-30e4a586b5fa\") " Feb 17 14:30:01 crc kubenswrapper[4836]: I0217 14:30:01.261640 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3ff6c86e-b884-480e-b74b-30e4a586b5fa-run-httpd\") pod \"3ff6c86e-b884-480e-b74b-30e4a586b5fa\" (UID: \"3ff6c86e-b884-480e-b74b-30e4a586b5fa\") " Feb 17 14:30:01 crc kubenswrapper[4836]: I0217 14:30:01.261664 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3ff6c86e-b884-480e-b74b-30e4a586b5fa-log-httpd\") pod \"3ff6c86e-b884-480e-b74b-30e4a586b5fa\" (UID: \"3ff6c86e-b884-480e-b74b-30e4a586b5fa\") " Feb 17 14:30:01 crc kubenswrapper[4836]: I0217 14:30:01.261683 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ff6c86e-b884-480e-b74b-30e4a586b5fa-config-data\") pod \"3ff6c86e-b884-480e-b74b-30e4a586b5fa\" (UID: \"3ff6c86e-b884-480e-b74b-30e4a586b5fa\") " Feb 17 14:30:01 crc kubenswrapper[4836]: I0217 14:30:01.263641 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ff6c86e-b884-480e-b74b-30e4a586b5fa-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "3ff6c86e-b884-480e-b74b-30e4a586b5fa" (UID: "3ff6c86e-b884-480e-b74b-30e4a586b5fa"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:30:01 crc kubenswrapper[4836]: I0217 14:30:01.263860 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ff6c86e-b884-480e-b74b-30e4a586b5fa-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "3ff6c86e-b884-480e-b74b-30e4a586b5fa" (UID: "3ff6c86e-b884-480e-b74b-30e4a586b5fa"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:30:01 crc kubenswrapper[4836]: I0217 14:30:01.272799 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ff6c86e-b884-480e-b74b-30e4a586b5fa-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "3ff6c86e-b884-480e-b74b-30e4a586b5fa" (UID: "3ff6c86e-b884-480e-b74b-30e4a586b5fa"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:30:01 crc kubenswrapper[4836]: I0217 14:30:01.272964 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ff6c86e-b884-480e-b74b-30e4a586b5fa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3ff6c86e-b884-480e-b74b-30e4a586b5fa" (UID: "3ff6c86e-b884-480e-b74b-30e4a586b5fa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:30:01 crc kubenswrapper[4836]: I0217 14:30:01.278449 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ff6c86e-b884-480e-b74b-30e4a586b5fa-config-data" (OuterVolumeSpecName: "config-data") pod "3ff6c86e-b884-480e-b74b-30e4a586b5fa" (UID: "3ff6c86e-b884-480e-b74b-30e4a586b5fa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:30:01 crc kubenswrapper[4836]: I0217 14:30:01.280545 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ff6c86e-b884-480e-b74b-30e4a586b5fa-scripts" (OuterVolumeSpecName: "scripts") pod "3ff6c86e-b884-480e-b74b-30e4a586b5fa" (UID: "3ff6c86e-b884-480e-b74b-30e4a586b5fa"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:30:01 crc kubenswrapper[4836]: I0217 14:30:01.284728 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ff6c86e-b884-480e-b74b-30e4a586b5fa-kube-api-access-pfvn4" (OuterVolumeSpecName: "kube-api-access-pfvn4") pod "3ff6c86e-b884-480e-b74b-30e4a586b5fa" (UID: "3ff6c86e-b884-480e-b74b-30e4a586b5fa"). InnerVolumeSpecName "kube-api-access-pfvn4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:30:01 crc kubenswrapper[4836]: I0217 14:30:01.366909 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pfvn4\" (UniqueName: \"kubernetes.io/projected/3ff6c86e-b884-480e-b74b-30e4a586b5fa-kube-api-access-pfvn4\") on node \"crc\" DevicePath \"\"" Feb 17 14:30:01 crc kubenswrapper[4836]: I0217 14:30:01.367281 4836 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ff6c86e-b884-480e-b74b-30e4a586b5fa-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 14:30:01 crc kubenswrapper[4836]: I0217 14:30:01.367306 4836 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3ff6c86e-b884-480e-b74b-30e4a586b5fa-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 14:30:01 crc kubenswrapper[4836]: I0217 14:30:01.367318 4836 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3ff6c86e-b884-480e-b74b-30e4a586b5fa-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 14:30:01 crc kubenswrapper[4836]: I0217 14:30:01.367326 4836 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ff6c86e-b884-480e-b74b-30e4a586b5fa-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 14:30:01 crc kubenswrapper[4836]: I0217 14:30:01.367336 4836 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ff6c86e-b884-480e-b74b-30e4a586b5fa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:30:01 crc kubenswrapper[4836]: I0217 14:30:01.367345 4836 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3ff6c86e-b884-480e-b74b-30e4a586b5fa-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 17 14:30:01 crc kubenswrapper[4836]: I0217 14:30:01.564056 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522310-jpncr"] Feb 17 14:30:01 crc kubenswrapper[4836]: W0217 14:30:01.577138 4836 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod576199c0_9d59_4a1d_bd1d_ec32eb8fac02.slice/crio-acc8635fbbc7bb6f57c5d71b815410418e47fa88d5bfcb304605c21d210e514a WatchSource:0}: Error finding container acc8635fbbc7bb6f57c5d71b815410418e47fa88d5bfcb304605c21d210e514a: Status 404 returned error can't find the container with id acc8635fbbc7bb6f57c5d71b815410418e47fa88d5bfcb304605c21d210e514a Feb 17 14:30:01 crc kubenswrapper[4836]: I0217 14:30:01.644251 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 17 14:30:01 crc kubenswrapper[4836]: I0217 14:30:01.644355 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 17 14:30:02 crc kubenswrapper[4836]: I0217 14:30:02.044371 4836 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 17 14:30:02 crc kubenswrapper[4836]: I0217 14:30:02.044871 4836 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 17 14:30:02 crc kubenswrapper[4836]: I0217 14:30:02.046359 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522310-jpncr" event={"ID":"576199c0-9d59-4a1d-bd1d-ec32eb8fac02","Type":"ContainerStarted","Data":"acc8635fbbc7bb6f57c5d71b815410418e47fa88d5bfcb304605c21d210e514a"} Feb 17 14:30:02 crc kubenswrapper[4836]: I0217 14:30:02.046514 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 14:30:02 crc kubenswrapper[4836]: I0217 14:30:02.133686 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 14:30:02 crc kubenswrapper[4836]: I0217 14:30:02.150838 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 17 14:30:02 crc kubenswrapper[4836]: I0217 14:30:02.165489 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 17 14:30:02 crc kubenswrapper[4836]: I0217 14:30:02.168891 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 14:30:02 crc kubenswrapper[4836]: I0217 14:30:02.181219 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 17 14:30:02 crc kubenswrapper[4836]: I0217 14:30:02.181903 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 17 14:30:02 crc kubenswrapper[4836]: I0217 14:30:02.228517 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 14:30:02 crc kubenswrapper[4836]: I0217 14:30:02.301648 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ca3b87dc-01ac-4a72-a432-9a4503f13c0b-log-httpd\") pod \"ceilometer-0\" (UID: \"ca3b87dc-01ac-4a72-a432-9a4503f13c0b\") " pod="openstack/ceilometer-0" Feb 17 14:30:02 crc kubenswrapper[4836]: I0217 14:30:02.301710 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ca3b87dc-01ac-4a72-a432-9a4503f13c0b-run-httpd\") pod \"ceilometer-0\" (UID: \"ca3b87dc-01ac-4a72-a432-9a4503f13c0b\") " pod="openstack/ceilometer-0" Feb 17 14:30:02 crc kubenswrapper[4836]: I0217 14:30:02.301890 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdnt8\" (UniqueName: \"kubernetes.io/projected/ca3b87dc-01ac-4a72-a432-9a4503f13c0b-kube-api-access-wdnt8\") pod \"ceilometer-0\" (UID: \"ca3b87dc-01ac-4a72-a432-9a4503f13c0b\") " pod="openstack/ceilometer-0" Feb 17 14:30:02 crc kubenswrapper[4836]: I0217 14:30:02.302051 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca3b87dc-01ac-4a72-a432-9a4503f13c0b-scripts\") pod \"ceilometer-0\" (UID: \"ca3b87dc-01ac-4a72-a432-9a4503f13c0b\") " pod="openstack/ceilometer-0" Feb 17 14:30:02 crc kubenswrapper[4836]: I0217 14:30:02.302264 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca3b87dc-01ac-4a72-a432-9a4503f13c0b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ca3b87dc-01ac-4a72-a432-9a4503f13c0b\") " pod="openstack/ceilometer-0" Feb 17 14:30:02 crc kubenswrapper[4836]: I0217 14:30:02.302347 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca3b87dc-01ac-4a72-a432-9a4503f13c0b-config-data\") pod \"ceilometer-0\" (UID: \"ca3b87dc-01ac-4a72-a432-9a4503f13c0b\") " pod="openstack/ceilometer-0" Feb 17 14:30:02 crc kubenswrapper[4836]: I0217 14:30:02.302423 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ca3b87dc-01ac-4a72-a432-9a4503f13c0b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ca3b87dc-01ac-4a72-a432-9a4503f13c0b\") " pod="openstack/ceilometer-0" Feb 17 14:30:02 crc kubenswrapper[4836]: I0217 14:30:02.407014 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca3b87dc-01ac-4a72-a432-9a4503f13c0b-config-data\") pod \"ceilometer-0\" (UID: \"ca3b87dc-01ac-4a72-a432-9a4503f13c0b\") " pod="openstack/ceilometer-0" Feb 17 14:30:02 crc kubenswrapper[4836]: I0217 14:30:02.407085 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ca3b87dc-01ac-4a72-a432-9a4503f13c0b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ca3b87dc-01ac-4a72-a432-9a4503f13c0b\") " pod="openstack/ceilometer-0" Feb 17 14:30:02 crc kubenswrapper[4836]: I0217 14:30:02.407157 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ca3b87dc-01ac-4a72-a432-9a4503f13c0b-log-httpd\") pod \"ceilometer-0\" (UID: \"ca3b87dc-01ac-4a72-a432-9a4503f13c0b\") " pod="openstack/ceilometer-0" Feb 17 14:30:02 crc kubenswrapper[4836]: I0217 14:30:02.407178 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ca3b87dc-01ac-4a72-a432-9a4503f13c0b-run-httpd\") pod \"ceilometer-0\" (UID: \"ca3b87dc-01ac-4a72-a432-9a4503f13c0b\") " pod="openstack/ceilometer-0" Feb 17 14:30:02 crc kubenswrapper[4836]: I0217 14:30:02.407311 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdnt8\" (UniqueName: \"kubernetes.io/projected/ca3b87dc-01ac-4a72-a432-9a4503f13c0b-kube-api-access-wdnt8\") pod \"ceilometer-0\" (UID: \"ca3b87dc-01ac-4a72-a432-9a4503f13c0b\") " pod="openstack/ceilometer-0" Feb 17 14:30:02 crc kubenswrapper[4836]: I0217 14:30:02.407385 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca3b87dc-01ac-4a72-a432-9a4503f13c0b-scripts\") pod \"ceilometer-0\" (UID: \"ca3b87dc-01ac-4a72-a432-9a4503f13c0b\") " pod="openstack/ceilometer-0" Feb 17 14:30:02 crc kubenswrapper[4836]: I0217 14:30:02.407466 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca3b87dc-01ac-4a72-a432-9a4503f13c0b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ca3b87dc-01ac-4a72-a432-9a4503f13c0b\") " pod="openstack/ceilometer-0" Feb 17 14:30:02 crc kubenswrapper[4836]: I0217 14:30:02.407799 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ca3b87dc-01ac-4a72-a432-9a4503f13c0b-log-httpd\") pod \"ceilometer-0\" (UID: \"ca3b87dc-01ac-4a72-a432-9a4503f13c0b\") " pod="openstack/ceilometer-0" Feb 17 14:30:02 crc kubenswrapper[4836]: I0217 14:30:02.407849 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ca3b87dc-01ac-4a72-a432-9a4503f13c0b-run-httpd\") pod \"ceilometer-0\" (UID: \"ca3b87dc-01ac-4a72-a432-9a4503f13c0b\") " pod="openstack/ceilometer-0" Feb 17 14:30:02 crc kubenswrapper[4836]: I0217 14:30:02.415501 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca3b87dc-01ac-4a72-a432-9a4503f13c0b-scripts\") pod \"ceilometer-0\" (UID: \"ca3b87dc-01ac-4a72-a432-9a4503f13c0b\") " pod="openstack/ceilometer-0" Feb 17 14:30:02 crc kubenswrapper[4836]: I0217 14:30:02.415749 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca3b87dc-01ac-4a72-a432-9a4503f13c0b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ca3b87dc-01ac-4a72-a432-9a4503f13c0b\") " pod="openstack/ceilometer-0" Feb 17 14:30:02 crc kubenswrapper[4836]: I0217 14:30:02.415796 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca3b87dc-01ac-4a72-a432-9a4503f13c0b-config-data\") pod \"ceilometer-0\" (UID: \"ca3b87dc-01ac-4a72-a432-9a4503f13c0b\") " pod="openstack/ceilometer-0" Feb 17 14:30:02 crc kubenswrapper[4836]: I0217 14:30:02.442062 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ca3b87dc-01ac-4a72-a432-9a4503f13c0b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ca3b87dc-01ac-4a72-a432-9a4503f13c0b\") " pod="openstack/ceilometer-0" Feb 17 14:30:02 crc kubenswrapper[4836]: I0217 14:30:02.450209 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdnt8\" (UniqueName: \"kubernetes.io/projected/ca3b87dc-01ac-4a72-a432-9a4503f13c0b-kube-api-access-wdnt8\") pod \"ceilometer-0\" (UID: \"ca3b87dc-01ac-4a72-a432-9a4503f13c0b\") " pod="openstack/ceilometer-0" Feb 17 14:30:02 crc kubenswrapper[4836]: I0217 14:30:02.510538 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 14:30:02 crc kubenswrapper[4836]: I0217 14:30:02.587532 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ff6c86e-b884-480e-b74b-30e4a586b5fa" path="/var/lib/kubelet/pods/3ff6c86e-b884-480e-b74b-30e4a586b5fa/volumes" Feb 17 14:30:03 crc kubenswrapper[4836]: I0217 14:30:03.381917 4836 generic.go:334] "Generic (PLEG): container finished" podID="576199c0-9d59-4a1d-bd1d-ec32eb8fac02" containerID="f70928b304ac14ef13a56d539a6d1c81f6a91cc1b5670ddde0c85a6fb06b84fe" exitCode=0 Feb 17 14:30:03 crc kubenswrapper[4836]: I0217 14:30:03.382001 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522310-jpncr" event={"ID":"576199c0-9d59-4a1d-bd1d-ec32eb8fac02","Type":"ContainerDied","Data":"f70928b304ac14ef13a56d539a6d1c81f6a91cc1b5670ddde0c85a6fb06b84fe"} Feb 17 14:30:04 crc kubenswrapper[4836]: I0217 14:30:04.099916 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 17 14:30:04 crc kubenswrapper[4836]: I0217 14:30:04.100422 4836 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 17 14:30:04 crc kubenswrapper[4836]: I0217 14:30:04.104737 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 17 14:30:04 crc kubenswrapper[4836]: I0217 14:30:04.431722 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 14:30:11 crc kubenswrapper[4836]: I0217 14:30:11.435559 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522310-jpncr" Feb 17 14:30:11 crc kubenswrapper[4836]: I0217 14:30:11.455164 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5cv9x\" (UniqueName: \"kubernetes.io/projected/576199c0-9d59-4a1d-bd1d-ec32eb8fac02-kube-api-access-5cv9x\") pod \"576199c0-9d59-4a1d-bd1d-ec32eb8fac02\" (UID: \"576199c0-9d59-4a1d-bd1d-ec32eb8fac02\") " Feb 17 14:30:11 crc kubenswrapper[4836]: I0217 14:30:11.455269 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/576199c0-9d59-4a1d-bd1d-ec32eb8fac02-secret-volume\") pod \"576199c0-9d59-4a1d-bd1d-ec32eb8fac02\" (UID: \"576199c0-9d59-4a1d-bd1d-ec32eb8fac02\") " Feb 17 14:30:11 crc kubenswrapper[4836]: I0217 14:30:11.455288 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/576199c0-9d59-4a1d-bd1d-ec32eb8fac02-config-volume\") pod \"576199c0-9d59-4a1d-bd1d-ec32eb8fac02\" (UID: \"576199c0-9d59-4a1d-bd1d-ec32eb8fac02\") " Feb 17 14:30:11 crc kubenswrapper[4836]: I0217 14:30:11.456569 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/576199c0-9d59-4a1d-bd1d-ec32eb8fac02-config-volume" (OuterVolumeSpecName: "config-volume") pod "576199c0-9d59-4a1d-bd1d-ec32eb8fac02" (UID: "576199c0-9d59-4a1d-bd1d-ec32eb8fac02"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:30:11 crc kubenswrapper[4836]: I0217 14:30:11.462663 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/576199c0-9d59-4a1d-bd1d-ec32eb8fac02-kube-api-access-5cv9x" (OuterVolumeSpecName: "kube-api-access-5cv9x") pod "576199c0-9d59-4a1d-bd1d-ec32eb8fac02" (UID: "576199c0-9d59-4a1d-bd1d-ec32eb8fac02"). InnerVolumeSpecName "kube-api-access-5cv9x". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:30:11 crc kubenswrapper[4836]: I0217 14:30:11.465938 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/576199c0-9d59-4a1d-bd1d-ec32eb8fac02-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "576199c0-9d59-4a1d-bd1d-ec32eb8fac02" (UID: "576199c0-9d59-4a1d-bd1d-ec32eb8fac02"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:30:11 crc kubenswrapper[4836]: I0217 14:30:11.516892 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522310-jpncr" event={"ID":"576199c0-9d59-4a1d-bd1d-ec32eb8fac02","Type":"ContainerDied","Data":"acc8635fbbc7bb6f57c5d71b815410418e47fa88d5bfcb304605c21d210e514a"} Feb 17 14:30:11 crc kubenswrapper[4836]: I0217 14:30:11.517271 4836 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="acc8635fbbc7bb6f57c5d71b815410418e47fa88d5bfcb304605c21d210e514a" Feb 17 14:30:11 crc kubenswrapper[4836]: I0217 14:30:11.517030 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522310-jpncr" Feb 17 14:30:11 crc kubenswrapper[4836]: I0217 14:30:11.561845 4836 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/576199c0-9d59-4a1d-bd1d-ec32eb8fac02-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 17 14:30:11 crc kubenswrapper[4836]: I0217 14:30:11.561893 4836 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/576199c0-9d59-4a1d-bd1d-ec32eb8fac02-config-volume\") on node \"crc\" DevicePath \"\"" Feb 17 14:30:11 crc kubenswrapper[4836]: I0217 14:30:11.561906 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5cv9x\" (UniqueName: \"kubernetes.io/projected/576199c0-9d59-4a1d-bd1d-ec32eb8fac02-kube-api-access-5cv9x\") on node \"crc\" DevicePath \"\"" Feb 17 14:30:12 crc kubenswrapper[4836]: I0217 14:30:12.270868 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 14:30:12 crc kubenswrapper[4836]: W0217 14:30:12.272930 4836 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podca3b87dc_01ac_4a72_a432_9a4503f13c0b.slice/crio-53fd4772108969ad15e091e6f34ccf6c953cf089c9c05df86f2022f850caee6d WatchSource:0}: Error finding container 53fd4772108969ad15e091e6f34ccf6c953cf089c9c05df86f2022f850caee6d: Status 404 returned error can't find the container with id 53fd4772108969ad15e091e6f34ccf6c953cf089c9c05df86f2022f850caee6d Feb 17 14:30:12 crc kubenswrapper[4836]: I0217 14:30:12.533803 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ca3b87dc-01ac-4a72-a432-9a4503f13c0b","Type":"ContainerStarted","Data":"53fd4772108969ad15e091e6f34ccf6c953cf089c9c05df86f2022f850caee6d"} Feb 17 14:30:12 crc kubenswrapper[4836]: I0217 14:30:12.546724 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-896gw" event={"ID":"5284ac65-3629-4b0f-94ce-114964fe6d15","Type":"ContainerStarted","Data":"959d5cc1d8ba4d131ae83ee3b420db014e052fb98b3a6fa5c53753ae63d88003"} Feb 17 14:30:12 crc kubenswrapper[4836]: I0217 14:30:12.572354 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-896gw" podStartSLOduration=2.747300667 podStartE2EDuration="23.572332207s" podCreationTimestamp="2026-02-17 14:29:49 +0000 UTC" firstStartedPulling="2026-02-17 14:29:51.040835074 +0000 UTC m=+1417.383763343" lastFinishedPulling="2026-02-17 14:30:11.865866614 +0000 UTC m=+1438.208794883" observedRunningTime="2026-02-17 14:30:12.564613668 +0000 UTC m=+1438.907541947" watchObservedRunningTime="2026-02-17 14:30:12.572332207 +0000 UTC m=+1438.915260486" Feb 17 14:30:13 crc kubenswrapper[4836]: I0217 14:30:13.561182 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ca3b87dc-01ac-4a72-a432-9a4503f13c0b","Type":"ContainerStarted","Data":"f0419b2e3c8ef9c0f54a84e7512a9cde99f00cb5aa7b44a637e787be45f07ccd"} Feb 17 14:30:14 crc kubenswrapper[4836]: I0217 14:30:14.592858 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ca3b87dc-01ac-4a72-a432-9a4503f13c0b","Type":"ContainerStarted","Data":"d3297c8494404e0f55bc6c3d7032d9a3295e84dc803655d2e2df3e6ab7a747be"} Feb 17 14:30:15 crc kubenswrapper[4836]: I0217 14:30:15.609028 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ca3b87dc-01ac-4a72-a432-9a4503f13c0b","Type":"ContainerStarted","Data":"4337aced693eb74520c39cdaad50c2d06e723483b872e61eb2f707cc9550085e"} Feb 17 14:30:17 crc kubenswrapper[4836]: I0217 14:30:17.693383 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ca3b87dc-01ac-4a72-a432-9a4503f13c0b","Type":"ContainerStarted","Data":"fcda893980936a4e72f451c12f1e7a2007edb9f1324581557ec99a4e77ee81f9"} Feb 17 14:30:17 crc kubenswrapper[4836]: I0217 14:30:17.694094 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ca3b87dc-01ac-4a72-a432-9a4503f13c0b" containerName="ceilometer-central-agent" containerID="cri-o://f0419b2e3c8ef9c0f54a84e7512a9cde99f00cb5aa7b44a637e787be45f07ccd" gracePeriod=30 Feb 17 14:30:17 crc kubenswrapper[4836]: I0217 14:30:17.694225 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 17 14:30:17 crc kubenswrapper[4836]: I0217 14:30:17.694884 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ca3b87dc-01ac-4a72-a432-9a4503f13c0b" containerName="proxy-httpd" containerID="cri-o://fcda893980936a4e72f451c12f1e7a2007edb9f1324581557ec99a4e77ee81f9" gracePeriod=30 Feb 17 14:30:17 crc kubenswrapper[4836]: I0217 14:30:17.694956 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ca3b87dc-01ac-4a72-a432-9a4503f13c0b" containerName="sg-core" containerID="cri-o://4337aced693eb74520c39cdaad50c2d06e723483b872e61eb2f707cc9550085e" gracePeriod=30 Feb 17 14:30:17 crc kubenswrapper[4836]: I0217 14:30:17.695009 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ca3b87dc-01ac-4a72-a432-9a4503f13c0b" containerName="ceilometer-notification-agent" containerID="cri-o://d3297c8494404e0f55bc6c3d7032d9a3295e84dc803655d2e2df3e6ab7a747be" gracePeriod=30 Feb 17 14:30:18 crc kubenswrapper[4836]: I0217 14:30:18.711850 4836 generic.go:334] "Generic (PLEG): container finished" podID="ca3b87dc-01ac-4a72-a432-9a4503f13c0b" containerID="4337aced693eb74520c39cdaad50c2d06e723483b872e61eb2f707cc9550085e" exitCode=2 Feb 17 14:30:18 crc kubenswrapper[4836]: I0217 14:30:18.712426 4836 generic.go:334] "Generic (PLEG): container finished" podID="ca3b87dc-01ac-4a72-a432-9a4503f13c0b" containerID="d3297c8494404e0f55bc6c3d7032d9a3295e84dc803655d2e2df3e6ab7a747be" exitCode=0 Feb 17 14:30:18 crc kubenswrapper[4836]: I0217 14:30:18.712338 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ca3b87dc-01ac-4a72-a432-9a4503f13c0b","Type":"ContainerDied","Data":"4337aced693eb74520c39cdaad50c2d06e723483b872e61eb2f707cc9550085e"} Feb 17 14:30:18 crc kubenswrapper[4836]: I0217 14:30:18.713760 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ca3b87dc-01ac-4a72-a432-9a4503f13c0b","Type":"ContainerDied","Data":"d3297c8494404e0f55bc6c3d7032d9a3295e84dc803655d2e2df3e6ab7a747be"} Feb 17 14:30:20 crc kubenswrapper[4836]: I0217 14:30:20.158603 4836 generic.go:334] "Generic (PLEG): container finished" podID="ca3b87dc-01ac-4a72-a432-9a4503f13c0b" containerID="fcda893980936a4e72f451c12f1e7a2007edb9f1324581557ec99a4e77ee81f9" exitCode=0 Feb 17 14:30:20 crc kubenswrapper[4836]: I0217 14:30:20.158679 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ca3b87dc-01ac-4a72-a432-9a4503f13c0b","Type":"ContainerDied","Data":"fcda893980936a4e72f451c12f1e7a2007edb9f1324581557ec99a4e77ee81f9"} Feb 17 14:30:28 crc kubenswrapper[4836]: I0217 14:30:28.123462 4836 generic.go:334] "Generic (PLEG): container finished" podID="ca3b87dc-01ac-4a72-a432-9a4503f13c0b" containerID="f0419b2e3c8ef9c0f54a84e7512a9cde99f00cb5aa7b44a637e787be45f07ccd" exitCode=0 Feb 17 14:30:28 crc kubenswrapper[4836]: I0217 14:30:28.123697 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ca3b87dc-01ac-4a72-a432-9a4503f13c0b","Type":"ContainerDied","Data":"f0419b2e3c8ef9c0f54a84e7512a9cde99f00cb5aa7b44a637e787be45f07ccd"} Feb 17 14:30:28 crc kubenswrapper[4836]: I0217 14:30:28.124438 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ca3b87dc-01ac-4a72-a432-9a4503f13c0b","Type":"ContainerDied","Data":"53fd4772108969ad15e091e6f34ccf6c953cf089c9c05df86f2022f850caee6d"} Feb 17 14:30:28 crc kubenswrapper[4836]: I0217 14:30:28.124472 4836 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="53fd4772108969ad15e091e6f34ccf6c953cf089c9c05df86f2022f850caee6d" Feb 17 14:30:28 crc kubenswrapper[4836]: I0217 14:30:28.148780 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 14:30:28 crc kubenswrapper[4836]: I0217 14:30:28.317374 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ca3b87dc-01ac-4a72-a432-9a4503f13c0b-log-httpd\") pod \"ca3b87dc-01ac-4a72-a432-9a4503f13c0b\" (UID: \"ca3b87dc-01ac-4a72-a432-9a4503f13c0b\") " Feb 17 14:30:28 crc kubenswrapper[4836]: I0217 14:30:28.317463 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca3b87dc-01ac-4a72-a432-9a4503f13c0b-scripts\") pod \"ca3b87dc-01ac-4a72-a432-9a4503f13c0b\" (UID: \"ca3b87dc-01ac-4a72-a432-9a4503f13c0b\") " Feb 17 14:30:28 crc kubenswrapper[4836]: I0217 14:30:28.317551 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ca3b87dc-01ac-4a72-a432-9a4503f13c0b-sg-core-conf-yaml\") pod \"ca3b87dc-01ac-4a72-a432-9a4503f13c0b\" (UID: \"ca3b87dc-01ac-4a72-a432-9a4503f13c0b\") " Feb 17 14:30:28 crc kubenswrapper[4836]: I0217 14:30:28.317906 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca3b87dc-01ac-4a72-a432-9a4503f13c0b-config-data\") pod \"ca3b87dc-01ac-4a72-a432-9a4503f13c0b\" (UID: \"ca3b87dc-01ac-4a72-a432-9a4503f13c0b\") " Feb 17 14:30:28 crc kubenswrapper[4836]: I0217 14:30:28.317949 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca3b87dc-01ac-4a72-a432-9a4503f13c0b-combined-ca-bundle\") pod \"ca3b87dc-01ac-4a72-a432-9a4503f13c0b\" (UID: \"ca3b87dc-01ac-4a72-a432-9a4503f13c0b\") " Feb 17 14:30:28 crc kubenswrapper[4836]: I0217 14:30:28.318060 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wdnt8\" (UniqueName: \"kubernetes.io/projected/ca3b87dc-01ac-4a72-a432-9a4503f13c0b-kube-api-access-wdnt8\") pod \"ca3b87dc-01ac-4a72-a432-9a4503f13c0b\" (UID: \"ca3b87dc-01ac-4a72-a432-9a4503f13c0b\") " Feb 17 14:30:28 crc kubenswrapper[4836]: I0217 14:30:28.318183 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ca3b87dc-01ac-4a72-a432-9a4503f13c0b-run-httpd\") pod \"ca3b87dc-01ac-4a72-a432-9a4503f13c0b\" (UID: \"ca3b87dc-01ac-4a72-a432-9a4503f13c0b\") " Feb 17 14:30:28 crc kubenswrapper[4836]: I0217 14:30:28.322766 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca3b87dc-01ac-4a72-a432-9a4503f13c0b-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ca3b87dc-01ac-4a72-a432-9a4503f13c0b" (UID: "ca3b87dc-01ac-4a72-a432-9a4503f13c0b"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:30:28 crc kubenswrapper[4836]: I0217 14:30:28.323761 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca3b87dc-01ac-4a72-a432-9a4503f13c0b-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ca3b87dc-01ac-4a72-a432-9a4503f13c0b" (UID: "ca3b87dc-01ac-4a72-a432-9a4503f13c0b"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:30:28 crc kubenswrapper[4836]: I0217 14:30:28.344650 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca3b87dc-01ac-4a72-a432-9a4503f13c0b-kube-api-access-wdnt8" (OuterVolumeSpecName: "kube-api-access-wdnt8") pod "ca3b87dc-01ac-4a72-a432-9a4503f13c0b" (UID: "ca3b87dc-01ac-4a72-a432-9a4503f13c0b"). InnerVolumeSpecName "kube-api-access-wdnt8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:30:28 crc kubenswrapper[4836]: I0217 14:30:28.399524 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca3b87dc-01ac-4a72-a432-9a4503f13c0b-scripts" (OuterVolumeSpecName: "scripts") pod "ca3b87dc-01ac-4a72-a432-9a4503f13c0b" (UID: "ca3b87dc-01ac-4a72-a432-9a4503f13c0b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:30:28 crc kubenswrapper[4836]: I0217 14:30:28.422618 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wdnt8\" (UniqueName: \"kubernetes.io/projected/ca3b87dc-01ac-4a72-a432-9a4503f13c0b-kube-api-access-wdnt8\") on node \"crc\" DevicePath \"\"" Feb 17 14:30:28 crc kubenswrapper[4836]: I0217 14:30:28.422681 4836 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ca3b87dc-01ac-4a72-a432-9a4503f13c0b-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 14:30:28 crc kubenswrapper[4836]: I0217 14:30:28.422690 4836 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ca3b87dc-01ac-4a72-a432-9a4503f13c0b-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 14:30:28 crc kubenswrapper[4836]: I0217 14:30:28.422698 4836 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca3b87dc-01ac-4a72-a432-9a4503f13c0b-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 14:30:28 crc kubenswrapper[4836]: I0217 14:30:28.474538 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca3b87dc-01ac-4a72-a432-9a4503f13c0b-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "ca3b87dc-01ac-4a72-a432-9a4503f13c0b" (UID: "ca3b87dc-01ac-4a72-a432-9a4503f13c0b"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:30:28 crc kubenswrapper[4836]: I0217 14:30:28.525280 4836 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ca3b87dc-01ac-4a72-a432-9a4503f13c0b-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 17 14:30:28 crc kubenswrapper[4836]: I0217 14:30:28.591548 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca3b87dc-01ac-4a72-a432-9a4503f13c0b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ca3b87dc-01ac-4a72-a432-9a4503f13c0b" (UID: "ca3b87dc-01ac-4a72-a432-9a4503f13c0b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:30:28 crc kubenswrapper[4836]: I0217 14:30:28.608331 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca3b87dc-01ac-4a72-a432-9a4503f13c0b-config-data" (OuterVolumeSpecName: "config-data") pod "ca3b87dc-01ac-4a72-a432-9a4503f13c0b" (UID: "ca3b87dc-01ac-4a72-a432-9a4503f13c0b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:30:28 crc kubenswrapper[4836]: I0217 14:30:28.628784 4836 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca3b87dc-01ac-4a72-a432-9a4503f13c0b-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 14:30:28 crc kubenswrapper[4836]: I0217 14:30:28.628823 4836 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca3b87dc-01ac-4a72-a432-9a4503f13c0b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:30:29 crc kubenswrapper[4836]: I0217 14:30:29.137325 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 14:30:29 crc kubenswrapper[4836]: I0217 14:30:29.179362 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 14:30:29 crc kubenswrapper[4836]: I0217 14:30:29.196154 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 17 14:30:29 crc kubenswrapper[4836]: I0217 14:30:29.228094 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 17 14:30:29 crc kubenswrapper[4836]: E0217 14:30:29.228681 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca3b87dc-01ac-4a72-a432-9a4503f13c0b" containerName="sg-core" Feb 17 14:30:29 crc kubenswrapper[4836]: I0217 14:30:29.228708 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca3b87dc-01ac-4a72-a432-9a4503f13c0b" containerName="sg-core" Feb 17 14:30:29 crc kubenswrapper[4836]: E0217 14:30:29.228744 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="576199c0-9d59-4a1d-bd1d-ec32eb8fac02" containerName="collect-profiles" Feb 17 14:30:29 crc kubenswrapper[4836]: I0217 14:30:29.228751 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="576199c0-9d59-4a1d-bd1d-ec32eb8fac02" containerName="collect-profiles" Feb 17 14:30:29 crc kubenswrapper[4836]: E0217 14:30:29.228766 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca3b87dc-01ac-4a72-a432-9a4503f13c0b" containerName="ceilometer-notification-agent" Feb 17 14:30:29 crc kubenswrapper[4836]: I0217 14:30:29.228776 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca3b87dc-01ac-4a72-a432-9a4503f13c0b" containerName="ceilometer-notification-agent" Feb 17 14:30:29 crc kubenswrapper[4836]: E0217 14:30:29.228790 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca3b87dc-01ac-4a72-a432-9a4503f13c0b" containerName="ceilometer-central-agent" Feb 17 14:30:29 crc kubenswrapper[4836]: I0217 14:30:29.228798 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca3b87dc-01ac-4a72-a432-9a4503f13c0b" containerName="ceilometer-central-agent" Feb 17 14:30:29 crc kubenswrapper[4836]: E0217 14:30:29.228828 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca3b87dc-01ac-4a72-a432-9a4503f13c0b" containerName="proxy-httpd" Feb 17 14:30:29 crc kubenswrapper[4836]: I0217 14:30:29.228836 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca3b87dc-01ac-4a72-a432-9a4503f13c0b" containerName="proxy-httpd" Feb 17 14:30:29 crc kubenswrapper[4836]: I0217 14:30:29.229093 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="576199c0-9d59-4a1d-bd1d-ec32eb8fac02" containerName="collect-profiles" Feb 17 14:30:29 crc kubenswrapper[4836]: I0217 14:30:29.229127 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca3b87dc-01ac-4a72-a432-9a4503f13c0b" containerName="proxy-httpd" Feb 17 14:30:29 crc kubenswrapper[4836]: I0217 14:30:29.229140 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca3b87dc-01ac-4a72-a432-9a4503f13c0b" containerName="ceilometer-central-agent" Feb 17 14:30:29 crc kubenswrapper[4836]: I0217 14:30:29.229162 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca3b87dc-01ac-4a72-a432-9a4503f13c0b" containerName="sg-core" Feb 17 14:30:29 crc kubenswrapper[4836]: I0217 14:30:29.229176 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca3b87dc-01ac-4a72-a432-9a4503f13c0b" containerName="ceilometer-notification-agent" Feb 17 14:30:29 crc kubenswrapper[4836]: I0217 14:30:29.231423 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 14:30:29 crc kubenswrapper[4836]: I0217 14:30:29.235736 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 17 14:30:29 crc kubenswrapper[4836]: I0217 14:30:29.236026 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 17 14:30:29 crc kubenswrapper[4836]: I0217 14:30:29.258579 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 14:30:29 crc kubenswrapper[4836]: I0217 14:30:29.345412 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506-run-httpd\") pod \"ceilometer-0\" (UID: \"7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506\") " pod="openstack/ceilometer-0" Feb 17 14:30:29 crc kubenswrapper[4836]: I0217 14:30:29.345521 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506\") " pod="openstack/ceilometer-0" Feb 17 14:30:29 crc kubenswrapper[4836]: I0217 14:30:29.345706 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506\") " pod="openstack/ceilometer-0" Feb 17 14:30:29 crc kubenswrapper[4836]: I0217 14:30:29.345801 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506-log-httpd\") pod \"ceilometer-0\" (UID: \"7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506\") " pod="openstack/ceilometer-0" Feb 17 14:30:29 crc kubenswrapper[4836]: I0217 14:30:29.346075 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506-config-data\") pod \"ceilometer-0\" (UID: \"7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506\") " pod="openstack/ceilometer-0" Feb 17 14:30:29 crc kubenswrapper[4836]: I0217 14:30:29.346419 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506-scripts\") pod \"ceilometer-0\" (UID: \"7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506\") " pod="openstack/ceilometer-0" Feb 17 14:30:29 crc kubenswrapper[4836]: I0217 14:30:29.346591 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7whp\" (UniqueName: \"kubernetes.io/projected/7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506-kube-api-access-l7whp\") pod \"ceilometer-0\" (UID: \"7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506\") " pod="openstack/ceilometer-0" Feb 17 14:30:29 crc kubenswrapper[4836]: I0217 14:30:29.448508 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506-run-httpd\") pod \"ceilometer-0\" (UID: \"7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506\") " pod="openstack/ceilometer-0" Feb 17 14:30:29 crc kubenswrapper[4836]: I0217 14:30:29.448606 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506\") " pod="openstack/ceilometer-0" Feb 17 14:30:29 crc kubenswrapper[4836]: I0217 14:30:29.448655 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506\") " pod="openstack/ceilometer-0" Feb 17 14:30:29 crc kubenswrapper[4836]: I0217 14:30:29.448677 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506-log-httpd\") pod \"ceilometer-0\" (UID: \"7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506\") " pod="openstack/ceilometer-0" Feb 17 14:30:29 crc kubenswrapper[4836]: I0217 14:30:29.448721 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506-config-data\") pod \"ceilometer-0\" (UID: \"7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506\") " pod="openstack/ceilometer-0" Feb 17 14:30:29 crc kubenswrapper[4836]: I0217 14:30:29.448799 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506-scripts\") pod \"ceilometer-0\" (UID: \"7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506\") " pod="openstack/ceilometer-0" Feb 17 14:30:29 crc kubenswrapper[4836]: I0217 14:30:29.448853 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7whp\" (UniqueName: \"kubernetes.io/projected/7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506-kube-api-access-l7whp\") pod \"ceilometer-0\" (UID: \"7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506\") " pod="openstack/ceilometer-0" Feb 17 14:30:29 crc kubenswrapper[4836]: I0217 14:30:29.449168 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506-run-httpd\") pod \"ceilometer-0\" (UID: \"7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506\") " pod="openstack/ceilometer-0" Feb 17 14:30:29 crc kubenswrapper[4836]: I0217 14:30:29.449478 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506-log-httpd\") pod \"ceilometer-0\" (UID: \"7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506\") " pod="openstack/ceilometer-0" Feb 17 14:30:29 crc kubenswrapper[4836]: I0217 14:30:29.455124 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506\") " pod="openstack/ceilometer-0" Feb 17 14:30:29 crc kubenswrapper[4836]: I0217 14:30:29.456447 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506-config-data\") pod \"ceilometer-0\" (UID: \"7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506\") " pod="openstack/ceilometer-0" Feb 17 14:30:29 crc kubenswrapper[4836]: I0217 14:30:29.461443 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506-scripts\") pod \"ceilometer-0\" (UID: \"7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506\") " pod="openstack/ceilometer-0" Feb 17 14:30:29 crc kubenswrapper[4836]: I0217 14:30:29.461680 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506\") " pod="openstack/ceilometer-0" Feb 17 14:30:29 crc kubenswrapper[4836]: I0217 14:30:29.467729 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7whp\" (UniqueName: \"kubernetes.io/projected/7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506-kube-api-access-l7whp\") pod \"ceilometer-0\" (UID: \"7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506\") " pod="openstack/ceilometer-0" Feb 17 14:30:29 crc kubenswrapper[4836]: I0217 14:30:29.602525 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 14:30:29 crc kubenswrapper[4836]: I0217 14:30:29.766397 4836 patch_prober.go:28] interesting pod/machine-config-daemon-bkk9g container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 14:30:29 crc kubenswrapper[4836]: I0217 14:30:29.766900 4836 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" podUID="895a19c9-a3f0-4a15-aa19-19347121388c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 14:30:29 crc kubenswrapper[4836]: I0217 14:30:29.766979 4836 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" Feb 17 14:30:29 crc kubenswrapper[4836]: I0217 14:30:29.769653 4836 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3c09fe81ffce38e5d9ef4195d8e69df0edfb238c5a8b73cb36be460e79dea4bb"} pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 14:30:29 crc kubenswrapper[4836]: I0217 14:30:29.770398 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" podUID="895a19c9-a3f0-4a15-aa19-19347121388c" containerName="machine-config-daemon" containerID="cri-o://3c09fe81ffce38e5d9ef4195d8e69df0edfb238c5a8b73cb36be460e79dea4bb" gracePeriod=600 Feb 17 14:30:30 crc kubenswrapper[4836]: W0217 14:30:30.147942 4836 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7977b0a7_fd9c_4d3c_bc21_fbf9d0e70506.slice/crio-f76c5ca2919dcba6a57d2c3c18620a0d364cbf08343d7aa01bc14bf17c1cc24c WatchSource:0}: Error finding container f76c5ca2919dcba6a57d2c3c18620a0d364cbf08343d7aa01bc14bf17c1cc24c: Status 404 returned error can't find the container with id f76c5ca2919dcba6a57d2c3c18620a0d364cbf08343d7aa01bc14bf17c1cc24c Feb 17 14:30:30 crc kubenswrapper[4836]: I0217 14:30:30.150166 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 14:30:30 crc kubenswrapper[4836]: I0217 14:30:30.172736 4836 generic.go:334] "Generic (PLEG): container finished" podID="895a19c9-a3f0-4a15-aa19-19347121388c" containerID="3c09fe81ffce38e5d9ef4195d8e69df0edfb238c5a8b73cb36be460e79dea4bb" exitCode=0 Feb 17 14:30:30 crc kubenswrapper[4836]: I0217 14:30:30.172802 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" event={"ID":"895a19c9-a3f0-4a15-aa19-19347121388c","Type":"ContainerDied","Data":"3c09fe81ffce38e5d9ef4195d8e69df0edfb238c5a8b73cb36be460e79dea4bb"} Feb 17 14:30:30 crc kubenswrapper[4836]: I0217 14:30:30.172869 4836 scope.go:117] "RemoveContainer" containerID="790067b54b3531952a7756a09b793da1fc53330ef71b8011e59f530ae444594e" Feb 17 14:30:30 crc kubenswrapper[4836]: I0217 14:30:30.584120 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca3b87dc-01ac-4a72-a432-9a4503f13c0b" path="/var/lib/kubelet/pods/ca3b87dc-01ac-4a72-a432-9a4503f13c0b/volumes" Feb 17 14:30:31 crc kubenswrapper[4836]: I0217 14:30:31.191811 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506","Type":"ContainerStarted","Data":"2a94678a06d4de49c0c0b68a141d38b36f2fd3139243ed587746cebb8d0a09d9"} Feb 17 14:30:31 crc kubenswrapper[4836]: I0217 14:30:31.192243 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506","Type":"ContainerStarted","Data":"f76c5ca2919dcba6a57d2c3c18620a0d364cbf08343d7aa01bc14bf17c1cc24c"} Feb 17 14:30:31 crc kubenswrapper[4836]: I0217 14:30:31.200979 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" event={"ID":"895a19c9-a3f0-4a15-aa19-19347121388c","Type":"ContainerStarted","Data":"325cd676e21fc52f03c197c519aae517b900944cff0ba106872a3e674c1b1f20"} Feb 17 14:30:32 crc kubenswrapper[4836]: I0217 14:30:32.213259 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506","Type":"ContainerStarted","Data":"36e8022aac7122767c45efd486545780cddb57ef019acdaab3f1b9c40d6c965d"} Feb 17 14:30:32 crc kubenswrapper[4836]: I0217 14:30:32.308016 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 14:30:33 crc kubenswrapper[4836]: I0217 14:30:33.229401 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506","Type":"ContainerStarted","Data":"c56beb0cbfb019f8f773e8f80f2e2999175246d8b61e325eddb4ad43a6b127d4"} Feb 17 14:30:33 crc kubenswrapper[4836]: I0217 14:30:33.232133 4836 generic.go:334] "Generic (PLEG): container finished" podID="5284ac65-3629-4b0f-94ce-114964fe6d15" containerID="959d5cc1d8ba4d131ae83ee3b420db014e052fb98b3a6fa5c53753ae63d88003" exitCode=0 Feb 17 14:30:33 crc kubenswrapper[4836]: I0217 14:30:33.232307 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-896gw" event={"ID":"5284ac65-3629-4b0f-94ce-114964fe6d15","Type":"ContainerDied","Data":"959d5cc1d8ba4d131ae83ee3b420db014e052fb98b3a6fa5c53753ae63d88003"} Feb 17 14:30:35 crc kubenswrapper[4836]: I0217 14:30:35.124794 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-896gw" Feb 17 14:30:35 crc kubenswrapper[4836]: I0217 14:30:35.238399 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5284ac65-3629-4b0f-94ce-114964fe6d15-combined-ca-bundle\") pod \"5284ac65-3629-4b0f-94ce-114964fe6d15\" (UID: \"5284ac65-3629-4b0f-94ce-114964fe6d15\") " Feb 17 14:30:35 crc kubenswrapper[4836]: I0217 14:30:35.238550 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5284ac65-3629-4b0f-94ce-114964fe6d15-config-data\") pod \"5284ac65-3629-4b0f-94ce-114964fe6d15\" (UID: \"5284ac65-3629-4b0f-94ce-114964fe6d15\") " Feb 17 14:30:35 crc kubenswrapper[4836]: I0217 14:30:35.238732 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9lng9\" (UniqueName: \"kubernetes.io/projected/5284ac65-3629-4b0f-94ce-114964fe6d15-kube-api-access-9lng9\") pod \"5284ac65-3629-4b0f-94ce-114964fe6d15\" (UID: \"5284ac65-3629-4b0f-94ce-114964fe6d15\") " Feb 17 14:30:35 crc kubenswrapper[4836]: I0217 14:30:35.238997 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5284ac65-3629-4b0f-94ce-114964fe6d15-scripts\") pod \"5284ac65-3629-4b0f-94ce-114964fe6d15\" (UID: \"5284ac65-3629-4b0f-94ce-114964fe6d15\") " Feb 17 14:30:35 crc kubenswrapper[4836]: I0217 14:30:35.247235 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5284ac65-3629-4b0f-94ce-114964fe6d15-scripts" (OuterVolumeSpecName: "scripts") pod "5284ac65-3629-4b0f-94ce-114964fe6d15" (UID: "5284ac65-3629-4b0f-94ce-114964fe6d15"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:30:35 crc kubenswrapper[4836]: I0217 14:30:35.265203 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-896gw" Feb 17 14:30:35 crc kubenswrapper[4836]: I0217 14:30:35.265544 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-896gw" event={"ID":"5284ac65-3629-4b0f-94ce-114964fe6d15","Type":"ContainerDied","Data":"9e6101c106f2eea30b241c7cb1b7e2d51a47ac641af556866c4a4cf6f00c0aad"} Feb 17 14:30:35 crc kubenswrapper[4836]: I0217 14:30:35.265676 4836 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9e6101c106f2eea30b241c7cb1b7e2d51a47ac641af556866c4a4cf6f00c0aad" Feb 17 14:30:35 crc kubenswrapper[4836]: I0217 14:30:35.268344 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5284ac65-3629-4b0f-94ce-114964fe6d15-kube-api-access-9lng9" (OuterVolumeSpecName: "kube-api-access-9lng9") pod "5284ac65-3629-4b0f-94ce-114964fe6d15" (UID: "5284ac65-3629-4b0f-94ce-114964fe6d15"). InnerVolumeSpecName "kube-api-access-9lng9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:30:35 crc kubenswrapper[4836]: I0217 14:30:35.269574 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506","Type":"ContainerStarted","Data":"0f482f5faddf89800b640dc5e78fd2ecafe8e9e7010c8aad8dada8307c95b71c"} Feb 17 14:30:35 crc kubenswrapper[4836]: I0217 14:30:35.269828 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506" containerName="ceilometer-central-agent" containerID="cri-o://2a94678a06d4de49c0c0b68a141d38b36f2fd3139243ed587746cebb8d0a09d9" gracePeriod=30 Feb 17 14:30:35 crc kubenswrapper[4836]: I0217 14:30:35.269971 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 17 14:30:35 crc kubenswrapper[4836]: I0217 14:30:35.270475 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506" containerName="proxy-httpd" containerID="cri-o://0f482f5faddf89800b640dc5e78fd2ecafe8e9e7010c8aad8dada8307c95b71c" gracePeriod=30 Feb 17 14:30:35 crc kubenswrapper[4836]: I0217 14:30:35.270536 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506" containerName="sg-core" containerID="cri-o://c56beb0cbfb019f8f773e8f80f2e2999175246d8b61e325eddb4ad43a6b127d4" gracePeriod=30 Feb 17 14:30:35 crc kubenswrapper[4836]: I0217 14:30:35.270572 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506" containerName="ceilometer-notification-agent" containerID="cri-o://36e8022aac7122767c45efd486545780cddb57ef019acdaab3f1b9c40d6c965d" gracePeriod=30 Feb 17 14:30:35 crc kubenswrapper[4836]: I0217 14:30:35.316978 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5284ac65-3629-4b0f-94ce-114964fe6d15-config-data" (OuterVolumeSpecName: "config-data") pod "5284ac65-3629-4b0f-94ce-114964fe6d15" (UID: "5284ac65-3629-4b0f-94ce-114964fe6d15"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:30:35 crc kubenswrapper[4836]: I0217 14:30:35.318587 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5284ac65-3629-4b0f-94ce-114964fe6d15-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5284ac65-3629-4b0f-94ce-114964fe6d15" (UID: "5284ac65-3629-4b0f-94ce-114964fe6d15"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:30:35 crc kubenswrapper[4836]: I0217 14:30:35.324396 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.429066117 podStartE2EDuration="6.32436186s" podCreationTimestamp="2026-02-17 14:30:29 +0000 UTC" firstStartedPulling="2026-02-17 14:30:30.158684057 +0000 UTC m=+1456.501612326" lastFinishedPulling="2026-02-17 14:30:34.0539798 +0000 UTC m=+1460.396908069" observedRunningTime="2026-02-17 14:30:35.312655483 +0000 UTC m=+1461.655583772" watchObservedRunningTime="2026-02-17 14:30:35.32436186 +0000 UTC m=+1461.667290139" Feb 17 14:30:35 crc kubenswrapper[4836]: I0217 14:30:35.354225 4836 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5284ac65-3629-4b0f-94ce-114964fe6d15-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 14:30:35 crc kubenswrapper[4836]: I0217 14:30:35.354310 4836 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5284ac65-3629-4b0f-94ce-114964fe6d15-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:30:35 crc kubenswrapper[4836]: I0217 14:30:35.354328 4836 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5284ac65-3629-4b0f-94ce-114964fe6d15-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 14:30:35 crc kubenswrapper[4836]: I0217 14:30:35.354342 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9lng9\" (UniqueName: \"kubernetes.io/projected/5284ac65-3629-4b0f-94ce-114964fe6d15-kube-api-access-9lng9\") on node \"crc\" DevicePath \"\"" Feb 17 14:30:35 crc kubenswrapper[4836]: I0217 14:30:35.411529 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 17 14:30:35 crc kubenswrapper[4836]: E0217 14:30:35.412069 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5284ac65-3629-4b0f-94ce-114964fe6d15" containerName="nova-cell0-conductor-db-sync" Feb 17 14:30:35 crc kubenswrapper[4836]: I0217 14:30:35.412090 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="5284ac65-3629-4b0f-94ce-114964fe6d15" containerName="nova-cell0-conductor-db-sync" Feb 17 14:30:35 crc kubenswrapper[4836]: I0217 14:30:35.412323 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="5284ac65-3629-4b0f-94ce-114964fe6d15" containerName="nova-cell0-conductor-db-sync" Feb 17 14:30:35 crc kubenswrapper[4836]: I0217 14:30:35.413091 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 17 14:30:35 crc kubenswrapper[4836]: I0217 14:30:35.436973 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 17 14:30:35 crc kubenswrapper[4836]: I0217 14:30:35.565644 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgv87\" (UniqueName: \"kubernetes.io/projected/00cffdcb-70af-415e-86a8-4f8eb7c0ba6f-kube-api-access-cgv87\") pod \"nova-cell0-conductor-0\" (UID: \"00cffdcb-70af-415e-86a8-4f8eb7c0ba6f\") " pod="openstack/nova-cell0-conductor-0" Feb 17 14:30:35 crc kubenswrapper[4836]: I0217 14:30:35.566479 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00cffdcb-70af-415e-86a8-4f8eb7c0ba6f-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"00cffdcb-70af-415e-86a8-4f8eb7c0ba6f\") " pod="openstack/nova-cell0-conductor-0" Feb 17 14:30:35 crc kubenswrapper[4836]: I0217 14:30:35.566608 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00cffdcb-70af-415e-86a8-4f8eb7c0ba6f-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"00cffdcb-70af-415e-86a8-4f8eb7c0ba6f\") " pod="openstack/nova-cell0-conductor-0" Feb 17 14:30:35 crc kubenswrapper[4836]: I0217 14:30:35.670117 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cgv87\" (UniqueName: \"kubernetes.io/projected/00cffdcb-70af-415e-86a8-4f8eb7c0ba6f-kube-api-access-cgv87\") pod \"nova-cell0-conductor-0\" (UID: \"00cffdcb-70af-415e-86a8-4f8eb7c0ba6f\") " pod="openstack/nova-cell0-conductor-0" Feb 17 14:30:35 crc kubenswrapper[4836]: I0217 14:30:35.670705 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00cffdcb-70af-415e-86a8-4f8eb7c0ba6f-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"00cffdcb-70af-415e-86a8-4f8eb7c0ba6f\") " pod="openstack/nova-cell0-conductor-0" Feb 17 14:30:35 crc kubenswrapper[4836]: I0217 14:30:35.670764 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00cffdcb-70af-415e-86a8-4f8eb7c0ba6f-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"00cffdcb-70af-415e-86a8-4f8eb7c0ba6f\") " pod="openstack/nova-cell0-conductor-0" Feb 17 14:30:35 crc kubenswrapper[4836]: I0217 14:30:35.678578 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00cffdcb-70af-415e-86a8-4f8eb7c0ba6f-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"00cffdcb-70af-415e-86a8-4f8eb7c0ba6f\") " pod="openstack/nova-cell0-conductor-0" Feb 17 14:30:35 crc kubenswrapper[4836]: I0217 14:30:35.681111 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00cffdcb-70af-415e-86a8-4f8eb7c0ba6f-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"00cffdcb-70af-415e-86a8-4f8eb7c0ba6f\") " pod="openstack/nova-cell0-conductor-0" Feb 17 14:30:35 crc kubenswrapper[4836]: I0217 14:30:35.693256 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgv87\" (UniqueName: \"kubernetes.io/projected/00cffdcb-70af-415e-86a8-4f8eb7c0ba6f-kube-api-access-cgv87\") pod \"nova-cell0-conductor-0\" (UID: \"00cffdcb-70af-415e-86a8-4f8eb7c0ba6f\") " pod="openstack/nova-cell0-conductor-0" Feb 17 14:30:35 crc kubenswrapper[4836]: I0217 14:30:35.737025 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-r5vl4"] Feb 17 14:30:35 crc kubenswrapper[4836]: I0217 14:30:35.740082 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r5vl4" Feb 17 14:30:35 crc kubenswrapper[4836]: I0217 14:30:35.765656 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-r5vl4"] Feb 17 14:30:35 crc kubenswrapper[4836]: I0217 14:30:35.787410 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 17 14:30:35 crc kubenswrapper[4836]: I0217 14:30:35.876111 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftf9s\" (UniqueName: \"kubernetes.io/projected/5d52263a-9417-43b6-903c-79e41b1200a0-kube-api-access-ftf9s\") pod \"redhat-operators-r5vl4\" (UID: \"5d52263a-9417-43b6-903c-79e41b1200a0\") " pod="openshift-marketplace/redhat-operators-r5vl4" Feb 17 14:30:35 crc kubenswrapper[4836]: I0217 14:30:35.876997 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d52263a-9417-43b6-903c-79e41b1200a0-utilities\") pod \"redhat-operators-r5vl4\" (UID: \"5d52263a-9417-43b6-903c-79e41b1200a0\") " pod="openshift-marketplace/redhat-operators-r5vl4" Feb 17 14:30:35 crc kubenswrapper[4836]: I0217 14:30:35.877214 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d52263a-9417-43b6-903c-79e41b1200a0-catalog-content\") pod \"redhat-operators-r5vl4\" (UID: \"5d52263a-9417-43b6-903c-79e41b1200a0\") " pod="openshift-marketplace/redhat-operators-r5vl4" Feb 17 14:30:35 crc kubenswrapper[4836]: I0217 14:30:35.979733 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d52263a-9417-43b6-903c-79e41b1200a0-utilities\") pod \"redhat-operators-r5vl4\" (UID: \"5d52263a-9417-43b6-903c-79e41b1200a0\") " pod="openshift-marketplace/redhat-operators-r5vl4" Feb 17 14:30:35 crc kubenswrapper[4836]: I0217 14:30:35.979821 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d52263a-9417-43b6-903c-79e41b1200a0-catalog-content\") pod \"redhat-operators-r5vl4\" (UID: \"5d52263a-9417-43b6-903c-79e41b1200a0\") " pod="openshift-marketplace/redhat-operators-r5vl4" Feb 17 14:30:35 crc kubenswrapper[4836]: I0217 14:30:35.980313 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftf9s\" (UniqueName: \"kubernetes.io/projected/5d52263a-9417-43b6-903c-79e41b1200a0-kube-api-access-ftf9s\") pod \"redhat-operators-r5vl4\" (UID: \"5d52263a-9417-43b6-903c-79e41b1200a0\") " pod="openshift-marketplace/redhat-operators-r5vl4" Feb 17 14:30:35 crc kubenswrapper[4836]: I0217 14:30:35.983887 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d52263a-9417-43b6-903c-79e41b1200a0-utilities\") pod \"redhat-operators-r5vl4\" (UID: \"5d52263a-9417-43b6-903c-79e41b1200a0\") " pod="openshift-marketplace/redhat-operators-r5vl4" Feb 17 14:30:35 crc kubenswrapper[4836]: I0217 14:30:35.984015 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d52263a-9417-43b6-903c-79e41b1200a0-catalog-content\") pod \"redhat-operators-r5vl4\" (UID: \"5d52263a-9417-43b6-903c-79e41b1200a0\") " pod="openshift-marketplace/redhat-operators-r5vl4" Feb 17 14:30:36 crc kubenswrapper[4836]: I0217 14:30:36.018883 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftf9s\" (UniqueName: \"kubernetes.io/projected/5d52263a-9417-43b6-903c-79e41b1200a0-kube-api-access-ftf9s\") pod \"redhat-operators-r5vl4\" (UID: \"5d52263a-9417-43b6-903c-79e41b1200a0\") " pod="openshift-marketplace/redhat-operators-r5vl4" Feb 17 14:30:36 crc kubenswrapper[4836]: I0217 14:30:36.133728 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r5vl4" Feb 17 14:30:36 crc kubenswrapper[4836]: I0217 14:30:36.308484 4836 generic.go:334] "Generic (PLEG): container finished" podID="7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506" containerID="0f482f5faddf89800b640dc5e78fd2ecafe8e9e7010c8aad8dada8307c95b71c" exitCode=0 Feb 17 14:30:36 crc kubenswrapper[4836]: I0217 14:30:36.308542 4836 generic.go:334] "Generic (PLEG): container finished" podID="7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506" containerID="c56beb0cbfb019f8f773e8f80f2e2999175246d8b61e325eddb4ad43a6b127d4" exitCode=2 Feb 17 14:30:36 crc kubenswrapper[4836]: I0217 14:30:36.308551 4836 generic.go:334] "Generic (PLEG): container finished" podID="7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506" containerID="36e8022aac7122767c45efd486545780cddb57ef019acdaab3f1b9c40d6c965d" exitCode=0 Feb 17 14:30:36 crc kubenswrapper[4836]: I0217 14:30:36.308595 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506","Type":"ContainerDied","Data":"0f482f5faddf89800b640dc5e78fd2ecafe8e9e7010c8aad8dada8307c95b71c"} Feb 17 14:30:36 crc kubenswrapper[4836]: I0217 14:30:36.308663 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506","Type":"ContainerDied","Data":"c56beb0cbfb019f8f773e8f80f2e2999175246d8b61e325eddb4ad43a6b127d4"} Feb 17 14:30:36 crc kubenswrapper[4836]: I0217 14:30:36.308682 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506","Type":"ContainerDied","Data":"36e8022aac7122767c45efd486545780cddb57ef019acdaab3f1b9c40d6c965d"} Feb 17 14:30:36 crc kubenswrapper[4836]: W0217 14:30:36.451200 4836 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod00cffdcb_70af_415e_86a8_4f8eb7c0ba6f.slice/crio-8490ed987ac63e5c963f1f64f86a48b7a33db4ee3520b61fb8dfa72cae6c38d4 WatchSource:0}: Error finding container 8490ed987ac63e5c963f1f64f86a48b7a33db4ee3520b61fb8dfa72cae6c38d4: Status 404 returned error can't find the container with id 8490ed987ac63e5c963f1f64f86a48b7a33db4ee3520b61fb8dfa72cae6c38d4 Feb 17 14:30:36 crc kubenswrapper[4836]: I0217 14:30:36.455055 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 17 14:30:36 crc kubenswrapper[4836]: I0217 14:30:36.671774 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-r5vl4"] Feb 17 14:30:37 crc kubenswrapper[4836]: I0217 14:30:37.325356 4836 generic.go:334] "Generic (PLEG): container finished" podID="5d52263a-9417-43b6-903c-79e41b1200a0" containerID="543693d067276811874d1e0bb4d0e4c0d0aa037b97569dbf9646ef328721db65" exitCode=0 Feb 17 14:30:37 crc kubenswrapper[4836]: I0217 14:30:37.325562 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r5vl4" event={"ID":"5d52263a-9417-43b6-903c-79e41b1200a0","Type":"ContainerDied","Data":"543693d067276811874d1e0bb4d0e4c0d0aa037b97569dbf9646ef328721db65"} Feb 17 14:30:37 crc kubenswrapper[4836]: I0217 14:30:37.325606 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r5vl4" event={"ID":"5d52263a-9417-43b6-903c-79e41b1200a0","Type":"ContainerStarted","Data":"3c5121268249f146eb49c77508540398c1fbd6e327b92f34e18793bdae5e01d9"} Feb 17 14:30:37 crc kubenswrapper[4836]: I0217 14:30:37.334542 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"00cffdcb-70af-415e-86a8-4f8eb7c0ba6f","Type":"ContainerStarted","Data":"7a8ca6cfe91d443fa0ff1a509a189535b636308c6c14471f70f218c7b11dc7a1"} Feb 17 14:30:37 crc kubenswrapper[4836]: I0217 14:30:37.334620 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"00cffdcb-70af-415e-86a8-4f8eb7c0ba6f","Type":"ContainerStarted","Data":"8490ed987ac63e5c963f1f64f86a48b7a33db4ee3520b61fb8dfa72cae6c38d4"} Feb 17 14:30:37 crc kubenswrapper[4836]: I0217 14:30:37.335437 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Feb 17 14:30:37 crc kubenswrapper[4836]: I0217 14:30:37.400044 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.400012544 podStartE2EDuration="2.400012544s" podCreationTimestamp="2026-02-17 14:30:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:30:37.380751353 +0000 UTC m=+1463.723679652" watchObservedRunningTime="2026-02-17 14:30:37.400012544 +0000 UTC m=+1463.742940813" Feb 17 14:30:45 crc kubenswrapper[4836]: I0217 14:30:45.837192 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Feb 17 14:30:47 crc kubenswrapper[4836]: I0217 14:30:47.491244 4836 generic.go:334] "Generic (PLEG): container finished" podID="7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506" containerID="2a94678a06d4de49c0c0b68a141d38b36f2fd3139243ed587746cebb8d0a09d9" exitCode=0 Feb 17 14:30:47 crc kubenswrapper[4836]: I0217 14:30:47.491515 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506","Type":"ContainerDied","Data":"2a94678a06d4de49c0c0b68a141d38b36f2fd3139243ed587746cebb8d0a09d9"} Feb 17 14:30:47 crc kubenswrapper[4836]: I0217 14:30:47.596904 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-lqvvn"] Feb 17 14:30:47 crc kubenswrapper[4836]: I0217 14:30:47.598638 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-lqvvn" Feb 17 14:30:47 crc kubenswrapper[4836]: I0217 14:30:47.601513 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Feb 17 14:30:47 crc kubenswrapper[4836]: I0217 14:30:47.607182 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-lqvvn"] Feb 17 14:30:47 crc kubenswrapper[4836]: I0217 14:30:47.609199 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Feb 17 14:30:47 crc kubenswrapper[4836]: I0217 14:30:47.716911 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f9d6a93-3d3a-4c5c-85cf-329209cfe911-config-data\") pod \"nova-cell0-cell-mapping-lqvvn\" (UID: \"3f9d6a93-3d3a-4c5c-85cf-329209cfe911\") " pod="openstack/nova-cell0-cell-mapping-lqvvn" Feb 17 14:30:47 crc kubenswrapper[4836]: I0217 14:30:47.716965 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5b2gc\" (UniqueName: \"kubernetes.io/projected/3f9d6a93-3d3a-4c5c-85cf-329209cfe911-kube-api-access-5b2gc\") pod \"nova-cell0-cell-mapping-lqvvn\" (UID: \"3f9d6a93-3d3a-4c5c-85cf-329209cfe911\") " pod="openstack/nova-cell0-cell-mapping-lqvvn" Feb 17 14:30:47 crc kubenswrapper[4836]: I0217 14:30:47.718056 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f9d6a93-3d3a-4c5c-85cf-329209cfe911-scripts\") pod \"nova-cell0-cell-mapping-lqvvn\" (UID: \"3f9d6a93-3d3a-4c5c-85cf-329209cfe911\") " pod="openstack/nova-cell0-cell-mapping-lqvvn" Feb 17 14:30:47 crc kubenswrapper[4836]: I0217 14:30:47.718142 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f9d6a93-3d3a-4c5c-85cf-329209cfe911-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-lqvvn\" (UID: \"3f9d6a93-3d3a-4c5c-85cf-329209cfe911\") " pod="openstack/nova-cell0-cell-mapping-lqvvn" Feb 17 14:30:47 crc kubenswrapper[4836]: I0217 14:30:47.826376 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f9d6a93-3d3a-4c5c-85cf-329209cfe911-config-data\") pod \"nova-cell0-cell-mapping-lqvvn\" (UID: \"3f9d6a93-3d3a-4c5c-85cf-329209cfe911\") " pod="openstack/nova-cell0-cell-mapping-lqvvn" Feb 17 14:30:47 crc kubenswrapper[4836]: I0217 14:30:47.826471 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5b2gc\" (UniqueName: \"kubernetes.io/projected/3f9d6a93-3d3a-4c5c-85cf-329209cfe911-kube-api-access-5b2gc\") pod \"nova-cell0-cell-mapping-lqvvn\" (UID: \"3f9d6a93-3d3a-4c5c-85cf-329209cfe911\") " pod="openstack/nova-cell0-cell-mapping-lqvvn" Feb 17 14:30:47 crc kubenswrapper[4836]: I0217 14:30:47.826952 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f9d6a93-3d3a-4c5c-85cf-329209cfe911-scripts\") pod \"nova-cell0-cell-mapping-lqvvn\" (UID: \"3f9d6a93-3d3a-4c5c-85cf-329209cfe911\") " pod="openstack/nova-cell0-cell-mapping-lqvvn" Feb 17 14:30:47 crc kubenswrapper[4836]: I0217 14:30:47.827004 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f9d6a93-3d3a-4c5c-85cf-329209cfe911-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-lqvvn\" (UID: \"3f9d6a93-3d3a-4c5c-85cf-329209cfe911\") " pod="openstack/nova-cell0-cell-mapping-lqvvn" Feb 17 14:30:47 crc kubenswrapper[4836]: I0217 14:30:47.846403 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f9d6a93-3d3a-4c5c-85cf-329209cfe911-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-lqvvn\" (UID: \"3f9d6a93-3d3a-4c5c-85cf-329209cfe911\") " pod="openstack/nova-cell0-cell-mapping-lqvvn" Feb 17 14:30:47 crc kubenswrapper[4836]: I0217 14:30:47.847015 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f9d6a93-3d3a-4c5c-85cf-329209cfe911-config-data\") pod \"nova-cell0-cell-mapping-lqvvn\" (UID: \"3f9d6a93-3d3a-4c5c-85cf-329209cfe911\") " pod="openstack/nova-cell0-cell-mapping-lqvvn" Feb 17 14:30:47 crc kubenswrapper[4836]: I0217 14:30:47.851921 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f9d6a93-3d3a-4c5c-85cf-329209cfe911-scripts\") pod \"nova-cell0-cell-mapping-lqvvn\" (UID: \"3f9d6a93-3d3a-4c5c-85cf-329209cfe911\") " pod="openstack/nova-cell0-cell-mapping-lqvvn" Feb 17 14:30:47 crc kubenswrapper[4836]: I0217 14:30:47.906530 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 17 14:30:47 crc kubenswrapper[4836]: I0217 14:30:47.924909 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 14:30:47 crc kubenswrapper[4836]: I0217 14:30:47.926360 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5b2gc\" (UniqueName: \"kubernetes.io/projected/3f9d6a93-3d3a-4c5c-85cf-329209cfe911-kube-api-access-5b2gc\") pod \"nova-cell0-cell-mapping-lqvvn\" (UID: \"3f9d6a93-3d3a-4c5c-85cf-329209cfe911\") " pod="openstack/nova-cell0-cell-mapping-lqvvn" Feb 17 14:30:47 crc kubenswrapper[4836]: I0217 14:30:47.935005 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 17 14:30:47 crc kubenswrapper[4836]: I0217 14:30:47.938049 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n528b\" (UniqueName: \"kubernetes.io/projected/c429025c-a79e-425a-987a-773baaba5ef2-kube-api-access-n528b\") pod \"nova-api-0\" (UID: \"c429025c-a79e-425a-987a-773baaba5ef2\") " pod="openstack/nova-api-0" Feb 17 14:30:47 crc kubenswrapper[4836]: I0217 14:30:47.938157 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c429025c-a79e-425a-987a-773baaba5ef2-config-data\") pod \"nova-api-0\" (UID: \"c429025c-a79e-425a-987a-773baaba5ef2\") " pod="openstack/nova-api-0" Feb 17 14:30:47 crc kubenswrapper[4836]: I0217 14:30:47.938191 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c429025c-a79e-425a-987a-773baaba5ef2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c429025c-a79e-425a-987a-773baaba5ef2\") " pod="openstack/nova-api-0" Feb 17 14:30:47 crc kubenswrapper[4836]: I0217 14:30:47.938255 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c429025c-a79e-425a-987a-773baaba5ef2-logs\") pod \"nova-api-0\" (UID: \"c429025c-a79e-425a-987a-773baaba5ef2\") " pod="openstack/nova-api-0" Feb 17 14:30:47 crc kubenswrapper[4836]: I0217 14:30:47.982538 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 17 14:30:48 crc kubenswrapper[4836]: I0217 14:30:48.038938 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 14:30:48 crc kubenswrapper[4836]: I0217 14:30:48.041184 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n528b\" (UniqueName: \"kubernetes.io/projected/c429025c-a79e-425a-987a-773baaba5ef2-kube-api-access-n528b\") pod \"nova-api-0\" (UID: \"c429025c-a79e-425a-987a-773baaba5ef2\") " pod="openstack/nova-api-0" Feb 17 14:30:48 crc kubenswrapper[4836]: I0217 14:30:48.041294 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c429025c-a79e-425a-987a-773baaba5ef2-config-data\") pod \"nova-api-0\" (UID: \"c429025c-a79e-425a-987a-773baaba5ef2\") " pod="openstack/nova-api-0" Feb 17 14:30:48 crc kubenswrapper[4836]: I0217 14:30:48.041338 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c429025c-a79e-425a-987a-773baaba5ef2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c429025c-a79e-425a-987a-773baaba5ef2\") " pod="openstack/nova-api-0" Feb 17 14:30:48 crc kubenswrapper[4836]: I0217 14:30:48.041424 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c429025c-a79e-425a-987a-773baaba5ef2-logs\") pod \"nova-api-0\" (UID: \"c429025c-a79e-425a-987a-773baaba5ef2\") " pod="openstack/nova-api-0" Feb 17 14:30:48 crc kubenswrapper[4836]: I0217 14:30:48.053672 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 17 14:30:48 crc kubenswrapper[4836]: I0217 14:30:48.056653 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 17 14:30:48 crc kubenswrapper[4836]: I0217 14:30:48.058778 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c429025c-a79e-425a-987a-773baaba5ef2-config-data\") pod \"nova-api-0\" (UID: \"c429025c-a79e-425a-987a-773baaba5ef2\") " pod="openstack/nova-api-0" Feb 17 14:30:48 crc kubenswrapper[4836]: I0217 14:30:48.060836 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c429025c-a79e-425a-987a-773baaba5ef2-logs\") pod \"nova-api-0\" (UID: \"c429025c-a79e-425a-987a-773baaba5ef2\") " pod="openstack/nova-api-0" Feb 17 14:30:48 crc kubenswrapper[4836]: I0217 14:30:48.075105 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c429025c-a79e-425a-987a-773baaba5ef2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c429025c-a79e-425a-987a-773baaba5ef2\") " pod="openstack/nova-api-0" Feb 17 14:30:48 crc kubenswrapper[4836]: I0217 14:30:48.100752 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n528b\" (UniqueName: \"kubernetes.io/projected/c429025c-a79e-425a-987a-773baaba5ef2-kube-api-access-n528b\") pod \"nova-api-0\" (UID: \"c429025c-a79e-425a-987a-773baaba5ef2\") " pod="openstack/nova-api-0" Feb 17 14:30:48 crc kubenswrapper[4836]: I0217 14:30:48.111821 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 14:30:48 crc kubenswrapper[4836]: I0217 14:30:48.143159 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6c45\" (UniqueName: \"kubernetes.io/projected/50c442bd-1a4d-4e8f-b3b2-c2e6c97faeed-kube-api-access-w6c45\") pod \"nova-scheduler-0\" (UID: \"50c442bd-1a4d-4e8f-b3b2-c2e6c97faeed\") " pod="openstack/nova-scheduler-0" Feb 17 14:30:48 crc kubenswrapper[4836]: I0217 14:30:48.143243 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50c442bd-1a4d-4e8f-b3b2-c2e6c97faeed-config-data\") pod \"nova-scheduler-0\" (UID: \"50c442bd-1a4d-4e8f-b3b2-c2e6c97faeed\") " pod="openstack/nova-scheduler-0" Feb 17 14:30:48 crc kubenswrapper[4836]: I0217 14:30:48.143298 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50c442bd-1a4d-4e8f-b3b2-c2e6c97faeed-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"50c442bd-1a4d-4e8f-b3b2-c2e6c97faeed\") " pod="openstack/nova-scheduler-0" Feb 17 14:30:48 crc kubenswrapper[4836]: I0217 14:30:48.206197 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 17 14:30:48 crc kubenswrapper[4836]: I0217 14:30:48.214883 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 14:30:48 crc kubenswrapper[4836]: I0217 14:30:48.222635 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-lqvvn" Feb 17 14:30:48 crc kubenswrapper[4836]: I0217 14:30:48.236433 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 17 14:30:48 crc kubenswrapper[4836]: I0217 14:30:48.247340 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 14:30:48 crc kubenswrapper[4836]: I0217 14:30:48.252735 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50c442bd-1a4d-4e8f-b3b2-c2e6c97faeed-config-data\") pod \"nova-scheduler-0\" (UID: \"50c442bd-1a4d-4e8f-b3b2-c2e6c97faeed\") " pod="openstack/nova-scheduler-0" Feb 17 14:30:48 crc kubenswrapper[4836]: I0217 14:30:48.252892 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50c442bd-1a4d-4e8f-b3b2-c2e6c97faeed-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"50c442bd-1a4d-4e8f-b3b2-c2e6c97faeed\") " pod="openstack/nova-scheduler-0" Feb 17 14:30:48 crc kubenswrapper[4836]: I0217 14:30:48.252971 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czlhp\" (UniqueName: \"kubernetes.io/projected/3d6e757d-b7e9-417b-a63e-94879c7f3f74-kube-api-access-czlhp\") pod \"nova-metadata-0\" (UID: \"3d6e757d-b7e9-417b-a63e-94879c7f3f74\") " pod="openstack/nova-metadata-0" Feb 17 14:30:48 crc kubenswrapper[4836]: I0217 14:30:48.253009 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d6e757d-b7e9-417b-a63e-94879c7f3f74-logs\") pod \"nova-metadata-0\" (UID: \"3d6e757d-b7e9-417b-a63e-94879c7f3f74\") " pod="openstack/nova-metadata-0" Feb 17 14:30:48 crc kubenswrapper[4836]: I0217 14:30:48.253097 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d6e757d-b7e9-417b-a63e-94879c7f3f74-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3d6e757d-b7e9-417b-a63e-94879c7f3f74\") " pod="openstack/nova-metadata-0" Feb 17 14:30:48 crc kubenswrapper[4836]: I0217 14:30:48.253214 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6c45\" (UniqueName: \"kubernetes.io/projected/50c442bd-1a4d-4e8f-b3b2-c2e6c97faeed-kube-api-access-w6c45\") pod \"nova-scheduler-0\" (UID: \"50c442bd-1a4d-4e8f-b3b2-c2e6c97faeed\") " pod="openstack/nova-scheduler-0" Feb 17 14:30:48 crc kubenswrapper[4836]: I0217 14:30:48.253242 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d6e757d-b7e9-417b-a63e-94879c7f3f74-config-data\") pod \"nova-metadata-0\" (UID: \"3d6e757d-b7e9-417b-a63e-94879c7f3f74\") " pod="openstack/nova-metadata-0" Feb 17 14:30:48 crc kubenswrapper[4836]: I0217 14:30:48.259740 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50c442bd-1a4d-4e8f-b3b2-c2e6c97faeed-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"50c442bd-1a4d-4e8f-b3b2-c2e6c97faeed\") " pod="openstack/nova-scheduler-0" Feb 17 14:30:48 crc kubenswrapper[4836]: I0217 14:30:48.270552 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50c442bd-1a4d-4e8f-b3b2-c2e6c97faeed-config-data\") pod \"nova-scheduler-0\" (UID: \"50c442bd-1a4d-4e8f-b3b2-c2e6c97faeed\") " pod="openstack/nova-scheduler-0" Feb 17 14:30:48 crc kubenswrapper[4836]: I0217 14:30:48.288083 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6c45\" (UniqueName: \"kubernetes.io/projected/50c442bd-1a4d-4e8f-b3b2-c2e6c97faeed-kube-api-access-w6c45\") pod \"nova-scheduler-0\" (UID: \"50c442bd-1a4d-4e8f-b3b2-c2e6c97faeed\") " pod="openstack/nova-scheduler-0" Feb 17 14:30:48 crc kubenswrapper[4836]: I0217 14:30:48.319736 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 14:30:48 crc kubenswrapper[4836]: I0217 14:30:48.355774 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d6e757d-b7e9-417b-a63e-94879c7f3f74-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3d6e757d-b7e9-417b-a63e-94879c7f3f74\") " pod="openstack/nova-metadata-0" Feb 17 14:30:48 crc kubenswrapper[4836]: I0217 14:30:48.355903 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d6e757d-b7e9-417b-a63e-94879c7f3f74-config-data\") pod \"nova-metadata-0\" (UID: \"3d6e757d-b7e9-417b-a63e-94879c7f3f74\") " pod="openstack/nova-metadata-0" Feb 17 14:30:48 crc kubenswrapper[4836]: I0217 14:30:48.356054 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czlhp\" (UniqueName: \"kubernetes.io/projected/3d6e757d-b7e9-417b-a63e-94879c7f3f74-kube-api-access-czlhp\") pod \"nova-metadata-0\" (UID: \"3d6e757d-b7e9-417b-a63e-94879c7f3f74\") " pod="openstack/nova-metadata-0" Feb 17 14:30:48 crc kubenswrapper[4836]: I0217 14:30:48.356090 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d6e757d-b7e9-417b-a63e-94879c7f3f74-logs\") pod \"nova-metadata-0\" (UID: \"3d6e757d-b7e9-417b-a63e-94879c7f3f74\") " pod="openstack/nova-metadata-0" Feb 17 14:30:48 crc kubenswrapper[4836]: I0217 14:30:48.356655 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d6e757d-b7e9-417b-a63e-94879c7f3f74-logs\") pod \"nova-metadata-0\" (UID: \"3d6e757d-b7e9-417b-a63e-94879c7f3f74\") " pod="openstack/nova-metadata-0" Feb 17 14:30:48 crc kubenswrapper[4836]: I0217 14:30:48.363265 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d6e757d-b7e9-417b-a63e-94879c7f3f74-config-data\") pod \"nova-metadata-0\" (UID: \"3d6e757d-b7e9-417b-a63e-94879c7f3f74\") " pod="openstack/nova-metadata-0" Feb 17 14:30:48 crc kubenswrapper[4836]: I0217 14:30:48.368884 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 17 14:30:48 crc kubenswrapper[4836]: I0217 14:30:48.372267 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 17 14:30:48 crc kubenswrapper[4836]: I0217 14:30:48.385870 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 17 14:30:48 crc kubenswrapper[4836]: I0217 14:30:48.386269 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d6e757d-b7e9-417b-a63e-94879c7f3f74-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3d6e757d-b7e9-417b-a63e-94879c7f3f74\") " pod="openstack/nova-metadata-0" Feb 17 14:30:48 crc kubenswrapper[4836]: I0217 14:30:48.394474 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czlhp\" (UniqueName: \"kubernetes.io/projected/3d6e757d-b7e9-417b-a63e-94879c7f3f74-kube-api-access-czlhp\") pod \"nova-metadata-0\" (UID: \"3d6e757d-b7e9-417b-a63e-94879c7f3f74\") " pod="openstack/nova-metadata-0" Feb 17 14:30:48 crc kubenswrapper[4836]: I0217 14:30:48.466055 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69kdk\" (UniqueName: \"kubernetes.io/projected/73de5f3f-982c-4471-b91b-e3725da6be03-kube-api-access-69kdk\") pod \"nova-cell1-novncproxy-0\" (UID: \"73de5f3f-982c-4471-b91b-e3725da6be03\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 14:30:48 crc kubenswrapper[4836]: I0217 14:30:48.466452 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73de5f3f-982c-4471-b91b-e3725da6be03-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"73de5f3f-982c-4471-b91b-e3725da6be03\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 14:30:48 crc kubenswrapper[4836]: I0217 14:30:48.466950 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73de5f3f-982c-4471-b91b-e3725da6be03-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"73de5f3f-982c-4471-b91b-e3725da6be03\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 14:30:48 crc kubenswrapper[4836]: I0217 14:30:48.491866 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 17 14:30:48 crc kubenswrapper[4836]: I0217 14:30:48.519878 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 17 14:30:48 crc kubenswrapper[4836]: I0217 14:30:48.557419 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78cd565959-cbrcp"] Feb 17 14:30:48 crc kubenswrapper[4836]: I0217 14:30:48.559959 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78cd565959-cbrcp" Feb 17 14:30:48 crc kubenswrapper[4836]: I0217 14:30:48.570095 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73de5f3f-982c-4471-b91b-e3725da6be03-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"73de5f3f-982c-4471-b91b-e3725da6be03\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 14:30:48 crc kubenswrapper[4836]: I0217 14:30:48.570164 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d8b08728-c946-43e4-85fa-0b033034bd26-dns-svc\") pod \"dnsmasq-dns-78cd565959-cbrcp\" (UID: \"d8b08728-c946-43e4-85fa-0b033034bd26\") " pod="openstack/dnsmasq-dns-78cd565959-cbrcp" Feb 17 14:30:48 crc kubenswrapper[4836]: I0217 14:30:48.570196 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d8b08728-c946-43e4-85fa-0b033034bd26-dns-swift-storage-0\") pod \"dnsmasq-dns-78cd565959-cbrcp\" (UID: \"d8b08728-c946-43e4-85fa-0b033034bd26\") " pod="openstack/dnsmasq-dns-78cd565959-cbrcp" Feb 17 14:30:48 crc kubenswrapper[4836]: I0217 14:30:48.570350 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73de5f3f-982c-4471-b91b-e3725da6be03-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"73de5f3f-982c-4471-b91b-e3725da6be03\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 14:30:48 crc kubenswrapper[4836]: I0217 14:30:48.570383 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d8b08728-c946-43e4-85fa-0b033034bd26-ovsdbserver-sb\") pod \"dnsmasq-dns-78cd565959-cbrcp\" (UID: \"d8b08728-c946-43e4-85fa-0b033034bd26\") " pod="openstack/dnsmasq-dns-78cd565959-cbrcp" Feb 17 14:30:48 crc kubenswrapper[4836]: I0217 14:30:48.570405 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rl67m\" (UniqueName: \"kubernetes.io/projected/d8b08728-c946-43e4-85fa-0b033034bd26-kube-api-access-rl67m\") pod \"dnsmasq-dns-78cd565959-cbrcp\" (UID: \"d8b08728-c946-43e4-85fa-0b033034bd26\") " pod="openstack/dnsmasq-dns-78cd565959-cbrcp" Feb 17 14:30:48 crc kubenswrapper[4836]: I0217 14:30:48.570459 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69kdk\" (UniqueName: \"kubernetes.io/projected/73de5f3f-982c-4471-b91b-e3725da6be03-kube-api-access-69kdk\") pod \"nova-cell1-novncproxy-0\" (UID: \"73de5f3f-982c-4471-b91b-e3725da6be03\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 14:30:48 crc kubenswrapper[4836]: I0217 14:30:48.570494 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8b08728-c946-43e4-85fa-0b033034bd26-config\") pod \"dnsmasq-dns-78cd565959-cbrcp\" (UID: \"d8b08728-c946-43e4-85fa-0b033034bd26\") " pod="openstack/dnsmasq-dns-78cd565959-cbrcp" Feb 17 14:30:48 crc kubenswrapper[4836]: I0217 14:30:48.570512 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d8b08728-c946-43e4-85fa-0b033034bd26-ovsdbserver-nb\") pod \"dnsmasq-dns-78cd565959-cbrcp\" (UID: \"d8b08728-c946-43e4-85fa-0b033034bd26\") " pod="openstack/dnsmasq-dns-78cd565959-cbrcp" Feb 17 14:30:48 crc kubenswrapper[4836]: I0217 14:30:48.579652 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73de5f3f-982c-4471-b91b-e3725da6be03-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"73de5f3f-982c-4471-b91b-e3725da6be03\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 14:30:48 crc kubenswrapper[4836]: I0217 14:30:48.580009 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73de5f3f-982c-4471-b91b-e3725da6be03-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"73de5f3f-982c-4471-b91b-e3725da6be03\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 14:30:48 crc kubenswrapper[4836]: I0217 14:30:48.610300 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-69kdk\" (UniqueName: \"kubernetes.io/projected/73de5f3f-982c-4471-b91b-e3725da6be03-kube-api-access-69kdk\") pod \"nova-cell1-novncproxy-0\" (UID: \"73de5f3f-982c-4471-b91b-e3725da6be03\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 14:30:48 crc kubenswrapper[4836]: I0217 14:30:48.616455 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78cd565959-cbrcp"] Feb 17 14:30:48 crc kubenswrapper[4836]: I0217 14:30:48.653545 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 14:30:48 crc kubenswrapper[4836]: I0217 14:30:48.672096 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8b08728-c946-43e4-85fa-0b033034bd26-config\") pod \"dnsmasq-dns-78cd565959-cbrcp\" (UID: \"d8b08728-c946-43e4-85fa-0b033034bd26\") " pod="openstack/dnsmasq-dns-78cd565959-cbrcp" Feb 17 14:30:48 crc kubenswrapper[4836]: I0217 14:30:48.672161 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d8b08728-c946-43e4-85fa-0b033034bd26-ovsdbserver-nb\") pod \"dnsmasq-dns-78cd565959-cbrcp\" (UID: \"d8b08728-c946-43e4-85fa-0b033034bd26\") " pod="openstack/dnsmasq-dns-78cd565959-cbrcp" Feb 17 14:30:48 crc kubenswrapper[4836]: I0217 14:30:48.672246 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d8b08728-c946-43e4-85fa-0b033034bd26-dns-svc\") pod \"dnsmasq-dns-78cd565959-cbrcp\" (UID: \"d8b08728-c946-43e4-85fa-0b033034bd26\") " pod="openstack/dnsmasq-dns-78cd565959-cbrcp" Feb 17 14:30:48 crc kubenswrapper[4836]: I0217 14:30:48.672270 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d8b08728-c946-43e4-85fa-0b033034bd26-dns-swift-storage-0\") pod \"dnsmasq-dns-78cd565959-cbrcp\" (UID: \"d8b08728-c946-43e4-85fa-0b033034bd26\") " pod="openstack/dnsmasq-dns-78cd565959-cbrcp" Feb 17 14:30:48 crc kubenswrapper[4836]: I0217 14:30:48.672417 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d8b08728-c946-43e4-85fa-0b033034bd26-ovsdbserver-sb\") pod \"dnsmasq-dns-78cd565959-cbrcp\" (UID: \"d8b08728-c946-43e4-85fa-0b033034bd26\") " pod="openstack/dnsmasq-dns-78cd565959-cbrcp" Feb 17 14:30:48 crc kubenswrapper[4836]: I0217 14:30:48.672446 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rl67m\" (UniqueName: \"kubernetes.io/projected/d8b08728-c946-43e4-85fa-0b033034bd26-kube-api-access-rl67m\") pod \"dnsmasq-dns-78cd565959-cbrcp\" (UID: \"d8b08728-c946-43e4-85fa-0b033034bd26\") " pod="openstack/dnsmasq-dns-78cd565959-cbrcp" Feb 17 14:30:48 crc kubenswrapper[4836]: I0217 14:30:48.674442 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d8b08728-c946-43e4-85fa-0b033034bd26-dns-svc\") pod \"dnsmasq-dns-78cd565959-cbrcp\" (UID: \"d8b08728-c946-43e4-85fa-0b033034bd26\") " pod="openstack/dnsmasq-dns-78cd565959-cbrcp" Feb 17 14:30:48 crc kubenswrapper[4836]: I0217 14:30:48.678127 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8b08728-c946-43e4-85fa-0b033034bd26-config\") pod \"dnsmasq-dns-78cd565959-cbrcp\" (UID: \"d8b08728-c946-43e4-85fa-0b033034bd26\") " pod="openstack/dnsmasq-dns-78cd565959-cbrcp" Feb 17 14:30:48 crc kubenswrapper[4836]: I0217 14:30:48.680159 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d8b08728-c946-43e4-85fa-0b033034bd26-dns-swift-storage-0\") pod \"dnsmasq-dns-78cd565959-cbrcp\" (UID: \"d8b08728-c946-43e4-85fa-0b033034bd26\") " pod="openstack/dnsmasq-dns-78cd565959-cbrcp" Feb 17 14:30:48 crc kubenswrapper[4836]: I0217 14:30:48.680386 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d8b08728-c946-43e4-85fa-0b033034bd26-ovsdbserver-sb\") pod \"dnsmasq-dns-78cd565959-cbrcp\" (UID: \"d8b08728-c946-43e4-85fa-0b033034bd26\") " pod="openstack/dnsmasq-dns-78cd565959-cbrcp" Feb 17 14:30:48 crc kubenswrapper[4836]: I0217 14:30:48.680911 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d8b08728-c946-43e4-85fa-0b033034bd26-ovsdbserver-nb\") pod \"dnsmasq-dns-78cd565959-cbrcp\" (UID: \"d8b08728-c946-43e4-85fa-0b033034bd26\") " pod="openstack/dnsmasq-dns-78cd565959-cbrcp" Feb 17 14:30:48 crc kubenswrapper[4836]: I0217 14:30:48.702488 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rl67m\" (UniqueName: \"kubernetes.io/projected/d8b08728-c946-43e4-85fa-0b033034bd26-kube-api-access-rl67m\") pod \"dnsmasq-dns-78cd565959-cbrcp\" (UID: \"d8b08728-c946-43e4-85fa-0b033034bd26\") " pod="openstack/dnsmasq-dns-78cd565959-cbrcp" Feb 17 14:30:48 crc kubenswrapper[4836]: I0217 14:30:48.722933 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 17 14:30:48 crc kubenswrapper[4836]: I0217 14:30:48.900053 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78cd565959-cbrcp" Feb 17 14:30:52 crc kubenswrapper[4836]: I0217 14:30:52.630039 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 17 14:30:52 crc kubenswrapper[4836]: I0217 14:30:52.660587 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 14:30:54 crc kubenswrapper[4836]: I0217 14:30:54.872678 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-bz94v"] Feb 17 14:30:54 crc kubenswrapper[4836]: I0217 14:30:54.875418 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-bz94v" Feb 17 14:30:54 crc kubenswrapper[4836]: I0217 14:30:54.879931 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 17 14:30:54 crc kubenswrapper[4836]: I0217 14:30:54.880247 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Feb 17 14:30:54 crc kubenswrapper[4836]: I0217 14:30:54.905368 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-bz94v"] Feb 17 14:30:54 crc kubenswrapper[4836]: I0217 14:30:54.914757 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/790a788c-3cfe-49c8-b1ff-a83bcedf17e0-config-data\") pod \"nova-cell1-conductor-db-sync-bz94v\" (UID: \"790a788c-3cfe-49c8-b1ff-a83bcedf17e0\") " pod="openstack/nova-cell1-conductor-db-sync-bz94v" Feb 17 14:30:54 crc kubenswrapper[4836]: I0217 14:30:54.914893 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hn56p\" (UniqueName: \"kubernetes.io/projected/790a788c-3cfe-49c8-b1ff-a83bcedf17e0-kube-api-access-hn56p\") pod \"nova-cell1-conductor-db-sync-bz94v\" (UID: \"790a788c-3cfe-49c8-b1ff-a83bcedf17e0\") " pod="openstack/nova-cell1-conductor-db-sync-bz94v" Feb 17 14:30:54 crc kubenswrapper[4836]: I0217 14:30:54.914970 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/790a788c-3cfe-49c8-b1ff-a83bcedf17e0-scripts\") pod \"nova-cell1-conductor-db-sync-bz94v\" (UID: \"790a788c-3cfe-49c8-b1ff-a83bcedf17e0\") " pod="openstack/nova-cell1-conductor-db-sync-bz94v" Feb 17 14:30:54 crc kubenswrapper[4836]: I0217 14:30:54.915080 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/790a788c-3cfe-49c8-b1ff-a83bcedf17e0-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-bz94v\" (UID: \"790a788c-3cfe-49c8-b1ff-a83bcedf17e0\") " pod="openstack/nova-cell1-conductor-db-sync-bz94v" Feb 17 14:30:55 crc kubenswrapper[4836]: I0217 14:30:55.017412 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hn56p\" (UniqueName: \"kubernetes.io/projected/790a788c-3cfe-49c8-b1ff-a83bcedf17e0-kube-api-access-hn56p\") pod \"nova-cell1-conductor-db-sync-bz94v\" (UID: \"790a788c-3cfe-49c8-b1ff-a83bcedf17e0\") " pod="openstack/nova-cell1-conductor-db-sync-bz94v" Feb 17 14:30:55 crc kubenswrapper[4836]: I0217 14:30:55.017540 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/790a788c-3cfe-49c8-b1ff-a83bcedf17e0-scripts\") pod \"nova-cell1-conductor-db-sync-bz94v\" (UID: \"790a788c-3cfe-49c8-b1ff-a83bcedf17e0\") " pod="openstack/nova-cell1-conductor-db-sync-bz94v" Feb 17 14:30:55 crc kubenswrapper[4836]: I0217 14:30:55.017658 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/790a788c-3cfe-49c8-b1ff-a83bcedf17e0-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-bz94v\" (UID: \"790a788c-3cfe-49c8-b1ff-a83bcedf17e0\") " pod="openstack/nova-cell1-conductor-db-sync-bz94v" Feb 17 14:30:55 crc kubenswrapper[4836]: I0217 14:30:55.017868 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/790a788c-3cfe-49c8-b1ff-a83bcedf17e0-config-data\") pod \"nova-cell1-conductor-db-sync-bz94v\" (UID: \"790a788c-3cfe-49c8-b1ff-a83bcedf17e0\") " pod="openstack/nova-cell1-conductor-db-sync-bz94v" Feb 17 14:30:55 crc kubenswrapper[4836]: I0217 14:30:55.025092 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/790a788c-3cfe-49c8-b1ff-a83bcedf17e0-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-bz94v\" (UID: \"790a788c-3cfe-49c8-b1ff-a83bcedf17e0\") " pod="openstack/nova-cell1-conductor-db-sync-bz94v" Feb 17 14:30:55 crc kubenswrapper[4836]: I0217 14:30:55.027237 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/790a788c-3cfe-49c8-b1ff-a83bcedf17e0-config-data\") pod \"nova-cell1-conductor-db-sync-bz94v\" (UID: \"790a788c-3cfe-49c8-b1ff-a83bcedf17e0\") " pod="openstack/nova-cell1-conductor-db-sync-bz94v" Feb 17 14:30:55 crc kubenswrapper[4836]: I0217 14:30:55.028128 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/790a788c-3cfe-49c8-b1ff-a83bcedf17e0-scripts\") pod \"nova-cell1-conductor-db-sync-bz94v\" (UID: \"790a788c-3cfe-49c8-b1ff-a83bcedf17e0\") " pod="openstack/nova-cell1-conductor-db-sync-bz94v" Feb 17 14:30:55 crc kubenswrapper[4836]: I0217 14:30:55.044436 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hn56p\" (UniqueName: \"kubernetes.io/projected/790a788c-3cfe-49c8-b1ff-a83bcedf17e0-kube-api-access-hn56p\") pod \"nova-cell1-conductor-db-sync-bz94v\" (UID: \"790a788c-3cfe-49c8-b1ff-a83bcedf17e0\") " pod="openstack/nova-cell1-conductor-db-sync-bz94v" Feb 17 14:30:55 crc kubenswrapper[4836]: I0217 14:30:55.214072 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-bz94v" Feb 17 14:30:59 crc kubenswrapper[4836]: E0217 14:30:59.238808 4836 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Feb 17 14:30:59 crc kubenswrapper[4836]: E0217 14:30:59.239862 4836 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ftf9s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-r5vl4_openshift-marketplace(5d52263a-9417-43b6-903c-79e41b1200a0): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 17 14:30:59 crc kubenswrapper[4836]: E0217 14:30:59.241119 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-r5vl4" podUID="5d52263a-9417-43b6-903c-79e41b1200a0" Feb 17 14:30:59 crc kubenswrapper[4836]: E0217 14:30:59.741281 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-r5vl4" podUID="5d52263a-9417-43b6-903c-79e41b1200a0" Feb 17 14:30:59 crc kubenswrapper[4836]: I0217 14:30:59.984038 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 14:31:00 crc kubenswrapper[4836]: I0217 14:31:00.143985 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506-combined-ca-bundle\") pod \"7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506\" (UID: \"7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506\") " Feb 17 14:31:00 crc kubenswrapper[4836]: I0217 14:31:00.144086 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506-sg-core-conf-yaml\") pod \"7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506\" (UID: \"7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506\") " Feb 17 14:31:00 crc kubenswrapper[4836]: I0217 14:31:00.144192 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506-config-data\") pod \"7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506\" (UID: \"7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506\") " Feb 17 14:31:00 crc kubenswrapper[4836]: I0217 14:31:00.144367 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506-run-httpd\") pod \"7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506\" (UID: \"7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506\") " Feb 17 14:31:00 crc kubenswrapper[4836]: I0217 14:31:00.144384 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506-scripts\") pod \"7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506\" (UID: \"7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506\") " Feb 17 14:31:00 crc kubenswrapper[4836]: I0217 14:31:00.144455 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506-log-httpd\") pod \"7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506\" (UID: \"7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506\") " Feb 17 14:31:00 crc kubenswrapper[4836]: I0217 14:31:00.144513 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l7whp\" (UniqueName: \"kubernetes.io/projected/7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506-kube-api-access-l7whp\") pod \"7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506\" (UID: \"7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506\") " Feb 17 14:31:00 crc kubenswrapper[4836]: I0217 14:31:00.177276 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506" (UID: "7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:31:00 crc kubenswrapper[4836]: I0217 14:31:00.177642 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506" (UID: "7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:31:00 crc kubenswrapper[4836]: I0217 14:31:00.182955 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506-kube-api-access-l7whp" (OuterVolumeSpecName: "kube-api-access-l7whp") pod "7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506" (UID: "7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506"). InnerVolumeSpecName "kube-api-access-l7whp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:31:00 crc kubenswrapper[4836]: I0217 14:31:00.204037 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506-scripts" (OuterVolumeSpecName: "scripts") pod "7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506" (UID: "7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:31:00 crc kubenswrapper[4836]: I0217 14:31:00.225490 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506" (UID: "7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:31:00 crc kubenswrapper[4836]: I0217 14:31:00.248702 4836 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:00 crc kubenswrapper[4836]: I0217 14:31:00.248761 4836 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:00 crc kubenswrapper[4836]: I0217 14:31:00.248777 4836 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:00 crc kubenswrapper[4836]: I0217 14:31:00.248788 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l7whp\" (UniqueName: \"kubernetes.io/projected/7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506-kube-api-access-l7whp\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:00 crc kubenswrapper[4836]: I0217 14:31:00.248801 4836 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:00 crc kubenswrapper[4836]: I0217 14:31:00.297011 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506" (UID: "7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:31:00 crc kubenswrapper[4836]: I0217 14:31:00.352767 4836 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:00 crc kubenswrapper[4836]: I0217 14:31:00.358407 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 14:31:00 crc kubenswrapper[4836]: I0217 14:31:00.381705 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506-config-data" (OuterVolumeSpecName: "config-data") pod "7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506" (UID: "7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:31:00 crc kubenswrapper[4836]: I0217 14:31:00.455544 4836 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:00 crc kubenswrapper[4836]: I0217 14:31:00.794942 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-bz94v"] Feb 17 14:31:00 crc kubenswrapper[4836]: I0217 14:31:00.809661 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 14:31:00 crc kubenswrapper[4836]: I0217 14:31:00.810004 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506","Type":"ContainerDied","Data":"f76c5ca2919dcba6a57d2c3c18620a0d364cbf08343d7aa01bc14bf17c1cc24c"} Feb 17 14:31:00 crc kubenswrapper[4836]: I0217 14:31:00.810751 4836 scope.go:117] "RemoveContainer" containerID="0f482f5faddf89800b640dc5e78fd2ecafe8e9e7010c8aad8dada8307c95b71c" Feb 17 14:31:00 crc kubenswrapper[4836]: W0217 14:31:00.823432 4836 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod50c442bd_1a4d_4e8f_b3b2_c2e6c97faeed.slice/crio-7520be2580517ac140f3d2f437db810e791805bf994d997d530a396b328fa465 WatchSource:0}: Error finding container 7520be2580517ac140f3d2f437db810e791805bf994d997d530a396b328fa465: Status 404 returned error can't find the container with id 7520be2580517ac140f3d2f437db810e791805bf994d997d530a396b328fa465 Feb 17 14:31:00 crc kubenswrapper[4836]: I0217 14:31:00.837639 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3d6e757d-b7e9-417b-a63e-94879c7f3f74","Type":"ContainerStarted","Data":"b6f643b62e3a190c9d6903fa464d7953ef4da0c0399e9c74ba601860f623c445"} Feb 17 14:31:00 crc kubenswrapper[4836]: I0217 14:31:00.852858 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 14:31:00 crc kubenswrapper[4836]: I0217 14:31:00.916776 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-lqvvn"] Feb 17 14:31:00 crc kubenswrapper[4836]: I0217 14:31:00.995804 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 14:31:01 crc kubenswrapper[4836]: I0217 14:31:01.004528 4836 scope.go:117] "RemoveContainer" containerID="c56beb0cbfb019f8f773e8f80f2e2999175246d8b61e325eddb4ad43a6b127d4" Feb 17 14:31:01 crc kubenswrapper[4836]: I0217 14:31:01.047397 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 17 14:31:01 crc kubenswrapper[4836]: I0217 14:31:01.257585 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 17 14:31:01 crc kubenswrapper[4836]: E0217 14:31:01.258286 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506" containerName="ceilometer-notification-agent" Feb 17 14:31:01 crc kubenswrapper[4836]: I0217 14:31:01.258333 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506" containerName="ceilometer-notification-agent" Feb 17 14:31:01 crc kubenswrapper[4836]: E0217 14:31:01.258362 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506" containerName="ceilometer-central-agent" Feb 17 14:31:01 crc kubenswrapper[4836]: I0217 14:31:01.258369 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506" containerName="ceilometer-central-agent" Feb 17 14:31:01 crc kubenswrapper[4836]: E0217 14:31:01.258409 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506" containerName="sg-core" Feb 17 14:31:01 crc kubenswrapper[4836]: I0217 14:31:01.258415 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506" containerName="sg-core" Feb 17 14:31:01 crc kubenswrapper[4836]: E0217 14:31:01.258436 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506" containerName="proxy-httpd" Feb 17 14:31:01 crc kubenswrapper[4836]: I0217 14:31:01.258442 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506" containerName="proxy-httpd" Feb 17 14:31:01 crc kubenswrapper[4836]: I0217 14:31:01.258695 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506" containerName="ceilometer-central-agent" Feb 17 14:31:01 crc kubenswrapper[4836]: I0217 14:31:01.258712 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506" containerName="proxy-httpd" Feb 17 14:31:01 crc kubenswrapper[4836]: I0217 14:31:01.258736 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506" containerName="sg-core" Feb 17 14:31:01 crc kubenswrapper[4836]: I0217 14:31:01.258747 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506" containerName="ceilometer-notification-agent" Feb 17 14:31:01 crc kubenswrapper[4836]: I0217 14:31:01.271160 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 14:31:01 crc kubenswrapper[4836]: I0217 14:31:01.273121 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 14:31:01 crc kubenswrapper[4836]: I0217 14:31:01.277798 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 17 14:31:01 crc kubenswrapper[4836]: I0217 14:31:01.278377 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 17 14:31:01 crc kubenswrapper[4836]: I0217 14:31:01.476882 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68b35111-581a-4e2e-9fae-3e0248674655-scripts\") pod \"ceilometer-0\" (UID: \"68b35111-581a-4e2e-9fae-3e0248674655\") " pod="openstack/ceilometer-0" Feb 17 14:31:01 crc kubenswrapper[4836]: I0217 14:31:01.479938 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/68b35111-581a-4e2e-9fae-3e0248674655-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"68b35111-581a-4e2e-9fae-3e0248674655\") " pod="openstack/ceilometer-0" Feb 17 14:31:01 crc kubenswrapper[4836]: I0217 14:31:01.480235 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qclhz\" (UniqueName: \"kubernetes.io/projected/68b35111-581a-4e2e-9fae-3e0248674655-kube-api-access-qclhz\") pod \"ceilometer-0\" (UID: \"68b35111-581a-4e2e-9fae-3e0248674655\") " pod="openstack/ceilometer-0" Feb 17 14:31:01 crc kubenswrapper[4836]: I0217 14:31:01.480679 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68b35111-581a-4e2e-9fae-3e0248674655-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"68b35111-581a-4e2e-9fae-3e0248674655\") " pod="openstack/ceilometer-0" Feb 17 14:31:01 crc kubenswrapper[4836]: I0217 14:31:01.480862 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/68b35111-581a-4e2e-9fae-3e0248674655-log-httpd\") pod \"ceilometer-0\" (UID: \"68b35111-581a-4e2e-9fae-3e0248674655\") " pod="openstack/ceilometer-0" Feb 17 14:31:01 crc kubenswrapper[4836]: I0217 14:31:01.481134 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68b35111-581a-4e2e-9fae-3e0248674655-config-data\") pod \"ceilometer-0\" (UID: \"68b35111-581a-4e2e-9fae-3e0248674655\") " pod="openstack/ceilometer-0" Feb 17 14:31:01 crc kubenswrapper[4836]: I0217 14:31:01.481353 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/68b35111-581a-4e2e-9fae-3e0248674655-run-httpd\") pod \"ceilometer-0\" (UID: \"68b35111-581a-4e2e-9fae-3e0248674655\") " pod="openstack/ceilometer-0" Feb 17 14:31:01 crc kubenswrapper[4836]: I0217 14:31:01.550724 4836 scope.go:117] "RemoveContainer" containerID="36e8022aac7122767c45efd486545780cddb57ef019acdaab3f1b9c40d6c965d" Feb 17 14:31:01 crc kubenswrapper[4836]: I0217 14:31:01.555132 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 17 14:31:01 crc kubenswrapper[4836]: I0217 14:31:01.573755 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 17 14:31:01 crc kubenswrapper[4836]: I0217 14:31:01.584937 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/68b35111-581a-4e2e-9fae-3e0248674655-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"68b35111-581a-4e2e-9fae-3e0248674655\") " pod="openstack/ceilometer-0" Feb 17 14:31:01 crc kubenswrapper[4836]: I0217 14:31:01.585017 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qclhz\" (UniqueName: \"kubernetes.io/projected/68b35111-581a-4e2e-9fae-3e0248674655-kube-api-access-qclhz\") pod \"ceilometer-0\" (UID: \"68b35111-581a-4e2e-9fae-3e0248674655\") " pod="openstack/ceilometer-0" Feb 17 14:31:01 crc kubenswrapper[4836]: I0217 14:31:01.585107 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68b35111-581a-4e2e-9fae-3e0248674655-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"68b35111-581a-4e2e-9fae-3e0248674655\") " pod="openstack/ceilometer-0" Feb 17 14:31:01 crc kubenswrapper[4836]: I0217 14:31:01.585141 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/68b35111-581a-4e2e-9fae-3e0248674655-log-httpd\") pod \"ceilometer-0\" (UID: \"68b35111-581a-4e2e-9fae-3e0248674655\") " pod="openstack/ceilometer-0" Feb 17 14:31:01 crc kubenswrapper[4836]: I0217 14:31:01.585164 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68b35111-581a-4e2e-9fae-3e0248674655-config-data\") pod \"ceilometer-0\" (UID: \"68b35111-581a-4e2e-9fae-3e0248674655\") " pod="openstack/ceilometer-0" Feb 17 14:31:01 crc kubenswrapper[4836]: I0217 14:31:01.585213 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/68b35111-581a-4e2e-9fae-3e0248674655-run-httpd\") pod \"ceilometer-0\" (UID: \"68b35111-581a-4e2e-9fae-3e0248674655\") " pod="openstack/ceilometer-0" Feb 17 14:31:01 crc kubenswrapper[4836]: I0217 14:31:01.585233 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68b35111-581a-4e2e-9fae-3e0248674655-scripts\") pod \"ceilometer-0\" (UID: \"68b35111-581a-4e2e-9fae-3e0248674655\") " pod="openstack/ceilometer-0" Feb 17 14:31:01 crc kubenswrapper[4836]: I0217 14:31:01.590421 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78cd565959-cbrcp"] Feb 17 14:31:01 crc kubenswrapper[4836]: I0217 14:31:01.594905 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68b35111-581a-4e2e-9fae-3e0248674655-scripts\") pod \"ceilometer-0\" (UID: \"68b35111-581a-4e2e-9fae-3e0248674655\") " pod="openstack/ceilometer-0" Feb 17 14:31:01 crc kubenswrapper[4836]: I0217 14:31:01.595402 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/68b35111-581a-4e2e-9fae-3e0248674655-log-httpd\") pod \"ceilometer-0\" (UID: \"68b35111-581a-4e2e-9fae-3e0248674655\") " pod="openstack/ceilometer-0" Feb 17 14:31:01 crc kubenswrapper[4836]: I0217 14:31:01.600387 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/68b35111-581a-4e2e-9fae-3e0248674655-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"68b35111-581a-4e2e-9fae-3e0248674655\") " pod="openstack/ceilometer-0" Feb 17 14:31:01 crc kubenswrapper[4836]: I0217 14:31:01.600776 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/68b35111-581a-4e2e-9fae-3e0248674655-run-httpd\") pod \"ceilometer-0\" (UID: \"68b35111-581a-4e2e-9fae-3e0248674655\") " pod="openstack/ceilometer-0" Feb 17 14:31:01 crc kubenswrapper[4836]: I0217 14:31:01.601168 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68b35111-581a-4e2e-9fae-3e0248674655-config-data\") pod \"ceilometer-0\" (UID: \"68b35111-581a-4e2e-9fae-3e0248674655\") " pod="openstack/ceilometer-0" Feb 17 14:31:01 crc kubenswrapper[4836]: I0217 14:31:01.619859 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68b35111-581a-4e2e-9fae-3e0248674655-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"68b35111-581a-4e2e-9fae-3e0248674655\") " pod="openstack/ceilometer-0" Feb 17 14:31:01 crc kubenswrapper[4836]: I0217 14:31:01.629780 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qclhz\" (UniqueName: \"kubernetes.io/projected/68b35111-581a-4e2e-9fae-3e0248674655-kube-api-access-qclhz\") pod \"ceilometer-0\" (UID: \"68b35111-581a-4e2e-9fae-3e0248674655\") " pod="openstack/ceilometer-0" Feb 17 14:31:01 crc kubenswrapper[4836]: I0217 14:31:01.734954 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 14:31:01 crc kubenswrapper[4836]: I0217 14:31:01.800381 4836 scope.go:117] "RemoveContainer" containerID="2a94678a06d4de49c0c0b68a141d38b36f2fd3139243ed587746cebb8d0a09d9" Feb 17 14:31:01 crc kubenswrapper[4836]: I0217 14:31:01.866442 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-bz94v" event={"ID":"790a788c-3cfe-49c8-b1ff-a83bcedf17e0","Type":"ContainerStarted","Data":"a2bfac90ff95a7b3434ec97c87c23bac5422abcf87cd41094de043f68aa74460"} Feb 17 14:31:01 crc kubenswrapper[4836]: I0217 14:31:01.868874 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"73de5f3f-982c-4471-b91b-e3725da6be03","Type":"ContainerStarted","Data":"73eb695666ad36f06c437207b1c5c555a1d75dfaa1f2995189afe99e230da0da"} Feb 17 14:31:01 crc kubenswrapper[4836]: I0217 14:31:01.873700 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c429025c-a79e-425a-987a-773baaba5ef2","Type":"ContainerStarted","Data":"156a656ff70823725df87467a6303f3fbea750b4b0136e5d302f89802c6a5c93"} Feb 17 14:31:01 crc kubenswrapper[4836]: I0217 14:31:01.878369 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"50c442bd-1a4d-4e8f-b3b2-c2e6c97faeed","Type":"ContainerStarted","Data":"7520be2580517ac140f3d2f437db810e791805bf994d997d530a396b328fa465"} Feb 17 14:31:01 crc kubenswrapper[4836]: I0217 14:31:01.880888 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78cd565959-cbrcp" event={"ID":"d8b08728-c946-43e4-85fa-0b033034bd26","Type":"ContainerStarted","Data":"4a4c76bc357a85c3013a688505b9be4f985a6e124e635443b51b48a2960c2a36"} Feb 17 14:31:01 crc kubenswrapper[4836]: I0217 14:31:01.883426 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-lqvvn" event={"ID":"3f9d6a93-3d3a-4c5c-85cf-329209cfe911","Type":"ContainerStarted","Data":"2372be5c0fdb7f3175c7e34e83cf6164deb631b4fbf708d784b860e58c9b0a29"} Feb 17 14:31:02 crc kubenswrapper[4836]: I0217 14:31:02.473678 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 14:31:02 crc kubenswrapper[4836]: W0217 14:31:02.487606 4836 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod68b35111_581a_4e2e_9fae_3e0248674655.slice/crio-ae9ad801c19207ea76afd61356edd8b9b7b66ec00cfa25e818baaceb869c4ad6 WatchSource:0}: Error finding container ae9ad801c19207ea76afd61356edd8b9b7b66ec00cfa25e818baaceb869c4ad6: Status 404 returned error can't find the container with id ae9ad801c19207ea76afd61356edd8b9b7b66ec00cfa25e818baaceb869c4ad6 Feb 17 14:31:02 crc kubenswrapper[4836]: I0217 14:31:02.589165 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506" path="/var/lib/kubelet/pods/7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506/volumes" Feb 17 14:31:02 crc kubenswrapper[4836]: I0217 14:31:02.904542 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"68b35111-581a-4e2e-9fae-3e0248674655","Type":"ContainerStarted","Data":"ae9ad801c19207ea76afd61356edd8b9b7b66ec00cfa25e818baaceb869c4ad6"} Feb 17 14:31:02 crc kubenswrapper[4836]: I0217 14:31:02.909396 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-bz94v" event={"ID":"790a788c-3cfe-49c8-b1ff-a83bcedf17e0","Type":"ContainerStarted","Data":"9a55578dc34e67ce0a93dbbd5c5e496ed951f38d462ffb4dcccf5ec23897e1c5"} Feb 17 14:31:02 crc kubenswrapper[4836]: I0217 14:31:02.920322 4836 generic.go:334] "Generic (PLEG): container finished" podID="d8b08728-c946-43e4-85fa-0b033034bd26" containerID="10880f8e13f3f6efc6d19c175c05a63fc27f01501a301fd0a28b68afaa946ee2" exitCode=0 Feb 17 14:31:02 crc kubenswrapper[4836]: I0217 14:31:02.920455 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78cd565959-cbrcp" event={"ID":"d8b08728-c946-43e4-85fa-0b033034bd26","Type":"ContainerDied","Data":"10880f8e13f3f6efc6d19c175c05a63fc27f01501a301fd0a28b68afaa946ee2"} Feb 17 14:31:02 crc kubenswrapper[4836]: I0217 14:31:02.930117 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-lqvvn" event={"ID":"3f9d6a93-3d3a-4c5c-85cf-329209cfe911","Type":"ContainerStarted","Data":"c224cbe49994301a8cf7d7e85623916f9815d0873ee461d723b64e1a3b753f8d"} Feb 17 14:31:02 crc kubenswrapper[4836]: I0217 14:31:02.945717 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-bz94v" podStartSLOduration=8.945682756 podStartE2EDuration="8.945682756s" podCreationTimestamp="2026-02-17 14:30:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:31:02.938466091 +0000 UTC m=+1489.281394370" watchObservedRunningTime="2026-02-17 14:31:02.945682756 +0000 UTC m=+1489.288611045" Feb 17 14:31:03 crc kubenswrapper[4836]: I0217 14:31:03.008446 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-lqvvn" podStartSLOduration=16.00828115 podStartE2EDuration="16.00828115s" podCreationTimestamp="2026-02-17 14:30:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:31:02.986051298 +0000 UTC m=+1489.328979567" watchObservedRunningTime="2026-02-17 14:31:03.00828115 +0000 UTC m=+1489.351209419" Feb 17 14:31:07 crc kubenswrapper[4836]: I0217 14:31:07.022837 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"68b35111-581a-4e2e-9fae-3e0248674655","Type":"ContainerStarted","Data":"061b81d1777bbf667556b595b5cc63218a0966b7aa3e7690e90b4b84a2173bed"} Feb 17 14:31:07 crc kubenswrapper[4836]: I0217 14:31:07.034611 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3d6e757d-b7e9-417b-a63e-94879c7f3f74","Type":"ContainerStarted","Data":"7b113b343450d9b94ea6adbfe150b4f215416e1008c8bb123e85d6f538396d6e"} Feb 17 14:31:07 crc kubenswrapper[4836]: I0217 14:31:07.034669 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3d6e757d-b7e9-417b-a63e-94879c7f3f74","Type":"ContainerStarted","Data":"376eec79f26ee001240c9935be991e0037ae7235786eba961156abcccae1aa84"} Feb 17 14:31:07 crc kubenswrapper[4836]: I0217 14:31:07.034829 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="3d6e757d-b7e9-417b-a63e-94879c7f3f74" containerName="nova-metadata-log" containerID="cri-o://376eec79f26ee001240c9935be991e0037ae7235786eba961156abcccae1aa84" gracePeriod=30 Feb 17 14:31:07 crc kubenswrapper[4836]: I0217 14:31:07.035420 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="3d6e757d-b7e9-417b-a63e-94879c7f3f74" containerName="nova-metadata-metadata" containerID="cri-o://7b113b343450d9b94ea6adbfe150b4f215416e1008c8bb123e85d6f538396d6e" gracePeriod=30 Feb 17 14:31:07 crc kubenswrapper[4836]: I0217 14:31:07.046393 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="73de5f3f-982c-4471-b91b-e3725da6be03" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://62c716ef584f99d01d3c8b6a75eeb1c7c51deabb2ccc02590847f8e3c96e8ddb" gracePeriod=30 Feb 17 14:31:07 crc kubenswrapper[4836]: I0217 14:31:07.046451 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"73de5f3f-982c-4471-b91b-e3725da6be03","Type":"ContainerStarted","Data":"62c716ef584f99d01d3c8b6a75eeb1c7c51deabb2ccc02590847f8e3c96e8ddb"} Feb 17 14:31:07 crc kubenswrapper[4836]: I0217 14:31:07.059982 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c429025c-a79e-425a-987a-773baaba5ef2","Type":"ContainerStarted","Data":"629650e8b16a63777bc8cae654dc487e384a65773313b5678caf2189bce80c69"} Feb 17 14:31:07 crc kubenswrapper[4836]: I0217 14:31:07.060083 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c429025c-a79e-425a-987a-773baaba5ef2","Type":"ContainerStarted","Data":"714f99461e2735815fecf65f89d7affa2f50ecfc1ef8586cdf0229b12c705175"} Feb 17 14:31:07 crc kubenswrapper[4836]: I0217 14:31:07.067821 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"50c442bd-1a4d-4e8f-b3b2-c2e6c97faeed","Type":"ContainerStarted","Data":"bcf6221fea7b134ba44f91202f344506361b2f3cd1e454aa804b4124188e38ef"} Feb 17 14:31:07 crc kubenswrapper[4836]: I0217 14:31:07.069803 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=14.583836704 podStartE2EDuration="20.0697675s" podCreationTimestamp="2026-02-17 14:30:47 +0000 UTC" firstStartedPulling="2026-02-17 14:31:00.355879851 +0000 UTC m=+1486.698808120" lastFinishedPulling="2026-02-17 14:31:05.841810647 +0000 UTC m=+1492.184738916" observedRunningTime="2026-02-17 14:31:07.06238526 +0000 UTC m=+1493.405313529" watchObservedRunningTime="2026-02-17 14:31:07.0697675 +0000 UTC m=+1493.412695769" Feb 17 14:31:07 crc kubenswrapper[4836]: I0217 14:31:07.088005 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78cd565959-cbrcp" event={"ID":"d8b08728-c946-43e4-85fa-0b033034bd26","Type":"ContainerStarted","Data":"6c739c83cd6c60eccf82cdc83958244ae182a579d8b46273f0f3fb2234b691ec"} Feb 17 14:31:07 crc kubenswrapper[4836]: I0217 14:31:07.089745 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-78cd565959-cbrcp" Feb 17 14:31:07 crc kubenswrapper[4836]: I0217 14:31:07.097211 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=15.883125737 podStartE2EDuration="20.097172494s" podCreationTimestamp="2026-02-17 14:30:47 +0000 UTC" firstStartedPulling="2026-02-17 14:31:01.644706779 +0000 UTC m=+1487.987635048" lastFinishedPulling="2026-02-17 14:31:05.858753536 +0000 UTC m=+1492.201681805" observedRunningTime="2026-02-17 14:31:07.090057091 +0000 UTC m=+1493.432985380" watchObservedRunningTime="2026-02-17 14:31:07.097172494 +0000 UTC m=+1493.440100763" Feb 17 14:31:07 crc kubenswrapper[4836]: I0217 14:31:07.116262 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=14.872917693 podStartE2EDuration="19.116232531s" podCreationTimestamp="2026-02-17 14:30:48 +0000 UTC" firstStartedPulling="2026-02-17 14:31:01.598497029 +0000 UTC m=+1487.941425298" lastFinishedPulling="2026-02-17 14:31:05.841811857 +0000 UTC m=+1492.184740136" observedRunningTime="2026-02-17 14:31:07.114476193 +0000 UTC m=+1493.457404482" watchObservedRunningTime="2026-02-17 14:31:07.116232531 +0000 UTC m=+1493.459160820" Feb 17 14:31:07 crc kubenswrapper[4836]: I0217 14:31:07.156148 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-78cd565959-cbrcp" podStartSLOduration=19.156108682 podStartE2EDuration="19.156108682s" podCreationTimestamp="2026-02-17 14:30:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:31:07.145112513 +0000 UTC m=+1493.488040802" watchObservedRunningTime="2026-02-17 14:31:07.156108682 +0000 UTC m=+1493.499036951" Feb 17 14:31:07 crc kubenswrapper[4836]: I0217 14:31:07.183688 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=15.200295449 podStartE2EDuration="20.183654309s" podCreationTimestamp="2026-02-17 14:30:47 +0000 UTC" firstStartedPulling="2026-02-17 14:31:00.866970678 +0000 UTC m=+1487.209898947" lastFinishedPulling="2026-02-17 14:31:05.850329528 +0000 UTC m=+1492.193257807" observedRunningTime="2026-02-17 14:31:07.169095564 +0000 UTC m=+1493.512023843" watchObservedRunningTime="2026-02-17 14:31:07.183654309 +0000 UTC m=+1493.526582568" Feb 17 14:31:08 crc kubenswrapper[4836]: I0217 14:31:08.113597 4836 generic.go:334] "Generic (PLEG): container finished" podID="3d6e757d-b7e9-417b-a63e-94879c7f3f74" containerID="376eec79f26ee001240c9935be991e0037ae7235786eba961156abcccae1aa84" exitCode=143 Feb 17 14:31:08 crc kubenswrapper[4836]: I0217 14:31:08.113825 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3d6e757d-b7e9-417b-a63e-94879c7f3f74","Type":"ContainerDied","Data":"376eec79f26ee001240c9935be991e0037ae7235786eba961156abcccae1aa84"} Feb 17 14:31:08 crc kubenswrapper[4836]: I0217 14:31:08.121745 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"68b35111-581a-4e2e-9fae-3e0248674655","Type":"ContainerStarted","Data":"f38b096e0cb174cc64328d45e0488ca7d9888e5f9a99f36292e400cf1d4fd13d"} Feb 17 14:31:08 crc kubenswrapper[4836]: I0217 14:31:08.321632 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 17 14:31:08 crc kubenswrapper[4836]: I0217 14:31:08.321741 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 17 14:31:08 crc kubenswrapper[4836]: I0217 14:31:08.521149 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 17 14:31:08 crc kubenswrapper[4836]: I0217 14:31:08.521238 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 17 14:31:08 crc kubenswrapper[4836]: I0217 14:31:08.559125 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 17 14:31:08 crc kubenswrapper[4836]: I0217 14:31:08.654541 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 17 14:31:08 crc kubenswrapper[4836]: I0217 14:31:08.654636 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 17 14:31:08 crc kubenswrapper[4836]: I0217 14:31:08.724163 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 17 14:31:09 crc kubenswrapper[4836]: I0217 14:31:09.141327 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"68b35111-581a-4e2e-9fae-3e0248674655","Type":"ContainerStarted","Data":"b296d87fd7976e77f9e6129f3c2eff1cfb26664d576e803cadb9560994f76f7d"} Feb 17 14:31:09 crc kubenswrapper[4836]: I0217 14:31:09.187972 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 17 14:31:09 crc kubenswrapper[4836]: I0217 14:31:09.403663 4836 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="c429025c-a79e-425a-987a-773baaba5ef2" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.214:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 17 14:31:09 crc kubenswrapper[4836]: I0217 14:31:09.404154 4836 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="c429025c-a79e-425a-987a-773baaba5ef2" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.214:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 17 14:31:10 crc kubenswrapper[4836]: I0217 14:31:10.157056 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"68b35111-581a-4e2e-9fae-3e0248674655","Type":"ContainerStarted","Data":"bd01937315b50abe24d9b50b0b169dd1610243cceeeeb3bd7fb5017e3522ad76"} Feb 17 14:31:10 crc kubenswrapper[4836]: I0217 14:31:10.213558 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.425666968 podStartE2EDuration="10.213531733s" podCreationTimestamp="2026-02-17 14:31:00 +0000 UTC" firstStartedPulling="2026-02-17 14:31:02.504714586 +0000 UTC m=+1488.847642865" lastFinishedPulling="2026-02-17 14:31:09.292579361 +0000 UTC m=+1495.635507630" observedRunningTime="2026-02-17 14:31:10.209606746 +0000 UTC m=+1496.552535025" watchObservedRunningTime="2026-02-17 14:31:10.213531733 +0000 UTC m=+1496.556460002" Feb 17 14:31:11 crc kubenswrapper[4836]: I0217 14:31:11.177831 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 17 14:31:13 crc kubenswrapper[4836]: I0217 14:31:13.202188 4836 generic.go:334] "Generic (PLEG): container finished" podID="790a788c-3cfe-49c8-b1ff-a83bcedf17e0" containerID="9a55578dc34e67ce0a93dbbd5c5e496ed951f38d462ffb4dcccf5ec23897e1c5" exitCode=0 Feb 17 14:31:13 crc kubenswrapper[4836]: I0217 14:31:13.202698 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-bz94v" event={"ID":"790a788c-3cfe-49c8-b1ff-a83bcedf17e0","Type":"ContainerDied","Data":"9a55578dc34e67ce0a93dbbd5c5e496ed951f38d462ffb4dcccf5ec23897e1c5"} Feb 17 14:31:13 crc kubenswrapper[4836]: I0217 14:31:13.902600 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-78cd565959-cbrcp" Feb 17 14:31:13 crc kubenswrapper[4836]: I0217 14:31:13.966754 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67bdc55879-cjz8m"] Feb 17 14:31:13 crc kubenswrapper[4836]: I0217 14:31:13.975179 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-67bdc55879-cjz8m" podUID="2cb84cd5-666c-4cf2-94df-5b51d0e3fa6d" containerName="dnsmasq-dns" containerID="cri-o://407f5678203e5e174c01300835b55b61252a1ab248014426970911ab531d756b" gracePeriod=10 Feb 17 14:31:14 crc kubenswrapper[4836]: I0217 14:31:14.220455 4836 generic.go:334] "Generic (PLEG): container finished" podID="2cb84cd5-666c-4cf2-94df-5b51d0e3fa6d" containerID="407f5678203e5e174c01300835b55b61252a1ab248014426970911ab531d756b" exitCode=0 Feb 17 14:31:14 crc kubenswrapper[4836]: I0217 14:31:14.220579 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67bdc55879-cjz8m" event={"ID":"2cb84cd5-666c-4cf2-94df-5b51d0e3fa6d","Type":"ContainerDied","Data":"407f5678203e5e174c01300835b55b61252a1ab248014426970911ab531d756b"} Feb 17 14:31:14 crc kubenswrapper[4836]: I0217 14:31:14.224914 4836 generic.go:334] "Generic (PLEG): container finished" podID="3f9d6a93-3d3a-4c5c-85cf-329209cfe911" containerID="c224cbe49994301a8cf7d7e85623916f9815d0873ee461d723b64e1a3b753f8d" exitCode=0 Feb 17 14:31:14 crc kubenswrapper[4836]: I0217 14:31:14.225183 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-lqvvn" event={"ID":"3f9d6a93-3d3a-4c5c-85cf-329209cfe911","Type":"ContainerDied","Data":"c224cbe49994301a8cf7d7e85623916f9815d0873ee461d723b64e1a3b753f8d"} Feb 17 14:31:14 crc kubenswrapper[4836]: I0217 14:31:14.428456 4836 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-67bdc55879-cjz8m" podUID="2cb84cd5-666c-4cf2-94df-5b51d0e3fa6d" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.190:5353: connect: connection refused" Feb 17 14:31:14 crc kubenswrapper[4836]: E0217 14:31:14.799652 4836 info.go:109] Failed to get network devices: open /sys/class/net/a2bfac90ff95a7b/address: no such file or directory Feb 17 14:31:15 crc kubenswrapper[4836]: I0217 14:31:15.075882 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-bz94v" Feb 17 14:31:15 crc kubenswrapper[4836]: I0217 14:31:15.215607 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hn56p\" (UniqueName: \"kubernetes.io/projected/790a788c-3cfe-49c8-b1ff-a83bcedf17e0-kube-api-access-hn56p\") pod \"790a788c-3cfe-49c8-b1ff-a83bcedf17e0\" (UID: \"790a788c-3cfe-49c8-b1ff-a83bcedf17e0\") " Feb 17 14:31:15 crc kubenswrapper[4836]: I0217 14:31:15.215794 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/790a788c-3cfe-49c8-b1ff-a83bcedf17e0-config-data\") pod \"790a788c-3cfe-49c8-b1ff-a83bcedf17e0\" (UID: \"790a788c-3cfe-49c8-b1ff-a83bcedf17e0\") " Feb 17 14:31:15 crc kubenswrapper[4836]: I0217 14:31:15.215860 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/790a788c-3cfe-49c8-b1ff-a83bcedf17e0-combined-ca-bundle\") pod \"790a788c-3cfe-49c8-b1ff-a83bcedf17e0\" (UID: \"790a788c-3cfe-49c8-b1ff-a83bcedf17e0\") " Feb 17 14:31:15 crc kubenswrapper[4836]: I0217 14:31:15.215967 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/790a788c-3cfe-49c8-b1ff-a83bcedf17e0-scripts\") pod \"790a788c-3cfe-49c8-b1ff-a83bcedf17e0\" (UID: \"790a788c-3cfe-49c8-b1ff-a83bcedf17e0\") " Feb 17 14:31:15 crc kubenswrapper[4836]: I0217 14:31:15.223487 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/790a788c-3cfe-49c8-b1ff-a83bcedf17e0-scripts" (OuterVolumeSpecName: "scripts") pod "790a788c-3cfe-49c8-b1ff-a83bcedf17e0" (UID: "790a788c-3cfe-49c8-b1ff-a83bcedf17e0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:31:15 crc kubenswrapper[4836]: I0217 14:31:15.224036 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/790a788c-3cfe-49c8-b1ff-a83bcedf17e0-kube-api-access-hn56p" (OuterVolumeSpecName: "kube-api-access-hn56p") pod "790a788c-3cfe-49c8-b1ff-a83bcedf17e0" (UID: "790a788c-3cfe-49c8-b1ff-a83bcedf17e0"). InnerVolumeSpecName "kube-api-access-hn56p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:31:15 crc kubenswrapper[4836]: I0217 14:31:15.244625 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67bdc55879-cjz8m" event={"ID":"2cb84cd5-666c-4cf2-94df-5b51d0e3fa6d","Type":"ContainerDied","Data":"3d5f1259a1d6811a1bf928961a17e403bc60cdb65dc5d67063f562d7b7e44223"} Feb 17 14:31:15 crc kubenswrapper[4836]: I0217 14:31:15.244685 4836 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3d5f1259a1d6811a1bf928961a17e403bc60cdb65dc5d67063f562d7b7e44223" Feb 17 14:31:15 crc kubenswrapper[4836]: I0217 14:31:15.253560 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-bz94v" event={"ID":"790a788c-3cfe-49c8-b1ff-a83bcedf17e0","Type":"ContainerDied","Data":"a2bfac90ff95a7b3434ec97c87c23bac5422abcf87cd41094de043f68aa74460"} Feb 17 14:31:15 crc kubenswrapper[4836]: I0217 14:31:15.253645 4836 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a2bfac90ff95a7b3434ec97c87c23bac5422abcf87cd41094de043f68aa74460" Feb 17 14:31:15 crc kubenswrapper[4836]: I0217 14:31:15.255475 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-bz94v" Feb 17 14:31:15 crc kubenswrapper[4836]: I0217 14:31:15.264346 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/790a788c-3cfe-49c8-b1ff-a83bcedf17e0-config-data" (OuterVolumeSpecName: "config-data") pod "790a788c-3cfe-49c8-b1ff-a83bcedf17e0" (UID: "790a788c-3cfe-49c8-b1ff-a83bcedf17e0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:31:15 crc kubenswrapper[4836]: I0217 14:31:15.279757 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/790a788c-3cfe-49c8-b1ff-a83bcedf17e0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "790a788c-3cfe-49c8-b1ff-a83bcedf17e0" (UID: "790a788c-3cfe-49c8-b1ff-a83bcedf17e0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:31:15 crc kubenswrapper[4836]: I0217 14:31:15.326052 4836 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/790a788c-3cfe-49c8-b1ff-a83bcedf17e0-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:15 crc kubenswrapper[4836]: I0217 14:31:15.326091 4836 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/790a788c-3cfe-49c8-b1ff-a83bcedf17e0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:15 crc kubenswrapper[4836]: I0217 14:31:15.326105 4836 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/790a788c-3cfe-49c8-b1ff-a83bcedf17e0-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:15 crc kubenswrapper[4836]: I0217 14:31:15.326115 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hn56p\" (UniqueName: \"kubernetes.io/projected/790a788c-3cfe-49c8-b1ff-a83bcedf17e0-kube-api-access-hn56p\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:15 crc kubenswrapper[4836]: I0217 14:31:15.351374 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 17 14:31:15 crc kubenswrapper[4836]: E0217 14:31:15.352059 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="790a788c-3cfe-49c8-b1ff-a83bcedf17e0" containerName="nova-cell1-conductor-db-sync" Feb 17 14:31:15 crc kubenswrapper[4836]: I0217 14:31:15.352080 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="790a788c-3cfe-49c8-b1ff-a83bcedf17e0" containerName="nova-cell1-conductor-db-sync" Feb 17 14:31:15 crc kubenswrapper[4836]: I0217 14:31:15.352326 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="790a788c-3cfe-49c8-b1ff-a83bcedf17e0" containerName="nova-cell1-conductor-db-sync" Feb 17 14:31:15 crc kubenswrapper[4836]: I0217 14:31:15.353253 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 17 14:31:15 crc kubenswrapper[4836]: I0217 14:31:15.383989 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 17 14:31:15 crc kubenswrapper[4836]: I0217 14:31:15.409222 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67bdc55879-cjz8m" Feb 17 14:31:15 crc kubenswrapper[4836]: I0217 14:31:15.435554 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed905f2c-85b9-4684-a376-674caf693eca-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"ed905f2c-85b9-4684-a376-674caf693eca\") " pod="openstack/nova-cell1-conductor-0" Feb 17 14:31:15 crc kubenswrapper[4836]: I0217 14:31:15.435687 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed905f2c-85b9-4684-a376-674caf693eca-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"ed905f2c-85b9-4684-a376-674caf693eca\") " pod="openstack/nova-cell1-conductor-0" Feb 17 14:31:15 crc kubenswrapper[4836]: I0217 14:31:15.435890 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5q92t\" (UniqueName: \"kubernetes.io/projected/ed905f2c-85b9-4684-a376-674caf693eca-kube-api-access-5q92t\") pod \"nova-cell1-conductor-0\" (UID: \"ed905f2c-85b9-4684-a376-674caf693eca\") " pod="openstack/nova-cell1-conductor-0" Feb 17 14:31:15 crc kubenswrapper[4836]: I0217 14:31:15.537282 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9tzl2\" (UniqueName: \"kubernetes.io/projected/2cb84cd5-666c-4cf2-94df-5b51d0e3fa6d-kube-api-access-9tzl2\") pod \"2cb84cd5-666c-4cf2-94df-5b51d0e3fa6d\" (UID: \"2cb84cd5-666c-4cf2-94df-5b51d0e3fa6d\") " Feb 17 14:31:15 crc kubenswrapper[4836]: I0217 14:31:15.537397 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2cb84cd5-666c-4cf2-94df-5b51d0e3fa6d-dns-svc\") pod \"2cb84cd5-666c-4cf2-94df-5b51d0e3fa6d\" (UID: \"2cb84cd5-666c-4cf2-94df-5b51d0e3fa6d\") " Feb 17 14:31:15 crc kubenswrapper[4836]: I0217 14:31:15.537423 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2cb84cd5-666c-4cf2-94df-5b51d0e3fa6d-ovsdbserver-sb\") pod \"2cb84cd5-666c-4cf2-94df-5b51d0e3fa6d\" (UID: \"2cb84cd5-666c-4cf2-94df-5b51d0e3fa6d\") " Feb 17 14:31:15 crc kubenswrapper[4836]: I0217 14:31:15.537544 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2cb84cd5-666c-4cf2-94df-5b51d0e3fa6d-config\") pod \"2cb84cd5-666c-4cf2-94df-5b51d0e3fa6d\" (UID: \"2cb84cd5-666c-4cf2-94df-5b51d0e3fa6d\") " Feb 17 14:31:15 crc kubenswrapper[4836]: I0217 14:31:15.537576 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2cb84cd5-666c-4cf2-94df-5b51d0e3fa6d-ovsdbserver-nb\") pod \"2cb84cd5-666c-4cf2-94df-5b51d0e3fa6d\" (UID: \"2cb84cd5-666c-4cf2-94df-5b51d0e3fa6d\") " Feb 17 14:31:15 crc kubenswrapper[4836]: I0217 14:31:15.537771 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2cb84cd5-666c-4cf2-94df-5b51d0e3fa6d-dns-swift-storage-0\") pod \"2cb84cd5-666c-4cf2-94df-5b51d0e3fa6d\" (UID: \"2cb84cd5-666c-4cf2-94df-5b51d0e3fa6d\") " Feb 17 14:31:15 crc kubenswrapper[4836]: I0217 14:31:15.538184 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed905f2c-85b9-4684-a376-674caf693eca-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"ed905f2c-85b9-4684-a376-674caf693eca\") " pod="openstack/nova-cell1-conductor-0" Feb 17 14:31:15 crc kubenswrapper[4836]: I0217 14:31:15.538334 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5q92t\" (UniqueName: \"kubernetes.io/projected/ed905f2c-85b9-4684-a376-674caf693eca-kube-api-access-5q92t\") pod \"nova-cell1-conductor-0\" (UID: \"ed905f2c-85b9-4684-a376-674caf693eca\") " pod="openstack/nova-cell1-conductor-0" Feb 17 14:31:15 crc kubenswrapper[4836]: I0217 14:31:15.538409 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed905f2c-85b9-4684-a376-674caf693eca-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"ed905f2c-85b9-4684-a376-674caf693eca\") " pod="openstack/nova-cell1-conductor-0" Feb 17 14:31:15 crc kubenswrapper[4836]: I0217 14:31:15.547549 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed905f2c-85b9-4684-a376-674caf693eca-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"ed905f2c-85b9-4684-a376-674caf693eca\") " pod="openstack/nova-cell1-conductor-0" Feb 17 14:31:15 crc kubenswrapper[4836]: I0217 14:31:15.550922 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed905f2c-85b9-4684-a376-674caf693eca-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"ed905f2c-85b9-4684-a376-674caf693eca\") " pod="openstack/nova-cell1-conductor-0" Feb 17 14:31:15 crc kubenswrapper[4836]: I0217 14:31:15.573274 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5q92t\" (UniqueName: \"kubernetes.io/projected/ed905f2c-85b9-4684-a376-674caf693eca-kube-api-access-5q92t\") pod \"nova-cell1-conductor-0\" (UID: \"ed905f2c-85b9-4684-a376-674caf693eca\") " pod="openstack/nova-cell1-conductor-0" Feb 17 14:31:15 crc kubenswrapper[4836]: I0217 14:31:15.601499 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2cb84cd5-666c-4cf2-94df-5b51d0e3fa6d-kube-api-access-9tzl2" (OuterVolumeSpecName: "kube-api-access-9tzl2") pod "2cb84cd5-666c-4cf2-94df-5b51d0e3fa6d" (UID: "2cb84cd5-666c-4cf2-94df-5b51d0e3fa6d"). InnerVolumeSpecName "kube-api-access-9tzl2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:31:15 crc kubenswrapper[4836]: I0217 14:31:15.629942 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2cb84cd5-666c-4cf2-94df-5b51d0e3fa6d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2cb84cd5-666c-4cf2-94df-5b51d0e3fa6d" (UID: "2cb84cd5-666c-4cf2-94df-5b51d0e3fa6d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:31:15 crc kubenswrapper[4836]: I0217 14:31:15.637508 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2cb84cd5-666c-4cf2-94df-5b51d0e3fa6d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2cb84cd5-666c-4cf2-94df-5b51d0e3fa6d" (UID: "2cb84cd5-666c-4cf2-94df-5b51d0e3fa6d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:31:15 crc kubenswrapper[4836]: I0217 14:31:15.641490 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2cb84cd5-666c-4cf2-94df-5b51d0e3fa6d-config" (OuterVolumeSpecName: "config") pod "2cb84cd5-666c-4cf2-94df-5b51d0e3fa6d" (UID: "2cb84cd5-666c-4cf2-94df-5b51d0e3fa6d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:31:15 crc kubenswrapper[4836]: I0217 14:31:15.641643 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9tzl2\" (UniqueName: \"kubernetes.io/projected/2cb84cd5-666c-4cf2-94df-5b51d0e3fa6d-kube-api-access-9tzl2\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:15 crc kubenswrapper[4836]: I0217 14:31:15.641691 4836 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2cb84cd5-666c-4cf2-94df-5b51d0e3fa6d-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:15 crc kubenswrapper[4836]: I0217 14:31:15.641700 4836 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2cb84cd5-666c-4cf2-94df-5b51d0e3fa6d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:15 crc kubenswrapper[4836]: I0217 14:31:15.645469 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2cb84cd5-666c-4cf2-94df-5b51d0e3fa6d-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "2cb84cd5-666c-4cf2-94df-5b51d0e3fa6d" (UID: "2cb84cd5-666c-4cf2-94df-5b51d0e3fa6d"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:31:15 crc kubenswrapper[4836]: I0217 14:31:15.668567 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2cb84cd5-666c-4cf2-94df-5b51d0e3fa6d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2cb84cd5-666c-4cf2-94df-5b51d0e3fa6d" (UID: "2cb84cd5-666c-4cf2-94df-5b51d0e3fa6d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:31:15 crc kubenswrapper[4836]: I0217 14:31:15.742983 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 17 14:31:15 crc kubenswrapper[4836]: I0217 14:31:15.744375 4836 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2cb84cd5-666c-4cf2-94df-5b51d0e3fa6d-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:15 crc kubenswrapper[4836]: I0217 14:31:15.744410 4836 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2cb84cd5-666c-4cf2-94df-5b51d0e3fa6d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:15 crc kubenswrapper[4836]: I0217 14:31:15.744424 4836 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2cb84cd5-666c-4cf2-94df-5b51d0e3fa6d-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:15 crc kubenswrapper[4836]: I0217 14:31:15.809649 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-lqvvn" Feb 17 14:31:15 crc kubenswrapper[4836]: I0217 14:31:15.966068 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f9d6a93-3d3a-4c5c-85cf-329209cfe911-config-data\") pod \"3f9d6a93-3d3a-4c5c-85cf-329209cfe911\" (UID: \"3f9d6a93-3d3a-4c5c-85cf-329209cfe911\") " Feb 17 14:31:15 crc kubenswrapper[4836]: I0217 14:31:15.966245 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5b2gc\" (UniqueName: \"kubernetes.io/projected/3f9d6a93-3d3a-4c5c-85cf-329209cfe911-kube-api-access-5b2gc\") pod \"3f9d6a93-3d3a-4c5c-85cf-329209cfe911\" (UID: \"3f9d6a93-3d3a-4c5c-85cf-329209cfe911\") " Feb 17 14:31:15 crc kubenswrapper[4836]: I0217 14:31:15.966339 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f9d6a93-3d3a-4c5c-85cf-329209cfe911-combined-ca-bundle\") pod \"3f9d6a93-3d3a-4c5c-85cf-329209cfe911\" (UID: \"3f9d6a93-3d3a-4c5c-85cf-329209cfe911\") " Feb 17 14:31:15 crc kubenswrapper[4836]: I0217 14:31:15.966437 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f9d6a93-3d3a-4c5c-85cf-329209cfe911-scripts\") pod \"3f9d6a93-3d3a-4c5c-85cf-329209cfe911\" (UID: \"3f9d6a93-3d3a-4c5c-85cf-329209cfe911\") " Feb 17 14:31:15 crc kubenswrapper[4836]: I0217 14:31:15.974544 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f9d6a93-3d3a-4c5c-85cf-329209cfe911-kube-api-access-5b2gc" (OuterVolumeSpecName: "kube-api-access-5b2gc") pod "3f9d6a93-3d3a-4c5c-85cf-329209cfe911" (UID: "3f9d6a93-3d3a-4c5c-85cf-329209cfe911"). InnerVolumeSpecName "kube-api-access-5b2gc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:31:15 crc kubenswrapper[4836]: I0217 14:31:15.981597 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f9d6a93-3d3a-4c5c-85cf-329209cfe911-scripts" (OuterVolumeSpecName: "scripts") pod "3f9d6a93-3d3a-4c5c-85cf-329209cfe911" (UID: "3f9d6a93-3d3a-4c5c-85cf-329209cfe911"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:31:16 crc kubenswrapper[4836]: I0217 14:31:16.008257 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f9d6a93-3d3a-4c5c-85cf-329209cfe911-config-data" (OuterVolumeSpecName: "config-data") pod "3f9d6a93-3d3a-4c5c-85cf-329209cfe911" (UID: "3f9d6a93-3d3a-4c5c-85cf-329209cfe911"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:31:16 crc kubenswrapper[4836]: I0217 14:31:16.016756 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f9d6a93-3d3a-4c5c-85cf-329209cfe911-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3f9d6a93-3d3a-4c5c-85cf-329209cfe911" (UID: "3f9d6a93-3d3a-4c5c-85cf-329209cfe911"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:31:16 crc kubenswrapper[4836]: I0217 14:31:16.069736 4836 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f9d6a93-3d3a-4c5c-85cf-329209cfe911-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:16 crc kubenswrapper[4836]: I0217 14:31:16.069780 4836 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f9d6a93-3d3a-4c5c-85cf-329209cfe911-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:16 crc kubenswrapper[4836]: I0217 14:31:16.069791 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5b2gc\" (UniqueName: \"kubernetes.io/projected/3f9d6a93-3d3a-4c5c-85cf-329209cfe911-kube-api-access-5b2gc\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:16 crc kubenswrapper[4836]: I0217 14:31:16.069804 4836 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f9d6a93-3d3a-4c5c-85cf-329209cfe911-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:16 crc kubenswrapper[4836]: I0217 14:31:16.264086 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 17 14:31:16 crc kubenswrapper[4836]: W0217 14:31:16.276344 4836 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poded905f2c_85b9_4684_a376_674caf693eca.slice/crio-f45852e9108cfa4373def7cb43822ff868e765b991183997db290af1163c4743 WatchSource:0}: Error finding container f45852e9108cfa4373def7cb43822ff868e765b991183997db290af1163c4743: Status 404 returned error can't find the container with id f45852e9108cfa4373def7cb43822ff868e765b991183997db290af1163c4743 Feb 17 14:31:16 crc kubenswrapper[4836]: I0217 14:31:16.278868 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r5vl4" event={"ID":"5d52263a-9417-43b6-903c-79e41b1200a0","Type":"ContainerStarted","Data":"d9cc1391f260161a1515210b2ec3643c9f8903ea38174d3e1a0a920c643cd1c4"} Feb 17 14:31:16 crc kubenswrapper[4836]: I0217 14:31:16.289601 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67bdc55879-cjz8m" Feb 17 14:31:16 crc kubenswrapper[4836]: I0217 14:31:16.289646 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-lqvvn" event={"ID":"3f9d6a93-3d3a-4c5c-85cf-329209cfe911","Type":"ContainerDied","Data":"2372be5c0fdb7f3175c7e34e83cf6164deb631b4fbf708d784b860e58c9b0a29"} Feb 17 14:31:16 crc kubenswrapper[4836]: I0217 14:31:16.289745 4836 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2372be5c0fdb7f3175c7e34e83cf6164deb631b4fbf708d784b860e58c9b0a29" Feb 17 14:31:16 crc kubenswrapper[4836]: I0217 14:31:16.289685 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-lqvvn" Feb 17 14:31:16 crc kubenswrapper[4836]: I0217 14:31:16.462327 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 17 14:31:16 crc kubenswrapper[4836]: I0217 14:31:16.462749 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="c429025c-a79e-425a-987a-773baaba5ef2" containerName="nova-api-log" containerID="cri-o://714f99461e2735815fecf65f89d7affa2f50ecfc1ef8586cdf0229b12c705175" gracePeriod=30 Feb 17 14:31:16 crc kubenswrapper[4836]: I0217 14:31:16.462819 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="c429025c-a79e-425a-987a-773baaba5ef2" containerName="nova-api-api" containerID="cri-o://629650e8b16a63777bc8cae654dc487e384a65773313b5678caf2189bce80c69" gracePeriod=30 Feb 17 14:31:16 crc kubenswrapper[4836]: I0217 14:31:16.480742 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 14:31:16 crc kubenswrapper[4836]: I0217 14:31:16.482142 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="50c442bd-1a4d-4e8f-b3b2-c2e6c97faeed" containerName="nova-scheduler-scheduler" containerID="cri-o://bcf6221fea7b134ba44f91202f344506361b2f3cd1e454aa804b4124188e38ef" gracePeriod=30 Feb 17 14:31:16 crc kubenswrapper[4836]: I0217 14:31:16.644428 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67bdc55879-cjz8m"] Feb 17 14:31:16 crc kubenswrapper[4836]: I0217 14:31:16.669243 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-67bdc55879-cjz8m"] Feb 17 14:31:17 crc kubenswrapper[4836]: I0217 14:31:17.304131 4836 generic.go:334] "Generic (PLEG): container finished" podID="c429025c-a79e-425a-987a-773baaba5ef2" containerID="714f99461e2735815fecf65f89d7affa2f50ecfc1ef8586cdf0229b12c705175" exitCode=143 Feb 17 14:31:17 crc kubenswrapper[4836]: I0217 14:31:17.304238 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c429025c-a79e-425a-987a-773baaba5ef2","Type":"ContainerDied","Data":"714f99461e2735815fecf65f89d7affa2f50ecfc1ef8586cdf0229b12c705175"} Feb 17 14:31:17 crc kubenswrapper[4836]: I0217 14:31:17.306152 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"ed905f2c-85b9-4684-a376-674caf693eca","Type":"ContainerStarted","Data":"8db8d4f89aaf6691469ef88d0e4bf139a6685bf389a2665674af697b0e174704"} Feb 17 14:31:17 crc kubenswrapper[4836]: I0217 14:31:17.306182 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"ed905f2c-85b9-4684-a376-674caf693eca","Type":"ContainerStarted","Data":"f45852e9108cfa4373def7cb43822ff868e765b991183997db290af1163c4743"} Feb 17 14:31:17 crc kubenswrapper[4836]: I0217 14:31:17.308981 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Feb 17 14:31:17 crc kubenswrapper[4836]: I0217 14:31:17.334246 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.334220327 podStartE2EDuration="2.334220327s" podCreationTimestamp="2026-02-17 14:31:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:31:17.325890552 +0000 UTC m=+1503.668818821" watchObservedRunningTime="2026-02-17 14:31:17.334220327 +0000 UTC m=+1503.677148596" Feb 17 14:31:18 crc kubenswrapper[4836]: I0217 14:31:18.321088 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 17 14:31:18 crc kubenswrapper[4836]: I0217 14:31:18.321146 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 17 14:31:18 crc kubenswrapper[4836]: E0217 14:31:18.530035 4836 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="bcf6221fea7b134ba44f91202f344506361b2f3cd1e454aa804b4124188e38ef" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 17 14:31:18 crc kubenswrapper[4836]: E0217 14:31:18.537743 4836 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="bcf6221fea7b134ba44f91202f344506361b2f3cd1e454aa804b4124188e38ef" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 17 14:31:18 crc kubenswrapper[4836]: E0217 14:31:18.539847 4836 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="bcf6221fea7b134ba44f91202f344506361b2f3cd1e454aa804b4124188e38ef" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 17 14:31:18 crc kubenswrapper[4836]: E0217 14:31:18.540018 4836 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="50c442bd-1a4d-4e8f-b3b2-c2e6c97faeed" containerName="nova-scheduler-scheduler" Feb 17 14:31:18 crc kubenswrapper[4836]: I0217 14:31:18.583141 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2cb84cd5-666c-4cf2-94df-5b51d0e3fa6d" path="/var/lib/kubelet/pods/2cb84cd5-666c-4cf2-94df-5b51d0e3fa6d/volumes" Feb 17 14:31:19 crc kubenswrapper[4836]: I0217 14:31:19.333974 4836 generic.go:334] "Generic (PLEG): container finished" podID="5d52263a-9417-43b6-903c-79e41b1200a0" containerID="d9cc1391f260161a1515210b2ec3643c9f8903ea38174d3e1a0a920c643cd1c4" exitCode=0 Feb 17 14:31:19 crc kubenswrapper[4836]: I0217 14:31:19.334099 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r5vl4" event={"ID":"5d52263a-9417-43b6-903c-79e41b1200a0","Type":"ContainerDied","Data":"d9cc1391f260161a1515210b2ec3643c9f8903ea38174d3e1a0a920c643cd1c4"} Feb 17 14:31:20 crc kubenswrapper[4836]: I0217 14:31:20.350626 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r5vl4" event={"ID":"5d52263a-9417-43b6-903c-79e41b1200a0","Type":"ContainerStarted","Data":"981c2da3bead75035e25520c28ec32c6fc29417547882d5a18e37555cbfe229b"} Feb 17 14:31:20 crc kubenswrapper[4836]: I0217 14:31:20.353432 4836 generic.go:334] "Generic (PLEG): container finished" podID="c429025c-a79e-425a-987a-773baaba5ef2" containerID="629650e8b16a63777bc8cae654dc487e384a65773313b5678caf2189bce80c69" exitCode=0 Feb 17 14:31:20 crc kubenswrapper[4836]: I0217 14:31:20.353473 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c429025c-a79e-425a-987a-773baaba5ef2","Type":"ContainerDied","Data":"629650e8b16a63777bc8cae654dc487e384a65773313b5678caf2189bce80c69"} Feb 17 14:31:20 crc kubenswrapper[4836]: I0217 14:31:20.384600 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-r5vl4" podStartSLOduration=2.846713621 podStartE2EDuration="45.384572156s" podCreationTimestamp="2026-02-17 14:30:35 +0000 UTC" firstStartedPulling="2026-02-17 14:30:37.329182088 +0000 UTC m=+1463.672110357" lastFinishedPulling="2026-02-17 14:31:19.867040623 +0000 UTC m=+1506.209968892" observedRunningTime="2026-02-17 14:31:20.378123171 +0000 UTC m=+1506.721051440" watchObservedRunningTime="2026-02-17 14:31:20.384572156 +0000 UTC m=+1506.727500425" Feb 17 14:31:20 crc kubenswrapper[4836]: I0217 14:31:20.671529 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 14:31:20 crc kubenswrapper[4836]: I0217 14:31:20.809439 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c429025c-a79e-425a-987a-773baaba5ef2-config-data\") pod \"c429025c-a79e-425a-987a-773baaba5ef2\" (UID: \"c429025c-a79e-425a-987a-773baaba5ef2\") " Feb 17 14:31:20 crc kubenswrapper[4836]: I0217 14:31:20.809582 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c429025c-a79e-425a-987a-773baaba5ef2-combined-ca-bundle\") pod \"c429025c-a79e-425a-987a-773baaba5ef2\" (UID: \"c429025c-a79e-425a-987a-773baaba5ef2\") " Feb 17 14:31:20 crc kubenswrapper[4836]: I0217 14:31:20.809724 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n528b\" (UniqueName: \"kubernetes.io/projected/c429025c-a79e-425a-987a-773baaba5ef2-kube-api-access-n528b\") pod \"c429025c-a79e-425a-987a-773baaba5ef2\" (UID: \"c429025c-a79e-425a-987a-773baaba5ef2\") " Feb 17 14:31:20 crc kubenswrapper[4836]: I0217 14:31:20.809891 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c429025c-a79e-425a-987a-773baaba5ef2-logs\") pod \"c429025c-a79e-425a-987a-773baaba5ef2\" (UID: \"c429025c-a79e-425a-987a-773baaba5ef2\") " Feb 17 14:31:20 crc kubenswrapper[4836]: I0217 14:31:20.818837 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c429025c-a79e-425a-987a-773baaba5ef2-logs" (OuterVolumeSpecName: "logs") pod "c429025c-a79e-425a-987a-773baaba5ef2" (UID: "c429025c-a79e-425a-987a-773baaba5ef2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:31:20 crc kubenswrapper[4836]: I0217 14:31:20.832500 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c429025c-a79e-425a-987a-773baaba5ef2-kube-api-access-n528b" (OuterVolumeSpecName: "kube-api-access-n528b") pod "c429025c-a79e-425a-987a-773baaba5ef2" (UID: "c429025c-a79e-425a-987a-773baaba5ef2"). InnerVolumeSpecName "kube-api-access-n528b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:31:20 crc kubenswrapper[4836]: I0217 14:31:20.862983 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c429025c-a79e-425a-987a-773baaba5ef2-config-data" (OuterVolumeSpecName: "config-data") pod "c429025c-a79e-425a-987a-773baaba5ef2" (UID: "c429025c-a79e-425a-987a-773baaba5ef2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:31:20 crc kubenswrapper[4836]: I0217 14:31:20.904745 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c429025c-a79e-425a-987a-773baaba5ef2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c429025c-a79e-425a-987a-773baaba5ef2" (UID: "c429025c-a79e-425a-987a-773baaba5ef2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:31:20 crc kubenswrapper[4836]: I0217 14:31:20.925129 4836 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c429025c-a79e-425a-987a-773baaba5ef2-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:20 crc kubenswrapper[4836]: I0217 14:31:20.925215 4836 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c429025c-a79e-425a-987a-773baaba5ef2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:20 crc kubenswrapper[4836]: I0217 14:31:20.925242 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n528b\" (UniqueName: \"kubernetes.io/projected/c429025c-a79e-425a-987a-773baaba5ef2-kube-api-access-n528b\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:20 crc kubenswrapper[4836]: I0217 14:31:20.925272 4836 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c429025c-a79e-425a-987a-773baaba5ef2-logs\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:20 crc kubenswrapper[4836]: E0217 14:31:20.954090 4836 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod50c442bd_1a4d_4e8f_b3b2_c2e6c97faeed.slice/crio-bcf6221fea7b134ba44f91202f344506361b2f3cd1e454aa804b4124188e38ef.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod50c442bd_1a4d_4e8f_b3b2_c2e6c97faeed.slice/crio-conmon-bcf6221fea7b134ba44f91202f344506361b2f3cd1e454aa804b4124188e38ef.scope\": RecentStats: unable to find data in memory cache]" Feb 17 14:31:20 crc kubenswrapper[4836]: I0217 14:31:20.963340 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 17 14:31:21 crc kubenswrapper[4836]: I0217 14:31:21.128773 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50c442bd-1a4d-4e8f-b3b2-c2e6c97faeed-combined-ca-bundle\") pod \"50c442bd-1a4d-4e8f-b3b2-c2e6c97faeed\" (UID: \"50c442bd-1a4d-4e8f-b3b2-c2e6c97faeed\") " Feb 17 14:31:21 crc kubenswrapper[4836]: I0217 14:31:21.129071 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50c442bd-1a4d-4e8f-b3b2-c2e6c97faeed-config-data\") pod \"50c442bd-1a4d-4e8f-b3b2-c2e6c97faeed\" (UID: \"50c442bd-1a4d-4e8f-b3b2-c2e6c97faeed\") " Feb 17 14:31:21 crc kubenswrapper[4836]: I0217 14:31:21.129254 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w6c45\" (UniqueName: \"kubernetes.io/projected/50c442bd-1a4d-4e8f-b3b2-c2e6c97faeed-kube-api-access-w6c45\") pod \"50c442bd-1a4d-4e8f-b3b2-c2e6c97faeed\" (UID: \"50c442bd-1a4d-4e8f-b3b2-c2e6c97faeed\") " Feb 17 14:31:21 crc kubenswrapper[4836]: I0217 14:31:21.151476 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50c442bd-1a4d-4e8f-b3b2-c2e6c97faeed-kube-api-access-w6c45" (OuterVolumeSpecName: "kube-api-access-w6c45") pod "50c442bd-1a4d-4e8f-b3b2-c2e6c97faeed" (UID: "50c442bd-1a4d-4e8f-b3b2-c2e6c97faeed"). InnerVolumeSpecName "kube-api-access-w6c45". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:31:21 crc kubenswrapper[4836]: I0217 14:31:21.171267 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50c442bd-1a4d-4e8f-b3b2-c2e6c97faeed-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "50c442bd-1a4d-4e8f-b3b2-c2e6c97faeed" (UID: "50c442bd-1a4d-4e8f-b3b2-c2e6c97faeed"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:31:21 crc kubenswrapper[4836]: I0217 14:31:21.180764 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50c442bd-1a4d-4e8f-b3b2-c2e6c97faeed-config-data" (OuterVolumeSpecName: "config-data") pod "50c442bd-1a4d-4e8f-b3b2-c2e6c97faeed" (UID: "50c442bd-1a4d-4e8f-b3b2-c2e6c97faeed"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:31:21 crc kubenswrapper[4836]: I0217 14:31:21.232450 4836 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50c442bd-1a4d-4e8f-b3b2-c2e6c97faeed-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:21 crc kubenswrapper[4836]: I0217 14:31:21.232496 4836 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50c442bd-1a4d-4e8f-b3b2-c2e6c97faeed-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:21 crc kubenswrapper[4836]: I0217 14:31:21.232509 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w6c45\" (UniqueName: \"kubernetes.io/projected/50c442bd-1a4d-4e8f-b3b2-c2e6c97faeed-kube-api-access-w6c45\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:21 crc kubenswrapper[4836]: I0217 14:31:21.369602 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c429025c-a79e-425a-987a-773baaba5ef2","Type":"ContainerDied","Data":"156a656ff70823725df87467a6303f3fbea750b4b0136e5d302f89802c6a5c93"} Feb 17 14:31:21 crc kubenswrapper[4836]: I0217 14:31:21.369646 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 14:31:21 crc kubenswrapper[4836]: I0217 14:31:21.369669 4836 scope.go:117] "RemoveContainer" containerID="629650e8b16a63777bc8cae654dc487e384a65773313b5678caf2189bce80c69" Feb 17 14:31:21 crc kubenswrapper[4836]: I0217 14:31:21.372834 4836 generic.go:334] "Generic (PLEG): container finished" podID="50c442bd-1a4d-4e8f-b3b2-c2e6c97faeed" containerID="bcf6221fea7b134ba44f91202f344506361b2f3cd1e454aa804b4124188e38ef" exitCode=0 Feb 17 14:31:21 crc kubenswrapper[4836]: I0217 14:31:21.372873 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"50c442bd-1a4d-4e8f-b3b2-c2e6c97faeed","Type":"ContainerDied","Data":"bcf6221fea7b134ba44f91202f344506361b2f3cd1e454aa804b4124188e38ef"} Feb 17 14:31:21 crc kubenswrapper[4836]: I0217 14:31:21.372894 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"50c442bd-1a4d-4e8f-b3b2-c2e6c97faeed","Type":"ContainerDied","Data":"7520be2580517ac140f3d2f437db810e791805bf994d997d530a396b328fa465"} Feb 17 14:31:21 crc kubenswrapper[4836]: I0217 14:31:21.372939 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 17 14:31:21 crc kubenswrapper[4836]: I0217 14:31:21.418685 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 14:31:21 crc kubenswrapper[4836]: I0217 14:31:21.434982 4836 scope.go:117] "RemoveContainer" containerID="714f99461e2735815fecf65f89d7affa2f50ecfc1ef8586cdf0229b12c705175" Feb 17 14:31:21 crc kubenswrapper[4836]: I0217 14:31:21.446176 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 14:31:21 crc kubenswrapper[4836]: I0217 14:31:21.464451 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 17 14:31:21 crc kubenswrapper[4836]: I0217 14:31:21.482478 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 17 14:31:21 crc kubenswrapper[4836]: I0217 14:31:21.499342 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 14:31:21 crc kubenswrapper[4836]: E0217 14:31:21.500134 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cb84cd5-666c-4cf2-94df-5b51d0e3fa6d" containerName="init" Feb 17 14:31:21 crc kubenswrapper[4836]: I0217 14:31:21.500157 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cb84cd5-666c-4cf2-94df-5b51d0e3fa6d" containerName="init" Feb 17 14:31:21 crc kubenswrapper[4836]: E0217 14:31:21.500172 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50c442bd-1a4d-4e8f-b3b2-c2e6c97faeed" containerName="nova-scheduler-scheduler" Feb 17 14:31:21 crc kubenswrapper[4836]: I0217 14:31:21.500179 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="50c442bd-1a4d-4e8f-b3b2-c2e6c97faeed" containerName="nova-scheduler-scheduler" Feb 17 14:31:21 crc kubenswrapper[4836]: E0217 14:31:21.500195 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f9d6a93-3d3a-4c5c-85cf-329209cfe911" containerName="nova-manage" Feb 17 14:31:21 crc kubenswrapper[4836]: I0217 14:31:21.500203 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f9d6a93-3d3a-4c5c-85cf-329209cfe911" containerName="nova-manage" Feb 17 14:31:21 crc kubenswrapper[4836]: E0217 14:31:21.500235 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c429025c-a79e-425a-987a-773baaba5ef2" containerName="nova-api-api" Feb 17 14:31:21 crc kubenswrapper[4836]: I0217 14:31:21.500246 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="c429025c-a79e-425a-987a-773baaba5ef2" containerName="nova-api-api" Feb 17 14:31:21 crc kubenswrapper[4836]: E0217 14:31:21.500266 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cb84cd5-666c-4cf2-94df-5b51d0e3fa6d" containerName="dnsmasq-dns" Feb 17 14:31:21 crc kubenswrapper[4836]: I0217 14:31:21.500274 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cb84cd5-666c-4cf2-94df-5b51d0e3fa6d" containerName="dnsmasq-dns" Feb 17 14:31:21 crc kubenswrapper[4836]: E0217 14:31:21.500305 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c429025c-a79e-425a-987a-773baaba5ef2" containerName="nova-api-log" Feb 17 14:31:21 crc kubenswrapper[4836]: I0217 14:31:21.500312 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="c429025c-a79e-425a-987a-773baaba5ef2" containerName="nova-api-log" Feb 17 14:31:21 crc kubenswrapper[4836]: I0217 14:31:21.500568 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="50c442bd-1a4d-4e8f-b3b2-c2e6c97faeed" containerName="nova-scheduler-scheduler" Feb 17 14:31:21 crc kubenswrapper[4836]: I0217 14:31:21.500591 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="c429025c-a79e-425a-987a-773baaba5ef2" containerName="nova-api-log" Feb 17 14:31:21 crc kubenswrapper[4836]: I0217 14:31:21.500607 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cb84cd5-666c-4cf2-94df-5b51d0e3fa6d" containerName="dnsmasq-dns" Feb 17 14:31:21 crc kubenswrapper[4836]: I0217 14:31:21.500628 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f9d6a93-3d3a-4c5c-85cf-329209cfe911" containerName="nova-manage" Feb 17 14:31:21 crc kubenswrapper[4836]: I0217 14:31:21.500646 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="c429025c-a79e-425a-987a-773baaba5ef2" containerName="nova-api-api" Feb 17 14:31:21 crc kubenswrapper[4836]: I0217 14:31:21.501686 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 17 14:31:21 crc kubenswrapper[4836]: I0217 14:31:21.502896 4836 scope.go:117] "RemoveContainer" containerID="bcf6221fea7b134ba44f91202f344506361b2f3cd1e454aa804b4124188e38ef" Feb 17 14:31:21 crc kubenswrapper[4836]: I0217 14:31:21.504574 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 17 14:31:21 crc kubenswrapper[4836]: I0217 14:31:21.519602 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 14:31:21 crc kubenswrapper[4836]: I0217 14:31:21.534789 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 17 14:31:21 crc kubenswrapper[4836]: I0217 14:31:21.537919 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 14:31:21 crc kubenswrapper[4836]: I0217 14:31:21.545259 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 17 14:31:21 crc kubenswrapper[4836]: I0217 14:31:21.551624 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 17 14:31:21 crc kubenswrapper[4836]: I0217 14:31:21.569165 4836 scope.go:117] "RemoveContainer" containerID="bcf6221fea7b134ba44f91202f344506361b2f3cd1e454aa804b4124188e38ef" Feb 17 14:31:21 crc kubenswrapper[4836]: E0217 14:31:21.569934 4836 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bcf6221fea7b134ba44f91202f344506361b2f3cd1e454aa804b4124188e38ef\": container with ID starting with bcf6221fea7b134ba44f91202f344506361b2f3cd1e454aa804b4124188e38ef not found: ID does not exist" containerID="bcf6221fea7b134ba44f91202f344506361b2f3cd1e454aa804b4124188e38ef" Feb 17 14:31:21 crc kubenswrapper[4836]: I0217 14:31:21.570009 4836 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bcf6221fea7b134ba44f91202f344506361b2f3cd1e454aa804b4124188e38ef"} err="failed to get container status \"bcf6221fea7b134ba44f91202f344506361b2f3cd1e454aa804b4124188e38ef\": rpc error: code = NotFound desc = could not find container \"bcf6221fea7b134ba44f91202f344506361b2f3cd1e454aa804b4124188e38ef\": container with ID starting with bcf6221fea7b134ba44f91202f344506361b2f3cd1e454aa804b4124188e38ef not found: ID does not exist" Feb 17 14:31:21 crc kubenswrapper[4836]: I0217 14:31:21.644757 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbrvs\" (UniqueName: \"kubernetes.io/projected/1853ac32-f733-4d5f-9cc2-edf83a927b28-kube-api-access-lbrvs\") pod \"nova-scheduler-0\" (UID: \"1853ac32-f733-4d5f-9cc2-edf83a927b28\") " pod="openstack/nova-scheduler-0" Feb 17 14:31:21 crc kubenswrapper[4836]: I0217 14:31:21.644819 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf284e7d-7c68-4688-9e14-87e9c32f6c41-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"cf284e7d-7c68-4688-9e14-87e9c32f6c41\") " pod="openstack/nova-api-0" Feb 17 14:31:21 crc kubenswrapper[4836]: I0217 14:31:21.645175 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1853ac32-f733-4d5f-9cc2-edf83a927b28-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"1853ac32-f733-4d5f-9cc2-edf83a927b28\") " pod="openstack/nova-scheduler-0" Feb 17 14:31:21 crc kubenswrapper[4836]: I0217 14:31:21.645618 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1853ac32-f733-4d5f-9cc2-edf83a927b28-config-data\") pod \"nova-scheduler-0\" (UID: \"1853ac32-f733-4d5f-9cc2-edf83a927b28\") " pod="openstack/nova-scheduler-0" Feb 17 14:31:21 crc kubenswrapper[4836]: I0217 14:31:21.645666 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvc2r\" (UniqueName: \"kubernetes.io/projected/cf284e7d-7c68-4688-9e14-87e9c32f6c41-kube-api-access-rvc2r\") pod \"nova-api-0\" (UID: \"cf284e7d-7c68-4688-9e14-87e9c32f6c41\") " pod="openstack/nova-api-0" Feb 17 14:31:21 crc kubenswrapper[4836]: I0217 14:31:21.646016 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf284e7d-7c68-4688-9e14-87e9c32f6c41-config-data\") pod \"nova-api-0\" (UID: \"cf284e7d-7c68-4688-9e14-87e9c32f6c41\") " pod="openstack/nova-api-0" Feb 17 14:31:21 crc kubenswrapper[4836]: I0217 14:31:21.646267 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf284e7d-7c68-4688-9e14-87e9c32f6c41-logs\") pod \"nova-api-0\" (UID: \"cf284e7d-7c68-4688-9e14-87e9c32f6c41\") " pod="openstack/nova-api-0" Feb 17 14:31:21 crc kubenswrapper[4836]: I0217 14:31:21.748884 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1853ac32-f733-4d5f-9cc2-edf83a927b28-config-data\") pod \"nova-scheduler-0\" (UID: \"1853ac32-f733-4d5f-9cc2-edf83a927b28\") " pod="openstack/nova-scheduler-0" Feb 17 14:31:21 crc kubenswrapper[4836]: I0217 14:31:21.748941 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvc2r\" (UniqueName: \"kubernetes.io/projected/cf284e7d-7c68-4688-9e14-87e9c32f6c41-kube-api-access-rvc2r\") pod \"nova-api-0\" (UID: \"cf284e7d-7c68-4688-9e14-87e9c32f6c41\") " pod="openstack/nova-api-0" Feb 17 14:31:21 crc kubenswrapper[4836]: I0217 14:31:21.749016 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf284e7d-7c68-4688-9e14-87e9c32f6c41-config-data\") pod \"nova-api-0\" (UID: \"cf284e7d-7c68-4688-9e14-87e9c32f6c41\") " pod="openstack/nova-api-0" Feb 17 14:31:21 crc kubenswrapper[4836]: I0217 14:31:21.750334 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf284e7d-7c68-4688-9e14-87e9c32f6c41-logs\") pod \"nova-api-0\" (UID: \"cf284e7d-7c68-4688-9e14-87e9c32f6c41\") " pod="openstack/nova-api-0" Feb 17 14:31:21 crc kubenswrapper[4836]: I0217 14:31:21.750783 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf284e7d-7c68-4688-9e14-87e9c32f6c41-logs\") pod \"nova-api-0\" (UID: \"cf284e7d-7c68-4688-9e14-87e9c32f6c41\") " pod="openstack/nova-api-0" Feb 17 14:31:21 crc kubenswrapper[4836]: I0217 14:31:21.751009 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbrvs\" (UniqueName: \"kubernetes.io/projected/1853ac32-f733-4d5f-9cc2-edf83a927b28-kube-api-access-lbrvs\") pod \"nova-scheduler-0\" (UID: \"1853ac32-f733-4d5f-9cc2-edf83a927b28\") " pod="openstack/nova-scheduler-0" Feb 17 14:31:21 crc kubenswrapper[4836]: I0217 14:31:21.751637 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf284e7d-7c68-4688-9e14-87e9c32f6c41-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"cf284e7d-7c68-4688-9e14-87e9c32f6c41\") " pod="openstack/nova-api-0" Feb 17 14:31:21 crc kubenswrapper[4836]: I0217 14:31:21.752170 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1853ac32-f733-4d5f-9cc2-edf83a927b28-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"1853ac32-f733-4d5f-9cc2-edf83a927b28\") " pod="openstack/nova-scheduler-0" Feb 17 14:31:21 crc kubenswrapper[4836]: I0217 14:31:21.753848 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1853ac32-f733-4d5f-9cc2-edf83a927b28-config-data\") pod \"nova-scheduler-0\" (UID: \"1853ac32-f733-4d5f-9cc2-edf83a927b28\") " pod="openstack/nova-scheduler-0" Feb 17 14:31:21 crc kubenswrapper[4836]: I0217 14:31:21.758626 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf284e7d-7c68-4688-9e14-87e9c32f6c41-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"cf284e7d-7c68-4688-9e14-87e9c32f6c41\") " pod="openstack/nova-api-0" Feb 17 14:31:21 crc kubenswrapper[4836]: I0217 14:31:21.758720 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1853ac32-f733-4d5f-9cc2-edf83a927b28-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"1853ac32-f733-4d5f-9cc2-edf83a927b28\") " pod="openstack/nova-scheduler-0" Feb 17 14:31:21 crc kubenswrapper[4836]: I0217 14:31:21.760660 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf284e7d-7c68-4688-9e14-87e9c32f6c41-config-data\") pod \"nova-api-0\" (UID: \"cf284e7d-7c68-4688-9e14-87e9c32f6c41\") " pod="openstack/nova-api-0" Feb 17 14:31:21 crc kubenswrapper[4836]: I0217 14:31:21.776531 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvc2r\" (UniqueName: \"kubernetes.io/projected/cf284e7d-7c68-4688-9e14-87e9c32f6c41-kube-api-access-rvc2r\") pod \"nova-api-0\" (UID: \"cf284e7d-7c68-4688-9e14-87e9c32f6c41\") " pod="openstack/nova-api-0" Feb 17 14:31:21 crc kubenswrapper[4836]: I0217 14:31:21.778078 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbrvs\" (UniqueName: \"kubernetes.io/projected/1853ac32-f733-4d5f-9cc2-edf83a927b28-kube-api-access-lbrvs\") pod \"nova-scheduler-0\" (UID: \"1853ac32-f733-4d5f-9cc2-edf83a927b28\") " pod="openstack/nova-scheduler-0" Feb 17 14:31:21 crc kubenswrapper[4836]: I0217 14:31:21.822031 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 17 14:31:21 crc kubenswrapper[4836]: I0217 14:31:21.884173 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 14:31:22 crc kubenswrapper[4836]: I0217 14:31:22.413125 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 14:31:22 crc kubenswrapper[4836]: W0217 14:31:22.579350 4836 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcf284e7d_7c68_4688_9e14_87e9c32f6c41.slice/crio-be3d92134965e9daf67db43dfe412847ed9ee954252aa5228b3fe8a1f26a6f7a WatchSource:0}: Error finding container be3d92134965e9daf67db43dfe412847ed9ee954252aa5228b3fe8a1f26a6f7a: Status 404 returned error can't find the container with id be3d92134965e9daf67db43dfe412847ed9ee954252aa5228b3fe8a1f26a6f7a Feb 17 14:31:22 crc kubenswrapper[4836]: I0217 14:31:22.589180 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50c442bd-1a4d-4e8f-b3b2-c2e6c97faeed" path="/var/lib/kubelet/pods/50c442bd-1a4d-4e8f-b3b2-c2e6c97faeed/volumes" Feb 17 14:31:22 crc kubenswrapper[4836]: I0217 14:31:22.590046 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c429025c-a79e-425a-987a-773baaba5ef2" path="/var/lib/kubelet/pods/c429025c-a79e-425a-987a-773baaba5ef2/volumes" Feb 17 14:31:22 crc kubenswrapper[4836]: I0217 14:31:22.590784 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 17 14:31:23 crc kubenswrapper[4836]: I0217 14:31:23.400007 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"cf284e7d-7c68-4688-9e14-87e9c32f6c41","Type":"ContainerStarted","Data":"6e5e87a2800a9b226a2e0132c7153060336b7daacac50613672b14b4cb72f047"} Feb 17 14:31:23 crc kubenswrapper[4836]: I0217 14:31:23.400373 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"cf284e7d-7c68-4688-9e14-87e9c32f6c41","Type":"ContainerStarted","Data":"d2a16e7e8ec6c9424d109c9048a972b614d91e9e7b39aa49c9b987a971f99b2e"} Feb 17 14:31:23 crc kubenswrapper[4836]: I0217 14:31:23.400387 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"cf284e7d-7c68-4688-9e14-87e9c32f6c41","Type":"ContainerStarted","Data":"be3d92134965e9daf67db43dfe412847ed9ee954252aa5228b3fe8a1f26a6f7a"} Feb 17 14:31:23 crc kubenswrapper[4836]: I0217 14:31:23.405733 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"1853ac32-f733-4d5f-9cc2-edf83a927b28","Type":"ContainerStarted","Data":"cdcb8668f9c38464113928fab0592c4bc426536aaf26770f316b86d52057f1a1"} Feb 17 14:31:23 crc kubenswrapper[4836]: I0217 14:31:23.405791 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"1853ac32-f733-4d5f-9cc2-edf83a927b28","Type":"ContainerStarted","Data":"942beb1e7c7f15eec98870c7e5614fc6da7b9d580327cb6b8b021c40fa96a882"} Feb 17 14:31:23 crc kubenswrapper[4836]: I0217 14:31:23.433218 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.433185308 podStartE2EDuration="2.433185308s" podCreationTimestamp="2026-02-17 14:31:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:31:23.424994186 +0000 UTC m=+1509.767922455" watchObservedRunningTime="2026-02-17 14:31:23.433185308 +0000 UTC m=+1509.776113587" Feb 17 14:31:23 crc kubenswrapper[4836]: I0217 14:31:23.450979 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.45094841 podStartE2EDuration="2.45094841s" podCreationTimestamp="2026-02-17 14:31:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:31:23.445956785 +0000 UTC m=+1509.788885074" watchObservedRunningTime="2026-02-17 14:31:23.45094841 +0000 UTC m=+1509.793876689" Feb 17 14:31:25 crc kubenswrapper[4836]: I0217 14:31:25.775279 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Feb 17 14:31:26 crc kubenswrapper[4836]: I0217 14:31:26.135422 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-r5vl4" Feb 17 14:31:26 crc kubenswrapper[4836]: I0217 14:31:26.135498 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-r5vl4" Feb 17 14:31:26 crc kubenswrapper[4836]: I0217 14:31:26.823046 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 17 14:31:27 crc kubenswrapper[4836]: I0217 14:31:27.193392 4836 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-r5vl4" podUID="5d52263a-9417-43b6-903c-79e41b1200a0" containerName="registry-server" probeResult="failure" output=< Feb 17 14:31:27 crc kubenswrapper[4836]: timeout: failed to connect service ":50051" within 1s Feb 17 14:31:27 crc kubenswrapper[4836]: > Feb 17 14:31:29 crc kubenswrapper[4836]: I0217 14:31:29.604399 4836 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="7977b0a7-fd9c-4d3c-bc21-fbf9d0e70506" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.210:3000/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 17 14:31:31 crc kubenswrapper[4836]: I0217 14:31:31.773009 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 17 14:31:31 crc kubenswrapper[4836]: I0217 14:31:31.823014 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 17 14:31:31 crc kubenswrapper[4836]: I0217 14:31:31.858137 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 17 14:31:31 crc kubenswrapper[4836]: I0217 14:31:31.885714 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 17 14:31:31 crc kubenswrapper[4836]: I0217 14:31:31.885779 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 17 14:31:32 crc kubenswrapper[4836]: I0217 14:31:32.565364 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 17 14:31:32 crc kubenswrapper[4836]: I0217 14:31:32.967547 4836 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="cf284e7d-7c68-4688-9e14-87e9c32f6c41" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.223:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 17 14:31:32 crc kubenswrapper[4836]: I0217 14:31:32.967543 4836 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="cf284e7d-7c68-4688-9e14-87e9c32f6c41" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.223:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 17 14:31:35 crc kubenswrapper[4836]: I0217 14:31:35.897712 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 17 14:31:35 crc kubenswrapper[4836]: I0217 14:31:35.898238 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="87197028-3222-4c04-89a7-135997258e0d" containerName="kube-state-metrics" containerID="cri-o://6d20de79ec5b2889e85dc2a14e5a092567cc4ce841befd2fa670b857d176cad9" gracePeriod=30 Feb 17 14:31:36 crc kubenswrapper[4836]: I0217 14:31:36.213556 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-r5vl4" Feb 17 14:31:36 crc kubenswrapper[4836]: I0217 14:31:36.308219 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-r5vl4" Feb 17 14:31:36 crc kubenswrapper[4836]: I0217 14:31:36.399420 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-r5vl4"] Feb 17 14:31:36 crc kubenswrapper[4836]: I0217 14:31:36.475795 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-89b2r"] Feb 17 14:31:36 crc kubenswrapper[4836]: I0217 14:31:36.476108 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-89b2r" podUID="cc99d806-e359-4577-8a61-1b527af8779f" containerName="registry-server" containerID="cri-o://b84dd65de54881081222d1401d684becd3ab6f396a5d3ddb1a10e413f4f858e0" gracePeriod=2 Feb 17 14:31:36 crc kubenswrapper[4836]: I0217 14:31:36.588959 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 17 14:31:36 crc kubenswrapper[4836]: I0217 14:31:36.597537 4836 generic.go:334] "Generic (PLEG): container finished" podID="87197028-3222-4c04-89a7-135997258e0d" containerID="6d20de79ec5b2889e85dc2a14e5a092567cc4ce841befd2fa670b857d176cad9" exitCode=2 Feb 17 14:31:36 crc kubenswrapper[4836]: I0217 14:31:36.598053 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 17 14:31:36 crc kubenswrapper[4836]: I0217 14:31:36.603532 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"87197028-3222-4c04-89a7-135997258e0d","Type":"ContainerDied","Data":"6d20de79ec5b2889e85dc2a14e5a092567cc4ce841befd2fa670b857d176cad9"} Feb 17 14:31:36 crc kubenswrapper[4836]: I0217 14:31:36.603625 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"87197028-3222-4c04-89a7-135997258e0d","Type":"ContainerDied","Data":"ac1cfd9dcf6c1abc6e025d9d148f792c18f57e036146c98f80e4d81f4745553b"} Feb 17 14:31:36 crc kubenswrapper[4836]: I0217 14:31:36.603647 4836 scope.go:117] "RemoveContainer" containerID="6d20de79ec5b2889e85dc2a14e5a092567cc4ce841befd2fa670b857d176cad9" Feb 17 14:31:37 crc kubenswrapper[4836]: I0217 14:31:37.004663 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wnv8r\" (UniqueName: \"kubernetes.io/projected/87197028-3222-4c04-89a7-135997258e0d-kube-api-access-wnv8r\") pod \"87197028-3222-4c04-89a7-135997258e0d\" (UID: \"87197028-3222-4c04-89a7-135997258e0d\") " Feb 17 14:31:37 crc kubenswrapper[4836]: I0217 14:31:37.032280 4836 scope.go:117] "RemoveContainer" containerID="6d20de79ec5b2889e85dc2a14e5a092567cc4ce841befd2fa670b857d176cad9" Feb 17 14:31:37 crc kubenswrapper[4836]: E0217 14:31:37.036810 4836 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d20de79ec5b2889e85dc2a14e5a092567cc4ce841befd2fa670b857d176cad9\": container with ID starting with 6d20de79ec5b2889e85dc2a14e5a092567cc4ce841befd2fa670b857d176cad9 not found: ID does not exist" containerID="6d20de79ec5b2889e85dc2a14e5a092567cc4ce841befd2fa670b857d176cad9" Feb 17 14:31:37 crc kubenswrapper[4836]: I0217 14:31:37.036887 4836 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d20de79ec5b2889e85dc2a14e5a092567cc4ce841befd2fa670b857d176cad9"} err="failed to get container status \"6d20de79ec5b2889e85dc2a14e5a092567cc4ce841befd2fa670b857d176cad9\": rpc error: code = NotFound desc = could not find container \"6d20de79ec5b2889e85dc2a14e5a092567cc4ce841befd2fa670b857d176cad9\": container with ID starting with 6d20de79ec5b2889e85dc2a14e5a092567cc4ce841befd2fa670b857d176cad9 not found: ID does not exist" Feb 17 14:31:37 crc kubenswrapper[4836]: I0217 14:31:37.083631 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87197028-3222-4c04-89a7-135997258e0d-kube-api-access-wnv8r" (OuterVolumeSpecName: "kube-api-access-wnv8r") pod "87197028-3222-4c04-89a7-135997258e0d" (UID: "87197028-3222-4c04-89a7-135997258e0d"). InnerVolumeSpecName "kube-api-access-wnv8r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:31:37 crc kubenswrapper[4836]: I0217 14:31:37.112001 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wnv8r\" (UniqueName: \"kubernetes.io/projected/87197028-3222-4c04-89a7-135997258e0d-kube-api-access-wnv8r\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:37 crc kubenswrapper[4836]: I0217 14:31:37.369014 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 17 14:31:37 crc kubenswrapper[4836]: I0217 14:31:37.389684 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 17 14:31:37 crc kubenswrapper[4836]: I0217 14:31:37.430305 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Feb 17 14:31:37 crc kubenswrapper[4836]: E0217 14:31:37.431136 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87197028-3222-4c04-89a7-135997258e0d" containerName="kube-state-metrics" Feb 17 14:31:37 crc kubenswrapper[4836]: I0217 14:31:37.431153 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="87197028-3222-4c04-89a7-135997258e0d" containerName="kube-state-metrics" Feb 17 14:31:37 crc kubenswrapper[4836]: I0217 14:31:37.433320 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="87197028-3222-4c04-89a7-135997258e0d" containerName="kube-state-metrics" Feb 17 14:31:37 crc kubenswrapper[4836]: I0217 14:31:37.434809 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 17 14:31:37 crc kubenswrapper[4836]: I0217 14:31:37.439195 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Feb 17 14:31:37 crc kubenswrapper[4836]: I0217 14:31:37.459689 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Feb 17 14:31:37 crc kubenswrapper[4836]: I0217 14:31:37.480678 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 17 14:31:37 crc kubenswrapper[4836]: I0217 14:31:37.536823 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhk2k\" (UniqueName: \"kubernetes.io/projected/8809e181-9f70-4810-97e8-6fc4c9e3561a-kube-api-access-qhk2k\") pod \"kube-state-metrics-0\" (UID: \"8809e181-9f70-4810-97e8-6fc4c9e3561a\") " pod="openstack/kube-state-metrics-0" Feb 17 14:31:37 crc kubenswrapper[4836]: I0217 14:31:37.537249 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/8809e181-9f70-4810-97e8-6fc4c9e3561a-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"8809e181-9f70-4810-97e8-6fc4c9e3561a\") " pod="openstack/kube-state-metrics-0" Feb 17 14:31:37 crc kubenswrapper[4836]: I0217 14:31:37.537355 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/8809e181-9f70-4810-97e8-6fc4c9e3561a-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"8809e181-9f70-4810-97e8-6fc4c9e3561a\") " pod="openstack/kube-state-metrics-0" Feb 17 14:31:37 crc kubenswrapper[4836]: I0217 14:31:37.537384 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8809e181-9f70-4810-97e8-6fc4c9e3561a-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"8809e181-9f70-4810-97e8-6fc4c9e3561a\") " pod="openstack/kube-state-metrics-0" Feb 17 14:31:37 crc kubenswrapper[4836]: I0217 14:31:37.616648 4836 generic.go:334] "Generic (PLEG): container finished" podID="73de5f3f-982c-4471-b91b-e3725da6be03" containerID="62c716ef584f99d01d3c8b6a75eeb1c7c51deabb2ccc02590847f8e3c96e8ddb" exitCode=137 Feb 17 14:31:37 crc kubenswrapper[4836]: I0217 14:31:37.616734 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"73de5f3f-982c-4471-b91b-e3725da6be03","Type":"ContainerDied","Data":"62c716ef584f99d01d3c8b6a75eeb1c7c51deabb2ccc02590847f8e3c96e8ddb"} Feb 17 14:31:37 crc kubenswrapper[4836]: I0217 14:31:37.622007 4836 generic.go:334] "Generic (PLEG): container finished" podID="cc99d806-e359-4577-8a61-1b527af8779f" containerID="b84dd65de54881081222d1401d684becd3ab6f396a5d3ddb1a10e413f4f858e0" exitCode=0 Feb 17 14:31:37 crc kubenswrapper[4836]: I0217 14:31:37.622207 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-89b2r" event={"ID":"cc99d806-e359-4577-8a61-1b527af8779f","Type":"ContainerDied","Data":"b84dd65de54881081222d1401d684becd3ab6f396a5d3ddb1a10e413f4f858e0"} Feb 17 14:31:37 crc kubenswrapper[4836]: I0217 14:31:37.622262 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-89b2r" event={"ID":"cc99d806-e359-4577-8a61-1b527af8779f","Type":"ContainerDied","Data":"7917a2258074c4b89c2b9c207136528b694ef3fdf3891f163bd96f2105f7c9c7"} Feb 17 14:31:37 crc kubenswrapper[4836]: I0217 14:31:37.622277 4836 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7917a2258074c4b89c2b9c207136528b694ef3fdf3891f163bd96f2105f7c9c7" Feb 17 14:31:37 crc kubenswrapper[4836]: I0217 14:31:37.629889 4836 generic.go:334] "Generic (PLEG): container finished" podID="3d6e757d-b7e9-417b-a63e-94879c7f3f74" containerID="7b113b343450d9b94ea6adbfe150b4f215416e1008c8bb123e85d6f538396d6e" exitCode=137 Feb 17 14:31:37 crc kubenswrapper[4836]: I0217 14:31:37.630220 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3d6e757d-b7e9-417b-a63e-94879c7f3f74","Type":"ContainerDied","Data":"7b113b343450d9b94ea6adbfe150b4f215416e1008c8bb123e85d6f538396d6e"} Feb 17 14:31:37 crc kubenswrapper[4836]: I0217 14:31:37.638740 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhk2k\" (UniqueName: \"kubernetes.io/projected/8809e181-9f70-4810-97e8-6fc4c9e3561a-kube-api-access-qhk2k\") pod \"kube-state-metrics-0\" (UID: \"8809e181-9f70-4810-97e8-6fc4c9e3561a\") " pod="openstack/kube-state-metrics-0" Feb 17 14:31:37 crc kubenswrapper[4836]: I0217 14:31:37.638822 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/8809e181-9f70-4810-97e8-6fc4c9e3561a-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"8809e181-9f70-4810-97e8-6fc4c9e3561a\") " pod="openstack/kube-state-metrics-0" Feb 17 14:31:37 crc kubenswrapper[4836]: I0217 14:31:37.641391 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/8809e181-9f70-4810-97e8-6fc4c9e3561a-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"8809e181-9f70-4810-97e8-6fc4c9e3561a\") " pod="openstack/kube-state-metrics-0" Feb 17 14:31:37 crc kubenswrapper[4836]: I0217 14:31:37.641468 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8809e181-9f70-4810-97e8-6fc4c9e3561a-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"8809e181-9f70-4810-97e8-6fc4c9e3561a\") " pod="openstack/kube-state-metrics-0" Feb 17 14:31:37 crc kubenswrapper[4836]: I0217 14:31:37.647611 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/8809e181-9f70-4810-97e8-6fc4c9e3561a-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"8809e181-9f70-4810-97e8-6fc4c9e3561a\") " pod="openstack/kube-state-metrics-0" Feb 17 14:31:37 crc kubenswrapper[4836]: I0217 14:31:37.653090 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/8809e181-9f70-4810-97e8-6fc4c9e3561a-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"8809e181-9f70-4810-97e8-6fc4c9e3561a\") " pod="openstack/kube-state-metrics-0" Feb 17 14:31:37 crc kubenswrapper[4836]: I0217 14:31:37.656741 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8809e181-9f70-4810-97e8-6fc4c9e3561a-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"8809e181-9f70-4810-97e8-6fc4c9e3561a\") " pod="openstack/kube-state-metrics-0" Feb 17 14:31:37 crc kubenswrapper[4836]: I0217 14:31:37.669381 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhk2k\" (UniqueName: \"kubernetes.io/projected/8809e181-9f70-4810-97e8-6fc4c9e3561a-kube-api-access-qhk2k\") pod \"kube-state-metrics-0\" (UID: \"8809e181-9f70-4810-97e8-6fc4c9e3561a\") " pod="openstack/kube-state-metrics-0" Feb 17 14:31:37 crc kubenswrapper[4836]: I0217 14:31:37.702756 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-89b2r" Feb 17 14:31:37 crc kubenswrapper[4836]: I0217 14:31:37.749477 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc99d806-e359-4577-8a61-1b527af8779f-catalog-content\") pod \"cc99d806-e359-4577-8a61-1b527af8779f\" (UID: \"cc99d806-e359-4577-8a61-1b527af8779f\") " Feb 17 14:31:37 crc kubenswrapper[4836]: I0217 14:31:37.749748 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-97574\" (UniqueName: \"kubernetes.io/projected/cc99d806-e359-4577-8a61-1b527af8779f-kube-api-access-97574\") pod \"cc99d806-e359-4577-8a61-1b527af8779f\" (UID: \"cc99d806-e359-4577-8a61-1b527af8779f\") " Feb 17 14:31:37 crc kubenswrapper[4836]: I0217 14:31:37.749950 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc99d806-e359-4577-8a61-1b527af8779f-utilities\") pod \"cc99d806-e359-4577-8a61-1b527af8779f\" (UID: \"cc99d806-e359-4577-8a61-1b527af8779f\") " Feb 17 14:31:37 crc kubenswrapper[4836]: I0217 14:31:37.762984 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc99d806-e359-4577-8a61-1b527af8779f-utilities" (OuterVolumeSpecName: "utilities") pod "cc99d806-e359-4577-8a61-1b527af8779f" (UID: "cc99d806-e359-4577-8a61-1b527af8779f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:31:37 crc kubenswrapper[4836]: I0217 14:31:37.770037 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc99d806-e359-4577-8a61-1b527af8779f-kube-api-access-97574" (OuterVolumeSpecName: "kube-api-access-97574") pod "cc99d806-e359-4577-8a61-1b527af8779f" (UID: "cc99d806-e359-4577-8a61-1b527af8779f"). InnerVolumeSpecName "kube-api-access-97574". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:31:37 crc kubenswrapper[4836]: I0217 14:31:37.788363 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 17 14:31:37 crc kubenswrapper[4836]: I0217 14:31:37.872714 4836 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc99d806-e359-4577-8a61-1b527af8779f-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:37 crc kubenswrapper[4836]: I0217 14:31:37.872771 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-97574\" (UniqueName: \"kubernetes.io/projected/cc99d806-e359-4577-8a61-1b527af8779f-kube-api-access-97574\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:37 crc kubenswrapper[4836]: I0217 14:31:37.918254 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 14:31:37 crc kubenswrapper[4836]: I0217 14:31:37.974868 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d6e757d-b7e9-417b-a63e-94879c7f3f74-combined-ca-bundle\") pod \"3d6e757d-b7e9-417b-a63e-94879c7f3f74\" (UID: \"3d6e757d-b7e9-417b-a63e-94879c7f3f74\") " Feb 17 14:31:37 crc kubenswrapper[4836]: I0217 14:31:37.975568 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d6e757d-b7e9-417b-a63e-94879c7f3f74-config-data\") pod \"3d6e757d-b7e9-417b-a63e-94879c7f3f74\" (UID: \"3d6e757d-b7e9-417b-a63e-94879c7f3f74\") " Feb 17 14:31:37 crc kubenswrapper[4836]: I0217 14:31:37.975812 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-czlhp\" (UniqueName: \"kubernetes.io/projected/3d6e757d-b7e9-417b-a63e-94879c7f3f74-kube-api-access-czlhp\") pod \"3d6e757d-b7e9-417b-a63e-94879c7f3f74\" (UID: \"3d6e757d-b7e9-417b-a63e-94879c7f3f74\") " Feb 17 14:31:37 crc kubenswrapper[4836]: I0217 14:31:37.975877 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d6e757d-b7e9-417b-a63e-94879c7f3f74-logs\") pod \"3d6e757d-b7e9-417b-a63e-94879c7f3f74\" (UID: \"3d6e757d-b7e9-417b-a63e-94879c7f3f74\") " Feb 17 14:31:37 crc kubenswrapper[4836]: I0217 14:31:37.980085 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d6e757d-b7e9-417b-a63e-94879c7f3f74-logs" (OuterVolumeSpecName: "logs") pod "3d6e757d-b7e9-417b-a63e-94879c7f3f74" (UID: "3d6e757d-b7e9-417b-a63e-94879c7f3f74"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:31:37 crc kubenswrapper[4836]: I0217 14:31:37.987401 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d6e757d-b7e9-417b-a63e-94879c7f3f74-kube-api-access-czlhp" (OuterVolumeSpecName: "kube-api-access-czlhp") pod "3d6e757d-b7e9-417b-a63e-94879c7f3f74" (UID: "3d6e757d-b7e9-417b-a63e-94879c7f3f74"). InnerVolumeSpecName "kube-api-access-czlhp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:31:38 crc kubenswrapper[4836]: I0217 14:31:38.006165 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc99d806-e359-4577-8a61-1b527af8779f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cc99d806-e359-4577-8a61-1b527af8779f" (UID: "cc99d806-e359-4577-8a61-1b527af8779f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:31:38 crc kubenswrapper[4836]: I0217 14:31:38.009140 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 17 14:31:38 crc kubenswrapper[4836]: I0217 14:31:38.051711 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d6e757d-b7e9-417b-a63e-94879c7f3f74-config-data" (OuterVolumeSpecName: "config-data") pod "3d6e757d-b7e9-417b-a63e-94879c7f3f74" (UID: "3d6e757d-b7e9-417b-a63e-94879c7f3f74"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:31:38 crc kubenswrapper[4836]: I0217 14:31:38.054520 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d6e757d-b7e9-417b-a63e-94879c7f3f74-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3d6e757d-b7e9-417b-a63e-94879c7f3f74" (UID: "3d6e757d-b7e9-417b-a63e-94879c7f3f74"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:31:38 crc kubenswrapper[4836]: I0217 14:31:38.078576 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-69kdk\" (UniqueName: \"kubernetes.io/projected/73de5f3f-982c-4471-b91b-e3725da6be03-kube-api-access-69kdk\") pod \"73de5f3f-982c-4471-b91b-e3725da6be03\" (UID: \"73de5f3f-982c-4471-b91b-e3725da6be03\") " Feb 17 14:31:38 crc kubenswrapper[4836]: I0217 14:31:38.078748 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73de5f3f-982c-4471-b91b-e3725da6be03-config-data\") pod \"73de5f3f-982c-4471-b91b-e3725da6be03\" (UID: \"73de5f3f-982c-4471-b91b-e3725da6be03\") " Feb 17 14:31:38 crc kubenswrapper[4836]: I0217 14:31:38.078918 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73de5f3f-982c-4471-b91b-e3725da6be03-combined-ca-bundle\") pod \"73de5f3f-982c-4471-b91b-e3725da6be03\" (UID: \"73de5f3f-982c-4471-b91b-e3725da6be03\") " Feb 17 14:31:38 crc kubenswrapper[4836]: I0217 14:31:38.080175 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-czlhp\" (UniqueName: \"kubernetes.io/projected/3d6e757d-b7e9-417b-a63e-94879c7f3f74-kube-api-access-czlhp\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:38 crc kubenswrapper[4836]: I0217 14:31:38.080201 4836 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d6e757d-b7e9-417b-a63e-94879c7f3f74-logs\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:38 crc kubenswrapper[4836]: I0217 14:31:38.080232 4836 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc99d806-e359-4577-8a61-1b527af8779f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:38 crc kubenswrapper[4836]: I0217 14:31:38.080249 4836 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d6e757d-b7e9-417b-a63e-94879c7f3f74-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:38 crc kubenswrapper[4836]: I0217 14:31:38.080264 4836 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d6e757d-b7e9-417b-a63e-94879c7f3f74-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:38 crc kubenswrapper[4836]: I0217 14:31:38.085129 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73de5f3f-982c-4471-b91b-e3725da6be03-kube-api-access-69kdk" (OuterVolumeSpecName: "kube-api-access-69kdk") pod "73de5f3f-982c-4471-b91b-e3725da6be03" (UID: "73de5f3f-982c-4471-b91b-e3725da6be03"). InnerVolumeSpecName "kube-api-access-69kdk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:31:38 crc kubenswrapper[4836]: I0217 14:31:38.150539 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73de5f3f-982c-4471-b91b-e3725da6be03-config-data" (OuterVolumeSpecName: "config-data") pod "73de5f3f-982c-4471-b91b-e3725da6be03" (UID: "73de5f3f-982c-4471-b91b-e3725da6be03"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:31:38 crc kubenswrapper[4836]: I0217 14:31:38.150653 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73de5f3f-982c-4471-b91b-e3725da6be03-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "73de5f3f-982c-4471-b91b-e3725da6be03" (UID: "73de5f3f-982c-4471-b91b-e3725da6be03"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:31:38 crc kubenswrapper[4836]: I0217 14:31:38.181901 4836 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73de5f3f-982c-4471-b91b-e3725da6be03-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:38 crc kubenswrapper[4836]: I0217 14:31:38.181938 4836 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73de5f3f-982c-4471-b91b-e3725da6be03-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:38 crc kubenswrapper[4836]: I0217 14:31:38.181950 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-69kdk\" (UniqueName: \"kubernetes.io/projected/73de5f3f-982c-4471-b91b-e3725da6be03-kube-api-access-69kdk\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:38 crc kubenswrapper[4836]: I0217 14:31:38.425901 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 17 14:31:38 crc kubenswrapper[4836]: W0217 14:31:38.439480 4836 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8809e181_9f70_4810_97e8_6fc4c9e3561a.slice/crio-da9d483ebf10af68921e181af3aa5a70697e99df3f7ea62fd5c7d1a9301dc8d3 WatchSource:0}: Error finding container da9d483ebf10af68921e181af3aa5a70697e99df3f7ea62fd5c7d1a9301dc8d3: Status 404 returned error can't find the container with id da9d483ebf10af68921e181af3aa5a70697e99df3f7ea62fd5c7d1a9301dc8d3 Feb 17 14:31:38 crc kubenswrapper[4836]: I0217 14:31:38.581815 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87197028-3222-4c04-89a7-135997258e0d" path="/var/lib/kubelet/pods/87197028-3222-4c04-89a7-135997258e0d/volumes" Feb 17 14:31:38 crc kubenswrapper[4836]: I0217 14:31:38.646548 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"8809e181-9f70-4810-97e8-6fc4c9e3561a","Type":"ContainerStarted","Data":"da9d483ebf10af68921e181af3aa5a70697e99df3f7ea62fd5c7d1a9301dc8d3"} Feb 17 14:31:38 crc kubenswrapper[4836]: I0217 14:31:38.649587 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 14:31:38 crc kubenswrapper[4836]: I0217 14:31:38.649578 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3d6e757d-b7e9-417b-a63e-94879c7f3f74","Type":"ContainerDied","Data":"b6f643b62e3a190c9d6903fa464d7953ef4da0c0399e9c74ba601860f623c445"} Feb 17 14:31:38 crc kubenswrapper[4836]: I0217 14:31:38.649759 4836 scope.go:117] "RemoveContainer" containerID="7b113b343450d9b94ea6adbfe150b4f215416e1008c8bb123e85d6f538396d6e" Feb 17 14:31:38 crc kubenswrapper[4836]: I0217 14:31:38.661635 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-89b2r" Feb 17 14:31:38 crc kubenswrapper[4836]: I0217 14:31:38.662710 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 17 14:31:38 crc kubenswrapper[4836]: I0217 14:31:38.663153 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"73de5f3f-982c-4471-b91b-e3725da6be03","Type":"ContainerDied","Data":"73eb695666ad36f06c437207b1c5c555a1d75dfaa1f2995189afe99e230da0da"} Feb 17 14:31:38 crc kubenswrapper[4836]: I0217 14:31:38.713058 4836 scope.go:117] "RemoveContainer" containerID="376eec79f26ee001240c9935be991e0037ae7235786eba961156abcccae1aa84" Feb 17 14:31:38 crc kubenswrapper[4836]: I0217 14:31:38.759090 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 14:31:38 crc kubenswrapper[4836]: I0217 14:31:38.773386 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 14:31:38 crc kubenswrapper[4836]: I0217 14:31:38.776786 4836 scope.go:117] "RemoveContainer" containerID="62c716ef584f99d01d3c8b6a75eeb1c7c51deabb2ccc02590847f8e3c96e8ddb" Feb 17 14:31:38 crc kubenswrapper[4836]: I0217 14:31:38.791266 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 17 14:31:38 crc kubenswrapper[4836]: I0217 14:31:38.838731 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 17 14:31:38 crc kubenswrapper[4836]: I0217 14:31:38.884891 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-89b2r"] Feb 17 14:31:38 crc kubenswrapper[4836]: I0217 14:31:38.934089 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 17 14:31:38 crc kubenswrapper[4836]: E0217 14:31:38.934925 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc99d806-e359-4577-8a61-1b527af8779f" containerName="extract-content" Feb 17 14:31:38 crc kubenswrapper[4836]: I0217 14:31:38.934949 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc99d806-e359-4577-8a61-1b527af8779f" containerName="extract-content" Feb 17 14:31:38 crc kubenswrapper[4836]: E0217 14:31:38.934974 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc99d806-e359-4577-8a61-1b527af8779f" containerName="registry-server" Feb 17 14:31:38 crc kubenswrapper[4836]: I0217 14:31:38.934981 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc99d806-e359-4577-8a61-1b527af8779f" containerName="registry-server" Feb 17 14:31:38 crc kubenswrapper[4836]: E0217 14:31:38.934999 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d6e757d-b7e9-417b-a63e-94879c7f3f74" containerName="nova-metadata-metadata" Feb 17 14:31:38 crc kubenswrapper[4836]: I0217 14:31:38.935005 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d6e757d-b7e9-417b-a63e-94879c7f3f74" containerName="nova-metadata-metadata" Feb 17 14:31:38 crc kubenswrapper[4836]: E0217 14:31:38.935019 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc99d806-e359-4577-8a61-1b527af8779f" containerName="extract-utilities" Feb 17 14:31:38 crc kubenswrapper[4836]: I0217 14:31:38.935025 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc99d806-e359-4577-8a61-1b527af8779f" containerName="extract-utilities" Feb 17 14:31:38 crc kubenswrapper[4836]: E0217 14:31:38.935038 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73de5f3f-982c-4471-b91b-e3725da6be03" containerName="nova-cell1-novncproxy-novncproxy" Feb 17 14:31:38 crc kubenswrapper[4836]: I0217 14:31:38.935068 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="73de5f3f-982c-4471-b91b-e3725da6be03" containerName="nova-cell1-novncproxy-novncproxy" Feb 17 14:31:38 crc kubenswrapper[4836]: E0217 14:31:38.935094 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d6e757d-b7e9-417b-a63e-94879c7f3f74" containerName="nova-metadata-log" Feb 17 14:31:38 crc kubenswrapper[4836]: I0217 14:31:38.935100 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d6e757d-b7e9-417b-a63e-94879c7f3f74" containerName="nova-metadata-log" Feb 17 14:31:38 crc kubenswrapper[4836]: I0217 14:31:38.935346 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d6e757d-b7e9-417b-a63e-94879c7f3f74" containerName="nova-metadata-metadata" Feb 17 14:31:38 crc kubenswrapper[4836]: I0217 14:31:38.935360 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc99d806-e359-4577-8a61-1b527af8779f" containerName="registry-server" Feb 17 14:31:38 crc kubenswrapper[4836]: I0217 14:31:38.935389 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d6e757d-b7e9-417b-a63e-94879c7f3f74" containerName="nova-metadata-log" Feb 17 14:31:38 crc kubenswrapper[4836]: I0217 14:31:38.935407 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="73de5f3f-982c-4471-b91b-e3725da6be03" containerName="nova-cell1-novncproxy-novncproxy" Feb 17 14:31:38 crc kubenswrapper[4836]: I0217 14:31:38.938364 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 14:31:38 crc kubenswrapper[4836]: I0217 14:31:38.941706 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 17 14:31:38 crc kubenswrapper[4836]: I0217 14:31:38.943278 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 17 14:31:38 crc kubenswrapper[4836]: I0217 14:31:38.965543 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-89b2r"] Feb 17 14:31:39 crc kubenswrapper[4836]: I0217 14:31:39.021387 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 17 14:31:39 crc kubenswrapper[4836]: I0217 14:31:39.023595 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 17 14:31:39 crc kubenswrapper[4836]: I0217 14:31:39.028312 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 17 14:31:39 crc kubenswrapper[4836]: I0217 14:31:39.028619 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Feb 17 14:31:39 crc kubenswrapper[4836]: I0217 14:31:39.028751 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Feb 17 14:31:39 crc kubenswrapper[4836]: I0217 14:31:39.048379 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 14:31:39 crc kubenswrapper[4836]: I0217 14:31:39.066495 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 17 14:31:39 crc kubenswrapper[4836]: I0217 14:31:39.070114 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/327aaf35-8278-4f1a-b369-7a40209c0a8e-config-data\") pod \"nova-metadata-0\" (UID: \"327aaf35-8278-4f1a-b369-7a40209c0a8e\") " pod="openstack/nova-metadata-0" Feb 17 14:31:39 crc kubenswrapper[4836]: I0217 14:31:39.070357 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnlb4\" (UniqueName: \"kubernetes.io/projected/327aaf35-8278-4f1a-b369-7a40209c0a8e-kube-api-access-mnlb4\") pod \"nova-metadata-0\" (UID: \"327aaf35-8278-4f1a-b369-7a40209c0a8e\") " pod="openstack/nova-metadata-0" Feb 17 14:31:39 crc kubenswrapper[4836]: I0217 14:31:39.070429 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/327aaf35-8278-4f1a-b369-7a40209c0a8e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"327aaf35-8278-4f1a-b369-7a40209c0a8e\") " pod="openstack/nova-metadata-0" Feb 17 14:31:39 crc kubenswrapper[4836]: I0217 14:31:39.070465 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/327aaf35-8278-4f1a-b369-7a40209c0a8e-logs\") pod \"nova-metadata-0\" (UID: \"327aaf35-8278-4f1a-b369-7a40209c0a8e\") " pod="openstack/nova-metadata-0" Feb 17 14:31:39 crc kubenswrapper[4836]: I0217 14:31:39.070652 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/327aaf35-8278-4f1a-b369-7a40209c0a8e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"327aaf35-8278-4f1a-b369-7a40209c0a8e\") " pod="openstack/nova-metadata-0" Feb 17 14:31:39 crc kubenswrapper[4836]: I0217 14:31:39.174126 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/327aaf35-8278-4f1a-b369-7a40209c0a8e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"327aaf35-8278-4f1a-b369-7a40209c0a8e\") " pod="openstack/nova-metadata-0" Feb 17 14:31:39 crc kubenswrapper[4836]: I0217 14:31:39.174533 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d9c8dd5-2ccb-4656-a059-352c03aa923d-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"6d9c8dd5-2ccb-4656-a059-352c03aa923d\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 14:31:39 crc kubenswrapper[4836]: I0217 14:31:39.174646 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/327aaf35-8278-4f1a-b369-7a40209c0a8e-config-data\") pod \"nova-metadata-0\" (UID: \"327aaf35-8278-4f1a-b369-7a40209c0a8e\") " pod="openstack/nova-metadata-0" Feb 17 14:31:39 crc kubenswrapper[4836]: I0217 14:31:39.174686 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d9c8dd5-2ccb-4656-a059-352c03aa923d-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"6d9c8dd5-2ccb-4656-a059-352c03aa923d\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 14:31:39 crc kubenswrapper[4836]: I0217 14:31:39.175145 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kf7r6\" (UniqueName: \"kubernetes.io/projected/6d9c8dd5-2ccb-4656-a059-352c03aa923d-kube-api-access-kf7r6\") pod \"nova-cell1-novncproxy-0\" (UID: \"6d9c8dd5-2ccb-4656-a059-352c03aa923d\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 14:31:39 crc kubenswrapper[4836]: I0217 14:31:39.175258 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d9c8dd5-2ccb-4656-a059-352c03aa923d-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"6d9c8dd5-2ccb-4656-a059-352c03aa923d\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 14:31:39 crc kubenswrapper[4836]: I0217 14:31:39.175437 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnlb4\" (UniqueName: \"kubernetes.io/projected/327aaf35-8278-4f1a-b369-7a40209c0a8e-kube-api-access-mnlb4\") pod \"nova-metadata-0\" (UID: \"327aaf35-8278-4f1a-b369-7a40209c0a8e\") " pod="openstack/nova-metadata-0" Feb 17 14:31:39 crc kubenswrapper[4836]: I0217 14:31:39.175575 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d9c8dd5-2ccb-4656-a059-352c03aa923d-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"6d9c8dd5-2ccb-4656-a059-352c03aa923d\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 14:31:39 crc kubenswrapper[4836]: I0217 14:31:39.175613 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/327aaf35-8278-4f1a-b369-7a40209c0a8e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"327aaf35-8278-4f1a-b369-7a40209c0a8e\") " pod="openstack/nova-metadata-0" Feb 17 14:31:39 crc kubenswrapper[4836]: I0217 14:31:39.175662 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/327aaf35-8278-4f1a-b369-7a40209c0a8e-logs\") pod \"nova-metadata-0\" (UID: \"327aaf35-8278-4f1a-b369-7a40209c0a8e\") " pod="openstack/nova-metadata-0" Feb 17 14:31:39 crc kubenswrapper[4836]: I0217 14:31:39.183153 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/327aaf35-8278-4f1a-b369-7a40209c0a8e-logs\") pod \"nova-metadata-0\" (UID: \"327aaf35-8278-4f1a-b369-7a40209c0a8e\") " pod="openstack/nova-metadata-0" Feb 17 14:31:39 crc kubenswrapper[4836]: I0217 14:31:39.186974 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/327aaf35-8278-4f1a-b369-7a40209c0a8e-config-data\") pod \"nova-metadata-0\" (UID: \"327aaf35-8278-4f1a-b369-7a40209c0a8e\") " pod="openstack/nova-metadata-0" Feb 17 14:31:39 crc kubenswrapper[4836]: I0217 14:31:39.191739 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/327aaf35-8278-4f1a-b369-7a40209c0a8e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"327aaf35-8278-4f1a-b369-7a40209c0a8e\") " pod="openstack/nova-metadata-0" Feb 17 14:31:39 crc kubenswrapper[4836]: I0217 14:31:39.192456 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/327aaf35-8278-4f1a-b369-7a40209c0a8e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"327aaf35-8278-4f1a-b369-7a40209c0a8e\") " pod="openstack/nova-metadata-0" Feb 17 14:31:39 crc kubenswrapper[4836]: I0217 14:31:39.201990 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnlb4\" (UniqueName: \"kubernetes.io/projected/327aaf35-8278-4f1a-b369-7a40209c0a8e-kube-api-access-mnlb4\") pod \"nova-metadata-0\" (UID: \"327aaf35-8278-4f1a-b369-7a40209c0a8e\") " pod="openstack/nova-metadata-0" Feb 17 14:31:39 crc kubenswrapper[4836]: I0217 14:31:39.280401 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 14:31:39 crc kubenswrapper[4836]: I0217 14:31:39.281387 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kf7r6\" (UniqueName: \"kubernetes.io/projected/6d9c8dd5-2ccb-4656-a059-352c03aa923d-kube-api-access-kf7r6\") pod \"nova-cell1-novncproxy-0\" (UID: \"6d9c8dd5-2ccb-4656-a059-352c03aa923d\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 14:31:39 crc kubenswrapper[4836]: I0217 14:31:39.281477 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d9c8dd5-2ccb-4656-a059-352c03aa923d-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"6d9c8dd5-2ccb-4656-a059-352c03aa923d\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 14:31:39 crc kubenswrapper[4836]: I0217 14:31:39.281544 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d9c8dd5-2ccb-4656-a059-352c03aa923d-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"6d9c8dd5-2ccb-4656-a059-352c03aa923d\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 14:31:39 crc kubenswrapper[4836]: I0217 14:31:39.281728 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d9c8dd5-2ccb-4656-a059-352c03aa923d-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"6d9c8dd5-2ccb-4656-a059-352c03aa923d\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 14:31:39 crc kubenswrapper[4836]: I0217 14:31:39.281886 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d9c8dd5-2ccb-4656-a059-352c03aa923d-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"6d9c8dd5-2ccb-4656-a059-352c03aa923d\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 14:31:39 crc kubenswrapper[4836]: I0217 14:31:39.288350 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d9c8dd5-2ccb-4656-a059-352c03aa923d-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"6d9c8dd5-2ccb-4656-a059-352c03aa923d\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 14:31:39 crc kubenswrapper[4836]: I0217 14:31:39.637657 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kf7r6\" (UniqueName: \"kubernetes.io/projected/6d9c8dd5-2ccb-4656-a059-352c03aa923d-kube-api-access-kf7r6\") pod \"nova-cell1-novncproxy-0\" (UID: \"6d9c8dd5-2ccb-4656-a059-352c03aa923d\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 14:31:39 crc kubenswrapper[4836]: I0217 14:31:39.678547 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d9c8dd5-2ccb-4656-a059-352c03aa923d-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"6d9c8dd5-2ccb-4656-a059-352c03aa923d\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 14:31:39 crc kubenswrapper[4836]: I0217 14:31:39.858811 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d9c8dd5-2ccb-4656-a059-352c03aa923d-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"6d9c8dd5-2ccb-4656-a059-352c03aa923d\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 14:31:39 crc kubenswrapper[4836]: I0217 14:31:39.874568 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d9c8dd5-2ccb-4656-a059-352c03aa923d-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"6d9c8dd5-2ccb-4656-a059-352c03aa923d\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 14:31:40 crc kubenswrapper[4836]: I0217 14:31:40.056477 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 17 14:31:40 crc kubenswrapper[4836]: I0217 14:31:40.438944 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 14:31:40 crc kubenswrapper[4836]: I0217 14:31:40.582552 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d6e757d-b7e9-417b-a63e-94879c7f3f74" path="/var/lib/kubelet/pods/3d6e757d-b7e9-417b-a63e-94879c7f3f74/volumes" Feb 17 14:31:40 crc kubenswrapper[4836]: I0217 14:31:40.583506 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73de5f3f-982c-4471-b91b-e3725da6be03" path="/var/lib/kubelet/pods/73de5f3f-982c-4471-b91b-e3725da6be03/volumes" Feb 17 14:31:40 crc kubenswrapper[4836]: I0217 14:31:40.584968 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc99d806-e359-4577-8a61-1b527af8779f" path="/var/lib/kubelet/pods/cc99d806-e359-4577-8a61-1b527af8779f/volumes" Feb 17 14:31:40 crc kubenswrapper[4836]: W0217 14:31:40.669370 4836 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6d9c8dd5_2ccb_4656_a059_352c03aa923d.slice/crio-98473541f824241a507b6901f4d765f231afe6dff37931dfef86a14dd772cd3e WatchSource:0}: Error finding container 98473541f824241a507b6901f4d765f231afe6dff37931dfef86a14dd772cd3e: Status 404 returned error can't find the container with id 98473541f824241a507b6901f4d765f231afe6dff37931dfef86a14dd772cd3e Feb 17 14:31:40 crc kubenswrapper[4836]: I0217 14:31:40.688802 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 17 14:31:40 crc kubenswrapper[4836]: I0217 14:31:40.933235 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"327aaf35-8278-4f1a-b369-7a40209c0a8e","Type":"ContainerStarted","Data":"e6a160448c863ff6f5997675db3bca2b01a51a67e8fdbdf7026c37112391dd6e"} Feb 17 14:31:40 crc kubenswrapper[4836]: I0217 14:31:40.933344 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"327aaf35-8278-4f1a-b369-7a40209c0a8e","Type":"ContainerStarted","Data":"f0a3643ff133a4988442a790112d5d5fb30bb8fc7d8f8119a27a6cf6da2e8bfc"} Feb 17 14:31:40 crc kubenswrapper[4836]: I0217 14:31:40.937547 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"6d9c8dd5-2ccb-4656-a059-352c03aa923d","Type":"ContainerStarted","Data":"98473541f824241a507b6901f4d765f231afe6dff37931dfef86a14dd772cd3e"} Feb 17 14:31:40 crc kubenswrapper[4836]: I0217 14:31:40.950617 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"8809e181-9f70-4810-97e8-6fc4c9e3561a","Type":"ContainerStarted","Data":"a350324cb5b9c856ef8a34e5ef41d3d953463dc1e3c10a657f4906005906c69b"} Feb 17 14:31:40 crc kubenswrapper[4836]: I0217 14:31:40.951594 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Feb 17 14:31:40 crc kubenswrapper[4836]: I0217 14:31:40.997262 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 14:31:40 crc kubenswrapper[4836]: I0217 14:31:40.997831 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="68b35111-581a-4e2e-9fae-3e0248674655" containerName="ceilometer-central-agent" containerID="cri-o://061b81d1777bbf667556b595b5cc63218a0966b7aa3e7690e90b4b84a2173bed" gracePeriod=30 Feb 17 14:31:40 crc kubenswrapper[4836]: I0217 14:31:40.998147 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="68b35111-581a-4e2e-9fae-3e0248674655" containerName="proxy-httpd" containerID="cri-o://bd01937315b50abe24d9b50b0b169dd1610243cceeeeb3bd7fb5017e3522ad76" gracePeriod=30 Feb 17 14:31:40 crc kubenswrapper[4836]: I0217 14:31:40.998219 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="68b35111-581a-4e2e-9fae-3e0248674655" containerName="ceilometer-notification-agent" containerID="cri-o://f38b096e0cb174cc64328d45e0488ca7d9888e5f9a99f36292e400cf1d4fd13d" gracePeriod=30 Feb 17 14:31:40 crc kubenswrapper[4836]: I0217 14:31:40.998345 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="68b35111-581a-4e2e-9fae-3e0248674655" containerName="sg-core" containerID="cri-o://b296d87fd7976e77f9e6129f3c2eff1cfb26664d576e803cadb9560994f76f7d" gracePeriod=30 Feb 17 14:31:41 crc kubenswrapper[4836]: I0217 14:31:41.001710 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=3.472179833 podStartE2EDuration="4.00167836s" podCreationTimestamp="2026-02-17 14:31:37 +0000 UTC" firstStartedPulling="2026-02-17 14:31:38.443092136 +0000 UTC m=+1524.786020405" lastFinishedPulling="2026-02-17 14:31:38.972590663 +0000 UTC m=+1525.315518932" observedRunningTime="2026-02-17 14:31:40.977210657 +0000 UTC m=+1527.320138936" watchObservedRunningTime="2026-02-17 14:31:41.00167836 +0000 UTC m=+1527.344606619" Feb 17 14:31:41 crc kubenswrapper[4836]: E0217 14:31:41.605395 4836 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod68b35111_581a_4e2e_9fae_3e0248674655.slice/crio-conmon-bd01937315b50abe24d9b50b0b169dd1610243cceeeeb3bd7fb5017e3522ad76.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod68b35111_581a_4e2e_9fae_3e0248674655.slice/crio-bd01937315b50abe24d9b50b0b169dd1610243cceeeeb3bd7fb5017e3522ad76.scope\": RecentStats: unable to find data in memory cache]" Feb 17 14:31:41 crc kubenswrapper[4836]: I0217 14:31:41.897970 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 17 14:31:41 crc kubenswrapper[4836]: I0217 14:31:41.899048 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 17 14:31:41 crc kubenswrapper[4836]: I0217 14:31:41.904710 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 17 14:31:41 crc kubenswrapper[4836]: I0217 14:31:41.909685 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 17 14:31:42 crc kubenswrapper[4836]: I0217 14:31:42.031493 4836 generic.go:334] "Generic (PLEG): container finished" podID="68b35111-581a-4e2e-9fae-3e0248674655" containerID="bd01937315b50abe24d9b50b0b169dd1610243cceeeeb3bd7fb5017e3522ad76" exitCode=0 Feb 17 14:31:42 crc kubenswrapper[4836]: I0217 14:31:42.031535 4836 generic.go:334] "Generic (PLEG): container finished" podID="68b35111-581a-4e2e-9fae-3e0248674655" containerID="b296d87fd7976e77f9e6129f3c2eff1cfb26664d576e803cadb9560994f76f7d" exitCode=2 Feb 17 14:31:42 crc kubenswrapper[4836]: I0217 14:31:42.031542 4836 generic.go:334] "Generic (PLEG): container finished" podID="68b35111-581a-4e2e-9fae-3e0248674655" containerID="061b81d1777bbf667556b595b5cc63218a0966b7aa3e7690e90b4b84a2173bed" exitCode=0 Feb 17 14:31:42 crc kubenswrapper[4836]: I0217 14:31:42.031589 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"68b35111-581a-4e2e-9fae-3e0248674655","Type":"ContainerDied","Data":"bd01937315b50abe24d9b50b0b169dd1610243cceeeeb3bd7fb5017e3522ad76"} Feb 17 14:31:42 crc kubenswrapper[4836]: I0217 14:31:42.031622 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"68b35111-581a-4e2e-9fae-3e0248674655","Type":"ContainerDied","Data":"b296d87fd7976e77f9e6129f3c2eff1cfb26664d576e803cadb9560994f76f7d"} Feb 17 14:31:42 crc kubenswrapper[4836]: I0217 14:31:42.031632 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"68b35111-581a-4e2e-9fae-3e0248674655","Type":"ContainerDied","Data":"061b81d1777bbf667556b595b5cc63218a0966b7aa3e7690e90b4b84a2173bed"} Feb 17 14:31:42 crc kubenswrapper[4836]: I0217 14:31:42.035525 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"6d9c8dd5-2ccb-4656-a059-352c03aa923d","Type":"ContainerStarted","Data":"ac9a5d0baf9704bc40414219f5c0c2559f1d8f4e79d885d3018faf83fc960618"} Feb 17 14:31:42 crc kubenswrapper[4836]: I0217 14:31:42.041030 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"327aaf35-8278-4f1a-b369-7a40209c0a8e","Type":"ContainerStarted","Data":"a23d238ccfddfc1e73a017a79f41c992e73a86ed04c0afd30e536b5ddc8892f3"} Feb 17 14:31:42 crc kubenswrapper[4836]: I0217 14:31:42.041072 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 17 14:31:42 crc kubenswrapper[4836]: I0217 14:31:42.075177 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 17 14:31:42 crc kubenswrapper[4836]: I0217 14:31:42.085178 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=4.085150738 podStartE2EDuration="4.085150738s" podCreationTimestamp="2026-02-17 14:31:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:31:42.058058524 +0000 UTC m=+1528.400986793" watchObservedRunningTime="2026-02-17 14:31:42.085150738 +0000 UTC m=+1528.428079007" Feb 17 14:31:42 crc kubenswrapper[4836]: I0217 14:31:42.111188 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=4.111158453 podStartE2EDuration="4.111158453s" podCreationTimestamp="2026-02-17 14:31:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:31:42.090152034 +0000 UTC m=+1528.433080303" watchObservedRunningTime="2026-02-17 14:31:42.111158453 +0000 UTC m=+1528.454086722" Feb 17 14:31:42 crc kubenswrapper[4836]: I0217 14:31:42.348183 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5fd9b586ff-snjhj"] Feb 17 14:31:42 crc kubenswrapper[4836]: I0217 14:31:42.351288 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fd9b586ff-snjhj" Feb 17 14:31:42 crc kubenswrapper[4836]: I0217 14:31:42.377023 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5fd9b586ff-snjhj"] Feb 17 14:31:42 crc kubenswrapper[4836]: I0217 14:31:42.527076 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6dc084a0-be89-4371-92a3-181cfe1979ce-ovsdbserver-sb\") pod \"dnsmasq-dns-5fd9b586ff-snjhj\" (UID: \"6dc084a0-be89-4371-92a3-181cfe1979ce\") " pod="openstack/dnsmasq-dns-5fd9b586ff-snjhj" Feb 17 14:31:42 crc kubenswrapper[4836]: I0217 14:31:42.527194 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6dc084a0-be89-4371-92a3-181cfe1979ce-dns-svc\") pod \"dnsmasq-dns-5fd9b586ff-snjhj\" (UID: \"6dc084a0-be89-4371-92a3-181cfe1979ce\") " pod="openstack/dnsmasq-dns-5fd9b586ff-snjhj" Feb 17 14:31:42 crc kubenswrapper[4836]: I0217 14:31:42.527235 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6dc084a0-be89-4371-92a3-181cfe1979ce-dns-swift-storage-0\") pod \"dnsmasq-dns-5fd9b586ff-snjhj\" (UID: \"6dc084a0-be89-4371-92a3-181cfe1979ce\") " pod="openstack/dnsmasq-dns-5fd9b586ff-snjhj" Feb 17 14:31:42 crc kubenswrapper[4836]: I0217 14:31:42.527316 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6dc084a0-be89-4371-92a3-181cfe1979ce-ovsdbserver-nb\") pod \"dnsmasq-dns-5fd9b586ff-snjhj\" (UID: \"6dc084a0-be89-4371-92a3-181cfe1979ce\") " pod="openstack/dnsmasq-dns-5fd9b586ff-snjhj" Feb 17 14:31:42 crc kubenswrapper[4836]: I0217 14:31:42.527375 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6dc084a0-be89-4371-92a3-181cfe1979ce-config\") pod \"dnsmasq-dns-5fd9b586ff-snjhj\" (UID: \"6dc084a0-be89-4371-92a3-181cfe1979ce\") " pod="openstack/dnsmasq-dns-5fd9b586ff-snjhj" Feb 17 14:31:42 crc kubenswrapper[4836]: I0217 14:31:42.527399 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hz7bl\" (UniqueName: \"kubernetes.io/projected/6dc084a0-be89-4371-92a3-181cfe1979ce-kube-api-access-hz7bl\") pod \"dnsmasq-dns-5fd9b586ff-snjhj\" (UID: \"6dc084a0-be89-4371-92a3-181cfe1979ce\") " pod="openstack/dnsmasq-dns-5fd9b586ff-snjhj" Feb 17 14:31:42 crc kubenswrapper[4836]: I0217 14:31:42.629878 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6dc084a0-be89-4371-92a3-181cfe1979ce-ovsdbserver-sb\") pod \"dnsmasq-dns-5fd9b586ff-snjhj\" (UID: \"6dc084a0-be89-4371-92a3-181cfe1979ce\") " pod="openstack/dnsmasq-dns-5fd9b586ff-snjhj" Feb 17 14:31:42 crc kubenswrapper[4836]: I0217 14:31:42.629985 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6dc084a0-be89-4371-92a3-181cfe1979ce-dns-svc\") pod \"dnsmasq-dns-5fd9b586ff-snjhj\" (UID: \"6dc084a0-be89-4371-92a3-181cfe1979ce\") " pod="openstack/dnsmasq-dns-5fd9b586ff-snjhj" Feb 17 14:31:42 crc kubenswrapper[4836]: I0217 14:31:42.630022 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6dc084a0-be89-4371-92a3-181cfe1979ce-dns-swift-storage-0\") pod \"dnsmasq-dns-5fd9b586ff-snjhj\" (UID: \"6dc084a0-be89-4371-92a3-181cfe1979ce\") " pod="openstack/dnsmasq-dns-5fd9b586ff-snjhj" Feb 17 14:31:42 crc kubenswrapper[4836]: I0217 14:31:42.630089 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6dc084a0-be89-4371-92a3-181cfe1979ce-ovsdbserver-nb\") pod \"dnsmasq-dns-5fd9b586ff-snjhj\" (UID: \"6dc084a0-be89-4371-92a3-181cfe1979ce\") " pod="openstack/dnsmasq-dns-5fd9b586ff-snjhj" Feb 17 14:31:42 crc kubenswrapper[4836]: I0217 14:31:42.630137 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6dc084a0-be89-4371-92a3-181cfe1979ce-config\") pod \"dnsmasq-dns-5fd9b586ff-snjhj\" (UID: \"6dc084a0-be89-4371-92a3-181cfe1979ce\") " pod="openstack/dnsmasq-dns-5fd9b586ff-snjhj" Feb 17 14:31:42 crc kubenswrapper[4836]: I0217 14:31:42.630163 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hz7bl\" (UniqueName: \"kubernetes.io/projected/6dc084a0-be89-4371-92a3-181cfe1979ce-kube-api-access-hz7bl\") pod \"dnsmasq-dns-5fd9b586ff-snjhj\" (UID: \"6dc084a0-be89-4371-92a3-181cfe1979ce\") " pod="openstack/dnsmasq-dns-5fd9b586ff-snjhj" Feb 17 14:31:42 crc kubenswrapper[4836]: I0217 14:31:42.631020 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6dc084a0-be89-4371-92a3-181cfe1979ce-ovsdbserver-sb\") pod \"dnsmasq-dns-5fd9b586ff-snjhj\" (UID: \"6dc084a0-be89-4371-92a3-181cfe1979ce\") " pod="openstack/dnsmasq-dns-5fd9b586ff-snjhj" Feb 17 14:31:42 crc kubenswrapper[4836]: I0217 14:31:42.631267 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6dc084a0-be89-4371-92a3-181cfe1979ce-dns-svc\") pod \"dnsmasq-dns-5fd9b586ff-snjhj\" (UID: \"6dc084a0-be89-4371-92a3-181cfe1979ce\") " pod="openstack/dnsmasq-dns-5fd9b586ff-snjhj" Feb 17 14:31:42 crc kubenswrapper[4836]: I0217 14:31:42.631673 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6dc084a0-be89-4371-92a3-181cfe1979ce-ovsdbserver-nb\") pod \"dnsmasq-dns-5fd9b586ff-snjhj\" (UID: \"6dc084a0-be89-4371-92a3-181cfe1979ce\") " pod="openstack/dnsmasq-dns-5fd9b586ff-snjhj" Feb 17 14:31:42 crc kubenswrapper[4836]: I0217 14:31:42.631884 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6dc084a0-be89-4371-92a3-181cfe1979ce-config\") pod \"dnsmasq-dns-5fd9b586ff-snjhj\" (UID: \"6dc084a0-be89-4371-92a3-181cfe1979ce\") " pod="openstack/dnsmasq-dns-5fd9b586ff-snjhj" Feb 17 14:31:42 crc kubenswrapper[4836]: I0217 14:31:42.632287 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6dc084a0-be89-4371-92a3-181cfe1979ce-dns-swift-storage-0\") pod \"dnsmasq-dns-5fd9b586ff-snjhj\" (UID: \"6dc084a0-be89-4371-92a3-181cfe1979ce\") " pod="openstack/dnsmasq-dns-5fd9b586ff-snjhj" Feb 17 14:31:42 crc kubenswrapper[4836]: I0217 14:31:42.655112 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hz7bl\" (UniqueName: \"kubernetes.io/projected/6dc084a0-be89-4371-92a3-181cfe1979ce-kube-api-access-hz7bl\") pod \"dnsmasq-dns-5fd9b586ff-snjhj\" (UID: \"6dc084a0-be89-4371-92a3-181cfe1979ce\") " pod="openstack/dnsmasq-dns-5fd9b586ff-snjhj" Feb 17 14:31:42 crc kubenswrapper[4836]: I0217 14:31:42.694758 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fd9b586ff-snjhj" Feb 17 14:31:43 crc kubenswrapper[4836]: I0217 14:31:43.419430 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5fd9b586ff-snjhj"] Feb 17 14:31:44 crc kubenswrapper[4836]: I0217 14:31:44.069326 4836 generic.go:334] "Generic (PLEG): container finished" podID="68b35111-581a-4e2e-9fae-3e0248674655" containerID="f38b096e0cb174cc64328d45e0488ca7d9888e5f9a99f36292e400cf1d4fd13d" exitCode=0 Feb 17 14:31:44 crc kubenswrapper[4836]: I0217 14:31:44.069725 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"68b35111-581a-4e2e-9fae-3e0248674655","Type":"ContainerDied","Data":"f38b096e0cb174cc64328d45e0488ca7d9888e5f9a99f36292e400cf1d4fd13d"} Feb 17 14:31:44 crc kubenswrapper[4836]: I0217 14:31:44.071881 4836 generic.go:334] "Generic (PLEG): container finished" podID="6dc084a0-be89-4371-92a3-181cfe1979ce" containerID="390fa4ca6b8979533f6405e113ef2079e208fe0693fc6d17b0be00a546b8f4a6" exitCode=0 Feb 17 14:31:44 crc kubenswrapper[4836]: I0217 14:31:44.072468 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fd9b586ff-snjhj" event={"ID":"6dc084a0-be89-4371-92a3-181cfe1979ce","Type":"ContainerDied","Data":"390fa4ca6b8979533f6405e113ef2079e208fe0693fc6d17b0be00a546b8f4a6"} Feb 17 14:31:44 crc kubenswrapper[4836]: I0217 14:31:44.072571 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fd9b586ff-snjhj" event={"ID":"6dc084a0-be89-4371-92a3-181cfe1979ce","Type":"ContainerStarted","Data":"afc5626684a403ed544c1a7eb27331441a69058c5e8e4168ae08e5ba526f6680"} Feb 17 14:31:44 crc kubenswrapper[4836]: I0217 14:31:44.283003 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 17 14:31:44 crc kubenswrapper[4836]: I0217 14:31:44.283550 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 17 14:31:44 crc kubenswrapper[4836]: I0217 14:31:44.367949 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 14:31:44 crc kubenswrapper[4836]: I0217 14:31:44.493797 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qclhz\" (UniqueName: \"kubernetes.io/projected/68b35111-581a-4e2e-9fae-3e0248674655-kube-api-access-qclhz\") pod \"68b35111-581a-4e2e-9fae-3e0248674655\" (UID: \"68b35111-581a-4e2e-9fae-3e0248674655\") " Feb 17 14:31:44 crc kubenswrapper[4836]: I0217 14:31:44.493885 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/68b35111-581a-4e2e-9fae-3e0248674655-run-httpd\") pod \"68b35111-581a-4e2e-9fae-3e0248674655\" (UID: \"68b35111-581a-4e2e-9fae-3e0248674655\") " Feb 17 14:31:44 crc kubenswrapper[4836]: I0217 14:31:44.493910 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/68b35111-581a-4e2e-9fae-3e0248674655-log-httpd\") pod \"68b35111-581a-4e2e-9fae-3e0248674655\" (UID: \"68b35111-581a-4e2e-9fae-3e0248674655\") " Feb 17 14:31:44 crc kubenswrapper[4836]: I0217 14:31:44.493971 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68b35111-581a-4e2e-9fae-3e0248674655-scripts\") pod \"68b35111-581a-4e2e-9fae-3e0248674655\" (UID: \"68b35111-581a-4e2e-9fae-3e0248674655\") " Feb 17 14:31:44 crc kubenswrapper[4836]: I0217 14:31:44.494347 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68b35111-581a-4e2e-9fae-3e0248674655-config-data\") pod \"68b35111-581a-4e2e-9fae-3e0248674655\" (UID: \"68b35111-581a-4e2e-9fae-3e0248674655\") " Feb 17 14:31:44 crc kubenswrapper[4836]: I0217 14:31:44.494613 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68b35111-581a-4e2e-9fae-3e0248674655-combined-ca-bundle\") pod \"68b35111-581a-4e2e-9fae-3e0248674655\" (UID: \"68b35111-581a-4e2e-9fae-3e0248674655\") " Feb 17 14:31:44 crc kubenswrapper[4836]: I0217 14:31:44.494765 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/68b35111-581a-4e2e-9fae-3e0248674655-sg-core-conf-yaml\") pod \"68b35111-581a-4e2e-9fae-3e0248674655\" (UID: \"68b35111-581a-4e2e-9fae-3e0248674655\") " Feb 17 14:31:44 crc kubenswrapper[4836]: I0217 14:31:44.495250 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68b35111-581a-4e2e-9fae-3e0248674655-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "68b35111-581a-4e2e-9fae-3e0248674655" (UID: "68b35111-581a-4e2e-9fae-3e0248674655"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:31:44 crc kubenswrapper[4836]: I0217 14:31:44.495616 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68b35111-581a-4e2e-9fae-3e0248674655-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "68b35111-581a-4e2e-9fae-3e0248674655" (UID: "68b35111-581a-4e2e-9fae-3e0248674655"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:31:44 crc kubenswrapper[4836]: I0217 14:31:44.496461 4836 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/68b35111-581a-4e2e-9fae-3e0248674655-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:44 crc kubenswrapper[4836]: I0217 14:31:44.496495 4836 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/68b35111-581a-4e2e-9fae-3e0248674655-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:44 crc kubenswrapper[4836]: I0217 14:31:44.501544 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68b35111-581a-4e2e-9fae-3e0248674655-scripts" (OuterVolumeSpecName: "scripts") pod "68b35111-581a-4e2e-9fae-3e0248674655" (UID: "68b35111-581a-4e2e-9fae-3e0248674655"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:31:44 crc kubenswrapper[4836]: I0217 14:31:44.517526 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68b35111-581a-4e2e-9fae-3e0248674655-kube-api-access-qclhz" (OuterVolumeSpecName: "kube-api-access-qclhz") pod "68b35111-581a-4e2e-9fae-3e0248674655" (UID: "68b35111-581a-4e2e-9fae-3e0248674655"). InnerVolumeSpecName "kube-api-access-qclhz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:31:44 crc kubenswrapper[4836]: I0217 14:31:44.576266 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68b35111-581a-4e2e-9fae-3e0248674655-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "68b35111-581a-4e2e-9fae-3e0248674655" (UID: "68b35111-581a-4e2e-9fae-3e0248674655"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:31:44 crc kubenswrapper[4836]: I0217 14:31:44.598573 4836 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/68b35111-581a-4e2e-9fae-3e0248674655-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:44 crc kubenswrapper[4836]: I0217 14:31:44.598731 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qclhz\" (UniqueName: \"kubernetes.io/projected/68b35111-581a-4e2e-9fae-3e0248674655-kube-api-access-qclhz\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:44 crc kubenswrapper[4836]: I0217 14:31:44.598794 4836 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68b35111-581a-4e2e-9fae-3e0248674655-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:44 crc kubenswrapper[4836]: I0217 14:31:44.675582 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68b35111-581a-4e2e-9fae-3e0248674655-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "68b35111-581a-4e2e-9fae-3e0248674655" (UID: "68b35111-581a-4e2e-9fae-3e0248674655"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:31:44 crc kubenswrapper[4836]: I0217 14:31:44.702344 4836 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68b35111-581a-4e2e-9fae-3e0248674655-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:44 crc kubenswrapper[4836]: I0217 14:31:44.706446 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68b35111-581a-4e2e-9fae-3e0248674655-config-data" (OuterVolumeSpecName: "config-data") pod "68b35111-581a-4e2e-9fae-3e0248674655" (UID: "68b35111-581a-4e2e-9fae-3e0248674655"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:31:44 crc kubenswrapper[4836]: I0217 14:31:44.804650 4836 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68b35111-581a-4e2e-9fae-3e0248674655-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:45 crc kubenswrapper[4836]: I0217 14:31:45.057013 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 17 14:31:45 crc kubenswrapper[4836]: I0217 14:31:45.085102 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fd9b586ff-snjhj" event={"ID":"6dc084a0-be89-4371-92a3-181cfe1979ce","Type":"ContainerStarted","Data":"3d8cdfd5c39de98f01d9d2493ccdc3254e68c41faca9b0db871f1b8eb0b67eed"} Feb 17 14:31:45 crc kubenswrapper[4836]: I0217 14:31:45.086566 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5fd9b586ff-snjhj" Feb 17 14:31:45 crc kubenswrapper[4836]: I0217 14:31:45.089188 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"68b35111-581a-4e2e-9fae-3e0248674655","Type":"ContainerDied","Data":"ae9ad801c19207ea76afd61356edd8b9b7b66ec00cfa25e818baaceb869c4ad6"} Feb 17 14:31:45 crc kubenswrapper[4836]: I0217 14:31:45.089257 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 14:31:45 crc kubenswrapper[4836]: I0217 14:31:45.089263 4836 scope.go:117] "RemoveContainer" containerID="bd01937315b50abe24d9b50b0b169dd1610243cceeeeb3bd7fb5017e3522ad76" Feb 17 14:31:45 crc kubenswrapper[4836]: I0217 14:31:45.119104 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5fd9b586ff-snjhj" podStartSLOduration=3.119083422 podStartE2EDuration="3.119083422s" podCreationTimestamp="2026-02-17 14:31:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:31:45.115878585 +0000 UTC m=+1531.458806874" watchObservedRunningTime="2026-02-17 14:31:45.119083422 +0000 UTC m=+1531.462011691" Feb 17 14:31:45 crc kubenswrapper[4836]: I0217 14:31:45.134454 4836 scope.go:117] "RemoveContainer" containerID="b296d87fd7976e77f9e6129f3c2eff1cfb26664d576e803cadb9560994f76f7d" Feb 17 14:31:45 crc kubenswrapper[4836]: I0217 14:31:45.157374 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 14:31:45 crc kubenswrapper[4836]: I0217 14:31:45.165570 4836 scope.go:117] "RemoveContainer" containerID="f38b096e0cb174cc64328d45e0488ca7d9888e5f9a99f36292e400cf1d4fd13d" Feb 17 14:31:45 crc kubenswrapper[4836]: I0217 14:31:45.177741 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 17 14:31:45 crc kubenswrapper[4836]: I0217 14:31:45.209389 4836 scope.go:117] "RemoveContainer" containerID="061b81d1777bbf667556b595b5cc63218a0966b7aa3e7690e90b4b84a2173bed" Feb 17 14:31:45 crc kubenswrapper[4836]: I0217 14:31:45.231323 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 17 14:31:45 crc kubenswrapper[4836]: E0217 14:31:45.231947 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68b35111-581a-4e2e-9fae-3e0248674655" containerName="ceilometer-central-agent" Feb 17 14:31:45 crc kubenswrapper[4836]: I0217 14:31:45.231968 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="68b35111-581a-4e2e-9fae-3e0248674655" containerName="ceilometer-central-agent" Feb 17 14:31:45 crc kubenswrapper[4836]: E0217 14:31:45.231993 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68b35111-581a-4e2e-9fae-3e0248674655" containerName="sg-core" Feb 17 14:31:45 crc kubenswrapper[4836]: I0217 14:31:45.232000 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="68b35111-581a-4e2e-9fae-3e0248674655" containerName="sg-core" Feb 17 14:31:45 crc kubenswrapper[4836]: E0217 14:31:45.232026 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68b35111-581a-4e2e-9fae-3e0248674655" containerName="ceilometer-notification-agent" Feb 17 14:31:45 crc kubenswrapper[4836]: I0217 14:31:45.232045 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="68b35111-581a-4e2e-9fae-3e0248674655" containerName="ceilometer-notification-agent" Feb 17 14:31:45 crc kubenswrapper[4836]: E0217 14:31:45.232056 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68b35111-581a-4e2e-9fae-3e0248674655" containerName="proxy-httpd" Feb 17 14:31:45 crc kubenswrapper[4836]: I0217 14:31:45.232063 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="68b35111-581a-4e2e-9fae-3e0248674655" containerName="proxy-httpd" Feb 17 14:31:45 crc kubenswrapper[4836]: I0217 14:31:45.232344 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="68b35111-581a-4e2e-9fae-3e0248674655" containerName="ceilometer-central-agent" Feb 17 14:31:45 crc kubenswrapper[4836]: I0217 14:31:45.232367 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="68b35111-581a-4e2e-9fae-3e0248674655" containerName="ceilometer-notification-agent" Feb 17 14:31:45 crc kubenswrapper[4836]: I0217 14:31:45.232388 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="68b35111-581a-4e2e-9fae-3e0248674655" containerName="sg-core" Feb 17 14:31:45 crc kubenswrapper[4836]: I0217 14:31:45.232405 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="68b35111-581a-4e2e-9fae-3e0248674655" containerName="proxy-httpd" Feb 17 14:31:45 crc kubenswrapper[4836]: I0217 14:31:45.234860 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 14:31:45 crc kubenswrapper[4836]: I0217 14:31:45.237711 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 17 14:31:45 crc kubenswrapper[4836]: I0217 14:31:45.238031 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 17 14:31:45 crc kubenswrapper[4836]: I0217 14:31:45.238026 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 17 14:31:45 crc kubenswrapper[4836]: I0217 14:31:45.335731 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 14:31:45 crc kubenswrapper[4836]: I0217 14:31:45.358728 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 17 14:31:45 crc kubenswrapper[4836]: I0217 14:31:45.359124 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="cf284e7d-7c68-4688-9e14-87e9c32f6c41" containerName="nova-api-log" containerID="cri-o://d2a16e7e8ec6c9424d109c9048a972b614d91e9e7b39aa49c9b987a971f99b2e" gracePeriod=30 Feb 17 14:31:45 crc kubenswrapper[4836]: I0217 14:31:45.359379 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="cf284e7d-7c68-4688-9e14-87e9c32f6c41" containerName="nova-api-api" containerID="cri-o://6e5e87a2800a9b226a2e0132c7153060336b7daacac50613672b14b4cb72f047" gracePeriod=30 Feb 17 14:31:45 crc kubenswrapper[4836]: I0217 14:31:45.391580 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 14:31:45 crc kubenswrapper[4836]: E0217 14:31:45.392984 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[ceilometer-tls-certs combined-ca-bundle config-data kube-api-access-6ps4p log-httpd run-httpd scripts sg-core-conf-yaml], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/ceilometer-0" podUID="7f817058-dcec-4186-bb5e-213ab8d215c0" Feb 17 14:31:45 crc kubenswrapper[4836]: I0217 14:31:45.419252 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f817058-dcec-4186-bb5e-213ab8d215c0-scripts\") pod \"ceilometer-0\" (UID: \"7f817058-dcec-4186-bb5e-213ab8d215c0\") " pod="openstack/ceilometer-0" Feb 17 14:31:45 crc kubenswrapper[4836]: I0217 14:31:45.419420 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7f817058-dcec-4186-bb5e-213ab8d215c0-log-httpd\") pod \"ceilometer-0\" (UID: \"7f817058-dcec-4186-bb5e-213ab8d215c0\") " pod="openstack/ceilometer-0" Feb 17 14:31:45 crc kubenswrapper[4836]: I0217 14:31:45.419497 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f817058-dcec-4186-bb5e-213ab8d215c0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7f817058-dcec-4186-bb5e-213ab8d215c0\") " pod="openstack/ceilometer-0" Feb 17 14:31:45 crc kubenswrapper[4836]: I0217 14:31:45.419538 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7f817058-dcec-4186-bb5e-213ab8d215c0-run-httpd\") pod \"ceilometer-0\" (UID: \"7f817058-dcec-4186-bb5e-213ab8d215c0\") " pod="openstack/ceilometer-0" Feb 17 14:31:45 crc kubenswrapper[4836]: I0217 14:31:45.419612 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ps4p\" (UniqueName: \"kubernetes.io/projected/7f817058-dcec-4186-bb5e-213ab8d215c0-kube-api-access-6ps4p\") pod \"ceilometer-0\" (UID: \"7f817058-dcec-4186-bb5e-213ab8d215c0\") " pod="openstack/ceilometer-0" Feb 17 14:31:45 crc kubenswrapper[4836]: I0217 14:31:45.420033 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f817058-dcec-4186-bb5e-213ab8d215c0-config-data\") pod \"ceilometer-0\" (UID: \"7f817058-dcec-4186-bb5e-213ab8d215c0\") " pod="openstack/ceilometer-0" Feb 17 14:31:45 crc kubenswrapper[4836]: I0217 14:31:45.420113 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7f817058-dcec-4186-bb5e-213ab8d215c0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7f817058-dcec-4186-bb5e-213ab8d215c0\") " pod="openstack/ceilometer-0" Feb 17 14:31:45 crc kubenswrapper[4836]: I0217 14:31:45.420202 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f817058-dcec-4186-bb5e-213ab8d215c0-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7f817058-dcec-4186-bb5e-213ab8d215c0\") " pod="openstack/ceilometer-0" Feb 17 14:31:45 crc kubenswrapper[4836]: I0217 14:31:45.522373 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f817058-dcec-4186-bb5e-213ab8d215c0-config-data\") pod \"ceilometer-0\" (UID: \"7f817058-dcec-4186-bb5e-213ab8d215c0\") " pod="openstack/ceilometer-0" Feb 17 14:31:45 crc kubenswrapper[4836]: I0217 14:31:45.522436 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7f817058-dcec-4186-bb5e-213ab8d215c0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7f817058-dcec-4186-bb5e-213ab8d215c0\") " pod="openstack/ceilometer-0" Feb 17 14:31:45 crc kubenswrapper[4836]: I0217 14:31:45.522501 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f817058-dcec-4186-bb5e-213ab8d215c0-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7f817058-dcec-4186-bb5e-213ab8d215c0\") " pod="openstack/ceilometer-0" Feb 17 14:31:45 crc kubenswrapper[4836]: I0217 14:31:45.522540 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f817058-dcec-4186-bb5e-213ab8d215c0-scripts\") pod \"ceilometer-0\" (UID: \"7f817058-dcec-4186-bb5e-213ab8d215c0\") " pod="openstack/ceilometer-0" Feb 17 14:31:45 crc kubenswrapper[4836]: I0217 14:31:45.522572 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7f817058-dcec-4186-bb5e-213ab8d215c0-log-httpd\") pod \"ceilometer-0\" (UID: \"7f817058-dcec-4186-bb5e-213ab8d215c0\") " pod="openstack/ceilometer-0" Feb 17 14:31:45 crc kubenswrapper[4836]: I0217 14:31:45.522610 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f817058-dcec-4186-bb5e-213ab8d215c0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7f817058-dcec-4186-bb5e-213ab8d215c0\") " pod="openstack/ceilometer-0" Feb 17 14:31:45 crc kubenswrapper[4836]: I0217 14:31:45.522633 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7f817058-dcec-4186-bb5e-213ab8d215c0-run-httpd\") pod \"ceilometer-0\" (UID: \"7f817058-dcec-4186-bb5e-213ab8d215c0\") " pod="openstack/ceilometer-0" Feb 17 14:31:45 crc kubenswrapper[4836]: I0217 14:31:45.522693 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ps4p\" (UniqueName: \"kubernetes.io/projected/7f817058-dcec-4186-bb5e-213ab8d215c0-kube-api-access-6ps4p\") pod \"ceilometer-0\" (UID: \"7f817058-dcec-4186-bb5e-213ab8d215c0\") " pod="openstack/ceilometer-0" Feb 17 14:31:45 crc kubenswrapper[4836]: I0217 14:31:45.523761 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7f817058-dcec-4186-bb5e-213ab8d215c0-log-httpd\") pod \"ceilometer-0\" (UID: \"7f817058-dcec-4186-bb5e-213ab8d215c0\") " pod="openstack/ceilometer-0" Feb 17 14:31:45 crc kubenswrapper[4836]: I0217 14:31:45.524715 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7f817058-dcec-4186-bb5e-213ab8d215c0-run-httpd\") pod \"ceilometer-0\" (UID: \"7f817058-dcec-4186-bb5e-213ab8d215c0\") " pod="openstack/ceilometer-0" Feb 17 14:31:45 crc kubenswrapper[4836]: I0217 14:31:45.527411 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f817058-dcec-4186-bb5e-213ab8d215c0-scripts\") pod \"ceilometer-0\" (UID: \"7f817058-dcec-4186-bb5e-213ab8d215c0\") " pod="openstack/ceilometer-0" Feb 17 14:31:45 crc kubenswrapper[4836]: I0217 14:31:45.528148 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7f817058-dcec-4186-bb5e-213ab8d215c0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7f817058-dcec-4186-bb5e-213ab8d215c0\") " pod="openstack/ceilometer-0" Feb 17 14:31:45 crc kubenswrapper[4836]: I0217 14:31:45.528732 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f817058-dcec-4186-bb5e-213ab8d215c0-config-data\") pod \"ceilometer-0\" (UID: \"7f817058-dcec-4186-bb5e-213ab8d215c0\") " pod="openstack/ceilometer-0" Feb 17 14:31:45 crc kubenswrapper[4836]: I0217 14:31:45.529088 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f817058-dcec-4186-bb5e-213ab8d215c0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7f817058-dcec-4186-bb5e-213ab8d215c0\") " pod="openstack/ceilometer-0" Feb 17 14:31:45 crc kubenswrapper[4836]: I0217 14:31:45.530689 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f817058-dcec-4186-bb5e-213ab8d215c0-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7f817058-dcec-4186-bb5e-213ab8d215c0\") " pod="openstack/ceilometer-0" Feb 17 14:31:45 crc kubenswrapper[4836]: I0217 14:31:45.542418 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ps4p\" (UniqueName: \"kubernetes.io/projected/7f817058-dcec-4186-bb5e-213ab8d215c0-kube-api-access-6ps4p\") pod \"ceilometer-0\" (UID: \"7f817058-dcec-4186-bb5e-213ab8d215c0\") " pod="openstack/ceilometer-0" Feb 17 14:31:46 crc kubenswrapper[4836]: I0217 14:31:46.102350 4836 generic.go:334] "Generic (PLEG): container finished" podID="cf284e7d-7c68-4688-9e14-87e9c32f6c41" containerID="d2a16e7e8ec6c9424d109c9048a972b614d91e9e7b39aa49c9b987a971f99b2e" exitCode=143 Feb 17 14:31:46 crc kubenswrapper[4836]: I0217 14:31:46.102447 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"cf284e7d-7c68-4688-9e14-87e9c32f6c41","Type":"ContainerDied","Data":"d2a16e7e8ec6c9424d109c9048a972b614d91e9e7b39aa49c9b987a971f99b2e"} Feb 17 14:31:46 crc kubenswrapper[4836]: I0217 14:31:46.103889 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 14:31:46 crc kubenswrapper[4836]: I0217 14:31:46.118591 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 14:31:46 crc kubenswrapper[4836]: I0217 14:31:46.237463 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7f817058-dcec-4186-bb5e-213ab8d215c0-run-httpd\") pod \"7f817058-dcec-4186-bb5e-213ab8d215c0\" (UID: \"7f817058-dcec-4186-bb5e-213ab8d215c0\") " Feb 17 14:31:46 crc kubenswrapper[4836]: I0217 14:31:46.237669 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7f817058-dcec-4186-bb5e-213ab8d215c0-sg-core-conf-yaml\") pod \"7f817058-dcec-4186-bb5e-213ab8d215c0\" (UID: \"7f817058-dcec-4186-bb5e-213ab8d215c0\") " Feb 17 14:31:46 crc kubenswrapper[4836]: I0217 14:31:46.237795 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f817058-dcec-4186-bb5e-213ab8d215c0-config-data\") pod \"7f817058-dcec-4186-bb5e-213ab8d215c0\" (UID: \"7f817058-dcec-4186-bb5e-213ab8d215c0\") " Feb 17 14:31:46 crc kubenswrapper[4836]: I0217 14:31:46.237864 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f817058-dcec-4186-bb5e-213ab8d215c0-combined-ca-bundle\") pod \"7f817058-dcec-4186-bb5e-213ab8d215c0\" (UID: \"7f817058-dcec-4186-bb5e-213ab8d215c0\") " Feb 17 14:31:46 crc kubenswrapper[4836]: I0217 14:31:46.237857 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f817058-dcec-4186-bb5e-213ab8d215c0-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "7f817058-dcec-4186-bb5e-213ab8d215c0" (UID: "7f817058-dcec-4186-bb5e-213ab8d215c0"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:31:46 crc kubenswrapper[4836]: I0217 14:31:46.237895 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ps4p\" (UniqueName: \"kubernetes.io/projected/7f817058-dcec-4186-bb5e-213ab8d215c0-kube-api-access-6ps4p\") pod \"7f817058-dcec-4186-bb5e-213ab8d215c0\" (UID: \"7f817058-dcec-4186-bb5e-213ab8d215c0\") " Feb 17 14:31:46 crc kubenswrapper[4836]: I0217 14:31:46.237953 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f817058-dcec-4186-bb5e-213ab8d215c0-ceilometer-tls-certs\") pod \"7f817058-dcec-4186-bb5e-213ab8d215c0\" (UID: \"7f817058-dcec-4186-bb5e-213ab8d215c0\") " Feb 17 14:31:46 crc kubenswrapper[4836]: I0217 14:31:46.238014 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7f817058-dcec-4186-bb5e-213ab8d215c0-log-httpd\") pod \"7f817058-dcec-4186-bb5e-213ab8d215c0\" (UID: \"7f817058-dcec-4186-bb5e-213ab8d215c0\") " Feb 17 14:31:46 crc kubenswrapper[4836]: I0217 14:31:46.238048 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f817058-dcec-4186-bb5e-213ab8d215c0-scripts\") pod \"7f817058-dcec-4186-bb5e-213ab8d215c0\" (UID: \"7f817058-dcec-4186-bb5e-213ab8d215c0\") " Feb 17 14:31:46 crc kubenswrapper[4836]: I0217 14:31:46.238770 4836 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7f817058-dcec-4186-bb5e-213ab8d215c0-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:46 crc kubenswrapper[4836]: I0217 14:31:46.239015 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f817058-dcec-4186-bb5e-213ab8d215c0-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "7f817058-dcec-4186-bb5e-213ab8d215c0" (UID: "7f817058-dcec-4186-bb5e-213ab8d215c0"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:31:46 crc kubenswrapper[4836]: I0217 14:31:46.247505 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f817058-dcec-4186-bb5e-213ab8d215c0-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "7f817058-dcec-4186-bb5e-213ab8d215c0" (UID: "7f817058-dcec-4186-bb5e-213ab8d215c0"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:31:46 crc kubenswrapper[4836]: I0217 14:31:46.247491 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f817058-dcec-4186-bb5e-213ab8d215c0-scripts" (OuterVolumeSpecName: "scripts") pod "7f817058-dcec-4186-bb5e-213ab8d215c0" (UID: "7f817058-dcec-4186-bb5e-213ab8d215c0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:31:46 crc kubenswrapper[4836]: I0217 14:31:46.247550 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f817058-dcec-4186-bb5e-213ab8d215c0-config-data" (OuterVolumeSpecName: "config-data") pod "7f817058-dcec-4186-bb5e-213ab8d215c0" (UID: "7f817058-dcec-4186-bb5e-213ab8d215c0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:31:46 crc kubenswrapper[4836]: I0217 14:31:46.248316 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f817058-dcec-4186-bb5e-213ab8d215c0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7f817058-dcec-4186-bb5e-213ab8d215c0" (UID: "7f817058-dcec-4186-bb5e-213ab8d215c0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:31:46 crc kubenswrapper[4836]: I0217 14:31:46.249858 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f817058-dcec-4186-bb5e-213ab8d215c0-kube-api-access-6ps4p" (OuterVolumeSpecName: "kube-api-access-6ps4p") pod "7f817058-dcec-4186-bb5e-213ab8d215c0" (UID: "7f817058-dcec-4186-bb5e-213ab8d215c0"). InnerVolumeSpecName "kube-api-access-6ps4p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:31:46 crc kubenswrapper[4836]: I0217 14:31:46.251717 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f817058-dcec-4186-bb5e-213ab8d215c0-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "7f817058-dcec-4186-bb5e-213ab8d215c0" (UID: "7f817058-dcec-4186-bb5e-213ab8d215c0"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:31:46 crc kubenswrapper[4836]: I0217 14:31:46.340979 4836 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7f817058-dcec-4186-bb5e-213ab8d215c0-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:46 crc kubenswrapper[4836]: I0217 14:31:46.341482 4836 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f817058-dcec-4186-bb5e-213ab8d215c0-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:46 crc kubenswrapper[4836]: I0217 14:31:46.341496 4836 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f817058-dcec-4186-bb5e-213ab8d215c0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:46 crc kubenswrapper[4836]: I0217 14:31:46.341512 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ps4p\" (UniqueName: \"kubernetes.io/projected/7f817058-dcec-4186-bb5e-213ab8d215c0-kube-api-access-6ps4p\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:46 crc kubenswrapper[4836]: I0217 14:31:46.341528 4836 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f817058-dcec-4186-bb5e-213ab8d215c0-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:46 crc kubenswrapper[4836]: I0217 14:31:46.341542 4836 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7f817058-dcec-4186-bb5e-213ab8d215c0-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:46 crc kubenswrapper[4836]: I0217 14:31:46.341551 4836 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f817058-dcec-4186-bb5e-213ab8d215c0-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:46 crc kubenswrapper[4836]: I0217 14:31:46.582242 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68b35111-581a-4e2e-9fae-3e0248674655" path="/var/lib/kubelet/pods/68b35111-581a-4e2e-9fae-3e0248674655/volumes" Feb 17 14:31:47 crc kubenswrapper[4836]: I0217 14:31:47.113212 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 14:31:47 crc kubenswrapper[4836]: I0217 14:31:47.182199 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 14:31:47 crc kubenswrapper[4836]: I0217 14:31:47.203502 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 17 14:31:47 crc kubenswrapper[4836]: I0217 14:31:47.227106 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 17 14:31:47 crc kubenswrapper[4836]: I0217 14:31:47.230219 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 14:31:47 crc kubenswrapper[4836]: I0217 14:31:47.239583 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 17 14:31:47 crc kubenswrapper[4836]: I0217 14:31:47.239965 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 17 14:31:47 crc kubenswrapper[4836]: I0217 14:31:47.240107 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 17 14:31:47 crc kubenswrapper[4836]: I0217 14:31:47.241620 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 14:31:47 crc kubenswrapper[4836]: I0217 14:31:47.376010 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/143c175f-4768-4188-8f12-3f76bf70804f-run-httpd\") pod \"ceilometer-0\" (UID: \"143c175f-4768-4188-8f12-3f76bf70804f\") " pod="openstack/ceilometer-0" Feb 17 14:31:47 crc kubenswrapper[4836]: I0217 14:31:47.376498 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/143c175f-4768-4188-8f12-3f76bf70804f-scripts\") pod \"ceilometer-0\" (UID: \"143c175f-4768-4188-8f12-3f76bf70804f\") " pod="openstack/ceilometer-0" Feb 17 14:31:47 crc kubenswrapper[4836]: I0217 14:31:47.376609 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wvnc\" (UniqueName: \"kubernetes.io/projected/143c175f-4768-4188-8f12-3f76bf70804f-kube-api-access-5wvnc\") pod \"ceilometer-0\" (UID: \"143c175f-4768-4188-8f12-3f76bf70804f\") " pod="openstack/ceilometer-0" Feb 17 14:31:47 crc kubenswrapper[4836]: I0217 14:31:47.376704 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/143c175f-4768-4188-8f12-3f76bf70804f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"143c175f-4768-4188-8f12-3f76bf70804f\") " pod="openstack/ceilometer-0" Feb 17 14:31:47 crc kubenswrapper[4836]: I0217 14:31:47.376903 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/143c175f-4768-4188-8f12-3f76bf70804f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"143c175f-4768-4188-8f12-3f76bf70804f\") " pod="openstack/ceilometer-0" Feb 17 14:31:47 crc kubenswrapper[4836]: I0217 14:31:47.376960 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/143c175f-4768-4188-8f12-3f76bf70804f-config-data\") pod \"ceilometer-0\" (UID: \"143c175f-4768-4188-8f12-3f76bf70804f\") " pod="openstack/ceilometer-0" Feb 17 14:31:47 crc kubenswrapper[4836]: I0217 14:31:47.376990 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/143c175f-4768-4188-8f12-3f76bf70804f-log-httpd\") pod \"ceilometer-0\" (UID: \"143c175f-4768-4188-8f12-3f76bf70804f\") " pod="openstack/ceilometer-0" Feb 17 14:31:47 crc kubenswrapper[4836]: I0217 14:31:47.377169 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/143c175f-4768-4188-8f12-3f76bf70804f-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"143c175f-4768-4188-8f12-3f76bf70804f\") " pod="openstack/ceilometer-0" Feb 17 14:31:47 crc kubenswrapper[4836]: I0217 14:31:47.479817 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/143c175f-4768-4188-8f12-3f76bf70804f-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"143c175f-4768-4188-8f12-3f76bf70804f\") " pod="openstack/ceilometer-0" Feb 17 14:31:47 crc kubenswrapper[4836]: I0217 14:31:47.479929 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/143c175f-4768-4188-8f12-3f76bf70804f-run-httpd\") pod \"ceilometer-0\" (UID: \"143c175f-4768-4188-8f12-3f76bf70804f\") " pod="openstack/ceilometer-0" Feb 17 14:31:47 crc kubenswrapper[4836]: I0217 14:31:47.480224 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/143c175f-4768-4188-8f12-3f76bf70804f-scripts\") pod \"ceilometer-0\" (UID: \"143c175f-4768-4188-8f12-3f76bf70804f\") " pod="openstack/ceilometer-0" Feb 17 14:31:47 crc kubenswrapper[4836]: I0217 14:31:47.480331 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5wvnc\" (UniqueName: \"kubernetes.io/projected/143c175f-4768-4188-8f12-3f76bf70804f-kube-api-access-5wvnc\") pod \"ceilometer-0\" (UID: \"143c175f-4768-4188-8f12-3f76bf70804f\") " pod="openstack/ceilometer-0" Feb 17 14:31:47 crc kubenswrapper[4836]: I0217 14:31:47.480370 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/143c175f-4768-4188-8f12-3f76bf70804f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"143c175f-4768-4188-8f12-3f76bf70804f\") " pod="openstack/ceilometer-0" Feb 17 14:31:47 crc kubenswrapper[4836]: I0217 14:31:47.480516 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/143c175f-4768-4188-8f12-3f76bf70804f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"143c175f-4768-4188-8f12-3f76bf70804f\") " pod="openstack/ceilometer-0" Feb 17 14:31:47 crc kubenswrapper[4836]: I0217 14:31:47.480558 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/143c175f-4768-4188-8f12-3f76bf70804f-config-data\") pod \"ceilometer-0\" (UID: \"143c175f-4768-4188-8f12-3f76bf70804f\") " pod="openstack/ceilometer-0" Feb 17 14:31:47 crc kubenswrapper[4836]: I0217 14:31:47.480575 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/143c175f-4768-4188-8f12-3f76bf70804f-run-httpd\") pod \"ceilometer-0\" (UID: \"143c175f-4768-4188-8f12-3f76bf70804f\") " pod="openstack/ceilometer-0" Feb 17 14:31:47 crc kubenswrapper[4836]: I0217 14:31:47.480589 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/143c175f-4768-4188-8f12-3f76bf70804f-log-httpd\") pod \"ceilometer-0\" (UID: \"143c175f-4768-4188-8f12-3f76bf70804f\") " pod="openstack/ceilometer-0" Feb 17 14:31:47 crc kubenswrapper[4836]: I0217 14:31:47.480987 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/143c175f-4768-4188-8f12-3f76bf70804f-log-httpd\") pod \"ceilometer-0\" (UID: \"143c175f-4768-4188-8f12-3f76bf70804f\") " pod="openstack/ceilometer-0" Feb 17 14:31:47 crc kubenswrapper[4836]: I0217 14:31:47.487261 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/143c175f-4768-4188-8f12-3f76bf70804f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"143c175f-4768-4188-8f12-3f76bf70804f\") " pod="openstack/ceilometer-0" Feb 17 14:31:47 crc kubenswrapper[4836]: I0217 14:31:47.487425 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/143c175f-4768-4188-8f12-3f76bf70804f-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"143c175f-4768-4188-8f12-3f76bf70804f\") " pod="openstack/ceilometer-0" Feb 17 14:31:47 crc kubenswrapper[4836]: I0217 14:31:47.487559 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/143c175f-4768-4188-8f12-3f76bf70804f-config-data\") pod \"ceilometer-0\" (UID: \"143c175f-4768-4188-8f12-3f76bf70804f\") " pod="openstack/ceilometer-0" Feb 17 14:31:47 crc kubenswrapper[4836]: I0217 14:31:47.488382 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/143c175f-4768-4188-8f12-3f76bf70804f-scripts\") pod \"ceilometer-0\" (UID: \"143c175f-4768-4188-8f12-3f76bf70804f\") " pod="openstack/ceilometer-0" Feb 17 14:31:47 crc kubenswrapper[4836]: I0217 14:31:47.488659 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/143c175f-4768-4188-8f12-3f76bf70804f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"143c175f-4768-4188-8f12-3f76bf70804f\") " pod="openstack/ceilometer-0" Feb 17 14:31:47 crc kubenswrapper[4836]: I0217 14:31:47.504414 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wvnc\" (UniqueName: \"kubernetes.io/projected/143c175f-4768-4188-8f12-3f76bf70804f-kube-api-access-5wvnc\") pod \"ceilometer-0\" (UID: \"143c175f-4768-4188-8f12-3f76bf70804f\") " pod="openstack/ceilometer-0" Feb 17 14:31:47 crc kubenswrapper[4836]: I0217 14:31:47.560842 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 14:31:47 crc kubenswrapper[4836]: I0217 14:31:47.809225 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Feb 17 14:31:48 crc kubenswrapper[4836]: I0217 14:31:48.084274 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 14:31:48 crc kubenswrapper[4836]: W0217 14:31:48.088181 4836 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod143c175f_4768_4188_8f12_3f76bf70804f.slice/crio-9dbb838e04f66687c191dcad00147af5a9d0fa9da289d8afa2cdcb370fe12258 WatchSource:0}: Error finding container 9dbb838e04f66687c191dcad00147af5a9d0fa9da289d8afa2cdcb370fe12258: Status 404 returned error can't find the container with id 9dbb838e04f66687c191dcad00147af5a9d0fa9da289d8afa2cdcb370fe12258 Feb 17 14:31:48 crc kubenswrapper[4836]: I0217 14:31:48.143849 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"143c175f-4768-4188-8f12-3f76bf70804f","Type":"ContainerStarted","Data":"9dbb838e04f66687c191dcad00147af5a9d0fa9da289d8afa2cdcb370fe12258"} Feb 17 14:31:48 crc kubenswrapper[4836]: I0217 14:31:48.253981 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 14:31:48 crc kubenswrapper[4836]: I0217 14:31:48.588227 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f817058-dcec-4186-bb5e-213ab8d215c0" path="/var/lib/kubelet/pods/7f817058-dcec-4186-bb5e-213ab8d215c0/volumes" Feb 17 14:31:49 crc kubenswrapper[4836]: I0217 14:31:49.118628 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 14:31:49 crc kubenswrapper[4836]: I0217 14:31:49.207724 4836 generic.go:334] "Generic (PLEG): container finished" podID="cf284e7d-7c68-4688-9e14-87e9c32f6c41" containerID="6e5e87a2800a9b226a2e0132c7153060336b7daacac50613672b14b4cb72f047" exitCode=0 Feb 17 14:31:49 crc kubenswrapper[4836]: I0217 14:31:49.207790 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"cf284e7d-7c68-4688-9e14-87e9c32f6c41","Type":"ContainerDied","Data":"6e5e87a2800a9b226a2e0132c7153060336b7daacac50613672b14b4cb72f047"} Feb 17 14:31:49 crc kubenswrapper[4836]: I0217 14:31:49.207825 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"cf284e7d-7c68-4688-9e14-87e9c32f6c41","Type":"ContainerDied","Data":"be3d92134965e9daf67db43dfe412847ed9ee954252aa5228b3fe8a1f26a6f7a"} Feb 17 14:31:49 crc kubenswrapper[4836]: I0217 14:31:49.207868 4836 scope.go:117] "RemoveContainer" containerID="6e5e87a2800a9b226a2e0132c7153060336b7daacac50613672b14b4cb72f047" Feb 17 14:31:49 crc kubenswrapper[4836]: I0217 14:31:49.208088 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 14:31:49 crc kubenswrapper[4836]: I0217 14:31:49.243325 4836 scope.go:117] "RemoveContainer" containerID="d2a16e7e8ec6c9424d109c9048a972b614d91e9e7b39aa49c9b987a971f99b2e" Feb 17 14:31:49 crc kubenswrapper[4836]: I0217 14:31:49.243632 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf284e7d-7c68-4688-9e14-87e9c32f6c41-combined-ca-bundle\") pod \"cf284e7d-7c68-4688-9e14-87e9c32f6c41\" (UID: \"cf284e7d-7c68-4688-9e14-87e9c32f6c41\") " Feb 17 14:31:49 crc kubenswrapper[4836]: I0217 14:31:49.243673 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf284e7d-7c68-4688-9e14-87e9c32f6c41-logs\") pod \"cf284e7d-7c68-4688-9e14-87e9c32f6c41\" (UID: \"cf284e7d-7c68-4688-9e14-87e9c32f6c41\") " Feb 17 14:31:49 crc kubenswrapper[4836]: I0217 14:31:49.243710 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rvc2r\" (UniqueName: \"kubernetes.io/projected/cf284e7d-7c68-4688-9e14-87e9c32f6c41-kube-api-access-rvc2r\") pod \"cf284e7d-7c68-4688-9e14-87e9c32f6c41\" (UID: \"cf284e7d-7c68-4688-9e14-87e9c32f6c41\") " Feb 17 14:31:49 crc kubenswrapper[4836]: I0217 14:31:49.243880 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf284e7d-7c68-4688-9e14-87e9c32f6c41-config-data\") pod \"cf284e7d-7c68-4688-9e14-87e9c32f6c41\" (UID: \"cf284e7d-7c68-4688-9e14-87e9c32f6c41\") " Feb 17 14:31:49 crc kubenswrapper[4836]: I0217 14:31:49.245427 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf284e7d-7c68-4688-9e14-87e9c32f6c41-logs" (OuterVolumeSpecName: "logs") pod "cf284e7d-7c68-4688-9e14-87e9c32f6c41" (UID: "cf284e7d-7c68-4688-9e14-87e9c32f6c41"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:31:49 crc kubenswrapper[4836]: I0217 14:31:49.247268 4836 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf284e7d-7c68-4688-9e14-87e9c32f6c41-logs\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:49 crc kubenswrapper[4836]: I0217 14:31:49.250807 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf284e7d-7c68-4688-9e14-87e9c32f6c41-kube-api-access-rvc2r" (OuterVolumeSpecName: "kube-api-access-rvc2r") pod "cf284e7d-7c68-4688-9e14-87e9c32f6c41" (UID: "cf284e7d-7c68-4688-9e14-87e9c32f6c41"). InnerVolumeSpecName "kube-api-access-rvc2r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:31:49 crc kubenswrapper[4836]: I0217 14:31:49.281355 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 17 14:31:49 crc kubenswrapper[4836]: I0217 14:31:49.281414 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 17 14:31:49 crc kubenswrapper[4836]: I0217 14:31:49.287799 4836 scope.go:117] "RemoveContainer" containerID="6e5e87a2800a9b226a2e0132c7153060336b7daacac50613672b14b4cb72f047" Feb 17 14:31:49 crc kubenswrapper[4836]: E0217 14:31:49.289099 4836 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e5e87a2800a9b226a2e0132c7153060336b7daacac50613672b14b4cb72f047\": container with ID starting with 6e5e87a2800a9b226a2e0132c7153060336b7daacac50613672b14b4cb72f047 not found: ID does not exist" containerID="6e5e87a2800a9b226a2e0132c7153060336b7daacac50613672b14b4cb72f047" Feb 17 14:31:49 crc kubenswrapper[4836]: I0217 14:31:49.289153 4836 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e5e87a2800a9b226a2e0132c7153060336b7daacac50613672b14b4cb72f047"} err="failed to get container status \"6e5e87a2800a9b226a2e0132c7153060336b7daacac50613672b14b4cb72f047\": rpc error: code = NotFound desc = could not find container \"6e5e87a2800a9b226a2e0132c7153060336b7daacac50613672b14b4cb72f047\": container with ID starting with 6e5e87a2800a9b226a2e0132c7153060336b7daacac50613672b14b4cb72f047 not found: ID does not exist" Feb 17 14:31:49 crc kubenswrapper[4836]: I0217 14:31:49.289179 4836 scope.go:117] "RemoveContainer" containerID="d2a16e7e8ec6c9424d109c9048a972b614d91e9e7b39aa49c9b987a971f99b2e" Feb 17 14:31:49 crc kubenswrapper[4836]: E0217 14:31:49.289689 4836 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2a16e7e8ec6c9424d109c9048a972b614d91e9e7b39aa49c9b987a971f99b2e\": container with ID starting with d2a16e7e8ec6c9424d109c9048a972b614d91e9e7b39aa49c9b987a971f99b2e not found: ID does not exist" containerID="d2a16e7e8ec6c9424d109c9048a972b614d91e9e7b39aa49c9b987a971f99b2e" Feb 17 14:31:49 crc kubenswrapper[4836]: I0217 14:31:49.289717 4836 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2a16e7e8ec6c9424d109c9048a972b614d91e9e7b39aa49c9b987a971f99b2e"} err="failed to get container status \"d2a16e7e8ec6c9424d109c9048a972b614d91e9e7b39aa49c9b987a971f99b2e\": rpc error: code = NotFound desc = could not find container \"d2a16e7e8ec6c9424d109c9048a972b614d91e9e7b39aa49c9b987a971f99b2e\": container with ID starting with d2a16e7e8ec6c9424d109c9048a972b614d91e9e7b39aa49c9b987a971f99b2e not found: ID does not exist" Feb 17 14:31:49 crc kubenswrapper[4836]: I0217 14:31:49.315044 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf284e7d-7c68-4688-9e14-87e9c32f6c41-config-data" (OuterVolumeSpecName: "config-data") pod "cf284e7d-7c68-4688-9e14-87e9c32f6c41" (UID: "cf284e7d-7c68-4688-9e14-87e9c32f6c41"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:31:49 crc kubenswrapper[4836]: I0217 14:31:49.322482 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf284e7d-7c68-4688-9e14-87e9c32f6c41-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cf284e7d-7c68-4688-9e14-87e9c32f6c41" (UID: "cf284e7d-7c68-4688-9e14-87e9c32f6c41"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:31:49 crc kubenswrapper[4836]: I0217 14:31:49.350442 4836 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf284e7d-7c68-4688-9e14-87e9c32f6c41-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:49 crc kubenswrapper[4836]: I0217 14:31:49.350538 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rvc2r\" (UniqueName: \"kubernetes.io/projected/cf284e7d-7c68-4688-9e14-87e9c32f6c41-kube-api-access-rvc2r\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:49 crc kubenswrapper[4836]: I0217 14:31:49.350552 4836 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf284e7d-7c68-4688-9e14-87e9c32f6c41-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:49 crc kubenswrapper[4836]: I0217 14:31:49.569370 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 17 14:31:49 crc kubenswrapper[4836]: I0217 14:31:49.596382 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 17 14:31:49 crc kubenswrapper[4836]: I0217 14:31:49.653124 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 17 14:31:49 crc kubenswrapper[4836]: E0217 14:31:49.653787 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf284e7d-7c68-4688-9e14-87e9c32f6c41" containerName="nova-api-log" Feb 17 14:31:49 crc kubenswrapper[4836]: I0217 14:31:49.653843 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf284e7d-7c68-4688-9e14-87e9c32f6c41" containerName="nova-api-log" Feb 17 14:31:49 crc kubenswrapper[4836]: E0217 14:31:49.653885 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf284e7d-7c68-4688-9e14-87e9c32f6c41" containerName="nova-api-api" Feb 17 14:31:49 crc kubenswrapper[4836]: I0217 14:31:49.653891 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf284e7d-7c68-4688-9e14-87e9c32f6c41" containerName="nova-api-api" Feb 17 14:31:49 crc kubenswrapper[4836]: I0217 14:31:49.654130 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf284e7d-7c68-4688-9e14-87e9c32f6c41" containerName="nova-api-api" Feb 17 14:31:49 crc kubenswrapper[4836]: I0217 14:31:49.654157 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf284e7d-7c68-4688-9e14-87e9c32f6c41" containerName="nova-api-log" Feb 17 14:31:49 crc kubenswrapper[4836]: I0217 14:31:49.657035 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 14:31:49 crc kubenswrapper[4836]: I0217 14:31:49.659463 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 17 14:31:49 crc kubenswrapper[4836]: I0217 14:31:49.660213 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Feb 17 14:31:49 crc kubenswrapper[4836]: I0217 14:31:49.660631 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Feb 17 14:31:49 crc kubenswrapper[4836]: I0217 14:31:49.667459 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 17 14:31:49 crc kubenswrapper[4836]: I0217 14:31:49.778233 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c95da3ac-7563-49bf-a956-b19297cb7d97-logs\") pod \"nova-api-0\" (UID: \"c95da3ac-7563-49bf-a956-b19297cb7d97\") " pod="openstack/nova-api-0" Feb 17 14:31:49 crc kubenswrapper[4836]: I0217 14:31:49.778364 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c95da3ac-7563-49bf-a956-b19297cb7d97-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c95da3ac-7563-49bf-a956-b19297cb7d97\") " pod="openstack/nova-api-0" Feb 17 14:31:49 crc kubenswrapper[4836]: I0217 14:31:49.778546 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6vzb\" (UniqueName: \"kubernetes.io/projected/c95da3ac-7563-49bf-a956-b19297cb7d97-kube-api-access-m6vzb\") pod \"nova-api-0\" (UID: \"c95da3ac-7563-49bf-a956-b19297cb7d97\") " pod="openstack/nova-api-0" Feb 17 14:31:49 crc kubenswrapper[4836]: I0217 14:31:49.778835 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c95da3ac-7563-49bf-a956-b19297cb7d97-internal-tls-certs\") pod \"nova-api-0\" (UID: \"c95da3ac-7563-49bf-a956-b19297cb7d97\") " pod="openstack/nova-api-0" Feb 17 14:31:49 crc kubenswrapper[4836]: I0217 14:31:49.779074 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c95da3ac-7563-49bf-a956-b19297cb7d97-config-data\") pod \"nova-api-0\" (UID: \"c95da3ac-7563-49bf-a956-b19297cb7d97\") " pod="openstack/nova-api-0" Feb 17 14:31:49 crc kubenswrapper[4836]: I0217 14:31:49.779244 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c95da3ac-7563-49bf-a956-b19297cb7d97-public-tls-certs\") pod \"nova-api-0\" (UID: \"c95da3ac-7563-49bf-a956-b19297cb7d97\") " pod="openstack/nova-api-0" Feb 17 14:31:49 crc kubenswrapper[4836]: I0217 14:31:49.881260 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c95da3ac-7563-49bf-a956-b19297cb7d97-config-data\") pod \"nova-api-0\" (UID: \"c95da3ac-7563-49bf-a956-b19297cb7d97\") " pod="openstack/nova-api-0" Feb 17 14:31:49 crc kubenswrapper[4836]: I0217 14:31:49.881358 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c95da3ac-7563-49bf-a956-b19297cb7d97-public-tls-certs\") pod \"nova-api-0\" (UID: \"c95da3ac-7563-49bf-a956-b19297cb7d97\") " pod="openstack/nova-api-0" Feb 17 14:31:49 crc kubenswrapper[4836]: I0217 14:31:49.881416 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c95da3ac-7563-49bf-a956-b19297cb7d97-logs\") pod \"nova-api-0\" (UID: \"c95da3ac-7563-49bf-a956-b19297cb7d97\") " pod="openstack/nova-api-0" Feb 17 14:31:49 crc kubenswrapper[4836]: I0217 14:31:49.881436 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c95da3ac-7563-49bf-a956-b19297cb7d97-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c95da3ac-7563-49bf-a956-b19297cb7d97\") " pod="openstack/nova-api-0" Feb 17 14:31:49 crc kubenswrapper[4836]: I0217 14:31:49.881474 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6vzb\" (UniqueName: \"kubernetes.io/projected/c95da3ac-7563-49bf-a956-b19297cb7d97-kube-api-access-m6vzb\") pod \"nova-api-0\" (UID: \"c95da3ac-7563-49bf-a956-b19297cb7d97\") " pod="openstack/nova-api-0" Feb 17 14:31:49 crc kubenswrapper[4836]: I0217 14:31:49.881585 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c95da3ac-7563-49bf-a956-b19297cb7d97-internal-tls-certs\") pod \"nova-api-0\" (UID: \"c95da3ac-7563-49bf-a956-b19297cb7d97\") " pod="openstack/nova-api-0" Feb 17 14:31:49 crc kubenswrapper[4836]: I0217 14:31:49.882705 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c95da3ac-7563-49bf-a956-b19297cb7d97-logs\") pod \"nova-api-0\" (UID: \"c95da3ac-7563-49bf-a956-b19297cb7d97\") " pod="openstack/nova-api-0" Feb 17 14:31:49 crc kubenswrapper[4836]: I0217 14:31:49.892534 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c95da3ac-7563-49bf-a956-b19297cb7d97-internal-tls-certs\") pod \"nova-api-0\" (UID: \"c95da3ac-7563-49bf-a956-b19297cb7d97\") " pod="openstack/nova-api-0" Feb 17 14:31:49 crc kubenswrapper[4836]: I0217 14:31:49.892633 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c95da3ac-7563-49bf-a956-b19297cb7d97-config-data\") pod \"nova-api-0\" (UID: \"c95da3ac-7563-49bf-a956-b19297cb7d97\") " pod="openstack/nova-api-0" Feb 17 14:31:49 crc kubenswrapper[4836]: I0217 14:31:49.892913 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c95da3ac-7563-49bf-a956-b19297cb7d97-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c95da3ac-7563-49bf-a956-b19297cb7d97\") " pod="openstack/nova-api-0" Feb 17 14:31:49 crc kubenswrapper[4836]: I0217 14:31:49.894700 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c95da3ac-7563-49bf-a956-b19297cb7d97-public-tls-certs\") pod \"nova-api-0\" (UID: \"c95da3ac-7563-49bf-a956-b19297cb7d97\") " pod="openstack/nova-api-0" Feb 17 14:31:49 crc kubenswrapper[4836]: I0217 14:31:49.901939 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6vzb\" (UniqueName: \"kubernetes.io/projected/c95da3ac-7563-49bf-a956-b19297cb7d97-kube-api-access-m6vzb\") pod \"nova-api-0\" (UID: \"c95da3ac-7563-49bf-a956-b19297cb7d97\") " pod="openstack/nova-api-0" Feb 17 14:31:49 crc kubenswrapper[4836]: I0217 14:31:49.992077 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 14:31:50 crc kubenswrapper[4836]: I0217 14:31:50.057994 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Feb 17 14:31:50 crc kubenswrapper[4836]: I0217 14:31:50.087990 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Feb 17 14:31:50 crc kubenswrapper[4836]: I0217 14:31:50.293222 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"143c175f-4768-4188-8f12-3f76bf70804f","Type":"ContainerStarted","Data":"c5153296780e29125ef218661157bfd30ddc0ca224b65cb9ffd34fbe9f884fe5"} Feb 17 14:31:50 crc kubenswrapper[4836]: I0217 14:31:50.302011 4836 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="327aaf35-8278-4f1a-b369-7a40209c0a8e" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.225:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 17 14:31:50 crc kubenswrapper[4836]: I0217 14:31:50.302376 4836 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="327aaf35-8278-4f1a-b369-7a40209c0a8e" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.225:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 17 14:31:50 crc kubenswrapper[4836]: I0217 14:31:50.331658 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Feb 17 14:31:50 crc kubenswrapper[4836]: I0217 14:31:50.582553 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf284e7d-7c68-4688-9e14-87e9c32f6c41" path="/var/lib/kubelet/pods/cf284e7d-7c68-4688-9e14-87e9c32f6c41/volumes" Feb 17 14:31:50 crc kubenswrapper[4836]: I0217 14:31:50.586553 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-h4mlr"] Feb 17 14:31:50 crc kubenswrapper[4836]: I0217 14:31:50.588709 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-h4mlr" Feb 17 14:31:50 crc kubenswrapper[4836]: I0217 14:31:50.591585 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Feb 17 14:31:50 crc kubenswrapper[4836]: I0217 14:31:50.592101 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Feb 17 14:31:50 crc kubenswrapper[4836]: I0217 14:31:50.613445 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-h4mlr"] Feb 17 14:31:50 crc kubenswrapper[4836]: I0217 14:31:50.699136 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 17 14:31:50 crc kubenswrapper[4836]: I0217 14:31:50.702185 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/079f20c9-f742-4c4b-a8c0-a2a09573bf62-config-data\") pod \"nova-cell1-cell-mapping-h4mlr\" (UID: \"079f20c9-f742-4c4b-a8c0-a2a09573bf62\") " pod="openstack/nova-cell1-cell-mapping-h4mlr" Feb 17 14:31:50 crc kubenswrapper[4836]: I0217 14:31:50.703001 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/079f20c9-f742-4c4b-a8c0-a2a09573bf62-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-h4mlr\" (UID: \"079f20c9-f742-4c4b-a8c0-a2a09573bf62\") " pod="openstack/nova-cell1-cell-mapping-h4mlr" Feb 17 14:31:50 crc kubenswrapper[4836]: I0217 14:31:50.703416 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/079f20c9-f742-4c4b-a8c0-a2a09573bf62-scripts\") pod \"nova-cell1-cell-mapping-h4mlr\" (UID: \"079f20c9-f742-4c4b-a8c0-a2a09573bf62\") " pod="openstack/nova-cell1-cell-mapping-h4mlr" Feb 17 14:31:50 crc kubenswrapper[4836]: I0217 14:31:50.703580 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w49k2\" (UniqueName: \"kubernetes.io/projected/079f20c9-f742-4c4b-a8c0-a2a09573bf62-kube-api-access-w49k2\") pod \"nova-cell1-cell-mapping-h4mlr\" (UID: \"079f20c9-f742-4c4b-a8c0-a2a09573bf62\") " pod="openstack/nova-cell1-cell-mapping-h4mlr" Feb 17 14:31:50 crc kubenswrapper[4836]: I0217 14:31:50.807887 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/079f20c9-f742-4c4b-a8c0-a2a09573bf62-scripts\") pod \"nova-cell1-cell-mapping-h4mlr\" (UID: \"079f20c9-f742-4c4b-a8c0-a2a09573bf62\") " pod="openstack/nova-cell1-cell-mapping-h4mlr" Feb 17 14:31:50 crc kubenswrapper[4836]: I0217 14:31:50.807949 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w49k2\" (UniqueName: \"kubernetes.io/projected/079f20c9-f742-4c4b-a8c0-a2a09573bf62-kube-api-access-w49k2\") pod \"nova-cell1-cell-mapping-h4mlr\" (UID: \"079f20c9-f742-4c4b-a8c0-a2a09573bf62\") " pod="openstack/nova-cell1-cell-mapping-h4mlr" Feb 17 14:31:50 crc kubenswrapper[4836]: I0217 14:31:50.808114 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/079f20c9-f742-4c4b-a8c0-a2a09573bf62-config-data\") pod \"nova-cell1-cell-mapping-h4mlr\" (UID: \"079f20c9-f742-4c4b-a8c0-a2a09573bf62\") " pod="openstack/nova-cell1-cell-mapping-h4mlr" Feb 17 14:31:50 crc kubenswrapper[4836]: I0217 14:31:50.808155 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/079f20c9-f742-4c4b-a8c0-a2a09573bf62-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-h4mlr\" (UID: \"079f20c9-f742-4c4b-a8c0-a2a09573bf62\") " pod="openstack/nova-cell1-cell-mapping-h4mlr" Feb 17 14:31:50 crc kubenswrapper[4836]: I0217 14:31:50.815090 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/079f20c9-f742-4c4b-a8c0-a2a09573bf62-config-data\") pod \"nova-cell1-cell-mapping-h4mlr\" (UID: \"079f20c9-f742-4c4b-a8c0-a2a09573bf62\") " pod="openstack/nova-cell1-cell-mapping-h4mlr" Feb 17 14:31:50 crc kubenswrapper[4836]: I0217 14:31:50.826728 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/079f20c9-f742-4c4b-a8c0-a2a09573bf62-scripts\") pod \"nova-cell1-cell-mapping-h4mlr\" (UID: \"079f20c9-f742-4c4b-a8c0-a2a09573bf62\") " pod="openstack/nova-cell1-cell-mapping-h4mlr" Feb 17 14:31:50 crc kubenswrapper[4836]: I0217 14:31:50.832764 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/079f20c9-f742-4c4b-a8c0-a2a09573bf62-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-h4mlr\" (UID: \"079f20c9-f742-4c4b-a8c0-a2a09573bf62\") " pod="openstack/nova-cell1-cell-mapping-h4mlr" Feb 17 14:31:50 crc kubenswrapper[4836]: I0217 14:31:50.849121 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w49k2\" (UniqueName: \"kubernetes.io/projected/079f20c9-f742-4c4b-a8c0-a2a09573bf62-kube-api-access-w49k2\") pod \"nova-cell1-cell-mapping-h4mlr\" (UID: \"079f20c9-f742-4c4b-a8c0-a2a09573bf62\") " pod="openstack/nova-cell1-cell-mapping-h4mlr" Feb 17 14:31:51 crc kubenswrapper[4836]: I0217 14:31:51.033172 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-h4mlr" Feb 17 14:31:51 crc kubenswrapper[4836]: I0217 14:31:51.320131 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c95da3ac-7563-49bf-a956-b19297cb7d97","Type":"ContainerStarted","Data":"423c24607bf9f3ae8ddccd308e2f900af69eca84ed6108aea143a2d2240bf369"} Feb 17 14:31:51 crc kubenswrapper[4836]: I0217 14:31:51.320667 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c95da3ac-7563-49bf-a956-b19297cb7d97","Type":"ContainerStarted","Data":"f3759853b325832fd4c44c00a3e3389086733322b67a29ca459e1ee11b19dfb7"} Feb 17 14:31:51 crc kubenswrapper[4836]: I0217 14:31:51.352332 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"143c175f-4768-4188-8f12-3f76bf70804f","Type":"ContainerStarted","Data":"b5230a56f72b92f59f1479439851803ed41a8e0044ef00c9559a2e9a8714d70a"} Feb 17 14:31:51 crc kubenswrapper[4836]: I0217 14:31:51.352394 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"143c175f-4768-4188-8f12-3f76bf70804f","Type":"ContainerStarted","Data":"646a0ce4d819762c0c29a5b108e6c0da6c863c98cc2e8398dd5c431ab5025ec7"} Feb 17 14:31:51 crc kubenswrapper[4836]: I0217 14:31:51.699929 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-h4mlr"] Feb 17 14:31:52 crc kubenswrapper[4836]: I0217 14:31:52.372130 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c95da3ac-7563-49bf-a956-b19297cb7d97","Type":"ContainerStarted","Data":"506f661fc5a6b974ebc362273c2cdd8a27145169f4fc426e732aad89c2e734ed"} Feb 17 14:31:52 crc kubenswrapper[4836]: I0217 14:31:52.376620 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-h4mlr" event={"ID":"079f20c9-f742-4c4b-a8c0-a2a09573bf62","Type":"ContainerStarted","Data":"6d24e9f78b938b24616765924395f09dc01b17f432bd2a5ca96dd30f763b95e2"} Feb 17 14:31:52 crc kubenswrapper[4836]: I0217 14:31:52.376674 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-h4mlr" event={"ID":"079f20c9-f742-4c4b-a8c0-a2a09573bf62","Type":"ContainerStarted","Data":"04c119ecda97e9ecffd6aff4094f5b50acfbc974345595ef1ecf805ee73c0e65"} Feb 17 14:31:52 crc kubenswrapper[4836]: I0217 14:31:52.403242 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.403202288 podStartE2EDuration="3.403202288s" podCreationTimestamp="2026-02-17 14:31:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:31:52.39440157 +0000 UTC m=+1538.737329859" watchObservedRunningTime="2026-02-17 14:31:52.403202288 +0000 UTC m=+1538.746130557" Feb 17 14:31:52 crc kubenswrapper[4836]: I0217 14:31:52.427260 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-h4mlr" podStartSLOduration=2.427234039 podStartE2EDuration="2.427234039s" podCreationTimestamp="2026-02-17 14:31:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:31:52.415279606 +0000 UTC m=+1538.758207885" watchObservedRunningTime="2026-02-17 14:31:52.427234039 +0000 UTC m=+1538.770162308" Feb 17 14:31:52 crc kubenswrapper[4836]: I0217 14:31:52.697607 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5fd9b586ff-snjhj" Feb 17 14:31:52 crc kubenswrapper[4836]: I0217 14:31:52.772981 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78cd565959-cbrcp"] Feb 17 14:31:52 crc kubenswrapper[4836]: I0217 14:31:52.773350 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-78cd565959-cbrcp" podUID="d8b08728-c946-43e4-85fa-0b033034bd26" containerName="dnsmasq-dns" containerID="cri-o://6c739c83cd6c60eccf82cdc83958244ae182a579d8b46273f0f3fb2234b691ec" gracePeriod=10 Feb 17 14:31:53 crc kubenswrapper[4836]: I0217 14:31:53.403281 4836 generic.go:334] "Generic (PLEG): container finished" podID="d8b08728-c946-43e4-85fa-0b033034bd26" containerID="6c739c83cd6c60eccf82cdc83958244ae182a579d8b46273f0f3fb2234b691ec" exitCode=0 Feb 17 14:31:53 crc kubenswrapper[4836]: I0217 14:31:53.403347 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78cd565959-cbrcp" event={"ID":"d8b08728-c946-43e4-85fa-0b033034bd26","Type":"ContainerDied","Data":"6c739c83cd6c60eccf82cdc83958244ae182a579d8b46273f0f3fb2234b691ec"} Feb 17 14:31:53 crc kubenswrapper[4836]: I0217 14:31:53.419565 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"143c175f-4768-4188-8f12-3f76bf70804f","Type":"ContainerStarted","Data":"a50d34b1dd358d7c95a4c22b41eaa637c5a3469ea425b6fa6f7b63908874bad7"} Feb 17 14:31:53 crc kubenswrapper[4836]: I0217 14:31:53.420348 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="143c175f-4768-4188-8f12-3f76bf70804f" containerName="ceilometer-central-agent" containerID="cri-o://c5153296780e29125ef218661157bfd30ddc0ca224b65cb9ffd34fbe9f884fe5" gracePeriod=30 Feb 17 14:31:53 crc kubenswrapper[4836]: I0217 14:31:53.421014 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="143c175f-4768-4188-8f12-3f76bf70804f" containerName="proxy-httpd" containerID="cri-o://a50d34b1dd358d7c95a4c22b41eaa637c5a3469ea425b6fa6f7b63908874bad7" gracePeriod=30 Feb 17 14:31:53 crc kubenswrapper[4836]: I0217 14:31:53.421088 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="143c175f-4768-4188-8f12-3f76bf70804f" containerName="sg-core" containerID="cri-o://b5230a56f72b92f59f1479439851803ed41a8e0044ef00c9559a2e9a8714d70a" gracePeriod=30 Feb 17 14:31:53 crc kubenswrapper[4836]: I0217 14:31:53.421141 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="143c175f-4768-4188-8f12-3f76bf70804f" containerName="ceilometer-notification-agent" containerID="cri-o://646a0ce4d819762c0c29a5b108e6c0da6c863c98cc2e8398dd5c431ab5025ec7" gracePeriod=30 Feb 17 14:31:53 crc kubenswrapper[4836]: I0217 14:31:53.454634 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.27692359 podStartE2EDuration="6.454606716s" podCreationTimestamp="2026-02-17 14:31:47 +0000 UTC" firstStartedPulling="2026-02-17 14:31:48.096524374 +0000 UTC m=+1534.439452643" lastFinishedPulling="2026-02-17 14:31:52.27420751 +0000 UTC m=+1538.617135769" observedRunningTime="2026-02-17 14:31:53.451984255 +0000 UTC m=+1539.794912534" watchObservedRunningTime="2026-02-17 14:31:53.454606716 +0000 UTC m=+1539.797534985" Feb 17 14:31:53 crc kubenswrapper[4836]: I0217 14:31:53.715134 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78cd565959-cbrcp" Feb 17 14:31:53 crc kubenswrapper[4836]: I0217 14:31:53.846397 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d8b08728-c946-43e4-85fa-0b033034bd26-ovsdbserver-sb\") pod \"d8b08728-c946-43e4-85fa-0b033034bd26\" (UID: \"d8b08728-c946-43e4-85fa-0b033034bd26\") " Feb 17 14:31:53 crc kubenswrapper[4836]: I0217 14:31:53.846498 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d8b08728-c946-43e4-85fa-0b033034bd26-dns-swift-storage-0\") pod \"d8b08728-c946-43e4-85fa-0b033034bd26\" (UID: \"d8b08728-c946-43e4-85fa-0b033034bd26\") " Feb 17 14:31:53 crc kubenswrapper[4836]: I0217 14:31:53.846808 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d8b08728-c946-43e4-85fa-0b033034bd26-ovsdbserver-nb\") pod \"d8b08728-c946-43e4-85fa-0b033034bd26\" (UID: \"d8b08728-c946-43e4-85fa-0b033034bd26\") " Feb 17 14:31:53 crc kubenswrapper[4836]: I0217 14:31:53.846891 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d8b08728-c946-43e4-85fa-0b033034bd26-dns-svc\") pod \"d8b08728-c946-43e4-85fa-0b033034bd26\" (UID: \"d8b08728-c946-43e4-85fa-0b033034bd26\") " Feb 17 14:31:53 crc kubenswrapper[4836]: I0217 14:31:53.846937 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rl67m\" (UniqueName: \"kubernetes.io/projected/d8b08728-c946-43e4-85fa-0b033034bd26-kube-api-access-rl67m\") pod \"d8b08728-c946-43e4-85fa-0b033034bd26\" (UID: \"d8b08728-c946-43e4-85fa-0b033034bd26\") " Feb 17 14:31:53 crc kubenswrapper[4836]: I0217 14:31:53.847339 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8b08728-c946-43e4-85fa-0b033034bd26-config\") pod \"d8b08728-c946-43e4-85fa-0b033034bd26\" (UID: \"d8b08728-c946-43e4-85fa-0b033034bd26\") " Feb 17 14:31:53 crc kubenswrapper[4836]: I0217 14:31:53.877541 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8b08728-c946-43e4-85fa-0b033034bd26-kube-api-access-rl67m" (OuterVolumeSpecName: "kube-api-access-rl67m") pod "d8b08728-c946-43e4-85fa-0b033034bd26" (UID: "d8b08728-c946-43e4-85fa-0b033034bd26"). InnerVolumeSpecName "kube-api-access-rl67m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:31:53 crc kubenswrapper[4836]: I0217 14:31:53.918761 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8b08728-c946-43e4-85fa-0b033034bd26-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d8b08728-c946-43e4-85fa-0b033034bd26" (UID: "d8b08728-c946-43e4-85fa-0b033034bd26"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:31:53 crc kubenswrapper[4836]: I0217 14:31:53.925587 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8b08728-c946-43e4-85fa-0b033034bd26-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "d8b08728-c946-43e4-85fa-0b033034bd26" (UID: "d8b08728-c946-43e4-85fa-0b033034bd26"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:31:53 crc kubenswrapper[4836]: I0217 14:31:53.941017 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8b08728-c946-43e4-85fa-0b033034bd26-config" (OuterVolumeSpecName: "config") pod "d8b08728-c946-43e4-85fa-0b033034bd26" (UID: "d8b08728-c946-43e4-85fa-0b033034bd26"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:31:53 crc kubenswrapper[4836]: I0217 14:31:53.942273 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8b08728-c946-43e4-85fa-0b033034bd26-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d8b08728-c946-43e4-85fa-0b033034bd26" (UID: "d8b08728-c946-43e4-85fa-0b033034bd26"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:31:53 crc kubenswrapper[4836]: I0217 14:31:53.957403 4836 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8b08728-c946-43e4-85fa-0b033034bd26-config\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:53 crc kubenswrapper[4836]: I0217 14:31:53.957461 4836 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d8b08728-c946-43e4-85fa-0b033034bd26-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:53 crc kubenswrapper[4836]: I0217 14:31:53.957474 4836 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d8b08728-c946-43e4-85fa-0b033034bd26-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:53 crc kubenswrapper[4836]: I0217 14:31:53.957487 4836 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d8b08728-c946-43e4-85fa-0b033034bd26-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:53 crc kubenswrapper[4836]: I0217 14:31:53.957498 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rl67m\" (UniqueName: \"kubernetes.io/projected/d8b08728-c946-43e4-85fa-0b033034bd26-kube-api-access-rl67m\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:53 crc kubenswrapper[4836]: I0217 14:31:53.963735 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8b08728-c946-43e4-85fa-0b033034bd26-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d8b08728-c946-43e4-85fa-0b033034bd26" (UID: "d8b08728-c946-43e4-85fa-0b033034bd26"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:31:54 crc kubenswrapper[4836]: I0217 14:31:54.062994 4836 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d8b08728-c946-43e4-85fa-0b033034bd26-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:54 crc kubenswrapper[4836]: I0217 14:31:54.435997 4836 generic.go:334] "Generic (PLEG): container finished" podID="143c175f-4768-4188-8f12-3f76bf70804f" containerID="a50d34b1dd358d7c95a4c22b41eaa637c5a3469ea425b6fa6f7b63908874bad7" exitCode=0 Feb 17 14:31:54 crc kubenswrapper[4836]: I0217 14:31:54.436054 4836 generic.go:334] "Generic (PLEG): container finished" podID="143c175f-4768-4188-8f12-3f76bf70804f" containerID="b5230a56f72b92f59f1479439851803ed41a8e0044ef00c9559a2e9a8714d70a" exitCode=2 Feb 17 14:31:54 crc kubenswrapper[4836]: I0217 14:31:54.436068 4836 generic.go:334] "Generic (PLEG): container finished" podID="143c175f-4768-4188-8f12-3f76bf70804f" containerID="646a0ce4d819762c0c29a5b108e6c0da6c863c98cc2e8398dd5c431ab5025ec7" exitCode=0 Feb 17 14:31:54 crc kubenswrapper[4836]: I0217 14:31:54.436068 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"143c175f-4768-4188-8f12-3f76bf70804f","Type":"ContainerDied","Data":"a50d34b1dd358d7c95a4c22b41eaa637c5a3469ea425b6fa6f7b63908874bad7"} Feb 17 14:31:54 crc kubenswrapper[4836]: I0217 14:31:54.436156 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"143c175f-4768-4188-8f12-3f76bf70804f","Type":"ContainerDied","Data":"b5230a56f72b92f59f1479439851803ed41a8e0044ef00c9559a2e9a8714d70a"} Feb 17 14:31:54 crc kubenswrapper[4836]: I0217 14:31:54.436170 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"143c175f-4768-4188-8f12-3f76bf70804f","Type":"ContainerDied","Data":"646a0ce4d819762c0c29a5b108e6c0da6c863c98cc2e8398dd5c431ab5025ec7"} Feb 17 14:31:54 crc kubenswrapper[4836]: I0217 14:31:54.439996 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78cd565959-cbrcp" event={"ID":"d8b08728-c946-43e4-85fa-0b033034bd26","Type":"ContainerDied","Data":"4a4c76bc357a85c3013a688505b9be4f985a6e124e635443b51b48a2960c2a36"} Feb 17 14:31:54 crc kubenswrapper[4836]: I0217 14:31:54.440070 4836 scope.go:117] "RemoveContainer" containerID="6c739c83cd6c60eccf82cdc83958244ae182a579d8b46273f0f3fb2234b691ec" Feb 17 14:31:54 crc kubenswrapper[4836]: I0217 14:31:54.440075 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78cd565959-cbrcp" Feb 17 14:31:54 crc kubenswrapper[4836]: I0217 14:31:54.494668 4836 scope.go:117] "RemoveContainer" containerID="10880f8e13f3f6efc6d19c175c05a63fc27f01501a301fd0a28b68afaa946ee2" Feb 17 14:31:54 crc kubenswrapper[4836]: I0217 14:31:54.500099 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78cd565959-cbrcp"] Feb 17 14:31:54 crc kubenswrapper[4836]: I0217 14:31:54.528952 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78cd565959-cbrcp"] Feb 17 14:31:54 crc kubenswrapper[4836]: I0217 14:31:54.631251 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8b08728-c946-43e4-85fa-0b033034bd26" path="/var/lib/kubelet/pods/d8b08728-c946-43e4-85fa-0b033034bd26/volumes" Feb 17 14:31:55 crc kubenswrapper[4836]: I0217 14:31:55.195189 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-gczl5"] Feb 17 14:31:55 crc kubenswrapper[4836]: E0217 14:31:55.195789 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8b08728-c946-43e4-85fa-0b033034bd26" containerName="init" Feb 17 14:31:55 crc kubenswrapper[4836]: I0217 14:31:55.195810 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8b08728-c946-43e4-85fa-0b033034bd26" containerName="init" Feb 17 14:31:55 crc kubenswrapper[4836]: E0217 14:31:55.195824 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8b08728-c946-43e4-85fa-0b033034bd26" containerName="dnsmasq-dns" Feb 17 14:31:55 crc kubenswrapper[4836]: I0217 14:31:55.195834 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8b08728-c946-43e4-85fa-0b033034bd26" containerName="dnsmasq-dns" Feb 17 14:31:55 crc kubenswrapper[4836]: I0217 14:31:55.196057 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8b08728-c946-43e4-85fa-0b033034bd26" containerName="dnsmasq-dns" Feb 17 14:31:55 crc kubenswrapper[4836]: I0217 14:31:55.197870 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gczl5" Feb 17 14:31:55 crc kubenswrapper[4836]: I0217 14:31:55.210895 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8pkv\" (UniqueName: \"kubernetes.io/projected/4cd9d6fa-d7e3-483f-afdf-104754807815-kube-api-access-c8pkv\") pod \"certified-operators-gczl5\" (UID: \"4cd9d6fa-d7e3-483f-afdf-104754807815\") " pod="openshift-marketplace/certified-operators-gczl5" Feb 17 14:31:55 crc kubenswrapper[4836]: I0217 14:31:55.211011 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4cd9d6fa-d7e3-483f-afdf-104754807815-utilities\") pod \"certified-operators-gczl5\" (UID: \"4cd9d6fa-d7e3-483f-afdf-104754807815\") " pod="openshift-marketplace/certified-operators-gczl5" Feb 17 14:31:55 crc kubenswrapper[4836]: I0217 14:31:55.211041 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4cd9d6fa-d7e3-483f-afdf-104754807815-catalog-content\") pod \"certified-operators-gczl5\" (UID: \"4cd9d6fa-d7e3-483f-afdf-104754807815\") " pod="openshift-marketplace/certified-operators-gczl5" Feb 17 14:31:55 crc kubenswrapper[4836]: I0217 14:31:55.213877 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gczl5"] Feb 17 14:31:55 crc kubenswrapper[4836]: I0217 14:31:55.313339 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4cd9d6fa-d7e3-483f-afdf-104754807815-utilities\") pod \"certified-operators-gczl5\" (UID: \"4cd9d6fa-d7e3-483f-afdf-104754807815\") " pod="openshift-marketplace/certified-operators-gczl5" Feb 17 14:31:55 crc kubenswrapper[4836]: I0217 14:31:55.313401 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4cd9d6fa-d7e3-483f-afdf-104754807815-catalog-content\") pod \"certified-operators-gczl5\" (UID: \"4cd9d6fa-d7e3-483f-afdf-104754807815\") " pod="openshift-marketplace/certified-operators-gczl5" Feb 17 14:31:55 crc kubenswrapper[4836]: I0217 14:31:55.313568 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8pkv\" (UniqueName: \"kubernetes.io/projected/4cd9d6fa-d7e3-483f-afdf-104754807815-kube-api-access-c8pkv\") pod \"certified-operators-gczl5\" (UID: \"4cd9d6fa-d7e3-483f-afdf-104754807815\") " pod="openshift-marketplace/certified-operators-gczl5" Feb 17 14:31:55 crc kubenswrapper[4836]: I0217 14:31:55.314267 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4cd9d6fa-d7e3-483f-afdf-104754807815-catalog-content\") pod \"certified-operators-gczl5\" (UID: \"4cd9d6fa-d7e3-483f-afdf-104754807815\") " pod="openshift-marketplace/certified-operators-gczl5" Feb 17 14:31:55 crc kubenswrapper[4836]: I0217 14:31:55.314282 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4cd9d6fa-d7e3-483f-afdf-104754807815-utilities\") pod \"certified-operators-gczl5\" (UID: \"4cd9d6fa-d7e3-483f-afdf-104754807815\") " pod="openshift-marketplace/certified-operators-gczl5" Feb 17 14:31:55 crc kubenswrapper[4836]: I0217 14:31:55.350389 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8pkv\" (UniqueName: \"kubernetes.io/projected/4cd9d6fa-d7e3-483f-afdf-104754807815-kube-api-access-c8pkv\") pod \"certified-operators-gczl5\" (UID: \"4cd9d6fa-d7e3-483f-afdf-104754807815\") " pod="openshift-marketplace/certified-operators-gczl5" Feb 17 14:31:55 crc kubenswrapper[4836]: I0217 14:31:55.517802 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gczl5" Feb 17 14:31:56 crc kubenswrapper[4836]: I0217 14:31:56.096209 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gczl5"] Feb 17 14:31:56 crc kubenswrapper[4836]: W0217 14:31:56.097806 4836 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4cd9d6fa_d7e3_483f_afdf_104754807815.slice/crio-bce52c57a9fcf2c1e5a87f3b45145f1651223562443f585e0fbfde97d5bea9a0 WatchSource:0}: Error finding container bce52c57a9fcf2c1e5a87f3b45145f1651223562443f585e0fbfde97d5bea9a0: Status 404 returned error can't find the container with id bce52c57a9fcf2c1e5a87f3b45145f1651223562443f585e0fbfde97d5bea9a0 Feb 17 14:31:56 crc kubenswrapper[4836]: I0217 14:31:56.466315 4836 generic.go:334] "Generic (PLEG): container finished" podID="4cd9d6fa-d7e3-483f-afdf-104754807815" containerID="d729bd6482605fb4977c1bf01960378417a9b2e6780614aa5f4dbff52eeda0e2" exitCode=0 Feb 17 14:31:56 crc kubenswrapper[4836]: I0217 14:31:56.466381 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gczl5" event={"ID":"4cd9d6fa-d7e3-483f-afdf-104754807815","Type":"ContainerDied","Data":"d729bd6482605fb4977c1bf01960378417a9b2e6780614aa5f4dbff52eeda0e2"} Feb 17 14:31:56 crc kubenswrapper[4836]: I0217 14:31:56.466417 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gczl5" event={"ID":"4cd9d6fa-d7e3-483f-afdf-104754807815","Type":"ContainerStarted","Data":"bce52c57a9fcf2c1e5a87f3b45145f1651223562443f585e0fbfde97d5bea9a0"} Feb 17 14:31:58 crc kubenswrapper[4836]: I0217 14:31:58.493437 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gczl5" event={"ID":"4cd9d6fa-d7e3-483f-afdf-104754807815","Type":"ContainerStarted","Data":"f59ee91509ea9e36d15899731572e250cd29f78a13163eb836fae779d30ef947"} Feb 17 14:31:58 crc kubenswrapper[4836]: I0217 14:31:58.500003 4836 generic.go:334] "Generic (PLEG): container finished" podID="143c175f-4768-4188-8f12-3f76bf70804f" containerID="c5153296780e29125ef218661157bfd30ddc0ca224b65cb9ffd34fbe9f884fe5" exitCode=0 Feb 17 14:31:58 crc kubenswrapper[4836]: I0217 14:31:58.500041 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"143c175f-4768-4188-8f12-3f76bf70804f","Type":"ContainerDied","Data":"c5153296780e29125ef218661157bfd30ddc0ca224b65cb9ffd34fbe9f884fe5"} Feb 17 14:31:59 crc kubenswrapper[4836]: I0217 14:31:59.166687 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 14:31:59 crc kubenswrapper[4836]: I0217 14:31:59.286203 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 17 14:31:59 crc kubenswrapper[4836]: I0217 14:31:59.288981 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 17 14:31:59 crc kubenswrapper[4836]: I0217 14:31:59.292785 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 17 14:31:59 crc kubenswrapper[4836]: I0217 14:31:59.344761 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/143c175f-4768-4188-8f12-3f76bf70804f-scripts\") pod \"143c175f-4768-4188-8f12-3f76bf70804f\" (UID: \"143c175f-4768-4188-8f12-3f76bf70804f\") " Feb 17 14:31:59 crc kubenswrapper[4836]: I0217 14:31:59.344947 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/143c175f-4768-4188-8f12-3f76bf70804f-sg-core-conf-yaml\") pod \"143c175f-4768-4188-8f12-3f76bf70804f\" (UID: \"143c175f-4768-4188-8f12-3f76bf70804f\") " Feb 17 14:31:59 crc kubenswrapper[4836]: I0217 14:31:59.345100 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/143c175f-4768-4188-8f12-3f76bf70804f-combined-ca-bundle\") pod \"143c175f-4768-4188-8f12-3f76bf70804f\" (UID: \"143c175f-4768-4188-8f12-3f76bf70804f\") " Feb 17 14:31:59 crc kubenswrapper[4836]: I0217 14:31:59.345229 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/143c175f-4768-4188-8f12-3f76bf70804f-config-data\") pod \"143c175f-4768-4188-8f12-3f76bf70804f\" (UID: \"143c175f-4768-4188-8f12-3f76bf70804f\") " Feb 17 14:31:59 crc kubenswrapper[4836]: I0217 14:31:59.345268 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/143c175f-4768-4188-8f12-3f76bf70804f-log-httpd\") pod \"143c175f-4768-4188-8f12-3f76bf70804f\" (UID: \"143c175f-4768-4188-8f12-3f76bf70804f\") " Feb 17 14:31:59 crc kubenswrapper[4836]: I0217 14:31:59.345365 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/143c175f-4768-4188-8f12-3f76bf70804f-run-httpd\") pod \"143c175f-4768-4188-8f12-3f76bf70804f\" (UID: \"143c175f-4768-4188-8f12-3f76bf70804f\") " Feb 17 14:31:59 crc kubenswrapper[4836]: I0217 14:31:59.345403 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5wvnc\" (UniqueName: \"kubernetes.io/projected/143c175f-4768-4188-8f12-3f76bf70804f-kube-api-access-5wvnc\") pod \"143c175f-4768-4188-8f12-3f76bf70804f\" (UID: \"143c175f-4768-4188-8f12-3f76bf70804f\") " Feb 17 14:31:59 crc kubenswrapper[4836]: I0217 14:31:59.345445 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/143c175f-4768-4188-8f12-3f76bf70804f-ceilometer-tls-certs\") pod \"143c175f-4768-4188-8f12-3f76bf70804f\" (UID: \"143c175f-4768-4188-8f12-3f76bf70804f\") " Feb 17 14:31:59 crc kubenswrapper[4836]: I0217 14:31:59.346650 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/143c175f-4768-4188-8f12-3f76bf70804f-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "143c175f-4768-4188-8f12-3f76bf70804f" (UID: "143c175f-4768-4188-8f12-3f76bf70804f"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:31:59 crc kubenswrapper[4836]: I0217 14:31:59.346743 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/143c175f-4768-4188-8f12-3f76bf70804f-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "143c175f-4768-4188-8f12-3f76bf70804f" (UID: "143c175f-4768-4188-8f12-3f76bf70804f"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:31:59 crc kubenswrapper[4836]: I0217 14:31:59.347755 4836 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/143c175f-4768-4188-8f12-3f76bf70804f-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:59 crc kubenswrapper[4836]: I0217 14:31:59.348185 4836 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/143c175f-4768-4188-8f12-3f76bf70804f-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:59 crc kubenswrapper[4836]: I0217 14:31:59.358721 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/143c175f-4768-4188-8f12-3f76bf70804f-kube-api-access-5wvnc" (OuterVolumeSpecName: "kube-api-access-5wvnc") pod "143c175f-4768-4188-8f12-3f76bf70804f" (UID: "143c175f-4768-4188-8f12-3f76bf70804f"). InnerVolumeSpecName "kube-api-access-5wvnc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:31:59 crc kubenswrapper[4836]: I0217 14:31:59.381925 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/143c175f-4768-4188-8f12-3f76bf70804f-scripts" (OuterVolumeSpecName: "scripts") pod "143c175f-4768-4188-8f12-3f76bf70804f" (UID: "143c175f-4768-4188-8f12-3f76bf70804f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:31:59 crc kubenswrapper[4836]: I0217 14:31:59.387397 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/143c175f-4768-4188-8f12-3f76bf70804f-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "143c175f-4768-4188-8f12-3f76bf70804f" (UID: "143c175f-4768-4188-8f12-3f76bf70804f"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:31:59 crc kubenswrapper[4836]: I0217 14:31:59.424268 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/143c175f-4768-4188-8f12-3f76bf70804f-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "143c175f-4768-4188-8f12-3f76bf70804f" (UID: "143c175f-4768-4188-8f12-3f76bf70804f"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:31:59 crc kubenswrapper[4836]: I0217 14:31:59.450541 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/143c175f-4768-4188-8f12-3f76bf70804f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "143c175f-4768-4188-8f12-3f76bf70804f" (UID: "143c175f-4768-4188-8f12-3f76bf70804f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:31:59 crc kubenswrapper[4836]: I0217 14:31:59.450668 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5wvnc\" (UniqueName: \"kubernetes.io/projected/143c175f-4768-4188-8f12-3f76bf70804f-kube-api-access-5wvnc\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:59 crc kubenswrapper[4836]: I0217 14:31:59.450694 4836 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/143c175f-4768-4188-8f12-3f76bf70804f-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:59 crc kubenswrapper[4836]: I0217 14:31:59.450704 4836 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/143c175f-4768-4188-8f12-3f76bf70804f-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:59 crc kubenswrapper[4836]: I0217 14:31:59.450713 4836 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/143c175f-4768-4188-8f12-3f76bf70804f-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:59 crc kubenswrapper[4836]: I0217 14:31:59.484966 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/143c175f-4768-4188-8f12-3f76bf70804f-config-data" (OuterVolumeSpecName: "config-data") pod "143c175f-4768-4188-8f12-3f76bf70804f" (UID: "143c175f-4768-4188-8f12-3f76bf70804f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:31:59 crc kubenswrapper[4836]: I0217 14:31:59.514552 4836 generic.go:334] "Generic (PLEG): container finished" podID="079f20c9-f742-4c4b-a8c0-a2a09573bf62" containerID="6d24e9f78b938b24616765924395f09dc01b17f432bd2a5ca96dd30f763b95e2" exitCode=0 Feb 17 14:31:59 crc kubenswrapper[4836]: I0217 14:31:59.514635 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-h4mlr" event={"ID":"079f20c9-f742-4c4b-a8c0-a2a09573bf62","Type":"ContainerDied","Data":"6d24e9f78b938b24616765924395f09dc01b17f432bd2a5ca96dd30f763b95e2"} Feb 17 14:31:59 crc kubenswrapper[4836]: I0217 14:31:59.519567 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"143c175f-4768-4188-8f12-3f76bf70804f","Type":"ContainerDied","Data":"9dbb838e04f66687c191dcad00147af5a9d0fa9da289d8afa2cdcb370fe12258"} Feb 17 14:31:59 crc kubenswrapper[4836]: I0217 14:31:59.519632 4836 scope.go:117] "RemoveContainer" containerID="a50d34b1dd358d7c95a4c22b41eaa637c5a3469ea425b6fa6f7b63908874bad7" Feb 17 14:31:59 crc kubenswrapper[4836]: I0217 14:31:59.519860 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 14:31:59 crc kubenswrapper[4836]: I0217 14:31:59.523679 4836 generic.go:334] "Generic (PLEG): container finished" podID="4cd9d6fa-d7e3-483f-afdf-104754807815" containerID="f59ee91509ea9e36d15899731572e250cd29f78a13163eb836fae779d30ef947" exitCode=0 Feb 17 14:31:59 crc kubenswrapper[4836]: I0217 14:31:59.526037 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gczl5" event={"ID":"4cd9d6fa-d7e3-483f-afdf-104754807815","Type":"ContainerDied","Data":"f59ee91509ea9e36d15899731572e250cd29f78a13163eb836fae779d30ef947"} Feb 17 14:31:59 crc kubenswrapper[4836]: I0217 14:31:59.549728 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 17 14:31:59 crc kubenswrapper[4836]: I0217 14:31:59.553079 4836 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/143c175f-4768-4188-8f12-3f76bf70804f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:59 crc kubenswrapper[4836]: I0217 14:31:59.553107 4836 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/143c175f-4768-4188-8f12-3f76bf70804f-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 14:31:59 crc kubenswrapper[4836]: I0217 14:31:59.633343 4836 scope.go:117] "RemoveContainer" containerID="b5230a56f72b92f59f1479439851803ed41a8e0044ef00c9559a2e9a8714d70a" Feb 17 14:31:59 crc kubenswrapper[4836]: I0217 14:31:59.639175 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 14:31:59 crc kubenswrapper[4836]: I0217 14:31:59.658689 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 17 14:31:59 crc kubenswrapper[4836]: I0217 14:31:59.700150 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 17 14:31:59 crc kubenswrapper[4836]: E0217 14:31:59.709345 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="143c175f-4768-4188-8f12-3f76bf70804f" containerName="proxy-httpd" Feb 17 14:31:59 crc kubenswrapper[4836]: I0217 14:31:59.709405 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="143c175f-4768-4188-8f12-3f76bf70804f" containerName="proxy-httpd" Feb 17 14:31:59 crc kubenswrapper[4836]: E0217 14:31:59.709491 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="143c175f-4768-4188-8f12-3f76bf70804f" containerName="sg-core" Feb 17 14:31:59 crc kubenswrapper[4836]: I0217 14:31:59.709506 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="143c175f-4768-4188-8f12-3f76bf70804f" containerName="sg-core" Feb 17 14:31:59 crc kubenswrapper[4836]: E0217 14:31:59.709522 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="143c175f-4768-4188-8f12-3f76bf70804f" containerName="ceilometer-central-agent" Feb 17 14:31:59 crc kubenswrapper[4836]: I0217 14:31:59.709536 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="143c175f-4768-4188-8f12-3f76bf70804f" containerName="ceilometer-central-agent" Feb 17 14:31:59 crc kubenswrapper[4836]: E0217 14:31:59.709562 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="143c175f-4768-4188-8f12-3f76bf70804f" containerName="ceilometer-notification-agent" Feb 17 14:31:59 crc kubenswrapper[4836]: I0217 14:31:59.709575 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="143c175f-4768-4188-8f12-3f76bf70804f" containerName="ceilometer-notification-agent" Feb 17 14:31:59 crc kubenswrapper[4836]: I0217 14:31:59.711440 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="143c175f-4768-4188-8f12-3f76bf70804f" containerName="ceilometer-central-agent" Feb 17 14:31:59 crc kubenswrapper[4836]: I0217 14:31:59.711468 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="143c175f-4768-4188-8f12-3f76bf70804f" containerName="sg-core" Feb 17 14:31:59 crc kubenswrapper[4836]: I0217 14:31:59.711479 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="143c175f-4768-4188-8f12-3f76bf70804f" containerName="proxy-httpd" Feb 17 14:31:59 crc kubenswrapper[4836]: I0217 14:31:59.711500 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="143c175f-4768-4188-8f12-3f76bf70804f" containerName="ceilometer-notification-agent" Feb 17 14:31:59 crc kubenswrapper[4836]: I0217 14:31:59.713888 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 14:31:59 crc kubenswrapper[4836]: I0217 14:31:59.726962 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 14:31:59 crc kubenswrapper[4836]: I0217 14:31:59.727153 4836 scope.go:117] "RemoveContainer" containerID="646a0ce4d819762c0c29a5b108e6c0da6c863c98cc2e8398dd5c431ab5025ec7" Feb 17 14:31:59 crc kubenswrapper[4836]: I0217 14:31:59.727773 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 17 14:31:59 crc kubenswrapper[4836]: I0217 14:31:59.727906 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 17 14:31:59 crc kubenswrapper[4836]: I0217 14:31:59.728164 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 17 14:31:59 crc kubenswrapper[4836]: I0217 14:31:59.759352 4836 scope.go:117] "RemoveContainer" containerID="c5153296780e29125ef218661157bfd30ddc0ca224b65cb9ffd34fbe9f884fe5" Feb 17 14:31:59 crc kubenswrapper[4836]: I0217 14:31:59.861063 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ddcf30e-7916-4b59-8986-a5d2c218170e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1ddcf30e-7916-4b59-8986-a5d2c218170e\") " pod="openstack/ceilometer-0" Feb 17 14:31:59 crc kubenswrapper[4836]: I0217 14:31:59.862423 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1ddcf30e-7916-4b59-8986-a5d2c218170e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1ddcf30e-7916-4b59-8986-a5d2c218170e\") " pod="openstack/ceilometer-0" Feb 17 14:31:59 crc kubenswrapper[4836]: I0217 14:31:59.862594 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ddcf30e-7916-4b59-8986-a5d2c218170e-log-httpd\") pod \"ceilometer-0\" (UID: \"1ddcf30e-7916-4b59-8986-a5d2c218170e\") " pod="openstack/ceilometer-0" Feb 17 14:31:59 crc kubenswrapper[4836]: I0217 14:31:59.862725 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ddcf30e-7916-4b59-8986-a5d2c218170e-scripts\") pod \"ceilometer-0\" (UID: \"1ddcf30e-7916-4b59-8986-a5d2c218170e\") " pod="openstack/ceilometer-0" Feb 17 14:31:59 crc kubenswrapper[4836]: I0217 14:31:59.862825 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ddcf30e-7916-4b59-8986-a5d2c218170e-run-httpd\") pod \"ceilometer-0\" (UID: \"1ddcf30e-7916-4b59-8986-a5d2c218170e\") " pod="openstack/ceilometer-0" Feb 17 14:31:59 crc kubenswrapper[4836]: I0217 14:31:59.862964 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ddcf30e-7916-4b59-8986-a5d2c218170e-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"1ddcf30e-7916-4b59-8986-a5d2c218170e\") " pod="openstack/ceilometer-0" Feb 17 14:31:59 crc kubenswrapper[4836]: I0217 14:31:59.863106 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrhqt\" (UniqueName: \"kubernetes.io/projected/1ddcf30e-7916-4b59-8986-a5d2c218170e-kube-api-access-qrhqt\") pod \"ceilometer-0\" (UID: \"1ddcf30e-7916-4b59-8986-a5d2c218170e\") " pod="openstack/ceilometer-0" Feb 17 14:31:59 crc kubenswrapper[4836]: I0217 14:31:59.863215 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ddcf30e-7916-4b59-8986-a5d2c218170e-config-data\") pod \"ceilometer-0\" (UID: \"1ddcf30e-7916-4b59-8986-a5d2c218170e\") " pod="openstack/ceilometer-0" Feb 17 14:31:59 crc kubenswrapper[4836]: I0217 14:31:59.966408 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ddcf30e-7916-4b59-8986-a5d2c218170e-scripts\") pod \"ceilometer-0\" (UID: \"1ddcf30e-7916-4b59-8986-a5d2c218170e\") " pod="openstack/ceilometer-0" Feb 17 14:31:59 crc kubenswrapper[4836]: I0217 14:31:59.966699 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ddcf30e-7916-4b59-8986-a5d2c218170e-run-httpd\") pod \"ceilometer-0\" (UID: \"1ddcf30e-7916-4b59-8986-a5d2c218170e\") " pod="openstack/ceilometer-0" Feb 17 14:31:59 crc kubenswrapper[4836]: I0217 14:31:59.966826 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ddcf30e-7916-4b59-8986-a5d2c218170e-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"1ddcf30e-7916-4b59-8986-a5d2c218170e\") " pod="openstack/ceilometer-0" Feb 17 14:31:59 crc kubenswrapper[4836]: I0217 14:31:59.967022 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrhqt\" (UniqueName: \"kubernetes.io/projected/1ddcf30e-7916-4b59-8986-a5d2c218170e-kube-api-access-qrhqt\") pod \"ceilometer-0\" (UID: \"1ddcf30e-7916-4b59-8986-a5d2c218170e\") " pod="openstack/ceilometer-0" Feb 17 14:31:59 crc kubenswrapper[4836]: I0217 14:31:59.967104 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ddcf30e-7916-4b59-8986-a5d2c218170e-config-data\") pod \"ceilometer-0\" (UID: \"1ddcf30e-7916-4b59-8986-a5d2c218170e\") " pod="openstack/ceilometer-0" Feb 17 14:31:59 crc kubenswrapper[4836]: I0217 14:31:59.967384 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ddcf30e-7916-4b59-8986-a5d2c218170e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1ddcf30e-7916-4b59-8986-a5d2c218170e\") " pod="openstack/ceilometer-0" Feb 17 14:31:59 crc kubenswrapper[4836]: I0217 14:31:59.968067 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ddcf30e-7916-4b59-8986-a5d2c218170e-run-httpd\") pod \"ceilometer-0\" (UID: \"1ddcf30e-7916-4b59-8986-a5d2c218170e\") " pod="openstack/ceilometer-0" Feb 17 14:31:59 crc kubenswrapper[4836]: I0217 14:31:59.968085 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1ddcf30e-7916-4b59-8986-a5d2c218170e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1ddcf30e-7916-4b59-8986-a5d2c218170e\") " pod="openstack/ceilometer-0" Feb 17 14:31:59 crc kubenswrapper[4836]: I0217 14:31:59.968360 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ddcf30e-7916-4b59-8986-a5d2c218170e-log-httpd\") pod \"ceilometer-0\" (UID: \"1ddcf30e-7916-4b59-8986-a5d2c218170e\") " pod="openstack/ceilometer-0" Feb 17 14:31:59 crc kubenswrapper[4836]: I0217 14:31:59.968967 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ddcf30e-7916-4b59-8986-a5d2c218170e-log-httpd\") pod \"ceilometer-0\" (UID: \"1ddcf30e-7916-4b59-8986-a5d2c218170e\") " pod="openstack/ceilometer-0" Feb 17 14:31:59 crc kubenswrapper[4836]: I0217 14:31:59.972364 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1ddcf30e-7916-4b59-8986-a5d2c218170e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1ddcf30e-7916-4b59-8986-a5d2c218170e\") " pod="openstack/ceilometer-0" Feb 17 14:31:59 crc kubenswrapper[4836]: I0217 14:31:59.972410 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ddcf30e-7916-4b59-8986-a5d2c218170e-config-data\") pod \"ceilometer-0\" (UID: \"1ddcf30e-7916-4b59-8986-a5d2c218170e\") " pod="openstack/ceilometer-0" Feb 17 14:31:59 crc kubenswrapper[4836]: I0217 14:31:59.981431 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ddcf30e-7916-4b59-8986-a5d2c218170e-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"1ddcf30e-7916-4b59-8986-a5d2c218170e\") " pod="openstack/ceilometer-0" Feb 17 14:31:59 crc kubenswrapper[4836]: I0217 14:31:59.985651 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ddcf30e-7916-4b59-8986-a5d2c218170e-scripts\") pod \"ceilometer-0\" (UID: \"1ddcf30e-7916-4b59-8986-a5d2c218170e\") " pod="openstack/ceilometer-0" Feb 17 14:31:59 crc kubenswrapper[4836]: I0217 14:31:59.987037 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ddcf30e-7916-4b59-8986-a5d2c218170e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1ddcf30e-7916-4b59-8986-a5d2c218170e\") " pod="openstack/ceilometer-0" Feb 17 14:31:59 crc kubenswrapper[4836]: I0217 14:31:59.992123 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrhqt\" (UniqueName: \"kubernetes.io/projected/1ddcf30e-7916-4b59-8986-a5d2c218170e-kube-api-access-qrhqt\") pod \"ceilometer-0\" (UID: \"1ddcf30e-7916-4b59-8986-a5d2c218170e\") " pod="openstack/ceilometer-0" Feb 17 14:31:59 crc kubenswrapper[4836]: I0217 14:31:59.995536 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 17 14:31:59 crc kubenswrapper[4836]: I0217 14:31:59.996190 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 17 14:32:00 crc kubenswrapper[4836]: I0217 14:32:00.044692 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 14:32:00 crc kubenswrapper[4836]: I0217 14:32:00.539001 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gczl5" event={"ID":"4cd9d6fa-d7e3-483f-afdf-104754807815","Type":"ContainerStarted","Data":"53b60b98531abff91e0a218cd8150c220b43da5bca3da77286dd345ff07020ef"} Feb 17 14:32:00 crc kubenswrapper[4836]: I0217 14:32:00.575854 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-gczl5" podStartSLOduration=2.040008613 podStartE2EDuration="5.575823025s" podCreationTimestamp="2026-02-17 14:31:55 +0000 UTC" firstStartedPulling="2026-02-17 14:31:56.468734293 +0000 UTC m=+1542.811662562" lastFinishedPulling="2026-02-17 14:32:00.004548705 +0000 UTC m=+1546.347476974" observedRunningTime="2026-02-17 14:32:00.562964797 +0000 UTC m=+1546.905893066" watchObservedRunningTime="2026-02-17 14:32:00.575823025 +0000 UTC m=+1546.918751294" Feb 17 14:32:00 crc kubenswrapper[4836]: I0217 14:32:00.589952 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="143c175f-4768-4188-8f12-3f76bf70804f" path="/var/lib/kubelet/pods/143c175f-4768-4188-8f12-3f76bf70804f/volumes" Feb 17 14:32:00 crc kubenswrapper[4836]: I0217 14:32:00.608098 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 14:32:01 crc kubenswrapper[4836]: I0217 14:32:01.013653 4836 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="c95da3ac-7563-49bf-a956-b19297cb7d97" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.230:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 17 14:32:01 crc kubenswrapper[4836]: I0217 14:32:01.014442 4836 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="c95da3ac-7563-49bf-a956-b19297cb7d97" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.230:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 17 14:32:01 crc kubenswrapper[4836]: I0217 14:32:01.088665 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-h4mlr" Feb 17 14:32:01 crc kubenswrapper[4836]: I0217 14:32:01.204652 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/079f20c9-f742-4c4b-a8c0-a2a09573bf62-scripts\") pod \"079f20c9-f742-4c4b-a8c0-a2a09573bf62\" (UID: \"079f20c9-f742-4c4b-a8c0-a2a09573bf62\") " Feb 17 14:32:01 crc kubenswrapper[4836]: I0217 14:32:01.204752 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/079f20c9-f742-4c4b-a8c0-a2a09573bf62-config-data\") pod \"079f20c9-f742-4c4b-a8c0-a2a09573bf62\" (UID: \"079f20c9-f742-4c4b-a8c0-a2a09573bf62\") " Feb 17 14:32:01 crc kubenswrapper[4836]: I0217 14:32:01.204984 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w49k2\" (UniqueName: \"kubernetes.io/projected/079f20c9-f742-4c4b-a8c0-a2a09573bf62-kube-api-access-w49k2\") pod \"079f20c9-f742-4c4b-a8c0-a2a09573bf62\" (UID: \"079f20c9-f742-4c4b-a8c0-a2a09573bf62\") " Feb 17 14:32:01 crc kubenswrapper[4836]: I0217 14:32:01.205101 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/079f20c9-f742-4c4b-a8c0-a2a09573bf62-combined-ca-bundle\") pod \"079f20c9-f742-4c4b-a8c0-a2a09573bf62\" (UID: \"079f20c9-f742-4c4b-a8c0-a2a09573bf62\") " Feb 17 14:32:01 crc kubenswrapper[4836]: I0217 14:32:01.234156 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/079f20c9-f742-4c4b-a8c0-a2a09573bf62-kube-api-access-w49k2" (OuterVolumeSpecName: "kube-api-access-w49k2") pod "079f20c9-f742-4c4b-a8c0-a2a09573bf62" (UID: "079f20c9-f742-4c4b-a8c0-a2a09573bf62"). InnerVolumeSpecName "kube-api-access-w49k2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:32:01 crc kubenswrapper[4836]: I0217 14:32:01.234954 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/079f20c9-f742-4c4b-a8c0-a2a09573bf62-scripts" (OuterVolumeSpecName: "scripts") pod "079f20c9-f742-4c4b-a8c0-a2a09573bf62" (UID: "079f20c9-f742-4c4b-a8c0-a2a09573bf62"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:32:01 crc kubenswrapper[4836]: I0217 14:32:01.284385 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/079f20c9-f742-4c4b-a8c0-a2a09573bf62-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "079f20c9-f742-4c4b-a8c0-a2a09573bf62" (UID: "079f20c9-f742-4c4b-a8c0-a2a09573bf62"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:32:01 crc kubenswrapper[4836]: I0217 14:32:01.284554 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/079f20c9-f742-4c4b-a8c0-a2a09573bf62-config-data" (OuterVolumeSpecName: "config-data") pod "079f20c9-f742-4c4b-a8c0-a2a09573bf62" (UID: "079f20c9-f742-4c4b-a8c0-a2a09573bf62"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:32:01 crc kubenswrapper[4836]: I0217 14:32:01.311131 4836 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/079f20c9-f742-4c4b-a8c0-a2a09573bf62-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 14:32:01 crc kubenswrapper[4836]: I0217 14:32:01.311199 4836 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/079f20c9-f742-4c4b-a8c0-a2a09573bf62-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 14:32:01 crc kubenswrapper[4836]: I0217 14:32:01.311213 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w49k2\" (UniqueName: \"kubernetes.io/projected/079f20c9-f742-4c4b-a8c0-a2a09573bf62-kube-api-access-w49k2\") on node \"crc\" DevicePath \"\"" Feb 17 14:32:01 crc kubenswrapper[4836]: I0217 14:32:01.311228 4836 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/079f20c9-f742-4c4b-a8c0-a2a09573bf62-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:32:01 crc kubenswrapper[4836]: I0217 14:32:01.564904 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-h4mlr" event={"ID":"079f20c9-f742-4c4b-a8c0-a2a09573bf62","Type":"ContainerDied","Data":"04c119ecda97e9ecffd6aff4094f5b50acfbc974345595ef1ecf805ee73c0e65"} Feb 17 14:32:01 crc kubenswrapper[4836]: I0217 14:32:01.565327 4836 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="04c119ecda97e9ecffd6aff4094f5b50acfbc974345595ef1ecf805ee73c0e65" Feb 17 14:32:01 crc kubenswrapper[4836]: I0217 14:32:01.564959 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-h4mlr" Feb 17 14:32:01 crc kubenswrapper[4836]: I0217 14:32:01.568352 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ddcf30e-7916-4b59-8986-a5d2c218170e","Type":"ContainerStarted","Data":"87ca4914522ee72ea747870e81b062f1e0a07903072c8b9301031784cbe78eee"} Feb 17 14:32:01 crc kubenswrapper[4836]: I0217 14:32:01.762679 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 17 14:32:01 crc kubenswrapper[4836]: I0217 14:32:01.763410 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="c95da3ac-7563-49bf-a956-b19297cb7d97" containerName="nova-api-log" containerID="cri-o://423c24607bf9f3ae8ddccd308e2f900af69eca84ed6108aea143a2d2240bf369" gracePeriod=30 Feb 17 14:32:01 crc kubenswrapper[4836]: I0217 14:32:01.763586 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="c95da3ac-7563-49bf-a956-b19297cb7d97" containerName="nova-api-api" containerID="cri-o://506f661fc5a6b974ebc362273c2cdd8a27145169f4fc426e732aad89c2e734ed" gracePeriod=30 Feb 17 14:32:01 crc kubenswrapper[4836]: I0217 14:32:01.778570 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 14:32:01 crc kubenswrapper[4836]: I0217 14:32:01.778917 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="1853ac32-f733-4d5f-9cc2-edf83a927b28" containerName="nova-scheduler-scheduler" containerID="cri-o://cdcb8668f9c38464113928fab0592c4bc426536aaf26770f316b86d52057f1a1" gracePeriod=30 Feb 17 14:32:01 crc kubenswrapper[4836]: I0217 14:32:01.823714 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 14:32:01 crc kubenswrapper[4836]: E0217 14:32:01.825935 4836 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cdcb8668f9c38464113928fab0592c4bc426536aaf26770f316b86d52057f1a1" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 17 14:32:01 crc kubenswrapper[4836]: E0217 14:32:01.829227 4836 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cdcb8668f9c38464113928fab0592c4bc426536aaf26770f316b86d52057f1a1" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 17 14:32:01 crc kubenswrapper[4836]: E0217 14:32:01.831136 4836 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cdcb8668f9c38464113928fab0592c4bc426536aaf26770f316b86d52057f1a1" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 17 14:32:01 crc kubenswrapper[4836]: E0217 14:32:01.831196 4836 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="1853ac32-f733-4d5f-9cc2-edf83a927b28" containerName="nova-scheduler-scheduler" Feb 17 14:32:02 crc kubenswrapper[4836]: I0217 14:32:02.586718 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ddcf30e-7916-4b59-8986-a5d2c218170e","Type":"ContainerStarted","Data":"94aac7de459f3e7ae7761b0e4dbad73203c8083dca82a26ff7b92d1e673bbaff"} Feb 17 14:32:02 crc kubenswrapper[4836]: I0217 14:32:02.587466 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ddcf30e-7916-4b59-8986-a5d2c218170e","Type":"ContainerStarted","Data":"ff5c43e04f82a1d6177f770d5c348df5177eeaae920a5717f5932754f70a1fa3"} Feb 17 14:32:02 crc kubenswrapper[4836]: I0217 14:32:02.588947 4836 generic.go:334] "Generic (PLEG): container finished" podID="c95da3ac-7563-49bf-a956-b19297cb7d97" containerID="423c24607bf9f3ae8ddccd308e2f900af69eca84ed6108aea143a2d2240bf369" exitCode=143 Feb 17 14:32:02 crc kubenswrapper[4836]: I0217 14:32:02.589155 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c95da3ac-7563-49bf-a956-b19297cb7d97","Type":"ContainerDied","Data":"423c24607bf9f3ae8ddccd308e2f900af69eca84ed6108aea143a2d2240bf369"} Feb 17 14:32:02 crc kubenswrapper[4836]: I0217 14:32:02.589277 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="327aaf35-8278-4f1a-b369-7a40209c0a8e" containerName="nova-metadata-log" containerID="cri-o://e6a160448c863ff6f5997675db3bca2b01a51a67e8fdbdf7026c37112391dd6e" gracePeriod=30 Feb 17 14:32:02 crc kubenswrapper[4836]: I0217 14:32:02.589837 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="327aaf35-8278-4f1a-b369-7a40209c0a8e" containerName="nova-metadata-metadata" containerID="cri-o://a23d238ccfddfc1e73a017a79f41c992e73a86ed04c0afd30e536b5ddc8892f3" gracePeriod=30 Feb 17 14:32:03 crc kubenswrapper[4836]: I0217 14:32:03.614038 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ddcf30e-7916-4b59-8986-a5d2c218170e","Type":"ContainerStarted","Data":"394bce71060f461459b97cd132dac0a4ab8421cf47e610b6fff4a930aefe8c38"} Feb 17 14:32:03 crc kubenswrapper[4836]: I0217 14:32:03.617804 4836 generic.go:334] "Generic (PLEG): container finished" podID="327aaf35-8278-4f1a-b369-7a40209c0a8e" containerID="e6a160448c863ff6f5997675db3bca2b01a51a67e8fdbdf7026c37112391dd6e" exitCode=143 Feb 17 14:32:03 crc kubenswrapper[4836]: I0217 14:32:03.617845 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"327aaf35-8278-4f1a-b369-7a40209c0a8e","Type":"ContainerDied","Data":"e6a160448c863ff6f5997675db3bca2b01a51a67e8fdbdf7026c37112391dd6e"} Feb 17 14:32:05 crc kubenswrapper[4836]: I0217 14:32:05.518476 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-gczl5" Feb 17 14:32:05 crc kubenswrapper[4836]: I0217 14:32:05.519576 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-gczl5" Feb 17 14:32:05 crc kubenswrapper[4836]: I0217 14:32:05.646989 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ddcf30e-7916-4b59-8986-a5d2c218170e","Type":"ContainerStarted","Data":"f2c5f59a3ea0bb2c698d106db28ebcc901db443392a5bbd1ae4a3f5e393b59c6"} Feb 17 14:32:05 crc kubenswrapper[4836]: I0217 14:32:05.647856 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 17 14:32:05 crc kubenswrapper[4836]: I0217 14:32:05.669829 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.913155196 podStartE2EDuration="6.669801126s" podCreationTimestamp="2026-02-17 14:31:59 +0000 UTC" firstStartedPulling="2026-02-17 14:32:00.610118595 +0000 UTC m=+1546.953046864" lastFinishedPulling="2026-02-17 14:32:04.366764525 +0000 UTC m=+1550.709692794" observedRunningTime="2026-02-17 14:32:05.66811048 +0000 UTC m=+1552.011038749" watchObservedRunningTime="2026-02-17 14:32:05.669801126 +0000 UTC m=+1552.012729395" Feb 17 14:32:05 crc kubenswrapper[4836]: I0217 14:32:05.728634 4836 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="327aaf35-8278-4f1a-b369-7a40209c0a8e" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.225:8775/\": read tcp 10.217.0.2:59416->10.217.0.225:8775: read: connection reset by peer" Feb 17 14:32:05 crc kubenswrapper[4836]: I0217 14:32:05.728987 4836 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="327aaf35-8278-4f1a-b369-7a40209c0a8e" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.225:8775/\": read tcp 10.217.0.2:59430->10.217.0.225:8775: read: connection reset by peer" Feb 17 14:32:06 crc kubenswrapper[4836]: I0217 14:32:06.397582 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 14:32:06 crc kubenswrapper[4836]: I0217 14:32:06.537223 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/327aaf35-8278-4f1a-b369-7a40209c0a8e-config-data\") pod \"327aaf35-8278-4f1a-b369-7a40209c0a8e\" (UID: \"327aaf35-8278-4f1a-b369-7a40209c0a8e\") " Feb 17 14:32:06 crc kubenswrapper[4836]: I0217 14:32:06.537287 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/327aaf35-8278-4f1a-b369-7a40209c0a8e-combined-ca-bundle\") pod \"327aaf35-8278-4f1a-b369-7a40209c0a8e\" (UID: \"327aaf35-8278-4f1a-b369-7a40209c0a8e\") " Feb 17 14:32:06 crc kubenswrapper[4836]: I0217 14:32:06.537550 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/327aaf35-8278-4f1a-b369-7a40209c0a8e-logs\") pod \"327aaf35-8278-4f1a-b369-7a40209c0a8e\" (UID: \"327aaf35-8278-4f1a-b369-7a40209c0a8e\") " Feb 17 14:32:06 crc kubenswrapper[4836]: I0217 14:32:06.537604 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnlb4\" (UniqueName: \"kubernetes.io/projected/327aaf35-8278-4f1a-b369-7a40209c0a8e-kube-api-access-mnlb4\") pod \"327aaf35-8278-4f1a-b369-7a40209c0a8e\" (UID: \"327aaf35-8278-4f1a-b369-7a40209c0a8e\") " Feb 17 14:32:06 crc kubenswrapper[4836]: I0217 14:32:06.537743 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/327aaf35-8278-4f1a-b369-7a40209c0a8e-nova-metadata-tls-certs\") pod \"327aaf35-8278-4f1a-b369-7a40209c0a8e\" (UID: \"327aaf35-8278-4f1a-b369-7a40209c0a8e\") " Feb 17 14:32:06 crc kubenswrapper[4836]: I0217 14:32:06.539892 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/327aaf35-8278-4f1a-b369-7a40209c0a8e-logs" (OuterVolumeSpecName: "logs") pod "327aaf35-8278-4f1a-b369-7a40209c0a8e" (UID: "327aaf35-8278-4f1a-b369-7a40209c0a8e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:32:06 crc kubenswrapper[4836]: I0217 14:32:06.545382 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/327aaf35-8278-4f1a-b369-7a40209c0a8e-kube-api-access-mnlb4" (OuterVolumeSpecName: "kube-api-access-mnlb4") pod "327aaf35-8278-4f1a-b369-7a40209c0a8e" (UID: "327aaf35-8278-4f1a-b369-7a40209c0a8e"). InnerVolumeSpecName "kube-api-access-mnlb4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:32:06 crc kubenswrapper[4836]: I0217 14:32:06.580103 4836 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-gczl5" podUID="4cd9d6fa-d7e3-483f-afdf-104754807815" containerName="registry-server" probeResult="failure" output=< Feb 17 14:32:06 crc kubenswrapper[4836]: timeout: failed to connect service ":50051" within 1s Feb 17 14:32:06 crc kubenswrapper[4836]: > Feb 17 14:32:06 crc kubenswrapper[4836]: I0217 14:32:06.582635 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/327aaf35-8278-4f1a-b369-7a40209c0a8e-config-data" (OuterVolumeSpecName: "config-data") pod "327aaf35-8278-4f1a-b369-7a40209c0a8e" (UID: "327aaf35-8278-4f1a-b369-7a40209c0a8e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:32:06 crc kubenswrapper[4836]: I0217 14:32:06.637804 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/327aaf35-8278-4f1a-b369-7a40209c0a8e-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "327aaf35-8278-4f1a-b369-7a40209c0a8e" (UID: "327aaf35-8278-4f1a-b369-7a40209c0a8e"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:32:06 crc kubenswrapper[4836]: I0217 14:32:06.641588 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 17 14:32:06 crc kubenswrapper[4836]: I0217 14:32:06.643339 4836 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/327aaf35-8278-4f1a-b369-7a40209c0a8e-logs\") on node \"crc\" DevicePath \"\"" Feb 17 14:32:06 crc kubenswrapper[4836]: I0217 14:32:06.643810 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnlb4\" (UniqueName: \"kubernetes.io/projected/327aaf35-8278-4f1a-b369-7a40209c0a8e-kube-api-access-mnlb4\") on node \"crc\" DevicePath \"\"" Feb 17 14:32:06 crc kubenswrapper[4836]: I0217 14:32:06.643906 4836 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/327aaf35-8278-4f1a-b369-7a40209c0a8e-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 14:32:06 crc kubenswrapper[4836]: I0217 14:32:06.643990 4836 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/327aaf35-8278-4f1a-b369-7a40209c0a8e-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 14:32:06 crc kubenswrapper[4836]: I0217 14:32:06.643506 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/327aaf35-8278-4f1a-b369-7a40209c0a8e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "327aaf35-8278-4f1a-b369-7a40209c0a8e" (UID: "327aaf35-8278-4f1a-b369-7a40209c0a8e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:32:06 crc kubenswrapper[4836]: I0217 14:32:06.672829 4836 generic.go:334] "Generic (PLEG): container finished" podID="1853ac32-f733-4d5f-9cc2-edf83a927b28" containerID="cdcb8668f9c38464113928fab0592c4bc426536aaf26770f316b86d52057f1a1" exitCode=0 Feb 17 14:32:06 crc kubenswrapper[4836]: I0217 14:32:06.673076 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 17 14:32:06 crc kubenswrapper[4836]: I0217 14:32:06.673845 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"1853ac32-f733-4d5f-9cc2-edf83a927b28","Type":"ContainerDied","Data":"cdcb8668f9c38464113928fab0592c4bc426536aaf26770f316b86d52057f1a1"} Feb 17 14:32:06 crc kubenswrapper[4836]: I0217 14:32:06.673910 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"1853ac32-f733-4d5f-9cc2-edf83a927b28","Type":"ContainerDied","Data":"942beb1e7c7f15eec98870c7e5614fc6da7b9d580327cb6b8b021c40fa96a882"} Feb 17 14:32:06 crc kubenswrapper[4836]: I0217 14:32:06.673938 4836 scope.go:117] "RemoveContainer" containerID="cdcb8668f9c38464113928fab0592c4bc426536aaf26770f316b86d52057f1a1" Feb 17 14:32:06 crc kubenswrapper[4836]: I0217 14:32:06.694705 4836 generic.go:334] "Generic (PLEG): container finished" podID="327aaf35-8278-4f1a-b369-7a40209c0a8e" containerID="a23d238ccfddfc1e73a017a79f41c992e73a86ed04c0afd30e536b5ddc8892f3" exitCode=0 Feb 17 14:32:06 crc kubenswrapper[4836]: I0217 14:32:06.694841 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"327aaf35-8278-4f1a-b369-7a40209c0a8e","Type":"ContainerDied","Data":"a23d238ccfddfc1e73a017a79f41c992e73a86ed04c0afd30e536b5ddc8892f3"} Feb 17 14:32:06 crc kubenswrapper[4836]: I0217 14:32:06.694940 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"327aaf35-8278-4f1a-b369-7a40209c0a8e","Type":"ContainerDied","Data":"f0a3643ff133a4988442a790112d5d5fb30bb8fc7d8f8119a27a6cf6da2e8bfc"} Feb 17 14:32:06 crc kubenswrapper[4836]: I0217 14:32:06.697317 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 14:32:06 crc kubenswrapper[4836]: I0217 14:32:06.721714 4836 scope.go:117] "RemoveContainer" containerID="cdcb8668f9c38464113928fab0592c4bc426536aaf26770f316b86d52057f1a1" Feb 17 14:32:06 crc kubenswrapper[4836]: E0217 14:32:06.722745 4836 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cdcb8668f9c38464113928fab0592c4bc426536aaf26770f316b86d52057f1a1\": container with ID starting with cdcb8668f9c38464113928fab0592c4bc426536aaf26770f316b86d52057f1a1 not found: ID does not exist" containerID="cdcb8668f9c38464113928fab0592c4bc426536aaf26770f316b86d52057f1a1" Feb 17 14:32:06 crc kubenswrapper[4836]: I0217 14:32:06.722947 4836 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cdcb8668f9c38464113928fab0592c4bc426536aaf26770f316b86d52057f1a1"} err="failed to get container status \"cdcb8668f9c38464113928fab0592c4bc426536aaf26770f316b86d52057f1a1\": rpc error: code = NotFound desc = could not find container \"cdcb8668f9c38464113928fab0592c4bc426536aaf26770f316b86d52057f1a1\": container with ID starting with cdcb8668f9c38464113928fab0592c4bc426536aaf26770f316b86d52057f1a1 not found: ID does not exist" Feb 17 14:32:06 crc kubenswrapper[4836]: I0217 14:32:06.723060 4836 scope.go:117] "RemoveContainer" containerID="a23d238ccfddfc1e73a017a79f41c992e73a86ed04c0afd30e536b5ddc8892f3" Feb 17 14:32:06 crc kubenswrapper[4836]: I0217 14:32:06.745159 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1853ac32-f733-4d5f-9cc2-edf83a927b28-combined-ca-bundle\") pod \"1853ac32-f733-4d5f-9cc2-edf83a927b28\" (UID: \"1853ac32-f733-4d5f-9cc2-edf83a927b28\") " Feb 17 14:32:06 crc kubenswrapper[4836]: I0217 14:32:06.745313 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lbrvs\" (UniqueName: \"kubernetes.io/projected/1853ac32-f733-4d5f-9cc2-edf83a927b28-kube-api-access-lbrvs\") pod \"1853ac32-f733-4d5f-9cc2-edf83a927b28\" (UID: \"1853ac32-f733-4d5f-9cc2-edf83a927b28\") " Feb 17 14:32:06 crc kubenswrapper[4836]: I0217 14:32:06.745441 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1853ac32-f733-4d5f-9cc2-edf83a927b28-config-data\") pod \"1853ac32-f733-4d5f-9cc2-edf83a927b28\" (UID: \"1853ac32-f733-4d5f-9cc2-edf83a927b28\") " Feb 17 14:32:06 crc kubenswrapper[4836]: I0217 14:32:06.746426 4836 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/327aaf35-8278-4f1a-b369-7a40209c0a8e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:32:06 crc kubenswrapper[4836]: I0217 14:32:06.749610 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1853ac32-f733-4d5f-9cc2-edf83a927b28-kube-api-access-lbrvs" (OuterVolumeSpecName: "kube-api-access-lbrvs") pod "1853ac32-f733-4d5f-9cc2-edf83a927b28" (UID: "1853ac32-f733-4d5f-9cc2-edf83a927b28"). InnerVolumeSpecName "kube-api-access-lbrvs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:32:06 crc kubenswrapper[4836]: I0217 14:32:06.780402 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 14:32:06 crc kubenswrapper[4836]: I0217 14:32:06.782286 4836 scope.go:117] "RemoveContainer" containerID="e6a160448c863ff6f5997675db3bca2b01a51a67e8fdbdf7026c37112391dd6e" Feb 17 14:32:06 crc kubenswrapper[4836]: I0217 14:32:06.794568 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 14:32:06 crc kubenswrapper[4836]: I0217 14:32:06.808665 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 17 14:32:06 crc kubenswrapper[4836]: I0217 14:32:06.809414 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1853ac32-f733-4d5f-9cc2-edf83a927b28-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1853ac32-f733-4d5f-9cc2-edf83a927b28" (UID: "1853ac32-f733-4d5f-9cc2-edf83a927b28"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:32:06 crc kubenswrapper[4836]: E0217 14:32:06.810685 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="327aaf35-8278-4f1a-b369-7a40209c0a8e" containerName="nova-metadata-metadata" Feb 17 14:32:06 crc kubenswrapper[4836]: I0217 14:32:06.810711 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="327aaf35-8278-4f1a-b369-7a40209c0a8e" containerName="nova-metadata-metadata" Feb 17 14:32:06 crc kubenswrapper[4836]: E0217 14:32:06.810734 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1853ac32-f733-4d5f-9cc2-edf83a927b28" containerName="nova-scheduler-scheduler" Feb 17 14:32:06 crc kubenswrapper[4836]: I0217 14:32:06.810743 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="1853ac32-f733-4d5f-9cc2-edf83a927b28" containerName="nova-scheduler-scheduler" Feb 17 14:32:06 crc kubenswrapper[4836]: E0217 14:32:06.810802 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="327aaf35-8278-4f1a-b369-7a40209c0a8e" containerName="nova-metadata-log" Feb 17 14:32:06 crc kubenswrapper[4836]: I0217 14:32:06.810812 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="327aaf35-8278-4f1a-b369-7a40209c0a8e" containerName="nova-metadata-log" Feb 17 14:32:06 crc kubenswrapper[4836]: E0217 14:32:06.810825 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="079f20c9-f742-4c4b-a8c0-a2a09573bf62" containerName="nova-manage" Feb 17 14:32:06 crc kubenswrapper[4836]: I0217 14:32:06.810835 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="079f20c9-f742-4c4b-a8c0-a2a09573bf62" containerName="nova-manage" Feb 17 14:32:06 crc kubenswrapper[4836]: I0217 14:32:06.811128 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="079f20c9-f742-4c4b-a8c0-a2a09573bf62" containerName="nova-manage" Feb 17 14:32:06 crc kubenswrapper[4836]: I0217 14:32:06.811828 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="1853ac32-f733-4d5f-9cc2-edf83a927b28" containerName="nova-scheduler-scheduler" Feb 17 14:32:06 crc kubenswrapper[4836]: I0217 14:32:06.811914 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="327aaf35-8278-4f1a-b369-7a40209c0a8e" containerName="nova-metadata-log" Feb 17 14:32:06 crc kubenswrapper[4836]: I0217 14:32:06.811932 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="327aaf35-8278-4f1a-b369-7a40209c0a8e" containerName="nova-metadata-metadata" Feb 17 14:32:06 crc kubenswrapper[4836]: I0217 14:32:06.819458 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 14:32:06 crc kubenswrapper[4836]: I0217 14:32:06.826224 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 14:32:06 crc kubenswrapper[4836]: I0217 14:32:06.828073 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 17 14:32:06 crc kubenswrapper[4836]: I0217 14:32:06.828197 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 17 14:32:06 crc kubenswrapper[4836]: I0217 14:32:06.839634 4836 scope.go:117] "RemoveContainer" containerID="a23d238ccfddfc1e73a017a79f41c992e73a86ed04c0afd30e536b5ddc8892f3" Feb 17 14:32:06 crc kubenswrapper[4836]: E0217 14:32:06.842691 4836 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a23d238ccfddfc1e73a017a79f41c992e73a86ed04c0afd30e536b5ddc8892f3\": container with ID starting with a23d238ccfddfc1e73a017a79f41c992e73a86ed04c0afd30e536b5ddc8892f3 not found: ID does not exist" containerID="a23d238ccfddfc1e73a017a79f41c992e73a86ed04c0afd30e536b5ddc8892f3" Feb 17 14:32:06 crc kubenswrapper[4836]: I0217 14:32:06.842939 4836 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a23d238ccfddfc1e73a017a79f41c992e73a86ed04c0afd30e536b5ddc8892f3"} err="failed to get container status \"a23d238ccfddfc1e73a017a79f41c992e73a86ed04c0afd30e536b5ddc8892f3\": rpc error: code = NotFound desc = could not find container \"a23d238ccfddfc1e73a017a79f41c992e73a86ed04c0afd30e536b5ddc8892f3\": container with ID starting with a23d238ccfddfc1e73a017a79f41c992e73a86ed04c0afd30e536b5ddc8892f3 not found: ID does not exist" Feb 17 14:32:06 crc kubenswrapper[4836]: I0217 14:32:06.843058 4836 scope.go:117] "RemoveContainer" containerID="e6a160448c863ff6f5997675db3bca2b01a51a67e8fdbdf7026c37112391dd6e" Feb 17 14:32:06 crc kubenswrapper[4836]: E0217 14:32:06.844143 4836 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6a160448c863ff6f5997675db3bca2b01a51a67e8fdbdf7026c37112391dd6e\": container with ID starting with e6a160448c863ff6f5997675db3bca2b01a51a67e8fdbdf7026c37112391dd6e not found: ID does not exist" containerID="e6a160448c863ff6f5997675db3bca2b01a51a67e8fdbdf7026c37112391dd6e" Feb 17 14:32:06 crc kubenswrapper[4836]: I0217 14:32:06.844373 4836 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6a160448c863ff6f5997675db3bca2b01a51a67e8fdbdf7026c37112391dd6e"} err="failed to get container status \"e6a160448c863ff6f5997675db3bca2b01a51a67e8fdbdf7026c37112391dd6e\": rpc error: code = NotFound desc = could not find container \"e6a160448c863ff6f5997675db3bca2b01a51a67e8fdbdf7026c37112391dd6e\": container with ID starting with e6a160448c863ff6f5997675db3bca2b01a51a67e8fdbdf7026c37112391dd6e not found: ID does not exist" Feb 17 14:32:06 crc kubenswrapper[4836]: I0217 14:32:06.850636 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpstt\" (UniqueName: \"kubernetes.io/projected/c56150e0-07ff-4a45-9231-26fa261942c4-kube-api-access-wpstt\") pod \"nova-metadata-0\" (UID: \"c56150e0-07ff-4a45-9231-26fa261942c4\") " pod="openstack/nova-metadata-0" Feb 17 14:32:06 crc kubenswrapper[4836]: I0217 14:32:06.851017 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c56150e0-07ff-4a45-9231-26fa261942c4-logs\") pod \"nova-metadata-0\" (UID: \"c56150e0-07ff-4a45-9231-26fa261942c4\") " pod="openstack/nova-metadata-0" Feb 17 14:32:06 crc kubenswrapper[4836]: I0217 14:32:06.851287 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c56150e0-07ff-4a45-9231-26fa261942c4-config-data\") pod \"nova-metadata-0\" (UID: \"c56150e0-07ff-4a45-9231-26fa261942c4\") " pod="openstack/nova-metadata-0" Feb 17 14:32:06 crc kubenswrapper[4836]: I0217 14:32:06.852163 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c56150e0-07ff-4a45-9231-26fa261942c4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c56150e0-07ff-4a45-9231-26fa261942c4\") " pod="openstack/nova-metadata-0" Feb 17 14:32:06 crc kubenswrapper[4836]: I0217 14:32:06.852356 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c56150e0-07ff-4a45-9231-26fa261942c4-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c56150e0-07ff-4a45-9231-26fa261942c4\") " pod="openstack/nova-metadata-0" Feb 17 14:32:06 crc kubenswrapper[4836]: I0217 14:32:06.852513 4836 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1853ac32-f733-4d5f-9cc2-edf83a927b28-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:32:06 crc kubenswrapper[4836]: I0217 14:32:06.852781 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lbrvs\" (UniqueName: \"kubernetes.io/projected/1853ac32-f733-4d5f-9cc2-edf83a927b28-kube-api-access-lbrvs\") on node \"crc\" DevicePath \"\"" Feb 17 14:32:06 crc kubenswrapper[4836]: I0217 14:32:06.854545 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1853ac32-f733-4d5f-9cc2-edf83a927b28-config-data" (OuterVolumeSpecName: "config-data") pod "1853ac32-f733-4d5f-9cc2-edf83a927b28" (UID: "1853ac32-f733-4d5f-9cc2-edf83a927b28"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:32:06 crc kubenswrapper[4836]: I0217 14:32:06.954288 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpstt\" (UniqueName: \"kubernetes.io/projected/c56150e0-07ff-4a45-9231-26fa261942c4-kube-api-access-wpstt\") pod \"nova-metadata-0\" (UID: \"c56150e0-07ff-4a45-9231-26fa261942c4\") " pod="openstack/nova-metadata-0" Feb 17 14:32:06 crc kubenswrapper[4836]: I0217 14:32:06.954392 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c56150e0-07ff-4a45-9231-26fa261942c4-logs\") pod \"nova-metadata-0\" (UID: \"c56150e0-07ff-4a45-9231-26fa261942c4\") " pod="openstack/nova-metadata-0" Feb 17 14:32:06 crc kubenswrapper[4836]: I0217 14:32:06.954470 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c56150e0-07ff-4a45-9231-26fa261942c4-config-data\") pod \"nova-metadata-0\" (UID: \"c56150e0-07ff-4a45-9231-26fa261942c4\") " pod="openstack/nova-metadata-0" Feb 17 14:32:06 crc kubenswrapper[4836]: I0217 14:32:06.954538 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c56150e0-07ff-4a45-9231-26fa261942c4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c56150e0-07ff-4a45-9231-26fa261942c4\") " pod="openstack/nova-metadata-0" Feb 17 14:32:06 crc kubenswrapper[4836]: I0217 14:32:06.954566 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c56150e0-07ff-4a45-9231-26fa261942c4-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c56150e0-07ff-4a45-9231-26fa261942c4\") " pod="openstack/nova-metadata-0" Feb 17 14:32:06 crc kubenswrapper[4836]: I0217 14:32:06.954655 4836 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1853ac32-f733-4d5f-9cc2-edf83a927b28-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 14:32:06 crc kubenswrapper[4836]: I0217 14:32:06.955183 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c56150e0-07ff-4a45-9231-26fa261942c4-logs\") pod \"nova-metadata-0\" (UID: \"c56150e0-07ff-4a45-9231-26fa261942c4\") " pod="openstack/nova-metadata-0" Feb 17 14:32:06 crc kubenswrapper[4836]: I0217 14:32:06.958329 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c56150e0-07ff-4a45-9231-26fa261942c4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c56150e0-07ff-4a45-9231-26fa261942c4\") " pod="openstack/nova-metadata-0" Feb 17 14:32:06 crc kubenswrapper[4836]: I0217 14:32:06.965053 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c56150e0-07ff-4a45-9231-26fa261942c4-config-data\") pod \"nova-metadata-0\" (UID: \"c56150e0-07ff-4a45-9231-26fa261942c4\") " pod="openstack/nova-metadata-0" Feb 17 14:32:06 crc kubenswrapper[4836]: I0217 14:32:06.965588 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c56150e0-07ff-4a45-9231-26fa261942c4-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c56150e0-07ff-4a45-9231-26fa261942c4\") " pod="openstack/nova-metadata-0" Feb 17 14:32:06 crc kubenswrapper[4836]: I0217 14:32:06.980948 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpstt\" (UniqueName: \"kubernetes.io/projected/c56150e0-07ff-4a45-9231-26fa261942c4-kube-api-access-wpstt\") pod \"nova-metadata-0\" (UID: \"c56150e0-07ff-4a45-9231-26fa261942c4\") " pod="openstack/nova-metadata-0" Feb 17 14:32:07 crc kubenswrapper[4836]: I0217 14:32:07.137924 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 14:32:07 crc kubenswrapper[4836]: I0217 14:32:07.223437 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 14:32:07 crc kubenswrapper[4836]: I0217 14:32:07.269491 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 14:32:07 crc kubenswrapper[4836]: I0217 14:32:07.289415 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 14:32:07 crc kubenswrapper[4836]: I0217 14:32:07.291682 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 17 14:32:07 crc kubenswrapper[4836]: I0217 14:32:07.302009 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 17 14:32:07 crc kubenswrapper[4836]: I0217 14:32:07.311409 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 14:32:07 crc kubenswrapper[4836]: I0217 14:32:07.484233 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6bfcfdb5-3886-47e2-8e71-33c95dc14e73-config-data\") pod \"nova-scheduler-0\" (UID: \"6bfcfdb5-3886-47e2-8e71-33c95dc14e73\") " pod="openstack/nova-scheduler-0" Feb 17 14:32:07 crc kubenswrapper[4836]: I0217 14:32:07.484398 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bfcfdb5-3886-47e2-8e71-33c95dc14e73-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6bfcfdb5-3886-47e2-8e71-33c95dc14e73\") " pod="openstack/nova-scheduler-0" Feb 17 14:32:07 crc kubenswrapper[4836]: I0217 14:32:07.484458 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrm6m\" (UniqueName: \"kubernetes.io/projected/6bfcfdb5-3886-47e2-8e71-33c95dc14e73-kube-api-access-nrm6m\") pod \"nova-scheduler-0\" (UID: \"6bfcfdb5-3886-47e2-8e71-33c95dc14e73\") " pod="openstack/nova-scheduler-0" Feb 17 14:32:07 crc kubenswrapper[4836]: I0217 14:32:07.586621 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bfcfdb5-3886-47e2-8e71-33c95dc14e73-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6bfcfdb5-3886-47e2-8e71-33c95dc14e73\") " pod="openstack/nova-scheduler-0" Feb 17 14:32:07 crc kubenswrapper[4836]: I0217 14:32:07.586946 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrm6m\" (UniqueName: \"kubernetes.io/projected/6bfcfdb5-3886-47e2-8e71-33c95dc14e73-kube-api-access-nrm6m\") pod \"nova-scheduler-0\" (UID: \"6bfcfdb5-3886-47e2-8e71-33c95dc14e73\") " pod="openstack/nova-scheduler-0" Feb 17 14:32:07 crc kubenswrapper[4836]: I0217 14:32:07.587066 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6bfcfdb5-3886-47e2-8e71-33c95dc14e73-config-data\") pod \"nova-scheduler-0\" (UID: \"6bfcfdb5-3886-47e2-8e71-33c95dc14e73\") " pod="openstack/nova-scheduler-0" Feb 17 14:32:07 crc kubenswrapper[4836]: I0217 14:32:07.593586 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6bfcfdb5-3886-47e2-8e71-33c95dc14e73-config-data\") pod \"nova-scheduler-0\" (UID: \"6bfcfdb5-3886-47e2-8e71-33c95dc14e73\") " pod="openstack/nova-scheduler-0" Feb 17 14:32:07 crc kubenswrapper[4836]: I0217 14:32:07.595382 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bfcfdb5-3886-47e2-8e71-33c95dc14e73-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6bfcfdb5-3886-47e2-8e71-33c95dc14e73\") " pod="openstack/nova-scheduler-0" Feb 17 14:32:07 crc kubenswrapper[4836]: I0217 14:32:07.605388 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrm6m\" (UniqueName: \"kubernetes.io/projected/6bfcfdb5-3886-47e2-8e71-33c95dc14e73-kube-api-access-nrm6m\") pod \"nova-scheduler-0\" (UID: \"6bfcfdb5-3886-47e2-8e71-33c95dc14e73\") " pod="openstack/nova-scheduler-0" Feb 17 14:32:07 crc kubenswrapper[4836]: I0217 14:32:07.678008 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 17 14:32:07 crc kubenswrapper[4836]: I0217 14:32:07.728039 4836 generic.go:334] "Generic (PLEG): container finished" podID="c95da3ac-7563-49bf-a956-b19297cb7d97" containerID="506f661fc5a6b974ebc362273c2cdd8a27145169f4fc426e732aad89c2e734ed" exitCode=0 Feb 17 14:32:07 crc kubenswrapper[4836]: I0217 14:32:07.728153 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c95da3ac-7563-49bf-a956-b19297cb7d97","Type":"ContainerDied","Data":"506f661fc5a6b974ebc362273c2cdd8a27145169f4fc426e732aad89c2e734ed"} Feb 17 14:32:07 crc kubenswrapper[4836]: I0217 14:32:07.865322 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 14:32:08 crc kubenswrapper[4836]: I0217 14:32:08.139277 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 14:32:08 crc kubenswrapper[4836]: I0217 14:32:08.216627 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c95da3ac-7563-49bf-a956-b19297cb7d97-internal-tls-certs\") pod \"c95da3ac-7563-49bf-a956-b19297cb7d97\" (UID: \"c95da3ac-7563-49bf-a956-b19297cb7d97\") " Feb 17 14:32:08 crc kubenswrapper[4836]: I0217 14:32:08.216805 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c95da3ac-7563-49bf-a956-b19297cb7d97-config-data\") pod \"c95da3ac-7563-49bf-a956-b19297cb7d97\" (UID: \"c95da3ac-7563-49bf-a956-b19297cb7d97\") " Feb 17 14:32:08 crc kubenswrapper[4836]: I0217 14:32:08.216848 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c95da3ac-7563-49bf-a956-b19297cb7d97-logs\") pod \"c95da3ac-7563-49bf-a956-b19297cb7d97\" (UID: \"c95da3ac-7563-49bf-a956-b19297cb7d97\") " Feb 17 14:32:08 crc kubenswrapper[4836]: I0217 14:32:08.216883 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c95da3ac-7563-49bf-a956-b19297cb7d97-public-tls-certs\") pod \"c95da3ac-7563-49bf-a956-b19297cb7d97\" (UID: \"c95da3ac-7563-49bf-a956-b19297cb7d97\") " Feb 17 14:32:08 crc kubenswrapper[4836]: I0217 14:32:08.217404 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m6vzb\" (UniqueName: \"kubernetes.io/projected/c95da3ac-7563-49bf-a956-b19297cb7d97-kube-api-access-m6vzb\") pod \"c95da3ac-7563-49bf-a956-b19297cb7d97\" (UID: \"c95da3ac-7563-49bf-a956-b19297cb7d97\") " Feb 17 14:32:08 crc kubenswrapper[4836]: I0217 14:32:08.219239 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c95da3ac-7563-49bf-a956-b19297cb7d97-logs" (OuterVolumeSpecName: "logs") pod "c95da3ac-7563-49bf-a956-b19297cb7d97" (UID: "c95da3ac-7563-49bf-a956-b19297cb7d97"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:32:08 crc kubenswrapper[4836]: I0217 14:32:08.226607 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c95da3ac-7563-49bf-a956-b19297cb7d97-kube-api-access-m6vzb" (OuterVolumeSpecName: "kube-api-access-m6vzb") pod "c95da3ac-7563-49bf-a956-b19297cb7d97" (UID: "c95da3ac-7563-49bf-a956-b19297cb7d97"). InnerVolumeSpecName "kube-api-access-m6vzb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:32:08 crc kubenswrapper[4836]: W0217 14:32:08.284129 4836 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6bfcfdb5_3886_47e2_8e71_33c95dc14e73.slice/crio-9164b01fdf81254ae31a0690c9caa3d8d01063ff6d5527c633a1da310069d90d WatchSource:0}: Error finding container 9164b01fdf81254ae31a0690c9caa3d8d01063ff6d5527c633a1da310069d90d: Status 404 returned error can't find the container with id 9164b01fdf81254ae31a0690c9caa3d8d01063ff6d5527c633a1da310069d90d Feb 17 14:32:08 crc kubenswrapper[4836]: I0217 14:32:08.291052 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 14:32:08 crc kubenswrapper[4836]: I0217 14:32:08.293756 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c95da3ac-7563-49bf-a956-b19297cb7d97-config-data" (OuterVolumeSpecName: "config-data") pod "c95da3ac-7563-49bf-a956-b19297cb7d97" (UID: "c95da3ac-7563-49bf-a956-b19297cb7d97"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:32:08 crc kubenswrapper[4836]: I0217 14:32:08.303409 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c95da3ac-7563-49bf-a956-b19297cb7d97-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "c95da3ac-7563-49bf-a956-b19297cb7d97" (UID: "c95da3ac-7563-49bf-a956-b19297cb7d97"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:32:08 crc kubenswrapper[4836]: I0217 14:32:08.308975 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c95da3ac-7563-49bf-a956-b19297cb7d97-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "c95da3ac-7563-49bf-a956-b19297cb7d97" (UID: "c95da3ac-7563-49bf-a956-b19297cb7d97"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:32:08 crc kubenswrapper[4836]: I0217 14:32:08.319352 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c95da3ac-7563-49bf-a956-b19297cb7d97-combined-ca-bundle\") pod \"c95da3ac-7563-49bf-a956-b19297cb7d97\" (UID: \"c95da3ac-7563-49bf-a956-b19297cb7d97\") " Feb 17 14:32:08 crc kubenswrapper[4836]: I0217 14:32:08.319744 4836 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c95da3ac-7563-49bf-a956-b19297cb7d97-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 14:32:08 crc kubenswrapper[4836]: I0217 14:32:08.319764 4836 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c95da3ac-7563-49bf-a956-b19297cb7d97-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 14:32:08 crc kubenswrapper[4836]: I0217 14:32:08.319775 4836 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c95da3ac-7563-49bf-a956-b19297cb7d97-logs\") on node \"crc\" DevicePath \"\"" Feb 17 14:32:08 crc kubenswrapper[4836]: I0217 14:32:08.319784 4836 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c95da3ac-7563-49bf-a956-b19297cb7d97-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 14:32:08 crc kubenswrapper[4836]: I0217 14:32:08.319795 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m6vzb\" (UniqueName: \"kubernetes.io/projected/c95da3ac-7563-49bf-a956-b19297cb7d97-kube-api-access-m6vzb\") on node \"crc\" DevicePath \"\"" Feb 17 14:32:08 crc kubenswrapper[4836]: I0217 14:32:08.350458 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c95da3ac-7563-49bf-a956-b19297cb7d97-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c95da3ac-7563-49bf-a956-b19297cb7d97" (UID: "c95da3ac-7563-49bf-a956-b19297cb7d97"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:32:08 crc kubenswrapper[4836]: I0217 14:32:08.422379 4836 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c95da3ac-7563-49bf-a956-b19297cb7d97-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 14:32:08 crc kubenswrapper[4836]: I0217 14:32:08.582418 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1853ac32-f733-4d5f-9cc2-edf83a927b28" path="/var/lib/kubelet/pods/1853ac32-f733-4d5f-9cc2-edf83a927b28/volumes" Feb 17 14:32:08 crc kubenswrapper[4836]: I0217 14:32:08.583781 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="327aaf35-8278-4f1a-b369-7a40209c0a8e" path="/var/lib/kubelet/pods/327aaf35-8278-4f1a-b369-7a40209c0a8e/volumes" Feb 17 14:32:08 crc kubenswrapper[4836]: I0217 14:32:08.760387 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c56150e0-07ff-4a45-9231-26fa261942c4","Type":"ContainerStarted","Data":"f36aa1be569d660a4ed5b11eec489dc4a85f8381236ebdb9f5339117b15bc9db"} Feb 17 14:32:08 crc kubenswrapper[4836]: I0217 14:32:08.760454 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c56150e0-07ff-4a45-9231-26fa261942c4","Type":"ContainerStarted","Data":"01e84a78b931b1ecbae9c5f18cc01f44e6d6bdf9262c30e38e84ac0e79f4084a"} Feb 17 14:32:08 crc kubenswrapper[4836]: I0217 14:32:08.760469 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c56150e0-07ff-4a45-9231-26fa261942c4","Type":"ContainerStarted","Data":"b7dfaca0a861b9c677b8d5b23f2024ef4060315f27a7aec2434206cbd68b3367"} Feb 17 14:32:08 crc kubenswrapper[4836]: I0217 14:32:08.763892 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c95da3ac-7563-49bf-a956-b19297cb7d97","Type":"ContainerDied","Data":"f3759853b325832fd4c44c00a3e3389086733322b67a29ca459e1ee11b19dfb7"} Feb 17 14:32:08 crc kubenswrapper[4836]: I0217 14:32:08.763948 4836 scope.go:117] "RemoveContainer" containerID="506f661fc5a6b974ebc362273c2cdd8a27145169f4fc426e732aad89c2e734ed" Feb 17 14:32:08 crc kubenswrapper[4836]: I0217 14:32:08.764076 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 14:32:08 crc kubenswrapper[4836]: I0217 14:32:08.788616 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6bfcfdb5-3886-47e2-8e71-33c95dc14e73","Type":"ContainerStarted","Data":"0b6199a97ea4c0df86dcd9fdc5893ef07b4dd716ba8ffef30b72bdf6e4d5c3fe"} Feb 17 14:32:08 crc kubenswrapper[4836]: I0217 14:32:08.788675 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6bfcfdb5-3886-47e2-8e71-33c95dc14e73","Type":"ContainerStarted","Data":"9164b01fdf81254ae31a0690c9caa3d8d01063ff6d5527c633a1da310069d90d"} Feb 17 14:32:08 crc kubenswrapper[4836]: I0217 14:32:08.803459 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.803427273 podStartE2EDuration="2.803427273s" podCreationTimestamp="2026-02-17 14:32:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:32:08.79444485 +0000 UTC m=+1555.137373139" watchObservedRunningTime="2026-02-17 14:32:08.803427273 +0000 UTC m=+1555.146355542" Feb 17 14:32:08 crc kubenswrapper[4836]: I0217 14:32:08.823439 4836 scope.go:117] "RemoveContainer" containerID="423c24607bf9f3ae8ddccd308e2f900af69eca84ed6108aea143a2d2240bf369" Feb 17 14:32:08 crc kubenswrapper[4836]: I0217 14:32:08.827384 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 17 14:32:08 crc kubenswrapper[4836]: I0217 14:32:08.861529 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 17 14:32:08 crc kubenswrapper[4836]: I0217 14:32:08.863243 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=1.863218045 podStartE2EDuration="1.863218045s" podCreationTimestamp="2026-02-17 14:32:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:32:08.840228492 +0000 UTC m=+1555.183156761" watchObservedRunningTime="2026-02-17 14:32:08.863218045 +0000 UTC m=+1555.206146334" Feb 17 14:32:08 crc kubenswrapper[4836]: I0217 14:32:08.904789 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 17 14:32:08 crc kubenswrapper[4836]: E0217 14:32:08.908831 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c95da3ac-7563-49bf-a956-b19297cb7d97" containerName="nova-api-api" Feb 17 14:32:08 crc kubenswrapper[4836]: I0217 14:32:08.908925 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="c95da3ac-7563-49bf-a956-b19297cb7d97" containerName="nova-api-api" Feb 17 14:32:08 crc kubenswrapper[4836]: E0217 14:32:08.908970 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c95da3ac-7563-49bf-a956-b19297cb7d97" containerName="nova-api-log" Feb 17 14:32:08 crc kubenswrapper[4836]: I0217 14:32:08.908978 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="c95da3ac-7563-49bf-a956-b19297cb7d97" containerName="nova-api-log" Feb 17 14:32:08 crc kubenswrapper[4836]: I0217 14:32:08.910256 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="c95da3ac-7563-49bf-a956-b19297cb7d97" containerName="nova-api-log" Feb 17 14:32:08 crc kubenswrapper[4836]: I0217 14:32:08.910284 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="c95da3ac-7563-49bf-a956-b19297cb7d97" containerName="nova-api-api" Feb 17 14:32:08 crc kubenswrapper[4836]: I0217 14:32:08.915480 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 14:32:08 crc kubenswrapper[4836]: I0217 14:32:08.917547 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 17 14:32:08 crc kubenswrapper[4836]: I0217 14:32:08.920020 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Feb 17 14:32:08 crc kubenswrapper[4836]: I0217 14:32:08.922101 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 17 14:32:08 crc kubenswrapper[4836]: I0217 14:32:08.923056 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Feb 17 14:32:08 crc kubenswrapper[4836]: I0217 14:32:08.943149 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nntvj\" (UniqueName: \"kubernetes.io/projected/a8815111-fe36-4868-b092-2f88255f8f2b-kube-api-access-nntvj\") pod \"nova-api-0\" (UID: \"a8815111-fe36-4868-b092-2f88255f8f2b\") " pod="openstack/nova-api-0" Feb 17 14:32:08 crc kubenswrapper[4836]: I0217 14:32:08.943627 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8815111-fe36-4868-b092-2f88255f8f2b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a8815111-fe36-4868-b092-2f88255f8f2b\") " pod="openstack/nova-api-0" Feb 17 14:32:08 crc kubenswrapper[4836]: I0217 14:32:08.943792 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8815111-fe36-4868-b092-2f88255f8f2b-config-data\") pod \"nova-api-0\" (UID: \"a8815111-fe36-4868-b092-2f88255f8f2b\") " pod="openstack/nova-api-0" Feb 17 14:32:08 crc kubenswrapper[4836]: I0217 14:32:08.943998 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8815111-fe36-4868-b092-2f88255f8f2b-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a8815111-fe36-4868-b092-2f88255f8f2b\") " pod="openstack/nova-api-0" Feb 17 14:32:08 crc kubenswrapper[4836]: I0217 14:32:08.944028 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8815111-fe36-4868-b092-2f88255f8f2b-public-tls-certs\") pod \"nova-api-0\" (UID: \"a8815111-fe36-4868-b092-2f88255f8f2b\") " pod="openstack/nova-api-0" Feb 17 14:32:08 crc kubenswrapper[4836]: I0217 14:32:08.944508 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a8815111-fe36-4868-b092-2f88255f8f2b-logs\") pod \"nova-api-0\" (UID: \"a8815111-fe36-4868-b092-2f88255f8f2b\") " pod="openstack/nova-api-0" Feb 17 14:32:09 crc kubenswrapper[4836]: I0217 14:32:09.045890 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nntvj\" (UniqueName: \"kubernetes.io/projected/a8815111-fe36-4868-b092-2f88255f8f2b-kube-api-access-nntvj\") pod \"nova-api-0\" (UID: \"a8815111-fe36-4868-b092-2f88255f8f2b\") " pod="openstack/nova-api-0" Feb 17 14:32:09 crc kubenswrapper[4836]: I0217 14:32:09.045980 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8815111-fe36-4868-b092-2f88255f8f2b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a8815111-fe36-4868-b092-2f88255f8f2b\") " pod="openstack/nova-api-0" Feb 17 14:32:09 crc kubenswrapper[4836]: I0217 14:32:09.046040 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8815111-fe36-4868-b092-2f88255f8f2b-config-data\") pod \"nova-api-0\" (UID: \"a8815111-fe36-4868-b092-2f88255f8f2b\") " pod="openstack/nova-api-0" Feb 17 14:32:09 crc kubenswrapper[4836]: I0217 14:32:09.046108 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8815111-fe36-4868-b092-2f88255f8f2b-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a8815111-fe36-4868-b092-2f88255f8f2b\") " pod="openstack/nova-api-0" Feb 17 14:32:09 crc kubenswrapper[4836]: I0217 14:32:09.046128 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8815111-fe36-4868-b092-2f88255f8f2b-public-tls-certs\") pod \"nova-api-0\" (UID: \"a8815111-fe36-4868-b092-2f88255f8f2b\") " pod="openstack/nova-api-0" Feb 17 14:32:09 crc kubenswrapper[4836]: I0217 14:32:09.046227 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a8815111-fe36-4868-b092-2f88255f8f2b-logs\") pod \"nova-api-0\" (UID: \"a8815111-fe36-4868-b092-2f88255f8f2b\") " pod="openstack/nova-api-0" Feb 17 14:32:09 crc kubenswrapper[4836]: I0217 14:32:09.046868 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a8815111-fe36-4868-b092-2f88255f8f2b-logs\") pod \"nova-api-0\" (UID: \"a8815111-fe36-4868-b092-2f88255f8f2b\") " pod="openstack/nova-api-0" Feb 17 14:32:09 crc kubenswrapper[4836]: I0217 14:32:09.054738 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8815111-fe36-4868-b092-2f88255f8f2b-public-tls-certs\") pod \"nova-api-0\" (UID: \"a8815111-fe36-4868-b092-2f88255f8f2b\") " pod="openstack/nova-api-0" Feb 17 14:32:09 crc kubenswrapper[4836]: I0217 14:32:09.055504 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8815111-fe36-4868-b092-2f88255f8f2b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a8815111-fe36-4868-b092-2f88255f8f2b\") " pod="openstack/nova-api-0" Feb 17 14:32:09 crc kubenswrapper[4836]: I0217 14:32:09.055826 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8815111-fe36-4868-b092-2f88255f8f2b-config-data\") pod \"nova-api-0\" (UID: \"a8815111-fe36-4868-b092-2f88255f8f2b\") " pod="openstack/nova-api-0" Feb 17 14:32:09 crc kubenswrapper[4836]: I0217 14:32:09.056235 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8815111-fe36-4868-b092-2f88255f8f2b-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a8815111-fe36-4868-b092-2f88255f8f2b\") " pod="openstack/nova-api-0" Feb 17 14:32:09 crc kubenswrapper[4836]: I0217 14:32:09.069965 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nntvj\" (UniqueName: \"kubernetes.io/projected/a8815111-fe36-4868-b092-2f88255f8f2b-kube-api-access-nntvj\") pod \"nova-api-0\" (UID: \"a8815111-fe36-4868-b092-2f88255f8f2b\") " pod="openstack/nova-api-0" Feb 17 14:32:09 crc kubenswrapper[4836]: I0217 14:32:09.250865 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 14:32:09 crc kubenswrapper[4836]: I0217 14:32:09.762275 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 17 14:32:09 crc kubenswrapper[4836]: I0217 14:32:09.808210 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a8815111-fe36-4868-b092-2f88255f8f2b","Type":"ContainerStarted","Data":"5f03bdb5d30d971ad9d332b69b10a6ab0decf96b5c6d26985596b2a2ea77c8b7"} Feb 17 14:32:10 crc kubenswrapper[4836]: I0217 14:32:10.585106 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c95da3ac-7563-49bf-a956-b19297cb7d97" path="/var/lib/kubelet/pods/c95da3ac-7563-49bf-a956-b19297cb7d97/volumes" Feb 17 14:32:10 crc kubenswrapper[4836]: I0217 14:32:10.829765 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a8815111-fe36-4868-b092-2f88255f8f2b","Type":"ContainerStarted","Data":"ab504b71215e8d4d5ca609a1e042996ab7cbd89da94251614b26a4005e6be5e5"} Feb 17 14:32:10 crc kubenswrapper[4836]: I0217 14:32:10.829820 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a8815111-fe36-4868-b092-2f88255f8f2b","Type":"ContainerStarted","Data":"74ab7ec7c7960538f983d4287ec205b08238f350ade0a8215a914b2068a69f5d"} Feb 17 14:32:10 crc kubenswrapper[4836]: I0217 14:32:10.865671 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.865377303 podStartE2EDuration="2.865377303s" podCreationTimestamp="2026-02-17 14:32:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:32:10.850347785 +0000 UTC m=+1557.193276064" watchObservedRunningTime="2026-02-17 14:32:10.865377303 +0000 UTC m=+1557.208305572" Feb 17 14:32:12 crc kubenswrapper[4836]: I0217 14:32:12.139620 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 17 14:32:12 crc kubenswrapper[4836]: I0217 14:32:12.140018 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 17 14:32:12 crc kubenswrapper[4836]: I0217 14:32:12.678662 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 17 14:32:15 crc kubenswrapper[4836]: I0217 14:32:15.574718 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-gczl5" Feb 17 14:32:15 crc kubenswrapper[4836]: I0217 14:32:15.632879 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-gczl5" Feb 17 14:32:15 crc kubenswrapper[4836]: I0217 14:32:15.828813 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gczl5"] Feb 17 14:32:16 crc kubenswrapper[4836]: I0217 14:32:16.901642 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-gczl5" podUID="4cd9d6fa-d7e3-483f-afdf-104754807815" containerName="registry-server" containerID="cri-o://53b60b98531abff91e0a218cd8150c220b43da5bca3da77286dd345ff07020ef" gracePeriod=2 Feb 17 14:32:17 crc kubenswrapper[4836]: I0217 14:32:17.139865 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 17 14:32:17 crc kubenswrapper[4836]: I0217 14:32:17.140320 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 17 14:32:17 crc kubenswrapper[4836]: I0217 14:32:17.544163 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gczl5" Feb 17 14:32:17 crc kubenswrapper[4836]: I0217 14:32:17.678336 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 17 14:32:17 crc kubenswrapper[4836]: I0217 14:32:17.707274 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c8pkv\" (UniqueName: \"kubernetes.io/projected/4cd9d6fa-d7e3-483f-afdf-104754807815-kube-api-access-c8pkv\") pod \"4cd9d6fa-d7e3-483f-afdf-104754807815\" (UID: \"4cd9d6fa-d7e3-483f-afdf-104754807815\") " Feb 17 14:32:17 crc kubenswrapper[4836]: I0217 14:32:17.707467 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4cd9d6fa-d7e3-483f-afdf-104754807815-utilities\") pod \"4cd9d6fa-d7e3-483f-afdf-104754807815\" (UID: \"4cd9d6fa-d7e3-483f-afdf-104754807815\") " Feb 17 14:32:17 crc kubenswrapper[4836]: I0217 14:32:17.707630 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4cd9d6fa-d7e3-483f-afdf-104754807815-catalog-content\") pod \"4cd9d6fa-d7e3-483f-afdf-104754807815\" (UID: \"4cd9d6fa-d7e3-483f-afdf-104754807815\") " Feb 17 14:32:17 crc kubenswrapper[4836]: I0217 14:32:17.708228 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4cd9d6fa-d7e3-483f-afdf-104754807815-utilities" (OuterVolumeSpecName: "utilities") pod "4cd9d6fa-d7e3-483f-afdf-104754807815" (UID: "4cd9d6fa-d7e3-483f-afdf-104754807815"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:32:17 crc kubenswrapper[4836]: I0217 14:32:17.715475 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4cd9d6fa-d7e3-483f-afdf-104754807815-kube-api-access-c8pkv" (OuterVolumeSpecName: "kube-api-access-c8pkv") pod "4cd9d6fa-d7e3-483f-afdf-104754807815" (UID: "4cd9d6fa-d7e3-483f-afdf-104754807815"). InnerVolumeSpecName "kube-api-access-c8pkv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:32:17 crc kubenswrapper[4836]: I0217 14:32:17.742723 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 17 14:32:17 crc kubenswrapper[4836]: I0217 14:32:17.763970 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4cd9d6fa-d7e3-483f-afdf-104754807815-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4cd9d6fa-d7e3-483f-afdf-104754807815" (UID: "4cd9d6fa-d7e3-483f-afdf-104754807815"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:32:17 crc kubenswrapper[4836]: I0217 14:32:17.809781 4836 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4cd9d6fa-d7e3-483f-afdf-104754807815-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 14:32:17 crc kubenswrapper[4836]: I0217 14:32:17.809822 4836 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4cd9d6fa-d7e3-483f-afdf-104754807815-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 14:32:17 crc kubenswrapper[4836]: I0217 14:32:17.809836 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c8pkv\" (UniqueName: \"kubernetes.io/projected/4cd9d6fa-d7e3-483f-afdf-104754807815-kube-api-access-c8pkv\") on node \"crc\" DevicePath \"\"" Feb 17 14:32:17 crc kubenswrapper[4836]: I0217 14:32:17.915569 4836 generic.go:334] "Generic (PLEG): container finished" podID="4cd9d6fa-d7e3-483f-afdf-104754807815" containerID="53b60b98531abff91e0a218cd8150c220b43da5bca3da77286dd345ff07020ef" exitCode=0 Feb 17 14:32:17 crc kubenswrapper[4836]: I0217 14:32:17.915657 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gczl5" Feb 17 14:32:17 crc kubenswrapper[4836]: I0217 14:32:17.915695 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gczl5" event={"ID":"4cd9d6fa-d7e3-483f-afdf-104754807815","Type":"ContainerDied","Data":"53b60b98531abff91e0a218cd8150c220b43da5bca3da77286dd345ff07020ef"} Feb 17 14:32:17 crc kubenswrapper[4836]: I0217 14:32:17.915779 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gczl5" event={"ID":"4cd9d6fa-d7e3-483f-afdf-104754807815","Type":"ContainerDied","Data":"bce52c57a9fcf2c1e5a87f3b45145f1651223562443f585e0fbfde97d5bea9a0"} Feb 17 14:32:17 crc kubenswrapper[4836]: I0217 14:32:17.915821 4836 scope.go:117] "RemoveContainer" containerID="53b60b98531abff91e0a218cd8150c220b43da5bca3da77286dd345ff07020ef" Feb 17 14:32:17 crc kubenswrapper[4836]: I0217 14:32:17.953550 4836 scope.go:117] "RemoveContainer" containerID="f59ee91509ea9e36d15899731572e250cd29f78a13163eb836fae779d30ef947" Feb 17 14:32:17 crc kubenswrapper[4836]: I0217 14:32:17.962818 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gczl5"] Feb 17 14:32:17 crc kubenswrapper[4836]: I0217 14:32:17.965494 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 17 14:32:17 crc kubenswrapper[4836]: I0217 14:32:17.976725 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-gczl5"] Feb 17 14:32:17 crc kubenswrapper[4836]: I0217 14:32:17.990200 4836 scope.go:117] "RemoveContainer" containerID="d729bd6482605fb4977c1bf01960378417a9b2e6780614aa5f4dbff52eeda0e2" Feb 17 14:32:18 crc kubenswrapper[4836]: I0217 14:32:18.039838 4836 scope.go:117] "RemoveContainer" containerID="53b60b98531abff91e0a218cd8150c220b43da5bca3da77286dd345ff07020ef" Feb 17 14:32:18 crc kubenswrapper[4836]: E0217 14:32:18.040574 4836 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53b60b98531abff91e0a218cd8150c220b43da5bca3da77286dd345ff07020ef\": container with ID starting with 53b60b98531abff91e0a218cd8150c220b43da5bca3da77286dd345ff07020ef not found: ID does not exist" containerID="53b60b98531abff91e0a218cd8150c220b43da5bca3da77286dd345ff07020ef" Feb 17 14:32:18 crc kubenswrapper[4836]: I0217 14:32:18.040638 4836 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53b60b98531abff91e0a218cd8150c220b43da5bca3da77286dd345ff07020ef"} err="failed to get container status \"53b60b98531abff91e0a218cd8150c220b43da5bca3da77286dd345ff07020ef\": rpc error: code = NotFound desc = could not find container \"53b60b98531abff91e0a218cd8150c220b43da5bca3da77286dd345ff07020ef\": container with ID starting with 53b60b98531abff91e0a218cd8150c220b43da5bca3da77286dd345ff07020ef not found: ID does not exist" Feb 17 14:32:18 crc kubenswrapper[4836]: I0217 14:32:18.040678 4836 scope.go:117] "RemoveContainer" containerID="f59ee91509ea9e36d15899731572e250cd29f78a13163eb836fae779d30ef947" Feb 17 14:32:18 crc kubenswrapper[4836]: E0217 14:32:18.041530 4836 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f59ee91509ea9e36d15899731572e250cd29f78a13163eb836fae779d30ef947\": container with ID starting with f59ee91509ea9e36d15899731572e250cd29f78a13163eb836fae779d30ef947 not found: ID does not exist" containerID="f59ee91509ea9e36d15899731572e250cd29f78a13163eb836fae779d30ef947" Feb 17 14:32:18 crc kubenswrapper[4836]: I0217 14:32:18.041570 4836 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f59ee91509ea9e36d15899731572e250cd29f78a13163eb836fae779d30ef947"} err="failed to get container status \"f59ee91509ea9e36d15899731572e250cd29f78a13163eb836fae779d30ef947\": rpc error: code = NotFound desc = could not find container \"f59ee91509ea9e36d15899731572e250cd29f78a13163eb836fae779d30ef947\": container with ID starting with f59ee91509ea9e36d15899731572e250cd29f78a13163eb836fae779d30ef947 not found: ID does not exist" Feb 17 14:32:18 crc kubenswrapper[4836]: I0217 14:32:18.041593 4836 scope.go:117] "RemoveContainer" containerID="d729bd6482605fb4977c1bf01960378417a9b2e6780614aa5f4dbff52eeda0e2" Feb 17 14:32:18 crc kubenswrapper[4836]: E0217 14:32:18.042031 4836 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d729bd6482605fb4977c1bf01960378417a9b2e6780614aa5f4dbff52eeda0e2\": container with ID starting with d729bd6482605fb4977c1bf01960378417a9b2e6780614aa5f4dbff52eeda0e2 not found: ID does not exist" containerID="d729bd6482605fb4977c1bf01960378417a9b2e6780614aa5f4dbff52eeda0e2" Feb 17 14:32:18 crc kubenswrapper[4836]: I0217 14:32:18.042070 4836 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d729bd6482605fb4977c1bf01960378417a9b2e6780614aa5f4dbff52eeda0e2"} err="failed to get container status \"d729bd6482605fb4977c1bf01960378417a9b2e6780614aa5f4dbff52eeda0e2\": rpc error: code = NotFound desc = could not find container \"d729bd6482605fb4977c1bf01960378417a9b2e6780614aa5f4dbff52eeda0e2\": container with ID starting with d729bd6482605fb4977c1bf01960378417a9b2e6780614aa5f4dbff52eeda0e2 not found: ID does not exist" Feb 17 14:32:18 crc kubenswrapper[4836]: I0217 14:32:18.188908 4836 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="c56150e0-07ff-4a45-9231-26fa261942c4" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.234:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 17 14:32:18 crc kubenswrapper[4836]: I0217 14:32:18.189074 4836 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="c56150e0-07ff-4a45-9231-26fa261942c4" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.234:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 17 14:32:18 crc kubenswrapper[4836]: I0217 14:32:18.583183 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4cd9d6fa-d7e3-483f-afdf-104754807815" path="/var/lib/kubelet/pods/4cd9d6fa-d7e3-483f-afdf-104754807815/volumes" Feb 17 14:32:19 crc kubenswrapper[4836]: I0217 14:32:19.251384 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 17 14:32:19 crc kubenswrapper[4836]: I0217 14:32:19.251767 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 17 14:32:20 crc kubenswrapper[4836]: I0217 14:32:20.264533 4836 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="a8815111-fe36-4868-b092-2f88255f8f2b" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.236:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 17 14:32:20 crc kubenswrapper[4836]: I0217 14:32:20.264525 4836 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="a8815111-fe36-4868-b092-2f88255f8f2b" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.236:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 17 14:32:22 crc kubenswrapper[4836]: I0217 14:32:22.336366 4836 scope.go:117] "RemoveContainer" containerID="fa952a578ab7d74e43550d2abf42e1871632978ec68916c0a6508b2ed82226f0" Feb 17 14:32:22 crc kubenswrapper[4836]: I0217 14:32:22.375652 4836 scope.go:117] "RemoveContainer" containerID="99c757b68ed859a793668b56d22b853641589be9aa542f670159f298a8c5ffcd" Feb 17 14:32:22 crc kubenswrapper[4836]: I0217 14:32:22.424414 4836 scope.go:117] "RemoveContainer" containerID="57ea1eebc786d3a8ae12a685cfa802406deab325110c652e436a68a0c258022f" Feb 17 14:32:22 crc kubenswrapper[4836]: I0217 14:32:22.474054 4836 scope.go:117] "RemoveContainer" containerID="b84dd65de54881081222d1401d684becd3ab6f396a5d3ddb1a10e413f4f858e0" Feb 17 14:32:27 crc kubenswrapper[4836]: I0217 14:32:27.146026 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 17 14:32:27 crc kubenswrapper[4836]: I0217 14:32:27.147873 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 17 14:32:27 crc kubenswrapper[4836]: I0217 14:32:27.152306 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 17 14:32:28 crc kubenswrapper[4836]: I0217 14:32:28.066538 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 17 14:32:29 crc kubenswrapper[4836]: I0217 14:32:29.261886 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 17 14:32:29 crc kubenswrapper[4836]: I0217 14:32:29.262017 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 17 14:32:29 crc kubenswrapper[4836]: I0217 14:32:29.263055 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 17 14:32:29 crc kubenswrapper[4836]: I0217 14:32:29.263256 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 17 14:32:29 crc kubenswrapper[4836]: I0217 14:32:29.271902 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 17 14:32:29 crc kubenswrapper[4836]: I0217 14:32:29.275588 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 17 14:32:30 crc kubenswrapper[4836]: I0217 14:32:30.236551 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 17 14:32:47 crc kubenswrapper[4836]: I0217 14:32:47.157246 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-79spl"] Feb 17 14:32:47 crc kubenswrapper[4836]: E0217 14:32:47.160054 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cd9d6fa-d7e3-483f-afdf-104754807815" containerName="extract-content" Feb 17 14:32:47 crc kubenswrapper[4836]: I0217 14:32:47.160169 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cd9d6fa-d7e3-483f-afdf-104754807815" containerName="extract-content" Feb 17 14:32:47 crc kubenswrapper[4836]: E0217 14:32:47.160245 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cd9d6fa-d7e3-483f-afdf-104754807815" containerName="registry-server" Feb 17 14:32:47 crc kubenswrapper[4836]: I0217 14:32:47.160316 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cd9d6fa-d7e3-483f-afdf-104754807815" containerName="registry-server" Feb 17 14:32:47 crc kubenswrapper[4836]: E0217 14:32:47.160386 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cd9d6fa-d7e3-483f-afdf-104754807815" containerName="extract-utilities" Feb 17 14:32:47 crc kubenswrapper[4836]: I0217 14:32:47.160442 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cd9d6fa-d7e3-483f-afdf-104754807815" containerName="extract-utilities" Feb 17 14:32:47 crc kubenswrapper[4836]: I0217 14:32:47.160920 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="4cd9d6fa-d7e3-483f-afdf-104754807815" containerName="registry-server" Feb 17 14:32:47 crc kubenswrapper[4836]: I0217 14:32:47.163097 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-79spl" Feb 17 14:32:47 crc kubenswrapper[4836]: I0217 14:32:47.171309 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-79spl"] Feb 17 14:32:47 crc kubenswrapper[4836]: I0217 14:32:47.248767 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/349d9039-4cce-4f99-83f3-f12ad111cae1-utilities\") pod \"redhat-marketplace-79spl\" (UID: \"349d9039-4cce-4f99-83f3-f12ad111cae1\") " pod="openshift-marketplace/redhat-marketplace-79spl" Feb 17 14:32:47 crc kubenswrapper[4836]: I0217 14:32:47.248933 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjml9\" (UniqueName: \"kubernetes.io/projected/349d9039-4cce-4f99-83f3-f12ad111cae1-kube-api-access-hjml9\") pod \"redhat-marketplace-79spl\" (UID: \"349d9039-4cce-4f99-83f3-f12ad111cae1\") " pod="openshift-marketplace/redhat-marketplace-79spl" Feb 17 14:32:47 crc kubenswrapper[4836]: I0217 14:32:47.248976 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/349d9039-4cce-4f99-83f3-f12ad111cae1-catalog-content\") pod \"redhat-marketplace-79spl\" (UID: \"349d9039-4cce-4f99-83f3-f12ad111cae1\") " pod="openshift-marketplace/redhat-marketplace-79spl" Feb 17 14:32:47 crc kubenswrapper[4836]: I0217 14:32:47.352198 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/349d9039-4cce-4f99-83f3-f12ad111cae1-utilities\") pod \"redhat-marketplace-79spl\" (UID: \"349d9039-4cce-4f99-83f3-f12ad111cae1\") " pod="openshift-marketplace/redhat-marketplace-79spl" Feb 17 14:32:47 crc kubenswrapper[4836]: I0217 14:32:47.352367 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjml9\" (UniqueName: \"kubernetes.io/projected/349d9039-4cce-4f99-83f3-f12ad111cae1-kube-api-access-hjml9\") pod \"redhat-marketplace-79spl\" (UID: \"349d9039-4cce-4f99-83f3-f12ad111cae1\") " pod="openshift-marketplace/redhat-marketplace-79spl" Feb 17 14:32:47 crc kubenswrapper[4836]: I0217 14:32:47.352409 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/349d9039-4cce-4f99-83f3-f12ad111cae1-catalog-content\") pod \"redhat-marketplace-79spl\" (UID: \"349d9039-4cce-4f99-83f3-f12ad111cae1\") " pod="openshift-marketplace/redhat-marketplace-79spl" Feb 17 14:32:47 crc kubenswrapper[4836]: I0217 14:32:47.352727 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/349d9039-4cce-4f99-83f3-f12ad111cae1-utilities\") pod \"redhat-marketplace-79spl\" (UID: \"349d9039-4cce-4f99-83f3-f12ad111cae1\") " pod="openshift-marketplace/redhat-marketplace-79spl" Feb 17 14:32:47 crc kubenswrapper[4836]: I0217 14:32:47.352840 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/349d9039-4cce-4f99-83f3-f12ad111cae1-catalog-content\") pod \"redhat-marketplace-79spl\" (UID: \"349d9039-4cce-4f99-83f3-f12ad111cae1\") " pod="openshift-marketplace/redhat-marketplace-79spl" Feb 17 14:32:47 crc kubenswrapper[4836]: I0217 14:32:47.375791 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjml9\" (UniqueName: \"kubernetes.io/projected/349d9039-4cce-4f99-83f3-f12ad111cae1-kube-api-access-hjml9\") pod \"redhat-marketplace-79spl\" (UID: \"349d9039-4cce-4f99-83f3-f12ad111cae1\") " pod="openshift-marketplace/redhat-marketplace-79spl" Feb 17 14:32:47 crc kubenswrapper[4836]: I0217 14:32:47.510266 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-79spl" Feb 17 14:32:48 crc kubenswrapper[4836]: I0217 14:32:48.114977 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-79spl"] Feb 17 14:32:48 crc kubenswrapper[4836]: I0217 14:32:48.291465 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-79spl" event={"ID":"349d9039-4cce-4f99-83f3-f12ad111cae1","Type":"ContainerStarted","Data":"a013853d66ffa5e96192c446178dcfb4fc6eec81200e42d20a9171d8c3996fc2"} Feb 17 14:32:49 crc kubenswrapper[4836]: I0217 14:32:49.306233 4836 generic.go:334] "Generic (PLEG): container finished" podID="349d9039-4cce-4f99-83f3-f12ad111cae1" containerID="dd37c66da0b978389c4d358d2f04173d267ab1cfda22f7781de1a6f5ae779bc0" exitCode=0 Feb 17 14:32:49 crc kubenswrapper[4836]: I0217 14:32:49.306340 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-79spl" event={"ID":"349d9039-4cce-4f99-83f3-f12ad111cae1","Type":"ContainerDied","Data":"dd37c66da0b978389c4d358d2f04173d267ab1cfda22f7781de1a6f5ae779bc0"} Feb 17 14:32:49 crc kubenswrapper[4836]: I0217 14:32:49.309654 4836 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 17 14:32:51 crc kubenswrapper[4836]: I0217 14:32:51.336261 4836 generic.go:334] "Generic (PLEG): container finished" podID="349d9039-4cce-4f99-83f3-f12ad111cae1" containerID="70b11e76751646ba17646df4e29485c533614f909c1dba6960938f572df3db8b" exitCode=0 Feb 17 14:32:51 crc kubenswrapper[4836]: I0217 14:32:51.336379 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-79spl" event={"ID":"349d9039-4cce-4f99-83f3-f12ad111cae1","Type":"ContainerDied","Data":"70b11e76751646ba17646df4e29485c533614f909c1dba6960938f572df3db8b"} Feb 17 14:32:52 crc kubenswrapper[4836]: I0217 14:32:52.404844 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-79spl" event={"ID":"349d9039-4cce-4f99-83f3-f12ad111cae1","Type":"ContainerStarted","Data":"569a357266f7e1f0fc1cc6f6be5224922001d03792c86e5f281447d8382e40c6"} Feb 17 14:32:52 crc kubenswrapper[4836]: I0217 14:32:52.435040 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-79spl" podStartSLOduration=2.907335903 podStartE2EDuration="5.435014371s" podCreationTimestamp="2026-02-17 14:32:47 +0000 UTC" firstStartedPulling="2026-02-17 14:32:49.309275708 +0000 UTC m=+1595.652203977" lastFinishedPulling="2026-02-17 14:32:51.836954176 +0000 UTC m=+1598.179882445" observedRunningTime="2026-02-17 14:32:52.431629129 +0000 UTC m=+1598.774557408" watchObservedRunningTime="2026-02-17 14:32:52.435014371 +0000 UTC m=+1598.777942640" Feb 17 14:32:52 crc kubenswrapper[4836]: I0217 14:32:52.506379 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-65gzh"] Feb 17 14:32:52 crc kubenswrapper[4836]: I0217 14:32:52.509025 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-65gzh" Feb 17 14:32:52 crc kubenswrapper[4836]: I0217 14:32:52.521721 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-65gzh"] Feb 17 14:32:52 crc kubenswrapper[4836]: I0217 14:32:52.606413 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52cf9c20-bb50-4295-a308-add7b717f6ce-catalog-content\") pod \"community-operators-65gzh\" (UID: \"52cf9c20-bb50-4295-a308-add7b717f6ce\") " pod="openshift-marketplace/community-operators-65gzh" Feb 17 14:32:52 crc kubenswrapper[4836]: I0217 14:32:52.606667 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52cf9c20-bb50-4295-a308-add7b717f6ce-utilities\") pod \"community-operators-65gzh\" (UID: \"52cf9c20-bb50-4295-a308-add7b717f6ce\") " pod="openshift-marketplace/community-operators-65gzh" Feb 17 14:32:52 crc kubenswrapper[4836]: I0217 14:32:52.606953 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8246n\" (UniqueName: \"kubernetes.io/projected/52cf9c20-bb50-4295-a308-add7b717f6ce-kube-api-access-8246n\") pod \"community-operators-65gzh\" (UID: \"52cf9c20-bb50-4295-a308-add7b717f6ce\") " pod="openshift-marketplace/community-operators-65gzh" Feb 17 14:32:52 crc kubenswrapper[4836]: I0217 14:32:52.709071 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8246n\" (UniqueName: \"kubernetes.io/projected/52cf9c20-bb50-4295-a308-add7b717f6ce-kube-api-access-8246n\") pod \"community-operators-65gzh\" (UID: \"52cf9c20-bb50-4295-a308-add7b717f6ce\") " pod="openshift-marketplace/community-operators-65gzh" Feb 17 14:32:52 crc kubenswrapper[4836]: I0217 14:32:52.709250 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52cf9c20-bb50-4295-a308-add7b717f6ce-catalog-content\") pod \"community-operators-65gzh\" (UID: \"52cf9c20-bb50-4295-a308-add7b717f6ce\") " pod="openshift-marketplace/community-operators-65gzh" Feb 17 14:32:52 crc kubenswrapper[4836]: I0217 14:32:52.709338 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52cf9c20-bb50-4295-a308-add7b717f6ce-utilities\") pod \"community-operators-65gzh\" (UID: \"52cf9c20-bb50-4295-a308-add7b717f6ce\") " pod="openshift-marketplace/community-operators-65gzh" Feb 17 14:32:52 crc kubenswrapper[4836]: I0217 14:32:52.709985 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52cf9c20-bb50-4295-a308-add7b717f6ce-utilities\") pod \"community-operators-65gzh\" (UID: \"52cf9c20-bb50-4295-a308-add7b717f6ce\") " pod="openshift-marketplace/community-operators-65gzh" Feb 17 14:32:52 crc kubenswrapper[4836]: I0217 14:32:52.709992 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52cf9c20-bb50-4295-a308-add7b717f6ce-catalog-content\") pod \"community-operators-65gzh\" (UID: \"52cf9c20-bb50-4295-a308-add7b717f6ce\") " pod="openshift-marketplace/community-operators-65gzh" Feb 17 14:32:52 crc kubenswrapper[4836]: I0217 14:32:52.737288 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8246n\" (UniqueName: \"kubernetes.io/projected/52cf9c20-bb50-4295-a308-add7b717f6ce-kube-api-access-8246n\") pod \"community-operators-65gzh\" (UID: \"52cf9c20-bb50-4295-a308-add7b717f6ce\") " pod="openshift-marketplace/community-operators-65gzh" Feb 17 14:32:52 crc kubenswrapper[4836]: I0217 14:32:52.832733 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-65gzh" Feb 17 14:32:53 crc kubenswrapper[4836]: I0217 14:32:53.597210 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-65gzh"] Feb 17 14:32:54 crc kubenswrapper[4836]: I0217 14:32:54.436243 4836 generic.go:334] "Generic (PLEG): container finished" podID="52cf9c20-bb50-4295-a308-add7b717f6ce" containerID="589c03a33d996d19da55c16d13129e22ee3a80583c3c47707a14170d5f4c36a2" exitCode=0 Feb 17 14:32:54 crc kubenswrapper[4836]: I0217 14:32:54.436367 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-65gzh" event={"ID":"52cf9c20-bb50-4295-a308-add7b717f6ce","Type":"ContainerDied","Data":"589c03a33d996d19da55c16d13129e22ee3a80583c3c47707a14170d5f4c36a2"} Feb 17 14:32:54 crc kubenswrapper[4836]: I0217 14:32:54.436637 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-65gzh" event={"ID":"52cf9c20-bb50-4295-a308-add7b717f6ce","Type":"ContainerStarted","Data":"845d9e040e01c21eede646e7791a3eeafb44af87e66b2eea5e7b077448fb458c"} Feb 17 14:32:55 crc kubenswrapper[4836]: I0217 14:32:55.449517 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-65gzh" event={"ID":"52cf9c20-bb50-4295-a308-add7b717f6ce","Type":"ContainerStarted","Data":"e9baafd53b9f38a4a14db774f6c7ff65f7621f5b63db204c143d432c7c69cf04"} Feb 17 14:32:57 crc kubenswrapper[4836]: I0217 14:32:57.476147 4836 generic.go:334] "Generic (PLEG): container finished" podID="52cf9c20-bb50-4295-a308-add7b717f6ce" containerID="e9baafd53b9f38a4a14db774f6c7ff65f7621f5b63db204c143d432c7c69cf04" exitCode=0 Feb 17 14:32:57 crc kubenswrapper[4836]: I0217 14:32:57.477098 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-65gzh" event={"ID":"52cf9c20-bb50-4295-a308-add7b717f6ce","Type":"ContainerDied","Data":"e9baafd53b9f38a4a14db774f6c7ff65f7621f5b63db204c143d432c7c69cf04"} Feb 17 14:32:57 crc kubenswrapper[4836]: I0217 14:32:57.510465 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-79spl" Feb 17 14:32:57 crc kubenswrapper[4836]: I0217 14:32:57.510813 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-79spl" Feb 17 14:32:57 crc kubenswrapper[4836]: I0217 14:32:57.575430 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-79spl" Feb 17 14:32:58 crc kubenswrapper[4836]: I0217 14:32:58.494220 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-65gzh" event={"ID":"52cf9c20-bb50-4295-a308-add7b717f6ce","Type":"ContainerStarted","Data":"ba76b4465f5c067dce887a59bb959e096cd91077fbbe7fff1637f01594af51cf"} Feb 17 14:32:58 crc kubenswrapper[4836]: I0217 14:32:58.525360 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-65gzh" podStartSLOduration=3.108142823 podStartE2EDuration="6.525336743s" podCreationTimestamp="2026-02-17 14:32:52 +0000 UTC" firstStartedPulling="2026-02-17 14:32:54.43908251 +0000 UTC m=+1600.782010779" lastFinishedPulling="2026-02-17 14:32:57.85627643 +0000 UTC m=+1604.199204699" observedRunningTime="2026-02-17 14:32:58.522861417 +0000 UTC m=+1604.865789706" watchObservedRunningTime="2026-02-17 14:32:58.525336743 +0000 UTC m=+1604.868265012" Feb 17 14:32:58 crc kubenswrapper[4836]: I0217 14:32:58.558654 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-79spl" Feb 17 14:32:59 crc kubenswrapper[4836]: I0217 14:32:59.765518 4836 patch_prober.go:28] interesting pod/machine-config-daemon-bkk9g container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 14:32:59 crc kubenswrapper[4836]: I0217 14:32:59.765956 4836 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" podUID="895a19c9-a3f0-4a15-aa19-19347121388c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 14:33:00 crc kubenswrapper[4836]: I0217 14:33:00.698240 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-79spl"] Feb 17 14:33:01 crc kubenswrapper[4836]: I0217 14:33:01.547730 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-79spl" podUID="349d9039-4cce-4f99-83f3-f12ad111cae1" containerName="registry-server" containerID="cri-o://569a357266f7e1f0fc1cc6f6be5224922001d03792c86e5f281447d8382e40c6" gracePeriod=2 Feb 17 14:33:02 crc kubenswrapper[4836]: I0217 14:33:02.184895 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-79spl" Feb 17 14:33:02 crc kubenswrapper[4836]: I0217 14:33:02.269224 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hjml9\" (UniqueName: \"kubernetes.io/projected/349d9039-4cce-4f99-83f3-f12ad111cae1-kube-api-access-hjml9\") pod \"349d9039-4cce-4f99-83f3-f12ad111cae1\" (UID: \"349d9039-4cce-4f99-83f3-f12ad111cae1\") " Feb 17 14:33:02 crc kubenswrapper[4836]: I0217 14:33:02.269417 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/349d9039-4cce-4f99-83f3-f12ad111cae1-catalog-content\") pod \"349d9039-4cce-4f99-83f3-f12ad111cae1\" (UID: \"349d9039-4cce-4f99-83f3-f12ad111cae1\") " Feb 17 14:33:02 crc kubenswrapper[4836]: I0217 14:33:02.269615 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/349d9039-4cce-4f99-83f3-f12ad111cae1-utilities\") pod \"349d9039-4cce-4f99-83f3-f12ad111cae1\" (UID: \"349d9039-4cce-4f99-83f3-f12ad111cae1\") " Feb 17 14:33:02 crc kubenswrapper[4836]: I0217 14:33:02.270124 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/349d9039-4cce-4f99-83f3-f12ad111cae1-utilities" (OuterVolumeSpecName: "utilities") pod "349d9039-4cce-4f99-83f3-f12ad111cae1" (UID: "349d9039-4cce-4f99-83f3-f12ad111cae1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:33:02 crc kubenswrapper[4836]: I0217 14:33:02.277732 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/349d9039-4cce-4f99-83f3-f12ad111cae1-kube-api-access-hjml9" (OuterVolumeSpecName: "kube-api-access-hjml9") pod "349d9039-4cce-4f99-83f3-f12ad111cae1" (UID: "349d9039-4cce-4f99-83f3-f12ad111cae1"). InnerVolumeSpecName "kube-api-access-hjml9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:33:02 crc kubenswrapper[4836]: I0217 14:33:02.291157 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/349d9039-4cce-4f99-83f3-f12ad111cae1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "349d9039-4cce-4f99-83f3-f12ad111cae1" (UID: "349d9039-4cce-4f99-83f3-f12ad111cae1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:33:02 crc kubenswrapper[4836]: I0217 14:33:02.372946 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hjml9\" (UniqueName: \"kubernetes.io/projected/349d9039-4cce-4f99-83f3-f12ad111cae1-kube-api-access-hjml9\") on node \"crc\" DevicePath \"\"" Feb 17 14:33:02 crc kubenswrapper[4836]: I0217 14:33:02.372996 4836 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/349d9039-4cce-4f99-83f3-f12ad111cae1-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 14:33:02 crc kubenswrapper[4836]: I0217 14:33:02.373009 4836 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/349d9039-4cce-4f99-83f3-f12ad111cae1-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 14:33:02 crc kubenswrapper[4836]: I0217 14:33:02.560186 4836 generic.go:334] "Generic (PLEG): container finished" podID="349d9039-4cce-4f99-83f3-f12ad111cae1" containerID="569a357266f7e1f0fc1cc6f6be5224922001d03792c86e5f281447d8382e40c6" exitCode=0 Feb 17 14:33:02 crc kubenswrapper[4836]: I0217 14:33:02.560292 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-79spl" event={"ID":"349d9039-4cce-4f99-83f3-f12ad111cae1","Type":"ContainerDied","Data":"569a357266f7e1f0fc1cc6f6be5224922001d03792c86e5f281447d8382e40c6"} Feb 17 14:33:02 crc kubenswrapper[4836]: I0217 14:33:02.560328 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-79spl" Feb 17 14:33:02 crc kubenswrapper[4836]: I0217 14:33:02.560371 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-79spl" event={"ID":"349d9039-4cce-4f99-83f3-f12ad111cae1","Type":"ContainerDied","Data":"a013853d66ffa5e96192c446178dcfb4fc6eec81200e42d20a9171d8c3996fc2"} Feb 17 14:33:02 crc kubenswrapper[4836]: I0217 14:33:02.560402 4836 scope.go:117] "RemoveContainer" containerID="569a357266f7e1f0fc1cc6f6be5224922001d03792c86e5f281447d8382e40c6" Feb 17 14:33:02 crc kubenswrapper[4836]: I0217 14:33:02.587161 4836 scope.go:117] "RemoveContainer" containerID="70b11e76751646ba17646df4e29485c533614f909c1dba6960938f572df3db8b" Feb 17 14:33:02 crc kubenswrapper[4836]: I0217 14:33:02.610702 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-79spl"] Feb 17 14:33:02 crc kubenswrapper[4836]: I0217 14:33:02.625128 4836 scope.go:117] "RemoveContainer" containerID="dd37c66da0b978389c4d358d2f04173d267ab1cfda22f7781de1a6f5ae779bc0" Feb 17 14:33:02 crc kubenswrapper[4836]: I0217 14:33:02.626005 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-79spl"] Feb 17 14:33:02 crc kubenswrapper[4836]: I0217 14:33:02.675133 4836 scope.go:117] "RemoveContainer" containerID="569a357266f7e1f0fc1cc6f6be5224922001d03792c86e5f281447d8382e40c6" Feb 17 14:33:02 crc kubenswrapper[4836]: E0217 14:33:02.675929 4836 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"569a357266f7e1f0fc1cc6f6be5224922001d03792c86e5f281447d8382e40c6\": container with ID starting with 569a357266f7e1f0fc1cc6f6be5224922001d03792c86e5f281447d8382e40c6 not found: ID does not exist" containerID="569a357266f7e1f0fc1cc6f6be5224922001d03792c86e5f281447d8382e40c6" Feb 17 14:33:02 crc kubenswrapper[4836]: I0217 14:33:02.676001 4836 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"569a357266f7e1f0fc1cc6f6be5224922001d03792c86e5f281447d8382e40c6"} err="failed to get container status \"569a357266f7e1f0fc1cc6f6be5224922001d03792c86e5f281447d8382e40c6\": rpc error: code = NotFound desc = could not find container \"569a357266f7e1f0fc1cc6f6be5224922001d03792c86e5f281447d8382e40c6\": container with ID starting with 569a357266f7e1f0fc1cc6f6be5224922001d03792c86e5f281447d8382e40c6 not found: ID does not exist" Feb 17 14:33:02 crc kubenswrapper[4836]: I0217 14:33:02.676038 4836 scope.go:117] "RemoveContainer" containerID="70b11e76751646ba17646df4e29485c533614f909c1dba6960938f572df3db8b" Feb 17 14:33:02 crc kubenswrapper[4836]: E0217 14:33:02.676600 4836 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70b11e76751646ba17646df4e29485c533614f909c1dba6960938f572df3db8b\": container with ID starting with 70b11e76751646ba17646df4e29485c533614f909c1dba6960938f572df3db8b not found: ID does not exist" containerID="70b11e76751646ba17646df4e29485c533614f909c1dba6960938f572df3db8b" Feb 17 14:33:02 crc kubenswrapper[4836]: I0217 14:33:02.676639 4836 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70b11e76751646ba17646df4e29485c533614f909c1dba6960938f572df3db8b"} err="failed to get container status \"70b11e76751646ba17646df4e29485c533614f909c1dba6960938f572df3db8b\": rpc error: code = NotFound desc = could not find container \"70b11e76751646ba17646df4e29485c533614f909c1dba6960938f572df3db8b\": container with ID starting with 70b11e76751646ba17646df4e29485c533614f909c1dba6960938f572df3db8b not found: ID does not exist" Feb 17 14:33:02 crc kubenswrapper[4836]: I0217 14:33:02.676666 4836 scope.go:117] "RemoveContainer" containerID="dd37c66da0b978389c4d358d2f04173d267ab1cfda22f7781de1a6f5ae779bc0" Feb 17 14:33:02 crc kubenswrapper[4836]: E0217 14:33:02.677183 4836 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd37c66da0b978389c4d358d2f04173d267ab1cfda22f7781de1a6f5ae779bc0\": container with ID starting with dd37c66da0b978389c4d358d2f04173d267ab1cfda22f7781de1a6f5ae779bc0 not found: ID does not exist" containerID="dd37c66da0b978389c4d358d2f04173d267ab1cfda22f7781de1a6f5ae779bc0" Feb 17 14:33:02 crc kubenswrapper[4836]: I0217 14:33:02.677222 4836 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd37c66da0b978389c4d358d2f04173d267ab1cfda22f7781de1a6f5ae779bc0"} err="failed to get container status \"dd37c66da0b978389c4d358d2f04173d267ab1cfda22f7781de1a6f5ae779bc0\": rpc error: code = NotFound desc = could not find container \"dd37c66da0b978389c4d358d2f04173d267ab1cfda22f7781de1a6f5ae779bc0\": container with ID starting with dd37c66da0b978389c4d358d2f04173d267ab1cfda22f7781de1a6f5ae779bc0 not found: ID does not exist" Feb 17 14:33:02 crc kubenswrapper[4836]: I0217 14:33:02.833553 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-65gzh" Feb 17 14:33:02 crc kubenswrapper[4836]: I0217 14:33:02.833632 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-65gzh" Feb 17 14:33:02 crc kubenswrapper[4836]: I0217 14:33:02.897633 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-65gzh" Feb 17 14:33:03 crc kubenswrapper[4836]: I0217 14:33:03.619337 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-65gzh" Feb 17 14:33:04 crc kubenswrapper[4836]: I0217 14:33:04.580848 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="349d9039-4cce-4f99-83f3-f12ad111cae1" path="/var/lib/kubelet/pods/349d9039-4cce-4f99-83f3-f12ad111cae1/volumes" Feb 17 14:33:05 crc kubenswrapper[4836]: I0217 14:33:05.100787 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-65gzh"] Feb 17 14:33:05 crc kubenswrapper[4836]: I0217 14:33:05.605043 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-65gzh" podUID="52cf9c20-bb50-4295-a308-add7b717f6ce" containerName="registry-server" containerID="cri-o://ba76b4465f5c067dce887a59bb959e096cd91077fbbe7fff1637f01594af51cf" gracePeriod=2 Feb 17 14:33:06 crc kubenswrapper[4836]: I0217 14:33:06.285345 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-65gzh" Feb 17 14:33:06 crc kubenswrapper[4836]: I0217 14:33:06.468779 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8246n\" (UniqueName: \"kubernetes.io/projected/52cf9c20-bb50-4295-a308-add7b717f6ce-kube-api-access-8246n\") pod \"52cf9c20-bb50-4295-a308-add7b717f6ce\" (UID: \"52cf9c20-bb50-4295-a308-add7b717f6ce\") " Feb 17 14:33:06 crc kubenswrapper[4836]: I0217 14:33:06.469163 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52cf9c20-bb50-4295-a308-add7b717f6ce-catalog-content\") pod \"52cf9c20-bb50-4295-a308-add7b717f6ce\" (UID: \"52cf9c20-bb50-4295-a308-add7b717f6ce\") " Feb 17 14:33:06 crc kubenswrapper[4836]: I0217 14:33:06.469446 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52cf9c20-bb50-4295-a308-add7b717f6ce-utilities\") pod \"52cf9c20-bb50-4295-a308-add7b717f6ce\" (UID: \"52cf9c20-bb50-4295-a308-add7b717f6ce\") " Feb 17 14:33:06 crc kubenswrapper[4836]: I0217 14:33:06.470066 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52cf9c20-bb50-4295-a308-add7b717f6ce-utilities" (OuterVolumeSpecName: "utilities") pod "52cf9c20-bb50-4295-a308-add7b717f6ce" (UID: "52cf9c20-bb50-4295-a308-add7b717f6ce"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:33:06 crc kubenswrapper[4836]: I0217 14:33:06.470534 4836 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52cf9c20-bb50-4295-a308-add7b717f6ce-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 14:33:06 crc kubenswrapper[4836]: I0217 14:33:06.485262 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52cf9c20-bb50-4295-a308-add7b717f6ce-kube-api-access-8246n" (OuterVolumeSpecName: "kube-api-access-8246n") pod "52cf9c20-bb50-4295-a308-add7b717f6ce" (UID: "52cf9c20-bb50-4295-a308-add7b717f6ce"). InnerVolumeSpecName "kube-api-access-8246n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:33:06 crc kubenswrapper[4836]: I0217 14:33:06.544514 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52cf9c20-bb50-4295-a308-add7b717f6ce-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "52cf9c20-bb50-4295-a308-add7b717f6ce" (UID: "52cf9c20-bb50-4295-a308-add7b717f6ce"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:33:06 crc kubenswrapper[4836]: I0217 14:33:06.572937 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8246n\" (UniqueName: \"kubernetes.io/projected/52cf9c20-bb50-4295-a308-add7b717f6ce-kube-api-access-8246n\") on node \"crc\" DevicePath \"\"" Feb 17 14:33:06 crc kubenswrapper[4836]: I0217 14:33:06.572984 4836 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52cf9c20-bb50-4295-a308-add7b717f6ce-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 14:33:06 crc kubenswrapper[4836]: I0217 14:33:06.619730 4836 generic.go:334] "Generic (PLEG): container finished" podID="52cf9c20-bb50-4295-a308-add7b717f6ce" containerID="ba76b4465f5c067dce887a59bb959e096cd91077fbbe7fff1637f01594af51cf" exitCode=0 Feb 17 14:33:06 crc kubenswrapper[4836]: I0217 14:33:06.619799 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-65gzh" event={"ID":"52cf9c20-bb50-4295-a308-add7b717f6ce","Type":"ContainerDied","Data":"ba76b4465f5c067dce887a59bb959e096cd91077fbbe7fff1637f01594af51cf"} Feb 17 14:33:06 crc kubenswrapper[4836]: I0217 14:33:06.619848 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-65gzh" event={"ID":"52cf9c20-bb50-4295-a308-add7b717f6ce","Type":"ContainerDied","Data":"845d9e040e01c21eede646e7791a3eeafb44af87e66b2eea5e7b077448fb458c"} Feb 17 14:33:06 crc kubenswrapper[4836]: I0217 14:33:06.619871 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-65gzh" Feb 17 14:33:06 crc kubenswrapper[4836]: I0217 14:33:06.619882 4836 scope.go:117] "RemoveContainer" containerID="ba76b4465f5c067dce887a59bb959e096cd91077fbbe7fff1637f01594af51cf" Feb 17 14:33:06 crc kubenswrapper[4836]: I0217 14:33:06.664927 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-65gzh"] Feb 17 14:33:06 crc kubenswrapper[4836]: I0217 14:33:06.667210 4836 scope.go:117] "RemoveContainer" containerID="e9baafd53b9f38a4a14db774f6c7ff65f7621f5b63db204c143d432c7c69cf04" Feb 17 14:33:06 crc kubenswrapper[4836]: I0217 14:33:06.685012 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-65gzh"] Feb 17 14:33:06 crc kubenswrapper[4836]: I0217 14:33:06.710829 4836 scope.go:117] "RemoveContainer" containerID="589c03a33d996d19da55c16d13129e22ee3a80583c3c47707a14170d5f4c36a2" Feb 17 14:33:06 crc kubenswrapper[4836]: I0217 14:33:06.752550 4836 scope.go:117] "RemoveContainer" containerID="ba76b4465f5c067dce887a59bb959e096cd91077fbbe7fff1637f01594af51cf" Feb 17 14:33:06 crc kubenswrapper[4836]: E0217 14:33:06.753235 4836 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba76b4465f5c067dce887a59bb959e096cd91077fbbe7fff1637f01594af51cf\": container with ID starting with ba76b4465f5c067dce887a59bb959e096cd91077fbbe7fff1637f01594af51cf not found: ID does not exist" containerID="ba76b4465f5c067dce887a59bb959e096cd91077fbbe7fff1637f01594af51cf" Feb 17 14:33:06 crc kubenswrapper[4836]: I0217 14:33:06.753322 4836 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba76b4465f5c067dce887a59bb959e096cd91077fbbe7fff1637f01594af51cf"} err="failed to get container status \"ba76b4465f5c067dce887a59bb959e096cd91077fbbe7fff1637f01594af51cf\": rpc error: code = NotFound desc = could not find container \"ba76b4465f5c067dce887a59bb959e096cd91077fbbe7fff1637f01594af51cf\": container with ID starting with ba76b4465f5c067dce887a59bb959e096cd91077fbbe7fff1637f01594af51cf not found: ID does not exist" Feb 17 14:33:06 crc kubenswrapper[4836]: I0217 14:33:06.753366 4836 scope.go:117] "RemoveContainer" containerID="e9baafd53b9f38a4a14db774f6c7ff65f7621f5b63db204c143d432c7c69cf04" Feb 17 14:33:06 crc kubenswrapper[4836]: E0217 14:33:06.753733 4836 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9baafd53b9f38a4a14db774f6c7ff65f7621f5b63db204c143d432c7c69cf04\": container with ID starting with e9baafd53b9f38a4a14db774f6c7ff65f7621f5b63db204c143d432c7c69cf04 not found: ID does not exist" containerID="e9baafd53b9f38a4a14db774f6c7ff65f7621f5b63db204c143d432c7c69cf04" Feb 17 14:33:06 crc kubenswrapper[4836]: I0217 14:33:06.753776 4836 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9baafd53b9f38a4a14db774f6c7ff65f7621f5b63db204c143d432c7c69cf04"} err="failed to get container status \"e9baafd53b9f38a4a14db774f6c7ff65f7621f5b63db204c143d432c7c69cf04\": rpc error: code = NotFound desc = could not find container \"e9baafd53b9f38a4a14db774f6c7ff65f7621f5b63db204c143d432c7c69cf04\": container with ID starting with e9baafd53b9f38a4a14db774f6c7ff65f7621f5b63db204c143d432c7c69cf04 not found: ID does not exist" Feb 17 14:33:06 crc kubenswrapper[4836]: I0217 14:33:06.753796 4836 scope.go:117] "RemoveContainer" containerID="589c03a33d996d19da55c16d13129e22ee3a80583c3c47707a14170d5f4c36a2" Feb 17 14:33:06 crc kubenswrapper[4836]: E0217 14:33:06.754193 4836 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"589c03a33d996d19da55c16d13129e22ee3a80583c3c47707a14170d5f4c36a2\": container with ID starting with 589c03a33d996d19da55c16d13129e22ee3a80583c3c47707a14170d5f4c36a2 not found: ID does not exist" containerID="589c03a33d996d19da55c16d13129e22ee3a80583c3c47707a14170d5f4c36a2" Feb 17 14:33:06 crc kubenswrapper[4836]: I0217 14:33:06.754244 4836 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"589c03a33d996d19da55c16d13129e22ee3a80583c3c47707a14170d5f4c36a2"} err="failed to get container status \"589c03a33d996d19da55c16d13129e22ee3a80583c3c47707a14170d5f4c36a2\": rpc error: code = NotFound desc = could not find container \"589c03a33d996d19da55c16d13129e22ee3a80583c3c47707a14170d5f4c36a2\": container with ID starting with 589c03a33d996d19da55c16d13129e22ee3a80583c3c47707a14170d5f4c36a2 not found: ID does not exist" Feb 17 14:33:08 crc kubenswrapper[4836]: I0217 14:33:08.580506 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52cf9c20-bb50-4295-a308-add7b717f6ce" path="/var/lib/kubelet/pods/52cf9c20-bb50-4295-a308-add7b717f6ce/volumes" Feb 17 14:33:09 crc kubenswrapper[4836]: I0217 14:33:09.453206 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-snsbl/must-gather-4sqf7"] Feb 17 14:33:09 crc kubenswrapper[4836]: E0217 14:33:09.453851 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52cf9c20-bb50-4295-a308-add7b717f6ce" containerName="registry-server" Feb 17 14:33:09 crc kubenswrapper[4836]: I0217 14:33:09.453878 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="52cf9c20-bb50-4295-a308-add7b717f6ce" containerName="registry-server" Feb 17 14:33:09 crc kubenswrapper[4836]: E0217 14:33:09.453890 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52cf9c20-bb50-4295-a308-add7b717f6ce" containerName="extract-utilities" Feb 17 14:33:09 crc kubenswrapper[4836]: I0217 14:33:09.453898 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="52cf9c20-bb50-4295-a308-add7b717f6ce" containerName="extract-utilities" Feb 17 14:33:09 crc kubenswrapper[4836]: E0217 14:33:09.453927 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="349d9039-4cce-4f99-83f3-f12ad111cae1" containerName="registry-server" Feb 17 14:33:09 crc kubenswrapper[4836]: I0217 14:33:09.453933 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="349d9039-4cce-4f99-83f3-f12ad111cae1" containerName="registry-server" Feb 17 14:33:09 crc kubenswrapper[4836]: E0217 14:33:09.453945 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52cf9c20-bb50-4295-a308-add7b717f6ce" containerName="extract-content" Feb 17 14:33:09 crc kubenswrapper[4836]: I0217 14:33:09.453952 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="52cf9c20-bb50-4295-a308-add7b717f6ce" containerName="extract-content" Feb 17 14:33:09 crc kubenswrapper[4836]: E0217 14:33:09.453972 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="349d9039-4cce-4f99-83f3-f12ad111cae1" containerName="extract-content" Feb 17 14:33:09 crc kubenswrapper[4836]: I0217 14:33:09.453978 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="349d9039-4cce-4f99-83f3-f12ad111cae1" containerName="extract-content" Feb 17 14:33:09 crc kubenswrapper[4836]: E0217 14:33:09.453997 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="349d9039-4cce-4f99-83f3-f12ad111cae1" containerName="extract-utilities" Feb 17 14:33:09 crc kubenswrapper[4836]: I0217 14:33:09.454002 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="349d9039-4cce-4f99-83f3-f12ad111cae1" containerName="extract-utilities" Feb 17 14:33:09 crc kubenswrapper[4836]: I0217 14:33:09.454211 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="52cf9c20-bb50-4295-a308-add7b717f6ce" containerName="registry-server" Feb 17 14:33:09 crc kubenswrapper[4836]: I0217 14:33:09.454236 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="349d9039-4cce-4f99-83f3-f12ad111cae1" containerName="registry-server" Feb 17 14:33:09 crc kubenswrapper[4836]: I0217 14:33:09.455659 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-snsbl/must-gather-4sqf7" Feb 17 14:33:09 crc kubenswrapper[4836]: I0217 14:33:09.458523 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-snsbl"/"openshift-service-ca.crt" Feb 17 14:33:09 crc kubenswrapper[4836]: I0217 14:33:09.458759 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-snsbl"/"kube-root-ca.crt" Feb 17 14:33:09 crc kubenswrapper[4836]: I0217 14:33:09.494582 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-snsbl/must-gather-4sqf7"] Feb 17 14:33:09 crc kubenswrapper[4836]: I0217 14:33:09.577029 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/781729f0-fe27-45e7-bd7b-23709696ec4d-must-gather-output\") pod \"must-gather-4sqf7\" (UID: \"781729f0-fe27-45e7-bd7b-23709696ec4d\") " pod="openshift-must-gather-snsbl/must-gather-4sqf7" Feb 17 14:33:09 crc kubenswrapper[4836]: I0217 14:33:09.577197 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6j6w\" (UniqueName: \"kubernetes.io/projected/781729f0-fe27-45e7-bd7b-23709696ec4d-kube-api-access-t6j6w\") pod \"must-gather-4sqf7\" (UID: \"781729f0-fe27-45e7-bd7b-23709696ec4d\") " pod="openshift-must-gather-snsbl/must-gather-4sqf7" Feb 17 14:33:09 crc kubenswrapper[4836]: I0217 14:33:09.678034 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/781729f0-fe27-45e7-bd7b-23709696ec4d-must-gather-output\") pod \"must-gather-4sqf7\" (UID: \"781729f0-fe27-45e7-bd7b-23709696ec4d\") " pod="openshift-must-gather-snsbl/must-gather-4sqf7" Feb 17 14:33:09 crc kubenswrapper[4836]: I0217 14:33:09.678156 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6j6w\" (UniqueName: \"kubernetes.io/projected/781729f0-fe27-45e7-bd7b-23709696ec4d-kube-api-access-t6j6w\") pod \"must-gather-4sqf7\" (UID: \"781729f0-fe27-45e7-bd7b-23709696ec4d\") " pod="openshift-must-gather-snsbl/must-gather-4sqf7" Feb 17 14:33:09 crc kubenswrapper[4836]: I0217 14:33:09.678516 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/781729f0-fe27-45e7-bd7b-23709696ec4d-must-gather-output\") pod \"must-gather-4sqf7\" (UID: \"781729f0-fe27-45e7-bd7b-23709696ec4d\") " pod="openshift-must-gather-snsbl/must-gather-4sqf7" Feb 17 14:33:09 crc kubenswrapper[4836]: I0217 14:33:09.708866 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6j6w\" (UniqueName: \"kubernetes.io/projected/781729f0-fe27-45e7-bd7b-23709696ec4d-kube-api-access-t6j6w\") pod \"must-gather-4sqf7\" (UID: \"781729f0-fe27-45e7-bd7b-23709696ec4d\") " pod="openshift-must-gather-snsbl/must-gather-4sqf7" Feb 17 14:33:09 crc kubenswrapper[4836]: I0217 14:33:09.780620 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-snsbl/must-gather-4sqf7" Feb 17 14:33:10 crc kubenswrapper[4836]: I0217 14:33:10.301086 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-snsbl/must-gather-4sqf7"] Feb 17 14:33:10 crc kubenswrapper[4836]: I0217 14:33:10.679641 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-snsbl/must-gather-4sqf7" event={"ID":"781729f0-fe27-45e7-bd7b-23709696ec4d","Type":"ContainerStarted","Data":"7228d3ea40d695c4e76f39706f5515b626db9e0c64eeee26178fe2c013e04da1"} Feb 17 14:33:20 crc kubenswrapper[4836]: I0217 14:33:20.829769 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-snsbl/must-gather-4sqf7" event={"ID":"781729f0-fe27-45e7-bd7b-23709696ec4d","Type":"ContainerStarted","Data":"07b49b7ed15d428bdf610ee0e143b812aa84072521d68f153a7b1e680fa48f67"} Feb 17 14:33:20 crc kubenswrapper[4836]: I0217 14:33:20.830405 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-snsbl/must-gather-4sqf7" event={"ID":"781729f0-fe27-45e7-bd7b-23709696ec4d","Type":"ContainerStarted","Data":"f31755722b53fa2304e51f9abd859d08ebfbf46cba2b7c60b902e42814f8b35d"} Feb 17 14:33:20 crc kubenswrapper[4836]: I0217 14:33:20.853383 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-snsbl/must-gather-4sqf7" podStartSLOduration=2.307672306 podStartE2EDuration="11.853348555s" podCreationTimestamp="2026-02-17 14:33:09 +0000 UTC" firstStartedPulling="2026-02-17 14:33:10.317001362 +0000 UTC m=+1616.659929631" lastFinishedPulling="2026-02-17 14:33:19.862677611 +0000 UTC m=+1626.205605880" observedRunningTime="2026-02-17 14:33:20.846852181 +0000 UTC m=+1627.189780460" watchObservedRunningTime="2026-02-17 14:33:20.853348555 +0000 UTC m=+1627.196276844" Feb 17 14:33:22 crc kubenswrapper[4836]: I0217 14:33:22.758322 4836 scope.go:117] "RemoveContainer" containerID="faf1f0c01e2ba58effda0101e73091532e490c7632b908240461cde1c4eacd7e" Feb 17 14:33:22 crc kubenswrapper[4836]: I0217 14:33:22.804957 4836 scope.go:117] "RemoveContainer" containerID="d1ced8732b18e9a32bcf99bb2f034caca8afbe19ad1c6c3a49849748da69630c" Feb 17 14:33:22 crc kubenswrapper[4836]: I0217 14:33:22.876827 4836 scope.go:117] "RemoveContainer" containerID="de75bc86bd0570fcef07a3f3195cfec352721b59eef66e22b061ebca87ca6456" Feb 17 14:33:22 crc kubenswrapper[4836]: I0217 14:33:22.902226 4836 scope.go:117] "RemoveContainer" containerID="44931b40ada4bc7bee4acb5d1054d14507951ed9df360a9eb97ae5e6b0efb503" Feb 17 14:33:22 crc kubenswrapper[4836]: I0217 14:33:22.925252 4836 scope.go:117] "RemoveContainer" containerID="4c54331d8c22a82e7135a4bdfa56b01c1bacccea5967146f9a8bb1c17d9ca3da" Feb 17 14:33:25 crc kubenswrapper[4836]: I0217 14:33:25.430665 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-snsbl/crc-debug-stlbp"] Feb 17 14:33:25 crc kubenswrapper[4836]: I0217 14:33:25.432898 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-snsbl/crc-debug-stlbp" Feb 17 14:33:25 crc kubenswrapper[4836]: I0217 14:33:25.435216 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-snsbl"/"default-dockercfg-ht85k" Feb 17 14:33:25 crc kubenswrapper[4836]: I0217 14:33:25.594528 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zc86l\" (UniqueName: \"kubernetes.io/projected/ac579e56-4f23-4b65-ad07-a1df27f67146-kube-api-access-zc86l\") pod \"crc-debug-stlbp\" (UID: \"ac579e56-4f23-4b65-ad07-a1df27f67146\") " pod="openshift-must-gather-snsbl/crc-debug-stlbp" Feb 17 14:33:25 crc kubenswrapper[4836]: I0217 14:33:25.597214 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ac579e56-4f23-4b65-ad07-a1df27f67146-host\") pod \"crc-debug-stlbp\" (UID: \"ac579e56-4f23-4b65-ad07-a1df27f67146\") " pod="openshift-must-gather-snsbl/crc-debug-stlbp" Feb 17 14:33:25 crc kubenswrapper[4836]: I0217 14:33:25.698993 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ac579e56-4f23-4b65-ad07-a1df27f67146-host\") pod \"crc-debug-stlbp\" (UID: \"ac579e56-4f23-4b65-ad07-a1df27f67146\") " pod="openshift-must-gather-snsbl/crc-debug-stlbp" Feb 17 14:33:25 crc kubenswrapper[4836]: I0217 14:33:25.699157 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ac579e56-4f23-4b65-ad07-a1df27f67146-host\") pod \"crc-debug-stlbp\" (UID: \"ac579e56-4f23-4b65-ad07-a1df27f67146\") " pod="openshift-must-gather-snsbl/crc-debug-stlbp" Feb 17 14:33:25 crc kubenswrapper[4836]: I0217 14:33:25.699167 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zc86l\" (UniqueName: \"kubernetes.io/projected/ac579e56-4f23-4b65-ad07-a1df27f67146-kube-api-access-zc86l\") pod \"crc-debug-stlbp\" (UID: \"ac579e56-4f23-4b65-ad07-a1df27f67146\") " pod="openshift-must-gather-snsbl/crc-debug-stlbp" Feb 17 14:33:25 crc kubenswrapper[4836]: I0217 14:33:25.720231 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zc86l\" (UniqueName: \"kubernetes.io/projected/ac579e56-4f23-4b65-ad07-a1df27f67146-kube-api-access-zc86l\") pod \"crc-debug-stlbp\" (UID: \"ac579e56-4f23-4b65-ad07-a1df27f67146\") " pod="openshift-must-gather-snsbl/crc-debug-stlbp" Feb 17 14:33:25 crc kubenswrapper[4836]: I0217 14:33:25.750796 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-snsbl/crc-debug-stlbp" Feb 17 14:33:25 crc kubenswrapper[4836]: I0217 14:33:25.887679 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-snsbl/crc-debug-stlbp" event={"ID":"ac579e56-4f23-4b65-ad07-a1df27f67146","Type":"ContainerStarted","Data":"6b9dafc80a9454cecd841ba44f7dffc8dfd96b47bb5dbc19c92e88113d30344a"} Feb 17 14:33:29 crc kubenswrapper[4836]: I0217 14:33:29.765462 4836 patch_prober.go:28] interesting pod/machine-config-daemon-bkk9g container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 14:33:29 crc kubenswrapper[4836]: I0217 14:33:29.766133 4836 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" podUID="895a19c9-a3f0-4a15-aa19-19347121388c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 14:33:40 crc kubenswrapper[4836]: I0217 14:33:40.086725 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-snsbl/crc-debug-stlbp" event={"ID":"ac579e56-4f23-4b65-ad07-a1df27f67146","Type":"ContainerStarted","Data":"0b1fdb782cc59c87b5c334a8e29bc01c7def7137ff5e1a24115754176ed4d2ab"} Feb 17 14:33:40 crc kubenswrapper[4836]: I0217 14:33:40.116322 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-snsbl/crc-debug-stlbp" podStartSLOduration=1.47615767 podStartE2EDuration="15.116255679s" podCreationTimestamp="2026-02-17 14:33:25 +0000 UTC" firstStartedPulling="2026-02-17 14:33:25.823966619 +0000 UTC m=+1632.166894888" lastFinishedPulling="2026-02-17 14:33:39.464064628 +0000 UTC m=+1645.806992897" observedRunningTime="2026-02-17 14:33:40.103344682 +0000 UTC m=+1646.446272961" watchObservedRunningTime="2026-02-17 14:33:40.116255679 +0000 UTC m=+1646.459183958" Feb 17 14:33:56 crc kubenswrapper[4836]: I0217 14:33:56.443388 4836 generic.go:334] "Generic (PLEG): container finished" podID="ac579e56-4f23-4b65-ad07-a1df27f67146" containerID="0b1fdb782cc59c87b5c334a8e29bc01c7def7137ff5e1a24115754176ed4d2ab" exitCode=0 Feb 17 14:33:56 crc kubenswrapper[4836]: I0217 14:33:56.443473 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-snsbl/crc-debug-stlbp" event={"ID":"ac579e56-4f23-4b65-ad07-a1df27f67146","Type":"ContainerDied","Data":"0b1fdb782cc59c87b5c334a8e29bc01c7def7137ff5e1a24115754176ed4d2ab"} Feb 17 14:33:57 crc kubenswrapper[4836]: I0217 14:33:57.588418 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-snsbl/crc-debug-stlbp" Feb 17 14:33:57 crc kubenswrapper[4836]: I0217 14:33:57.602335 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ac579e56-4f23-4b65-ad07-a1df27f67146-host\") pod \"ac579e56-4f23-4b65-ad07-a1df27f67146\" (UID: \"ac579e56-4f23-4b65-ad07-a1df27f67146\") " Feb 17 14:33:57 crc kubenswrapper[4836]: I0217 14:33:57.602460 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ac579e56-4f23-4b65-ad07-a1df27f67146-host" (OuterVolumeSpecName: "host") pod "ac579e56-4f23-4b65-ad07-a1df27f67146" (UID: "ac579e56-4f23-4b65-ad07-a1df27f67146"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 14:33:57 crc kubenswrapper[4836]: I0217 14:33:57.602759 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zc86l\" (UniqueName: \"kubernetes.io/projected/ac579e56-4f23-4b65-ad07-a1df27f67146-kube-api-access-zc86l\") pod \"ac579e56-4f23-4b65-ad07-a1df27f67146\" (UID: \"ac579e56-4f23-4b65-ad07-a1df27f67146\") " Feb 17 14:33:57 crc kubenswrapper[4836]: I0217 14:33:57.605212 4836 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ac579e56-4f23-4b65-ad07-a1df27f67146-host\") on node \"crc\" DevicePath \"\"" Feb 17 14:33:57 crc kubenswrapper[4836]: I0217 14:33:57.609452 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac579e56-4f23-4b65-ad07-a1df27f67146-kube-api-access-zc86l" (OuterVolumeSpecName: "kube-api-access-zc86l") pod "ac579e56-4f23-4b65-ad07-a1df27f67146" (UID: "ac579e56-4f23-4b65-ad07-a1df27f67146"). InnerVolumeSpecName "kube-api-access-zc86l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:33:57 crc kubenswrapper[4836]: I0217 14:33:57.643715 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-snsbl/crc-debug-stlbp"] Feb 17 14:33:57 crc kubenswrapper[4836]: I0217 14:33:57.667191 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-snsbl/crc-debug-stlbp"] Feb 17 14:33:57 crc kubenswrapper[4836]: I0217 14:33:57.706007 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zc86l\" (UniqueName: \"kubernetes.io/projected/ac579e56-4f23-4b65-ad07-a1df27f67146-kube-api-access-zc86l\") on node \"crc\" DevicePath \"\"" Feb 17 14:33:58 crc kubenswrapper[4836]: I0217 14:33:58.468018 4836 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6b9dafc80a9454cecd841ba44f7dffc8dfd96b47bb5dbc19c92e88113d30344a" Feb 17 14:33:58 crc kubenswrapper[4836]: I0217 14:33:58.468089 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-snsbl/crc-debug-stlbp" Feb 17 14:33:58 crc kubenswrapper[4836]: I0217 14:33:58.581902 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac579e56-4f23-4b65-ad07-a1df27f67146" path="/var/lib/kubelet/pods/ac579e56-4f23-4b65-ad07-a1df27f67146/volumes" Feb 17 14:33:58 crc kubenswrapper[4836]: I0217 14:33:58.853580 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-snsbl/crc-debug-pv9j4"] Feb 17 14:33:58 crc kubenswrapper[4836]: E0217 14:33:58.854377 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac579e56-4f23-4b65-ad07-a1df27f67146" containerName="container-00" Feb 17 14:33:58 crc kubenswrapper[4836]: I0217 14:33:58.854392 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac579e56-4f23-4b65-ad07-a1df27f67146" containerName="container-00" Feb 17 14:33:58 crc kubenswrapper[4836]: I0217 14:33:58.854621 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac579e56-4f23-4b65-ad07-a1df27f67146" containerName="container-00" Feb 17 14:33:58 crc kubenswrapper[4836]: I0217 14:33:58.855367 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-snsbl/crc-debug-pv9j4" Feb 17 14:33:58 crc kubenswrapper[4836]: I0217 14:33:58.858075 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-snsbl"/"default-dockercfg-ht85k" Feb 17 14:33:58 crc kubenswrapper[4836]: I0217 14:33:58.900785 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rlqh\" (UniqueName: \"kubernetes.io/projected/3d11c6f6-7a22-40b4-b7ba-e0cb46e0cecf-kube-api-access-7rlqh\") pod \"crc-debug-pv9j4\" (UID: \"3d11c6f6-7a22-40b4-b7ba-e0cb46e0cecf\") " pod="openshift-must-gather-snsbl/crc-debug-pv9j4" Feb 17 14:33:58 crc kubenswrapper[4836]: I0217 14:33:58.901102 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3d11c6f6-7a22-40b4-b7ba-e0cb46e0cecf-host\") pod \"crc-debug-pv9j4\" (UID: \"3d11c6f6-7a22-40b4-b7ba-e0cb46e0cecf\") " pod="openshift-must-gather-snsbl/crc-debug-pv9j4" Feb 17 14:33:59 crc kubenswrapper[4836]: I0217 14:33:59.003375 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3d11c6f6-7a22-40b4-b7ba-e0cb46e0cecf-host\") pod \"crc-debug-pv9j4\" (UID: \"3d11c6f6-7a22-40b4-b7ba-e0cb46e0cecf\") " pod="openshift-must-gather-snsbl/crc-debug-pv9j4" Feb 17 14:33:59 crc kubenswrapper[4836]: I0217 14:33:59.003602 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3d11c6f6-7a22-40b4-b7ba-e0cb46e0cecf-host\") pod \"crc-debug-pv9j4\" (UID: \"3d11c6f6-7a22-40b4-b7ba-e0cb46e0cecf\") " pod="openshift-must-gather-snsbl/crc-debug-pv9j4" Feb 17 14:33:59 crc kubenswrapper[4836]: I0217 14:33:59.003641 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rlqh\" (UniqueName: \"kubernetes.io/projected/3d11c6f6-7a22-40b4-b7ba-e0cb46e0cecf-kube-api-access-7rlqh\") pod \"crc-debug-pv9j4\" (UID: \"3d11c6f6-7a22-40b4-b7ba-e0cb46e0cecf\") " pod="openshift-must-gather-snsbl/crc-debug-pv9j4" Feb 17 14:33:59 crc kubenswrapper[4836]: I0217 14:33:59.029569 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rlqh\" (UniqueName: \"kubernetes.io/projected/3d11c6f6-7a22-40b4-b7ba-e0cb46e0cecf-kube-api-access-7rlqh\") pod \"crc-debug-pv9j4\" (UID: \"3d11c6f6-7a22-40b4-b7ba-e0cb46e0cecf\") " pod="openshift-must-gather-snsbl/crc-debug-pv9j4" Feb 17 14:33:59 crc kubenswrapper[4836]: I0217 14:33:59.175519 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-snsbl/crc-debug-pv9j4" Feb 17 14:33:59 crc kubenswrapper[4836]: I0217 14:33:59.481088 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-snsbl/crc-debug-pv9j4" event={"ID":"3d11c6f6-7a22-40b4-b7ba-e0cb46e0cecf","Type":"ContainerStarted","Data":"52187a78be9c156d0265b94c5488b994ed0bab26684ef18004535fce0431373c"} Feb 17 14:33:59 crc kubenswrapper[4836]: I0217 14:33:59.764979 4836 patch_prober.go:28] interesting pod/machine-config-daemon-bkk9g container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 14:33:59 crc kubenswrapper[4836]: I0217 14:33:59.765101 4836 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" podUID="895a19c9-a3f0-4a15-aa19-19347121388c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 14:33:59 crc kubenswrapper[4836]: I0217 14:33:59.765188 4836 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" Feb 17 14:33:59 crc kubenswrapper[4836]: I0217 14:33:59.766448 4836 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"325cd676e21fc52f03c197c519aae517b900944cff0ba106872a3e674c1b1f20"} pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 14:33:59 crc kubenswrapper[4836]: I0217 14:33:59.766543 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" podUID="895a19c9-a3f0-4a15-aa19-19347121388c" containerName="machine-config-daemon" containerID="cri-o://325cd676e21fc52f03c197c519aae517b900944cff0ba106872a3e674c1b1f20" gracePeriod=600 Feb 17 14:33:59 crc kubenswrapper[4836]: E0217 14:33:59.897409 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bkk9g_openshift-machine-config-operator(895a19c9-a3f0-4a15-aa19-19347121388c)\"" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" podUID="895a19c9-a3f0-4a15-aa19-19347121388c" Feb 17 14:34:00 crc kubenswrapper[4836]: I0217 14:34:00.500971 4836 generic.go:334] "Generic (PLEG): container finished" podID="3d11c6f6-7a22-40b4-b7ba-e0cb46e0cecf" containerID="b099deccdc43aaaf5e1d9673615b93cdbff588beb42726f387dc2c0ef267fb73" exitCode=1 Feb 17 14:34:00 crc kubenswrapper[4836]: I0217 14:34:00.501080 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-snsbl/crc-debug-pv9j4" event={"ID":"3d11c6f6-7a22-40b4-b7ba-e0cb46e0cecf","Type":"ContainerDied","Data":"b099deccdc43aaaf5e1d9673615b93cdbff588beb42726f387dc2c0ef267fb73"} Feb 17 14:34:00 crc kubenswrapper[4836]: I0217 14:34:00.520085 4836 generic.go:334] "Generic (PLEG): container finished" podID="895a19c9-a3f0-4a15-aa19-19347121388c" containerID="325cd676e21fc52f03c197c519aae517b900944cff0ba106872a3e674c1b1f20" exitCode=0 Feb 17 14:34:00 crc kubenswrapper[4836]: I0217 14:34:00.520158 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" event={"ID":"895a19c9-a3f0-4a15-aa19-19347121388c","Type":"ContainerDied","Data":"325cd676e21fc52f03c197c519aae517b900944cff0ba106872a3e674c1b1f20"} Feb 17 14:34:00 crc kubenswrapper[4836]: I0217 14:34:00.520206 4836 scope.go:117] "RemoveContainer" containerID="3c09fe81ffce38e5d9ef4195d8e69df0edfb238c5a8b73cb36be460e79dea4bb" Feb 17 14:34:00 crc kubenswrapper[4836]: I0217 14:34:00.521688 4836 scope.go:117] "RemoveContainer" containerID="325cd676e21fc52f03c197c519aae517b900944cff0ba106872a3e674c1b1f20" Feb 17 14:34:00 crc kubenswrapper[4836]: E0217 14:34:00.522126 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bkk9g_openshift-machine-config-operator(895a19c9-a3f0-4a15-aa19-19347121388c)\"" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" podUID="895a19c9-a3f0-4a15-aa19-19347121388c" Feb 17 14:34:00 crc kubenswrapper[4836]: I0217 14:34:00.616253 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-snsbl/crc-debug-pv9j4"] Feb 17 14:34:00 crc kubenswrapper[4836]: I0217 14:34:00.621971 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-snsbl/crc-debug-pv9j4"] Feb 17 14:34:01 crc kubenswrapper[4836]: I0217 14:34:01.641427 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-snsbl/crc-debug-pv9j4" Feb 17 14:34:01 crc kubenswrapper[4836]: I0217 14:34:01.834328 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7rlqh\" (UniqueName: \"kubernetes.io/projected/3d11c6f6-7a22-40b4-b7ba-e0cb46e0cecf-kube-api-access-7rlqh\") pod \"3d11c6f6-7a22-40b4-b7ba-e0cb46e0cecf\" (UID: \"3d11c6f6-7a22-40b4-b7ba-e0cb46e0cecf\") " Feb 17 14:34:01 crc kubenswrapper[4836]: I0217 14:34:01.834395 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3d11c6f6-7a22-40b4-b7ba-e0cb46e0cecf-host\") pod \"3d11c6f6-7a22-40b4-b7ba-e0cb46e0cecf\" (UID: \"3d11c6f6-7a22-40b4-b7ba-e0cb46e0cecf\") " Feb 17 14:34:01 crc kubenswrapper[4836]: I0217 14:34:01.835091 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3d11c6f6-7a22-40b4-b7ba-e0cb46e0cecf-host" (OuterVolumeSpecName: "host") pod "3d11c6f6-7a22-40b4-b7ba-e0cb46e0cecf" (UID: "3d11c6f6-7a22-40b4-b7ba-e0cb46e0cecf"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 14:34:01 crc kubenswrapper[4836]: I0217 14:34:01.842884 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d11c6f6-7a22-40b4-b7ba-e0cb46e0cecf-kube-api-access-7rlqh" (OuterVolumeSpecName: "kube-api-access-7rlqh") pod "3d11c6f6-7a22-40b4-b7ba-e0cb46e0cecf" (UID: "3d11c6f6-7a22-40b4-b7ba-e0cb46e0cecf"). InnerVolumeSpecName "kube-api-access-7rlqh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:34:01 crc kubenswrapper[4836]: I0217 14:34:01.937745 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7rlqh\" (UniqueName: \"kubernetes.io/projected/3d11c6f6-7a22-40b4-b7ba-e0cb46e0cecf-kube-api-access-7rlqh\") on node \"crc\" DevicePath \"\"" Feb 17 14:34:01 crc kubenswrapper[4836]: I0217 14:34:01.937788 4836 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3d11c6f6-7a22-40b4-b7ba-e0cb46e0cecf-host\") on node \"crc\" DevicePath \"\"" Feb 17 14:34:02 crc kubenswrapper[4836]: I0217 14:34:02.546137 4836 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="52187a78be9c156d0265b94c5488b994ed0bab26684ef18004535fce0431373c" Feb 17 14:34:02 crc kubenswrapper[4836]: I0217 14:34:02.546521 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-snsbl/crc-debug-pv9j4" Feb 17 14:34:02 crc kubenswrapper[4836]: I0217 14:34:02.584252 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d11c6f6-7a22-40b4-b7ba-e0cb46e0cecf" path="/var/lib/kubelet/pods/3d11c6f6-7a22-40b4-b7ba-e0cb46e0cecf/volumes" Feb 17 14:34:14 crc kubenswrapper[4836]: I0217 14:34:14.578833 4836 scope.go:117] "RemoveContainer" containerID="325cd676e21fc52f03c197c519aae517b900944cff0ba106872a3e674c1b1f20" Feb 17 14:34:14 crc kubenswrapper[4836]: E0217 14:34:14.579899 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bkk9g_openshift-machine-config-operator(895a19c9-a3f0-4a15-aa19-19347121388c)\"" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" podUID="895a19c9-a3f0-4a15-aa19-19347121388c" Feb 17 14:34:23 crc kubenswrapper[4836]: I0217 14:34:23.147103 4836 scope.go:117] "RemoveContainer" containerID="85bf6d2c05b11776e36fd7dffb8368edf8f8e5b125a942780ac6175dd831a159" Feb 17 14:34:23 crc kubenswrapper[4836]: I0217 14:34:23.190873 4836 scope.go:117] "RemoveContainer" containerID="14423eb209623d815ed52e92ff6318e5e659fcf35e927a649dbd595f58224937" Feb 17 14:34:29 crc kubenswrapper[4836]: I0217 14:34:29.568421 4836 scope.go:117] "RemoveContainer" containerID="325cd676e21fc52f03c197c519aae517b900944cff0ba106872a3e674c1b1f20" Feb 17 14:34:29 crc kubenswrapper[4836]: E0217 14:34:29.570533 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bkk9g_openshift-machine-config-operator(895a19c9-a3f0-4a15-aa19-19347121388c)\"" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" podUID="895a19c9-a3f0-4a15-aa19-19347121388c" Feb 17 14:34:40 crc kubenswrapper[4836]: I0217 14:34:40.568665 4836 scope.go:117] "RemoveContainer" containerID="325cd676e21fc52f03c197c519aae517b900944cff0ba106872a3e674c1b1f20" Feb 17 14:34:40 crc kubenswrapper[4836]: E0217 14:34:40.570030 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bkk9g_openshift-machine-config-operator(895a19c9-a3f0-4a15-aa19-19347121388c)\"" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" podUID="895a19c9-a3f0-4a15-aa19-19347121388c" Feb 17 14:34:54 crc kubenswrapper[4836]: I0217 14:34:54.579178 4836 scope.go:117] "RemoveContainer" containerID="325cd676e21fc52f03c197c519aae517b900944cff0ba106872a3e674c1b1f20" Feb 17 14:34:54 crc kubenswrapper[4836]: E0217 14:34:54.580319 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bkk9g_openshift-machine-config-operator(895a19c9-a3f0-4a15-aa19-19347121388c)\"" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" podUID="895a19c9-a3f0-4a15-aa19-19347121388c" Feb 17 14:35:07 crc kubenswrapper[4836]: I0217 14:35:07.569155 4836 scope.go:117] "RemoveContainer" containerID="325cd676e21fc52f03c197c519aae517b900944cff0ba106872a3e674c1b1f20" Feb 17 14:35:07 crc kubenswrapper[4836]: E0217 14:35:07.570539 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bkk9g_openshift-machine-config-operator(895a19c9-a3f0-4a15-aa19-19347121388c)\"" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" podUID="895a19c9-a3f0-4a15-aa19-19347121388c" Feb 17 14:35:08 crc kubenswrapper[4836]: I0217 14:35:08.071587 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_039a526c-4f5a-4641-9340-b18459145569/init-config-reloader/0.log" Feb 17 14:35:08 crc kubenswrapper[4836]: I0217 14:35:08.769270 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_039a526c-4f5a-4641-9340-b18459145569/init-config-reloader/0.log" Feb 17 14:35:08 crc kubenswrapper[4836]: I0217 14:35:08.861390 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_039a526c-4f5a-4641-9340-b18459145569/alertmanager/0.log" Feb 17 14:35:08 crc kubenswrapper[4836]: I0217 14:35:08.906146 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_039a526c-4f5a-4641-9340-b18459145569/config-reloader/0.log" Feb 17 14:35:09 crc kubenswrapper[4836]: I0217 14:35:09.342961 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-652c-account-create-update-lswdv_767841a7-db94-430a-b408-10e5bd0350e5/mariadb-account-create-update/0.log" Feb 17 14:35:09 crc kubenswrapper[4836]: I0217 14:35:09.356953 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7dc9c9fdbb-zxjj6_62b902ba-6ba2-48f3-a6dc-652fd1d6d58c/barbican-api/0.log" Feb 17 14:35:09 crc kubenswrapper[4836]: I0217 14:35:09.452241 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7dc9c9fdbb-zxjj6_62b902ba-6ba2-48f3-a6dc-652fd1d6d58c/barbican-api-log/0.log" Feb 17 14:35:09 crc kubenswrapper[4836]: I0217 14:35:09.673196 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-db-create-69hk6_4edeb89f-0bd9-466e-a9f9-2d45575d2c72/mariadb-database-create/0.log" Feb 17 14:35:09 crc kubenswrapper[4836]: I0217 14:35:09.713455 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-db-sync-g9l4s_18361bc2-5db1-4611-be18-38593e0b5d5d/barbican-db-sync/0.log" Feb 17 14:35:09 crc kubenswrapper[4836]: I0217 14:35:09.902863 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-68fd77ffbb-m5r5c_f79d706e-2d22-49c6-acb5-dc3f130ab102/barbican-keystone-listener/0.log" Feb 17 14:35:09 crc kubenswrapper[4836]: I0217 14:35:09.994104 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-68fd77ffbb-m5r5c_f79d706e-2d22-49c6-acb5-dc3f130ab102/barbican-keystone-listener-log/0.log" Feb 17 14:35:10 crc kubenswrapper[4836]: I0217 14:35:10.144566 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6567fb9c77-xcq7p_bf33e52a-365f-4ccc-8352-f4c7f8e2aebd/barbican-worker/0.log" Feb 17 14:35:10 crc kubenswrapper[4836]: I0217 14:35:10.285402 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6567fb9c77-xcq7p_bf33e52a-365f-4ccc-8352-f4c7f8e2aebd/barbican-worker-log/0.log" Feb 17 14:35:10 crc kubenswrapper[4836]: I0217 14:35:10.330798 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_1ddcf30e-7916-4b59-8986-a5d2c218170e/ceilometer-central-agent/0.log" Feb 17 14:35:10 crc kubenswrapper[4836]: I0217 14:35:10.423428 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_1ddcf30e-7916-4b59-8986-a5d2c218170e/ceilometer-notification-agent/0.log" Feb 17 14:35:10 crc kubenswrapper[4836]: I0217 14:35:10.525047 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_1ddcf30e-7916-4b59-8986-a5d2c218170e/sg-core/0.log" Feb 17 14:35:10 crc kubenswrapper[4836]: I0217 14:35:10.527974 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_1ddcf30e-7916-4b59-8986-a5d2c218170e/proxy-httpd/0.log" Feb 17 14:35:10 crc kubenswrapper[4836]: I0217 14:35:10.653765 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-0d11-account-create-update-jf72z_f9ee15e8-6695-454f-83ad-d54176458497/mariadb-account-create-update/0.log" Feb 17 14:35:10 crc kubenswrapper[4836]: I0217 14:35:10.883121 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_8722776f-950d-46d6-8929-164cc70747af/cinder-api/0.log" Feb 17 14:35:10 crc kubenswrapper[4836]: I0217 14:35:10.889117 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_8722776f-950d-46d6-8929-164cc70747af/cinder-api-log/0.log" Feb 17 14:35:11 crc kubenswrapper[4836]: I0217 14:35:11.084071 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-db-create-w5qdk_eb354e85-311d-40bb-ae4a-5c535d4d89b9/mariadb-database-create/0.log" Feb 17 14:35:11 crc kubenswrapper[4836]: I0217 14:35:11.109989 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-db-sync-qqwhc_8185c649-f1ad-4230-830d-07d002e5b358/cinder-db-sync/0.log" Feb 17 14:35:12 crc kubenswrapper[4836]: I0217 14:35:12.088286 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_0e6a7955-6cfb-4afe-b94a-8900513e5821/probe/0.log" Feb 17 14:35:12 crc kubenswrapper[4836]: I0217 14:35:12.210445 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-2ea0-account-create-update-p7p99_2ee1a0f2-86df-4f97-957a-22bbd7da4505/mariadb-account-create-update/0.log" Feb 17 14:35:12 crc kubenswrapper[4836]: I0217 14:35:12.266754 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_0e6a7955-6cfb-4afe-b94a-8900513e5821/cinder-scheduler/0.log" Feb 17 14:35:12 crc kubenswrapper[4836]: I0217 14:35:12.727763 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-api-0_260ff3a2-31e2-4ef9-9d7a-ef52ddb0fc49/cloudkitty-api/0.log" Feb 17 14:35:12 crc kubenswrapper[4836]: I0217 14:35:12.833614 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-api-0_260ff3a2-31e2-4ef9-9d7a-ef52ddb0fc49/cloudkitty-api-log/0.log" Feb 17 14:35:12 crc kubenswrapper[4836]: I0217 14:35:12.885054 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-db-create-jjrp2_a1fe36f3-d6b6-44e0-b85b-6def754fd08e/mariadb-database-create/0.log" Feb 17 14:35:13 crc kubenswrapper[4836]: I0217 14:35:13.068795 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-db-sync-pvljf_4e016162-2025-44ad-989d-ce71d9f8f9bf/cloudkitty-db-sync/0.log" Feb 17 14:35:13 crc kubenswrapper[4836]: I0217 14:35:13.120102 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-compactor-0_e2c3e649-7933-49e2-800c-b66dbd377ac6/loki-compactor/0.log" Feb 17 14:35:13 crc kubenswrapper[4836]: I0217 14:35:13.347679 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-distributor-585d9bcbc-r4gdh_33c54f8c-91c4-4742-b545-d0e2c4e85fe2/loki-distributor/0.log" Feb 17 14:35:13 crc kubenswrapper[4836]: I0217 14:35:13.448620 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-gateway-7f8685b49f-nbvnf_a977b831-7959-4509-93bf-a45b375ca722/gateway/0.log" Feb 17 14:35:13 crc kubenswrapper[4836]: I0217 14:35:13.597243 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-gateway-7f8685b49f-q78z5_974f66b3-690f-4008-949d-1d57c978d427/gateway/0.log" Feb 17 14:35:13 crc kubenswrapper[4836]: I0217 14:35:13.731396 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-index-gateway-0_d370240e-d6c1-4d9c-9877-293afa6e77f2/loki-index-gateway/0.log" Feb 17 14:35:13 crc kubenswrapper[4836]: I0217 14:35:13.910779 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-ingester-0_1c33fb01-9bf7-43f1-86d5-004e70d3721c/loki-ingester/0.log" Feb 17 14:35:14 crc kubenswrapper[4836]: I0217 14:35:14.049610 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-querier-58c84b5844-fsq2h_27c5f450-8bef-4732-a7fb-272d9b5a4ea8/loki-querier/0.log" Feb 17 14:35:14 crc kubenswrapper[4836]: I0217 14:35:14.245329 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-query-frontend-67bb4dfcd8-bqn6j_487d19a3-7f23-4945-bfe1-6231a37a84c6/loki-query-frontend/0.log" Feb 17 14:35:14 crc kubenswrapper[4836]: I0217 14:35:14.449395 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-storageinit-9z4jp_f38b5f94-bc8b-4e64-abe6-8c39b920cb4b/cloudkitty-storageinit/0.log" Feb 17 14:35:15 crc kubenswrapper[4836]: I0217 14:35:15.085377 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5fd9b586ff-snjhj_6dc084a0-be89-4371-92a3-181cfe1979ce/init/0.log" Feb 17 14:35:15 crc kubenswrapper[4836]: I0217 14:35:15.462323 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5fd9b586ff-snjhj_6dc084a0-be89-4371-92a3-181cfe1979ce/dnsmasq-dns/0.log" Feb 17 14:35:15 crc kubenswrapper[4836]: I0217 14:35:15.589828 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5fd9b586ff-snjhj_6dc084a0-be89-4371-92a3-181cfe1979ce/init/0.log" Feb 17 14:35:15 crc kubenswrapper[4836]: I0217 14:35:15.604197 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-d162-account-create-update-khb5j_1a5ee61c-8aa3-4bb4-a7da-2d61ca1561f5/mariadb-account-create-update/0.log" Feb 17 14:35:16 crc kubenswrapper[4836]: I0217 14:35:16.134848 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-db-sync-z8g7x_df3a6cf1-bca0-45b2-9f7c-6d483452d49d/glance-db-sync/0.log" Feb 17 14:35:16 crc kubenswrapper[4836]: I0217 14:35:16.166580 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-db-create-pn587_77cd04d8-7d93-4b49-b91b-8cc5fdf8f53b/mariadb-database-create/0.log" Feb 17 14:35:16 crc kubenswrapper[4836]: I0217 14:35:16.434560 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_b5121f0d-e93f-44c6-96b5-4ed7b6ec960e/glance-log/0.log" Feb 17 14:35:16 crc kubenswrapper[4836]: I0217 14:35:16.527564 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_b5121f0d-e93f-44c6-96b5-4ed7b6ec960e/glance-httpd/0.log" Feb 17 14:35:16 crc kubenswrapper[4836]: I0217 14:35:16.777825 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_172fadf8-99d3-436a-b711-010e8ffe289b/glance-httpd/0.log" Feb 17 14:35:16 crc kubenswrapper[4836]: I0217 14:35:16.873396 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_172fadf8-99d3-436a-b711-010e8ffe289b/glance-log/0.log" Feb 17 14:35:17 crc kubenswrapper[4836]: I0217 14:35:17.127989 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-proc-0_79c00bb2-9487-433a-be90-07b6d885e4d0/cloudkitty-proc/0.log" Feb 17 14:35:17 crc kubenswrapper[4836]: I0217 14:35:17.244326 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-bootstrap-vmgps_10331926-261d-4e44-a8c2-89846903ca12/keystone-bootstrap/0.log" Feb 17 14:35:17 crc kubenswrapper[4836]: I0217 14:35:17.306224 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-78c4d587b5-cqhdl_f2f9acba-3f54-43b6-9461-31cba0cc954b/keystone-api/0.log" Feb 17 14:35:17 crc kubenswrapper[4836]: I0217 14:35:17.566191 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-d8f3-account-create-update-kmlvm_2ae1659d-7892-4744-a570-4ba7c65e4caf/mariadb-account-create-update/0.log" Feb 17 14:35:17 crc kubenswrapper[4836]: I0217 14:35:17.572927 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-db-create-k7zc9_e562d506-21d2-4edd-90b8-97bd11bf068e/mariadb-database-create/0.log" Feb 17 14:35:17 crc kubenswrapper[4836]: I0217 14:35:17.635287 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-db-sync-q25rr_6a1d4ef8-03d9-42d8-ae0b-9410767ed25f/keystone-db-sync/0.log" Feb 17 14:35:17 crc kubenswrapper[4836]: I0217 14:35:17.935934 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_8809e181-9f70-4810-97e8-6fc4c9e3561a/kube-state-metrics/0.log" Feb 17 14:35:18 crc kubenswrapper[4836]: I0217 14:35:18.829813 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-14cb-account-create-update-xw2dd_623225aa-2492-494e-be5b-92acef6f23cf/mariadb-account-create-update/0.log" Feb 17 14:35:18 crc kubenswrapper[4836]: I0217 14:35:18.870878 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6fc4994bf7-cqhhj_88848d0f-5d90-4ca0-9a78-d08e73159601/neutron-api/0.log" Feb 17 14:35:18 crc kubenswrapper[4836]: I0217 14:35:18.939089 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6fc4994bf7-cqhhj_88848d0f-5d90-4ca0-9a78-d08e73159601/neutron-httpd/0.log" Feb 17 14:35:19 crc kubenswrapper[4836]: I0217 14:35:19.143912 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-db-create-nwjd8_d4ce1c7a-57e8-491e-84ab-8aed8baea37b/mariadb-database-create/0.log" Feb 17 14:35:19 crc kubenswrapper[4836]: I0217 14:35:19.303685 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-db-sync-sb6h7_81ddbaec-f370-44a3-802b-26980ea65d2f/neutron-db-sync/0.log" Feb 17 14:35:19 crc kubenswrapper[4836]: I0217 14:35:19.807250 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_a8815111-fe36-4868-b092-2f88255f8f2b/nova-api-api/0.log" Feb 17 14:35:19 crc kubenswrapper[4836]: I0217 14:35:19.811320 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_a8815111-fe36-4868-b092-2f88255f8f2b/nova-api-log/0.log" Feb 17 14:35:20 crc kubenswrapper[4836]: I0217 14:35:20.150578 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-a7c4-account-create-update-qj5lb_c7d61f8c-4804-49b6-937e-fbaf20aa3ed2/mariadb-account-create-update/0.log" Feb 17 14:35:20 crc kubenswrapper[4836]: I0217 14:35:20.262733 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-db-create-q8wrd_88b1aa3a-dc15-4ec1-ba76-8246e300422f/mariadb-database-create/0.log" Feb 17 14:35:20 crc kubenswrapper[4836]: I0217 14:35:20.482493 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-8fba-account-create-update-gqd5n_0b8171da-ad25-4388-9dab-2afc19993d97/mariadb-account-create-update/0.log" Feb 17 14:35:20 crc kubenswrapper[4836]: I0217 14:35:20.550280 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-cell-mapping-lqvvn_3f9d6a93-3d3a-4c5c-85cf-329209cfe911/nova-manage/0.log" Feb 17 14:35:20 crc kubenswrapper[4836]: I0217 14:35:20.823502 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_00cffdcb-70af-415e-86a8-4f8eb7c0ba6f/nova-cell0-conductor-conductor/0.log" Feb 17 14:35:20 crc kubenswrapper[4836]: I0217 14:35:20.901029 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-db-sync-896gw_5284ac65-3629-4b0f-94ce-114964fe6d15/nova-cell0-conductor-db-sync/0.log" Feb 17 14:35:21 crc kubenswrapper[4836]: I0217 14:35:21.145065 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-db-create-npl52_db342a3d-55f5-4b0c-b96f-327014b6fb82/mariadb-database-create/0.log" Feb 17 14:35:21 crc kubenswrapper[4836]: I0217 14:35:21.653172 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-28f5-account-create-update-74tvm_4dc00367-2940-413d-872a-74d4fa37fc1f/mariadb-account-create-update/0.log" Feb 17 14:35:21 crc kubenswrapper[4836]: I0217 14:35:21.910401 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-cell-mapping-h4mlr_079f20c9-f742-4c4b-a8c0-a2a09573bf62/nova-manage/0.log" Feb 17 14:35:22 crc kubenswrapper[4836]: I0217 14:35:22.175839 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-db-sync-bz94v_790a788c-3cfe-49c8-b1ff-a83bcedf17e0/nova-cell1-conductor-db-sync/0.log" Feb 17 14:35:22 crc kubenswrapper[4836]: I0217 14:35:22.190846 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_ed905f2c-85b9-4684-a376-674caf693eca/nova-cell1-conductor-conductor/0.log" Feb 17 14:35:22 crc kubenswrapper[4836]: I0217 14:35:22.465534 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-db-create-5h5m9_0312359b-98a6-49c7-83f1-fb44c679e8aa/mariadb-database-create/0.log" Feb 17 14:35:22 crc kubenswrapper[4836]: I0217 14:35:22.569043 4836 scope.go:117] "RemoveContainer" containerID="325cd676e21fc52f03c197c519aae517b900944cff0ba106872a3e674c1b1f20" Feb 17 14:35:22 crc kubenswrapper[4836]: E0217 14:35:22.569857 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bkk9g_openshift-machine-config-operator(895a19c9-a3f0-4a15-aa19-19347121388c)\"" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" podUID="895a19c9-a3f0-4a15-aa19-19347121388c" Feb 17 14:35:22 crc kubenswrapper[4836]: I0217 14:35:22.661242 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_6d9c8dd5-2ccb-4656-a059-352c03aa923d/nova-cell1-novncproxy-novncproxy/0.log" Feb 17 14:35:22 crc kubenswrapper[4836]: I0217 14:35:22.867851 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_c56150e0-07ff-4a45-9231-26fa261942c4/nova-metadata-metadata/0.log" Feb 17 14:35:22 crc kubenswrapper[4836]: I0217 14:35:22.921108 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_c56150e0-07ff-4a45-9231-26fa261942c4/nova-metadata-log/0.log" Feb 17 14:35:23 crc kubenswrapper[4836]: I0217 14:35:23.123991 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_6bfcfdb5-3886-47e2-8e71-33c95dc14e73/nova-scheduler-scheduler/0.log" Feb 17 14:35:23 crc kubenswrapper[4836]: I0217 14:35:23.341257 4836 scope.go:117] "RemoveContainer" containerID="1fc9116efed5aa1cde1e1851a8feece763300523cbdc4d6253a5c08f4f4f9f36" Feb 17 14:35:23 crc kubenswrapper[4836]: I0217 14:35:23.473521 4836 scope.go:117] "RemoveContainer" containerID="9ee60ada822c522c9249d0e3c31f511d939804abdb610bce124e951b7000a09d" Feb 17 14:35:23 crc kubenswrapper[4836]: I0217 14:35:23.546807 4836 scope.go:117] "RemoveContainer" containerID="b7a5e210ee7a505ae087f3c56329942b71db962383e4ae1693812dd8340169c8" Feb 17 14:35:23 crc kubenswrapper[4836]: I0217 14:35:23.585346 4836 scope.go:117] "RemoveContainer" containerID="407f5678203e5e174c01300835b55b61252a1ab248014426970911ab531d756b" Feb 17 14:35:23 crc kubenswrapper[4836]: I0217 14:35:23.599157 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_a6016745-1634-4eb6-afee-b98ce9ab8f56/mysql-bootstrap/0.log" Feb 17 14:35:23 crc kubenswrapper[4836]: I0217 14:35:23.634362 4836 scope.go:117] "RemoveContainer" containerID="d2098b2a7c4dcbee4fa27ea9bfa1c19e32c5f83e96aa663b877abb8284852c74" Feb 17 14:35:23 crc kubenswrapper[4836]: I0217 14:35:23.853263 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_a6016745-1634-4eb6-afee-b98ce9ab8f56/mysql-bootstrap/0.log" Feb 17 14:35:23 crc kubenswrapper[4836]: I0217 14:35:23.911722 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_a6016745-1634-4eb6-afee-b98ce9ab8f56/galera/0.log" Feb 17 14:35:23 crc kubenswrapper[4836]: I0217 14:35:23.977110 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_2fd891e0-6f97-4fa3-8281-aa97232d6c6d/mysql-bootstrap/0.log" Feb 17 14:35:24 crc kubenswrapper[4836]: I0217 14:35:24.320583 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_2fd891e0-6f97-4fa3-8281-aa97232d6c6d/mysql-bootstrap/0.log" Feb 17 14:35:24 crc kubenswrapper[4836]: I0217 14:35:24.377157 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_4fe674a8-c32b-412e-8d20-2a6e7e18bb10/openstackclient/0.log" Feb 17 14:35:24 crc kubenswrapper[4836]: I0217 14:35:24.411370 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_2fd891e0-6f97-4fa3-8281-aa97232d6c6d/galera/0.log" Feb 17 14:35:25 crc kubenswrapper[4836]: I0217 14:35:25.492410 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ghk5k_5949d44f-ef6d-417e-9035-9b235cd59863/ovn-controller/0.log" Feb 17 14:35:25 crc kubenswrapper[4836]: I0217 14:35:25.512048 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-6s7lx_bf32834e-7ae4-4e3b-b532-dd87f6a9223e/openstack-network-exporter/0.log" Feb 17 14:35:25 crc kubenswrapper[4836]: I0217 14:35:25.724696 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-j4jj9_cefe420d-f25c-4681-9ae8-b61f0a354282/ovsdb-server-init/0.log" Feb 17 14:35:26 crc kubenswrapper[4836]: I0217 14:35:26.103166 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-j4jj9_cefe420d-f25c-4681-9ae8-b61f0a354282/ovsdb-server/0.log" Feb 17 14:35:26 crc kubenswrapper[4836]: I0217 14:35:26.128357 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-j4jj9_cefe420d-f25c-4681-9ae8-b61f0a354282/ovsdb-server-init/0.log" Feb 17 14:35:26 crc kubenswrapper[4836]: I0217 14:35:26.147192 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-j4jj9_cefe420d-f25c-4681-9ae8-b61f0a354282/ovs-vswitchd/0.log" Feb 17 14:35:26 crc kubenswrapper[4836]: I0217 14:35:26.712682 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_0f031114-b776-4180-ab6e-eb5868f34d3e/openstack-network-exporter/0.log" Feb 17 14:35:26 crc kubenswrapper[4836]: I0217 14:35:26.821377 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_0f031114-b776-4180-ab6e-eb5868f34d3e/ovn-northd/0.log" Feb 17 14:35:26 crc kubenswrapper[4836]: I0217 14:35:26.869050 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_55bc1962-7790-448a-838c-cb13a870ea23/openstack-network-exporter/0.log" Feb 17 14:35:27 crc kubenswrapper[4836]: I0217 14:35:27.029837 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_55bc1962-7790-448a-838c-cb13a870ea23/ovsdbserver-nb/0.log" Feb 17 14:35:27 crc kubenswrapper[4836]: I0217 14:35:27.226435 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_348d02a8-d1b2-4bd3-9f4c-9153e24a5f19/openstack-network-exporter/0.log" Feb 17 14:35:27 crc kubenswrapper[4836]: I0217 14:35:27.243431 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_348d02a8-d1b2-4bd3-9f4c-9153e24a5f19/ovsdbserver-sb/0.log" Feb 17 14:35:27 crc kubenswrapper[4836]: I0217 14:35:27.497093 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-83de-account-create-update-fh75b_54905e17-d443-4465-8f70-7be04a89086f/mariadb-account-create-update/0.log" Feb 17 14:35:27 crc kubenswrapper[4836]: I0217 14:35:27.609862 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-bc958ddf6-kh2rq_42c3b1e3-728a-4bd8-9669-bfe1656b6de2/placement-api/0.log" Feb 17 14:35:28 crc kubenswrapper[4836]: I0217 14:35:28.104949 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-bc958ddf6-kh2rq_42c3b1e3-728a-4bd8-9669-bfe1656b6de2/placement-log/0.log" Feb 17 14:35:28 crc kubenswrapper[4836]: I0217 14:35:28.382673 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-db-create-hx7tv_add50d48-0a1c-4d2f-bcc3-ae9355e95c3b/mariadb-database-create/0.log" Feb 17 14:35:28 crc kubenswrapper[4836]: I0217 14:35:28.407034 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-db-sync-pdhxs_1fe4b42c-afbf-41e1-8035-5fffb156eadc/placement-db-sync/0.log" Feb 17 14:35:28 crc kubenswrapper[4836]: I0217 14:35:28.668062 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_6fec8667-7189-4e29-8362-37dd935d2db7/init-config-reloader/0.log" Feb 17 14:35:28 crc kubenswrapper[4836]: I0217 14:35:28.879926 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_6fec8667-7189-4e29-8362-37dd935d2db7/config-reloader/0.log" Feb 17 14:35:28 crc kubenswrapper[4836]: I0217 14:35:28.889414 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_6fec8667-7189-4e29-8362-37dd935d2db7/init-config-reloader/0.log" Feb 17 14:35:28 crc kubenswrapper[4836]: I0217 14:35:28.915929 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_6fec8667-7189-4e29-8362-37dd935d2db7/prometheus/0.log" Feb 17 14:35:28 crc kubenswrapper[4836]: I0217 14:35:28.932871 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_6fec8667-7189-4e29-8362-37dd935d2db7/thanos-sidecar/0.log" Feb 17 14:35:29 crc kubenswrapper[4836]: I0217 14:35:29.166003 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_6f866bb7-5209-4275-8884-df6f074b3f7c/setup-container/0.log" Feb 17 14:35:29 crc kubenswrapper[4836]: I0217 14:35:29.692607 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_6f866bb7-5209-4275-8884-df6f074b3f7c/setup-container/0.log" Feb 17 14:35:29 crc kubenswrapper[4836]: I0217 14:35:29.825731 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_6f866bb7-5209-4275-8884-df6f074b3f7c/rabbitmq/0.log" Feb 17 14:35:29 crc kubenswrapper[4836]: I0217 14:35:29.831782 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_ec9408e6-0474-4f84-842e-b1c20f42a7b8/setup-container/0.log" Feb 17 14:35:30 crc kubenswrapper[4836]: I0217 14:35:30.180226 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_ec9408e6-0474-4f84-842e-b1c20f42a7b8/setup-container/0.log" Feb 17 14:35:30 crc kubenswrapper[4836]: I0217 14:35:30.194893 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_root-account-create-update-h9gmq_caa6524b-2b3f-47c3-b55f-1435685df59d/mariadb-account-create-update/0.log" Feb 17 14:35:30 crc kubenswrapper[4836]: I0217 14:35:30.232579 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_ec9408e6-0474-4f84-842e-b1c20f42a7b8/rabbitmq/0.log" Feb 17 14:35:30 crc kubenswrapper[4836]: I0217 14:35:30.499414 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-5d87f46c5f-vfn9f_a17ffb1e-09d2-4524-8c33-e50e15b9031d/proxy-httpd/0.log" Feb 17 14:35:30 crc kubenswrapper[4836]: I0217 14:35:30.584381 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-5d87f46c5f-vfn9f_a17ffb1e-09d2-4524-8c33-e50e15b9031d/proxy-server/0.log" Feb 17 14:35:30 crc kubenswrapper[4836]: I0217 14:35:30.799366 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-dbzmx_cb33695b-c451-44b2-8a2a-fe534a4040e3/swift-ring-rebalance/0.log" Feb 17 14:35:30 crc kubenswrapper[4836]: I0217 14:35:30.931009 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_e482046c-502a-4f41-b013-7b3ef1c71ee1/account-auditor/0.log" Feb 17 14:35:31 crc kubenswrapper[4836]: I0217 14:35:31.070930 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_e482046c-502a-4f41-b013-7b3ef1c71ee1/account-reaper/0.log" Feb 17 14:35:31 crc kubenswrapper[4836]: I0217 14:35:31.775285 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_e482046c-502a-4f41-b013-7b3ef1c71ee1/account-replicator/0.log" Feb 17 14:35:31 crc kubenswrapper[4836]: I0217 14:35:31.821871 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_e482046c-502a-4f41-b013-7b3ef1c71ee1/account-server/0.log" Feb 17 14:35:31 crc kubenswrapper[4836]: I0217 14:35:31.836225 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_e482046c-502a-4f41-b013-7b3ef1c71ee1/container-auditor/0.log" Feb 17 14:35:31 crc kubenswrapper[4836]: I0217 14:35:31.966623 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_ce3babe4-6d77-45ce-b9cc-626678d3ec64/memcached/0.log" Feb 17 14:35:32 crc kubenswrapper[4836]: I0217 14:35:32.023197 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_e482046c-502a-4f41-b013-7b3ef1c71ee1/container-replicator/0.log" Feb 17 14:35:32 crc kubenswrapper[4836]: I0217 14:35:32.052067 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_e482046c-502a-4f41-b013-7b3ef1c71ee1/container-server/0.log" Feb 17 14:35:32 crc kubenswrapper[4836]: I0217 14:35:32.151484 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_e482046c-502a-4f41-b013-7b3ef1c71ee1/container-updater/0.log" Feb 17 14:35:32 crc kubenswrapper[4836]: I0217 14:35:32.201324 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_e482046c-502a-4f41-b013-7b3ef1c71ee1/object-auditor/0.log" Feb 17 14:35:32 crc kubenswrapper[4836]: I0217 14:35:32.315436 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_e482046c-502a-4f41-b013-7b3ef1c71ee1/object-expirer/0.log" Feb 17 14:35:32 crc kubenswrapper[4836]: I0217 14:35:32.393036 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_e482046c-502a-4f41-b013-7b3ef1c71ee1/object-server/0.log" Feb 17 14:35:32 crc kubenswrapper[4836]: I0217 14:35:32.430871 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_e482046c-502a-4f41-b013-7b3ef1c71ee1/object-updater/0.log" Feb 17 14:35:32 crc kubenswrapper[4836]: I0217 14:35:32.433011 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_e482046c-502a-4f41-b013-7b3ef1c71ee1/object-replicator/0.log" Feb 17 14:35:32 crc kubenswrapper[4836]: I0217 14:35:32.484338 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_e482046c-502a-4f41-b013-7b3ef1c71ee1/rsync/0.log" Feb 17 14:35:32 crc kubenswrapper[4836]: I0217 14:35:32.820693 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_e482046c-502a-4f41-b013-7b3ef1c71ee1/swift-recon-cron/0.log" Feb 17 14:35:35 crc kubenswrapper[4836]: I0217 14:35:35.097485 4836 scope.go:117] "RemoveContainer" containerID="325cd676e21fc52f03c197c519aae517b900944cff0ba106872a3e674c1b1f20" Feb 17 14:35:35 crc kubenswrapper[4836]: E0217 14:35:35.098231 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bkk9g_openshift-machine-config-operator(895a19c9-a3f0-4a15-aa19-19347121388c)\"" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" podUID="895a19c9-a3f0-4a15-aa19-19347121388c" Feb 17 14:35:47 crc kubenswrapper[4836]: I0217 14:35:47.568991 4836 scope.go:117] "RemoveContainer" containerID="325cd676e21fc52f03c197c519aae517b900944cff0ba106872a3e674c1b1f20" Feb 17 14:35:47 crc kubenswrapper[4836]: E0217 14:35:47.570381 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bkk9g_openshift-machine-config-operator(895a19c9-a3f0-4a15-aa19-19347121388c)\"" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" podUID="895a19c9-a3f0-4a15-aa19-19347121388c" Feb 17 14:36:02 crc kubenswrapper[4836]: I0217 14:36:02.568888 4836 scope.go:117] "RemoveContainer" containerID="325cd676e21fc52f03c197c519aae517b900944cff0ba106872a3e674c1b1f20" Feb 17 14:36:02 crc kubenswrapper[4836]: E0217 14:36:02.569686 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bkk9g_openshift-machine-config-operator(895a19c9-a3f0-4a15-aa19-19347121388c)\"" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" podUID="895a19c9-a3f0-4a15-aa19-19347121388c" Feb 17 14:36:11 crc kubenswrapper[4836]: I0217 14:36:11.517612 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_609d143c5a73782b709463f2a5b0d811d0c25a93f651a8c9a58ebcae612cvgm_dc1ca64e-8914-44ae-8d9e-d7c63ba6e166/util/0.log" Feb 17 14:36:11 crc kubenswrapper[4836]: I0217 14:36:11.737992 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_609d143c5a73782b709463f2a5b0d811d0c25a93f651a8c9a58ebcae612cvgm_dc1ca64e-8914-44ae-8d9e-d7c63ba6e166/util/0.log" Feb 17 14:36:11 crc kubenswrapper[4836]: I0217 14:36:11.754161 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_609d143c5a73782b709463f2a5b0d811d0c25a93f651a8c9a58ebcae612cvgm_dc1ca64e-8914-44ae-8d9e-d7c63ba6e166/pull/0.log" Feb 17 14:36:11 crc kubenswrapper[4836]: I0217 14:36:11.840371 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_609d143c5a73782b709463f2a5b0d811d0c25a93f651a8c9a58ebcae612cvgm_dc1ca64e-8914-44ae-8d9e-d7c63ba6e166/pull/0.log" Feb 17 14:36:12 crc kubenswrapper[4836]: I0217 14:36:12.052839 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_609d143c5a73782b709463f2a5b0d811d0c25a93f651a8c9a58ebcae612cvgm_dc1ca64e-8914-44ae-8d9e-d7c63ba6e166/extract/0.log" Feb 17 14:36:12 crc kubenswrapper[4836]: I0217 14:36:12.058577 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_609d143c5a73782b709463f2a5b0d811d0c25a93f651a8c9a58ebcae612cvgm_dc1ca64e-8914-44ae-8d9e-d7c63ba6e166/pull/0.log" Feb 17 14:36:12 crc kubenswrapper[4836]: I0217 14:36:12.100473 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_609d143c5a73782b709463f2a5b0d811d0c25a93f651a8c9a58ebcae612cvgm_dc1ca64e-8914-44ae-8d9e-d7c63ba6e166/util/0.log" Feb 17 14:36:12 crc kubenswrapper[4836]: I0217 14:36:12.969061 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-6d8bf5c495-8wdwr_0962ca43-43c4-4884-bd8e-889835f83632/manager/0.log" Feb 17 14:36:13 crc kubenswrapper[4836]: I0217 14:36:13.295447 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987464f4-zxb25_ce77a6a5-95bb-4758-8a38-cdc354fd9d6c/manager/0.log" Feb 17 14:36:13 crc kubenswrapper[4836]: I0217 14:36:13.466954 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-69f49c598c-7vwdd_c3d9def3-7f53-4acc-9c46-d37ddf41e3b7/manager/0.log" Feb 17 14:36:13 crc kubenswrapper[4836]: I0217 14:36:13.774226 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-5b9b8895d5-bv7s8_f2e6ac9f-ee72-4a28-b298-9b2f918d0c95/manager/0.log" Feb 17 14:36:14 crc kubenswrapper[4836]: I0217 14:36:14.420371 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-5d946d989d-b6cfm_12cff299-e5ea-40a9-8a69-528c478cd0a0/manager/0.log" Feb 17 14:36:14 crc kubenswrapper[4836]: I0217 14:36:14.643519 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-79d975b745-f4fvp_a1ae24b8-83c8-416d-9d39-24d84eb6cd83/manager/0.log" Feb 17 14:36:14 crc kubenswrapper[4836]: I0217 14:36:14.771485 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-554564d7fc-k9p46_e805966b-ea22-4c2a-a6c4-3622300fcb2f/manager/0.log" Feb 17 14:36:14 crc kubenswrapper[4836]: I0217 14:36:14.976762 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-b4d948c87-qnb5b_18a63480-edc2-44ed-bd43-b7750f7f8f33/manager/0.log" Feb 17 14:36:15 crc kubenswrapper[4836]: I0217 14:36:15.086920 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-54f6768c69-6lzts_9ccd7ed5-2772-4482-af31-2578e98011fd/manager/0.log" Feb 17 14:36:15 crc kubenswrapper[4836]: I0217 14:36:15.693855 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6994f66f48-zkzrs_7b9749c7-038f-4814-9357-623346c9172c/manager/0.log" Feb 17 14:36:15 crc kubenswrapper[4836]: I0217 14:36:15.804932 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-64ddbf8bb-6c4rn_3d12b131-73a0-477e-ab9e-579309b0f5b1/manager/0.log" Feb 17 14:36:16 crc kubenswrapper[4836]: I0217 14:36:16.070575 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-567668f5cf-5hz7c_52a90e1a-0e2d-4488-8a1a-34de15bfa3a5/manager/0.log" Feb 17 14:36:16 crc kubenswrapper[4836]: I0217 14:36:16.381424 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-7c6767dc9ct7xht_4affaaf4-1113-4635-b30f-da26e04f6662/manager/0.log" Feb 17 14:36:17 crc kubenswrapper[4836]: I0217 14:36:17.179057 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-7464dc569f-6nqxk_4afa09e7-5273-4170-8c40-6c3ed66e6b8e/operator/0.log" Feb 17 14:36:17 crc kubenswrapper[4836]: I0217 14:36:17.428861 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-pz5pz_f0982db9-e1ef-4fc9-b7d4-e52ac91e6676/registry-server/0.log" Feb 17 14:36:17 crc kubenswrapper[4836]: I0217 14:36:17.568181 4836 scope.go:117] "RemoveContainer" containerID="325cd676e21fc52f03c197c519aae517b900944cff0ba106872a3e674c1b1f20" Feb 17 14:36:17 crc kubenswrapper[4836]: E0217 14:36:17.568550 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bkk9g_openshift-machine-config-operator(895a19c9-a3f0-4a15-aa19-19347121388c)\"" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" podUID="895a19c9-a3f0-4a15-aa19-19347121388c" Feb 17 14:36:17 crc kubenswrapper[4836]: I0217 14:36:17.790037 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-d44cf6b75-mq76b_f6ba6343-872d-4e36-accf-959bb437f82d/manager/0.log" Feb 17 14:36:18 crc kubenswrapper[4836]: I0217 14:36:18.565220 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-8497b45c89-jnxzt_cf7c4631-b19a-4160-8581-15f72869a60b/manager/0.log" Feb 17 14:36:18 crc kubenswrapper[4836]: I0217 14:36:18.614852 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-69f8888797-llzlm_1bb12b86-1f25-4dd9-a44d-449a6deee701/manager/0.log" Feb 17 14:36:18 crc kubenswrapper[4836]: I0217 14:36:18.911775 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-w4dds_d423f7ba-2751-4d99-8102-3bc52b302161/operator/0.log" Feb 17 14:36:19 crc kubenswrapper[4836]: I0217 14:36:19.039747 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-68f46476f-7ktgs_d0c3c41c-ac60-40f0-bdfb-8fe641c9426a/manager/0.log" Feb 17 14:36:19 crc kubenswrapper[4836]: I0217 14:36:19.193966 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-667f54696f-kskgn_ca022a36-1c0e-4d3b-a6cf-87f4a78cfd48/manager/0.log" Feb 17 14:36:19 crc kubenswrapper[4836]: I0217 14:36:19.512608 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-7866795846-ztvz2_d4aa765a-0f56-4f05-b02f-f041841bc97d/manager/0.log" Feb 17 14:36:19 crc kubenswrapper[4836]: I0217 14:36:19.661789 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-5db88f68c-lmtng_1f238b1a-4c0c-45de-bb7a-12946f426b89/manager/0.log" Feb 17 14:36:19 crc kubenswrapper[4836]: I0217 14:36:19.774248 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-6d6964fcdb-rbq62_a3c22d9b-6ba0-4dd2-861d-8685c18e9330/manager/0.log" Feb 17 14:36:22 crc kubenswrapper[4836]: I0217 14:36:22.177055 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-868647ff47-54696_a7c6acc7-4243-4c0d-a723-e83dc2e054df/manager/0.log" Feb 17 14:36:23 crc kubenswrapper[4836]: I0217 14:36:23.880468 4836 scope.go:117] "RemoveContainer" containerID="f0419b2e3c8ef9c0f54a84e7512a9cde99f00cb5aa7b44a637e787be45f07ccd" Feb 17 14:36:23 crc kubenswrapper[4836]: I0217 14:36:23.914313 4836 scope.go:117] "RemoveContainer" containerID="fcda893980936a4e72f451c12f1e7a2007edb9f1324581557ec99a4e77ee81f9" Feb 17 14:36:23 crc kubenswrapper[4836]: I0217 14:36:23.950286 4836 scope.go:117] "RemoveContainer" containerID="d3297c8494404e0f55bc6c3d7032d9a3295e84dc803655d2e2df3e6ab7a747be" Feb 17 14:36:23 crc kubenswrapper[4836]: I0217 14:36:23.982089 4836 scope.go:117] "RemoveContainer" containerID="4337aced693eb74520c39cdaad50c2d06e723483b872e61eb2f707cc9550085e" Feb 17 14:36:28 crc kubenswrapper[4836]: I0217 14:36:28.568955 4836 scope.go:117] "RemoveContainer" containerID="325cd676e21fc52f03c197c519aae517b900944cff0ba106872a3e674c1b1f20" Feb 17 14:36:28 crc kubenswrapper[4836]: E0217 14:36:28.569993 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bkk9g_openshift-machine-config-operator(895a19c9-a3f0-4a15-aa19-19347121388c)\"" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" podUID="895a19c9-a3f0-4a15-aa19-19347121388c" Feb 17 14:36:39 crc kubenswrapper[4836]: I0217 14:36:39.569429 4836 scope.go:117] "RemoveContainer" containerID="325cd676e21fc52f03c197c519aae517b900944cff0ba106872a3e674c1b1f20" Feb 17 14:36:39 crc kubenswrapper[4836]: E0217 14:36:39.570315 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bkk9g_openshift-machine-config-operator(895a19c9-a3f0-4a15-aa19-19347121388c)\"" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" podUID="895a19c9-a3f0-4a15-aa19-19347121388c" Feb 17 14:36:44 crc kubenswrapper[4836]: I0217 14:36:44.059274 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-pn587"] Feb 17 14:36:44 crc kubenswrapper[4836]: I0217 14:36:44.074039 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-pn587"] Feb 17 14:36:44 crc kubenswrapper[4836]: I0217 14:36:44.583654 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77cd04d8-7d93-4b49-b91b-8cc5fdf8f53b" path="/var/lib/kubelet/pods/77cd04d8-7d93-4b49-b91b-8cc5fdf8f53b/volumes" Feb 17 14:36:45 crc kubenswrapper[4836]: I0217 14:36:45.058883 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-hx7tv"] Feb 17 14:36:45 crc kubenswrapper[4836]: I0217 14:36:45.071655 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-hx7tv"] Feb 17 14:36:46 crc kubenswrapper[4836]: I0217 14:36:46.583960 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="add50d48-0a1c-4d2f-bcc3-ae9355e95c3b" path="/var/lib/kubelet/pods/add50d48-0a1c-4d2f-bcc3-ae9355e95c3b/volumes" Feb 17 14:36:47 crc kubenswrapper[4836]: I0217 14:36:47.043021 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-d162-account-create-update-khb5j"] Feb 17 14:36:47 crc kubenswrapper[4836]: I0217 14:36:47.054869 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-83de-account-create-update-fh75b"] Feb 17 14:36:47 crc kubenswrapper[4836]: I0217 14:36:47.067890 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-d8f3-account-create-update-kmlvm"] Feb 17 14:36:47 crc kubenswrapper[4836]: I0217 14:36:47.083851 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-d162-account-create-update-khb5j"] Feb 17 14:36:47 crc kubenswrapper[4836]: I0217 14:36:47.095150 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-d8f3-account-create-update-kmlvm"] Feb 17 14:36:47 crc kubenswrapper[4836]: I0217 14:36:47.106096 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-k7zc9"] Feb 17 14:36:47 crc kubenswrapper[4836]: I0217 14:36:47.117332 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-83de-account-create-update-fh75b"] Feb 17 14:36:47 crc kubenswrapper[4836]: I0217 14:36:47.127841 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-k7zc9"] Feb 17 14:36:48 crc kubenswrapper[4836]: I0217 14:36:48.585398 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a5ee61c-8aa3-4bb4-a7da-2d61ca1561f5" path="/var/lib/kubelet/pods/1a5ee61c-8aa3-4bb4-a7da-2d61ca1561f5/volumes" Feb 17 14:36:48 crc kubenswrapper[4836]: I0217 14:36:48.586501 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ae1659d-7892-4744-a570-4ba7c65e4caf" path="/var/lib/kubelet/pods/2ae1659d-7892-4744-a570-4ba7c65e4caf/volumes" Feb 17 14:36:48 crc kubenswrapper[4836]: I0217 14:36:48.587409 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54905e17-d443-4465-8f70-7be04a89086f" path="/var/lib/kubelet/pods/54905e17-d443-4465-8f70-7be04a89086f/volumes" Feb 17 14:36:48 crc kubenswrapper[4836]: I0217 14:36:48.588147 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e562d506-21d2-4edd-90b8-97bd11bf068e" path="/var/lib/kubelet/pods/e562d506-21d2-4edd-90b8-97bd11bf068e/volumes" Feb 17 14:36:48 crc kubenswrapper[4836]: I0217 14:36:48.807664 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-jhzxl_cea58b47-da5e-4dc7-be23-19d8408318d7/control-plane-machine-set-operator/0.log" Feb 17 14:36:49 crc kubenswrapper[4836]: I0217 14:36:49.437210 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-jjmwc_1ecc7c98-e9a3-4850-a741-7e0bcf670e27/machine-api-operator/0.log" Feb 17 14:36:49 crc kubenswrapper[4836]: I0217 14:36:49.444465 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-jjmwc_1ecc7c98-e9a3-4850-a741-7e0bcf670e27/kube-rbac-proxy/0.log" Feb 17 14:36:52 crc kubenswrapper[4836]: I0217 14:36:52.575104 4836 scope.go:117] "RemoveContainer" containerID="325cd676e21fc52f03c197c519aae517b900944cff0ba106872a3e674c1b1f20" Feb 17 14:36:52 crc kubenswrapper[4836]: E0217 14:36:52.576090 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bkk9g_openshift-machine-config-operator(895a19c9-a3f0-4a15-aa19-19347121388c)\"" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" podUID="895a19c9-a3f0-4a15-aa19-19347121388c" Feb 17 14:37:00 crc kubenswrapper[4836]: I0217 14:37:00.062271 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-h9gmq"] Feb 17 14:37:00 crc kubenswrapper[4836]: I0217 14:37:00.089775 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-h9gmq"] Feb 17 14:37:00 crc kubenswrapper[4836]: I0217 14:37:00.581218 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="caa6524b-2b3f-47c3-b55f-1435685df59d" path="/var/lib/kubelet/pods/caa6524b-2b3f-47c3-b55f-1435685df59d/volumes" Feb 17 14:37:04 crc kubenswrapper[4836]: I0217 14:37:04.586543 4836 scope.go:117] "RemoveContainer" containerID="325cd676e21fc52f03c197c519aae517b900944cff0ba106872a3e674c1b1f20" Feb 17 14:37:04 crc kubenswrapper[4836]: E0217 14:37:04.588157 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bkk9g_openshift-machine-config-operator(895a19c9-a3f0-4a15-aa19-19347121388c)\"" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" podUID="895a19c9-a3f0-4a15-aa19-19347121388c" Feb 17 14:37:06 crc kubenswrapper[4836]: I0217 14:37:06.078181 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-vtfx4_63f75031-4e24-42f7-80cc-2f3fb289dac0/cert-manager-controller/0.log" Feb 17 14:37:06 crc kubenswrapper[4836]: I0217 14:37:06.230619 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-dmddv_918985c6-76a8-4bb2-8868-278b633133a9/cert-manager-cainjector/0.log" Feb 17 14:37:06 crc kubenswrapper[4836]: I0217 14:37:06.331734 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-zhbzj_662067b4-39c2-4ab7-adb4-ba8a6330b0b9/cert-manager-webhook/0.log" Feb 17 14:37:16 crc kubenswrapper[4836]: I0217 14:37:16.568269 4836 scope.go:117] "RemoveContainer" containerID="325cd676e21fc52f03c197c519aae517b900944cff0ba106872a3e674c1b1f20" Feb 17 14:37:16 crc kubenswrapper[4836]: E0217 14:37:16.569452 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bkk9g_openshift-machine-config-operator(895a19c9-a3f0-4a15-aa19-19347121388c)\"" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" podUID="895a19c9-a3f0-4a15-aa19-19347121388c" Feb 17 14:37:21 crc kubenswrapper[4836]: I0217 14:37:21.601523 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5c78fc5d65-q985f_8fc6d41c-a8a1-4fe3-ade2-b79761920b17/nmstate-console-plugin/0.log" Feb 17 14:37:21 crc kubenswrapper[4836]: I0217 14:37:21.789245 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-w8wbg_9ff842c9-08b8-4363-b82a-5f7e2461ec2a/nmstate-handler/0.log" Feb 17 14:37:21 crc kubenswrapper[4836]: I0217 14:37:21.824099 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58c85c668d-877xf_0d0615b5-ef3b-4932-957c-a4b44f35c1a9/kube-rbac-proxy/0.log" Feb 17 14:37:21 crc kubenswrapper[4836]: I0217 14:37:21.926025 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58c85c668d-877xf_0d0615b5-ef3b-4932-957c-a4b44f35c1a9/nmstate-metrics/0.log" Feb 17 14:37:22 crc kubenswrapper[4836]: I0217 14:37:22.042528 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-14cb-account-create-update-xw2dd"] Feb 17 14:37:22 crc kubenswrapper[4836]: I0217 14:37:22.048061 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-694c9596b7-9w75g_c190e38d-4893-49c9-a633-e6b912030d37/nmstate-operator/0.log" Feb 17 14:37:22 crc kubenswrapper[4836]: I0217 14:37:22.060029 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-nwjd8"] Feb 17 14:37:22 crc kubenswrapper[4836]: I0217 14:37:22.071626 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-14cb-account-create-update-xw2dd"] Feb 17 14:37:22 crc kubenswrapper[4836]: I0217 14:37:22.086316 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-nwjd8"] Feb 17 14:37:22 crc kubenswrapper[4836]: I0217 14:37:22.251534 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-866bcb46dc-52vj8_6d6a6ca4-12c5-4bc1-b67e-5a48d1fe86f8/nmstate-webhook/0.log" Feb 17 14:37:22 crc kubenswrapper[4836]: I0217 14:37:22.584821 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="623225aa-2492-494e-be5b-92acef6f23cf" path="/var/lib/kubelet/pods/623225aa-2492-494e-be5b-92acef6f23cf/volumes" Feb 17 14:37:22 crc kubenswrapper[4836]: I0217 14:37:22.585833 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4ce1c7a-57e8-491e-84ab-8aed8baea37b" path="/var/lib/kubelet/pods/d4ce1c7a-57e8-491e-84ab-8aed8baea37b/volumes" Feb 17 14:37:24 crc kubenswrapper[4836]: I0217 14:37:24.054785 4836 scope.go:117] "RemoveContainer" containerID="b3fd8198bda32089f8d16c7005023bc9355442a69582a28217b4faa19a58edfd" Feb 17 14:37:24 crc kubenswrapper[4836]: I0217 14:37:24.093049 4836 scope.go:117] "RemoveContainer" containerID="bda2c6a640050c54150d82f44c6e78a2f7107b79ee0b4f6fd03e4d8c6e1019d3" Feb 17 14:37:24 crc kubenswrapper[4836]: I0217 14:37:24.150822 4836 scope.go:117] "RemoveContainer" containerID="8f88022ab4daa99006c48416f95fa6fcf0ec231af3f8553f0fffe8cc8f1971ee" Feb 17 14:37:24 crc kubenswrapper[4836]: I0217 14:37:24.197129 4836 scope.go:117] "RemoveContainer" containerID="55c6c8d1d911f68476c5d07d35dec7d57e500cdc1c29d64681255555160897dd" Feb 17 14:37:24 crc kubenswrapper[4836]: I0217 14:37:24.285733 4836 scope.go:117] "RemoveContainer" containerID="ca8e0602e1b36f3c2d9bfabc7020988df18e6945d19646bd583313467d47a539" Feb 17 14:37:24 crc kubenswrapper[4836]: I0217 14:37:24.321588 4836 scope.go:117] "RemoveContainer" containerID="c0e6439979838c98e66157164ef8073f70f7245c52bc8c72b4753a2777fab786" Feb 17 14:37:24 crc kubenswrapper[4836]: I0217 14:37:24.374434 4836 scope.go:117] "RemoveContainer" containerID="7e6f04d96e5a077df5020259f367870723b0f91e790c0b81e936bf2cbc3790f9" Feb 17 14:37:24 crc kubenswrapper[4836]: I0217 14:37:24.399478 4836 scope.go:117] "RemoveContainer" containerID="0179fb4c7564ecef52fa63a2f91fe687b3340cb3f7aaa46ff46f4ec68e5ee26d" Feb 17 14:37:24 crc kubenswrapper[4836]: I0217 14:37:24.431052 4836 scope.go:117] "RemoveContainer" containerID="bf410eadcd21b6c409b08a23916bc0ac4d5ba43505387a89c251ab098b87e562" Feb 17 14:37:29 crc kubenswrapper[4836]: I0217 14:37:29.074806 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-2ea0-account-create-update-p7p99"] Feb 17 14:37:29 crc kubenswrapper[4836]: I0217 14:37:29.085832 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-db-create-jjrp2"] Feb 17 14:37:29 crc kubenswrapper[4836]: I0217 14:37:29.097011 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-0d11-account-create-update-jf72z"] Feb 17 14:37:29 crc kubenswrapper[4836]: I0217 14:37:29.107274 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-69hk6"] Feb 17 14:37:29 crc kubenswrapper[4836]: I0217 14:37:29.117060 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-652c-account-create-update-lswdv"] Feb 17 14:37:29 crc kubenswrapper[4836]: I0217 14:37:29.128578 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-w5qdk"] Feb 17 14:37:29 crc kubenswrapper[4836]: I0217 14:37:29.141365 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-69hk6"] Feb 17 14:37:29 crc kubenswrapper[4836]: I0217 14:37:29.150965 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-2ea0-account-create-update-p7p99"] Feb 17 14:37:29 crc kubenswrapper[4836]: I0217 14:37:29.160246 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-652c-account-create-update-lswdv"] Feb 17 14:37:29 crc kubenswrapper[4836]: I0217 14:37:29.171345 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-w5qdk"] Feb 17 14:37:29 crc kubenswrapper[4836]: I0217 14:37:29.183245 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-0d11-account-create-update-jf72z"] Feb 17 14:37:29 crc kubenswrapper[4836]: I0217 14:37:29.194874 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-db-create-jjrp2"] Feb 17 14:37:30 crc kubenswrapper[4836]: I0217 14:37:30.569503 4836 scope.go:117] "RemoveContainer" containerID="325cd676e21fc52f03c197c519aae517b900944cff0ba106872a3e674c1b1f20" Feb 17 14:37:30 crc kubenswrapper[4836]: E0217 14:37:30.570187 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bkk9g_openshift-machine-config-operator(895a19c9-a3f0-4a15-aa19-19347121388c)\"" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" podUID="895a19c9-a3f0-4a15-aa19-19347121388c" Feb 17 14:37:30 crc kubenswrapper[4836]: I0217 14:37:30.582267 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ee1a0f2-86df-4f97-957a-22bbd7da4505" path="/var/lib/kubelet/pods/2ee1a0f2-86df-4f97-957a-22bbd7da4505/volumes" Feb 17 14:37:30 crc kubenswrapper[4836]: I0217 14:37:30.583394 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4edeb89f-0bd9-466e-a9f9-2d45575d2c72" path="/var/lib/kubelet/pods/4edeb89f-0bd9-466e-a9f9-2d45575d2c72/volumes" Feb 17 14:37:30 crc kubenswrapper[4836]: I0217 14:37:30.584174 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="767841a7-db94-430a-b408-10e5bd0350e5" path="/var/lib/kubelet/pods/767841a7-db94-430a-b408-10e5bd0350e5/volumes" Feb 17 14:37:30 crc kubenswrapper[4836]: I0217 14:37:30.584864 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1fe36f3-d6b6-44e0-b85b-6def754fd08e" path="/var/lib/kubelet/pods/a1fe36f3-d6b6-44e0-b85b-6def754fd08e/volumes" Feb 17 14:37:30 crc kubenswrapper[4836]: I0217 14:37:30.586230 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb354e85-311d-40bb-ae4a-5c535d4d89b9" path="/var/lib/kubelet/pods/eb354e85-311d-40bb-ae4a-5c535d4d89b9/volumes" Feb 17 14:37:30 crc kubenswrapper[4836]: I0217 14:37:30.587169 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9ee15e8-6695-454f-83ad-d54176458497" path="/var/lib/kubelet/pods/f9ee15e8-6695-454f-83ad-d54176458497/volumes" Feb 17 14:37:36 crc kubenswrapper[4836]: I0217 14:37:36.040072 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-q25rr"] Feb 17 14:37:36 crc kubenswrapper[4836]: I0217 14:37:36.054874 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-q25rr"] Feb 17 14:37:36 crc kubenswrapper[4836]: I0217 14:37:36.582199 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a1d4ef8-03d9-42d8-ae0b-9410767ed25f" path="/var/lib/kubelet/pods/6a1d4ef8-03d9-42d8-ae0b-9410767ed25f/volumes" Feb 17 14:37:36 crc kubenswrapper[4836]: I0217 14:37:36.728583 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-dfd4b8c4b-kclf7_297a6b35-d11d-4c2b-858c-79cb4c3c1b2c/kube-rbac-proxy/0.log" Feb 17 14:37:36 crc kubenswrapper[4836]: I0217 14:37:36.774243 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-dfd4b8c4b-kclf7_297a6b35-d11d-4c2b-858c-79cb4c3c1b2c/manager/0.log" Feb 17 14:37:41 crc kubenswrapper[4836]: I0217 14:37:41.567949 4836 scope.go:117] "RemoveContainer" containerID="325cd676e21fc52f03c197c519aae517b900944cff0ba106872a3e674c1b1f20" Feb 17 14:37:41 crc kubenswrapper[4836]: E0217 14:37:41.568739 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bkk9g_openshift-machine-config-operator(895a19c9-a3f0-4a15-aa19-19347121388c)\"" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" podUID="895a19c9-a3f0-4a15-aa19-19347121388c" Feb 17 14:37:51 crc kubenswrapper[4836]: I0217 14:37:51.472661 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-xm2rk_755bc851-3fff-45db-bbcf-164a27afcf85/prometheus-operator/0.log" Feb 17 14:37:51 crc kubenswrapper[4836]: I0217 14:37:51.585498 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-5b474d8486-lngwm_5a9fdae1-f115-4e94-9b72-026862e02026/prometheus-operator-admission-webhook/0.log" Feb 17 14:37:51 crc kubenswrapper[4836]: I0217 14:37:51.702173 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-5b474d8486-m8jhr_ce0a3fd2-d84a-417c-bd46-c0dba979376e/prometheus-operator-admission-webhook/0.log" Feb 17 14:37:51 crc kubenswrapper[4836]: I0217 14:37:51.834036 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-f94f2_d6acbcf2-dfc0-4a7b-b6bd-4b62c0b03578/operator/0.log" Feb 17 14:37:51 crc kubenswrapper[4836]: I0217 14:37:51.979915 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-vqhkf_c4b6d996-7a86-4512-825f-6e6d34148862/perses-operator/0.log" Feb 17 14:37:53 crc kubenswrapper[4836]: I0217 14:37:53.568774 4836 scope.go:117] "RemoveContainer" containerID="325cd676e21fc52f03c197c519aae517b900944cff0ba106872a3e674c1b1f20" Feb 17 14:37:53 crc kubenswrapper[4836]: E0217 14:37:53.569389 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bkk9g_openshift-machine-config-operator(895a19c9-a3f0-4a15-aa19-19347121388c)\"" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" podUID="895a19c9-a3f0-4a15-aa19-19347121388c" Feb 17 14:37:55 crc kubenswrapper[4836]: I0217 14:37:55.039268 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-z8g7x"] Feb 17 14:37:55 crc kubenswrapper[4836]: I0217 14:37:55.064048 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-z8g7x"] Feb 17 14:37:56 crc kubenswrapper[4836]: I0217 14:37:56.580037 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df3a6cf1-bca0-45b2-9f7c-6d483452d49d" path="/var/lib/kubelet/pods/df3a6cf1-bca0-45b2-9f7c-6d483452d49d/volumes" Feb 17 14:38:06 crc kubenswrapper[4836]: I0217 14:38:06.573702 4836 scope.go:117] "RemoveContainer" containerID="325cd676e21fc52f03c197c519aae517b900944cff0ba106872a3e674c1b1f20" Feb 17 14:38:06 crc kubenswrapper[4836]: E0217 14:38:06.574734 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bkk9g_openshift-machine-config-operator(895a19c9-a3f0-4a15-aa19-19347121388c)\"" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" podUID="895a19c9-a3f0-4a15-aa19-19347121388c" Feb 17 14:38:06 crc kubenswrapper[4836]: I0217 14:38:06.623269 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-69bbfbf88f-szl4j_27eed55a-1a00-497e-9aa4-74f7007f336e/kube-rbac-proxy/0.log" Feb 17 14:38:06 crc kubenswrapper[4836]: I0217 14:38:06.646415 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-69bbfbf88f-szl4j_27eed55a-1a00-497e-9aa4-74f7007f336e/controller/0.log" Feb 17 14:38:06 crc kubenswrapper[4836]: I0217 14:38:06.842726 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-78b44bf5bb-mznjt_18ec2995-af0c-4c47-aa70-480f9323329e/frr-k8s-webhook-server/0.log" Feb 17 14:38:07 crc kubenswrapper[4836]: I0217 14:38:07.045005 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x257b_e019f338-ff73-4160-a283-a71e9e6119b3/cp-frr-files/0.log" Feb 17 14:38:07 crc kubenswrapper[4836]: I0217 14:38:07.242572 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x257b_e019f338-ff73-4160-a283-a71e9e6119b3/cp-reloader/0.log" Feb 17 14:38:07 crc kubenswrapper[4836]: I0217 14:38:07.247455 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x257b_e019f338-ff73-4160-a283-a71e9e6119b3/cp-frr-files/0.log" Feb 17 14:38:07 crc kubenswrapper[4836]: I0217 14:38:07.261660 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x257b_e019f338-ff73-4160-a283-a71e9e6119b3/cp-reloader/0.log" Feb 17 14:38:07 crc kubenswrapper[4836]: I0217 14:38:07.269693 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x257b_e019f338-ff73-4160-a283-a71e9e6119b3/cp-metrics/0.log" Feb 17 14:38:07 crc kubenswrapper[4836]: I0217 14:38:07.479477 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x257b_e019f338-ff73-4160-a283-a71e9e6119b3/cp-reloader/0.log" Feb 17 14:38:07 crc kubenswrapper[4836]: I0217 14:38:07.503916 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x257b_e019f338-ff73-4160-a283-a71e9e6119b3/cp-metrics/0.log" Feb 17 14:38:07 crc kubenswrapper[4836]: I0217 14:38:07.556587 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x257b_e019f338-ff73-4160-a283-a71e9e6119b3/cp-metrics/0.log" Feb 17 14:38:07 crc kubenswrapper[4836]: I0217 14:38:07.565544 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x257b_e019f338-ff73-4160-a283-a71e9e6119b3/cp-frr-files/0.log" Feb 17 14:38:07 crc kubenswrapper[4836]: I0217 14:38:07.804133 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x257b_e019f338-ff73-4160-a283-a71e9e6119b3/cp-reloader/0.log" Feb 17 14:38:07 crc kubenswrapper[4836]: I0217 14:38:07.829663 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x257b_e019f338-ff73-4160-a283-a71e9e6119b3/cp-metrics/0.log" Feb 17 14:38:07 crc kubenswrapper[4836]: I0217 14:38:07.830954 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x257b_e019f338-ff73-4160-a283-a71e9e6119b3/cp-frr-files/0.log" Feb 17 14:38:07 crc kubenswrapper[4836]: I0217 14:38:07.891367 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x257b_e019f338-ff73-4160-a283-a71e9e6119b3/controller/0.log" Feb 17 14:38:08 crc kubenswrapper[4836]: I0217 14:38:08.052814 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x257b_e019f338-ff73-4160-a283-a71e9e6119b3/frr-metrics/0.log" Feb 17 14:38:08 crc kubenswrapper[4836]: I0217 14:38:08.094220 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x257b_e019f338-ff73-4160-a283-a71e9e6119b3/kube-rbac-proxy/0.log" Feb 17 14:38:08 crc kubenswrapper[4836]: I0217 14:38:08.161029 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x257b_e019f338-ff73-4160-a283-a71e9e6119b3/kube-rbac-proxy-frr/0.log" Feb 17 14:38:08 crc kubenswrapper[4836]: I0217 14:38:08.306359 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x257b_e019f338-ff73-4160-a283-a71e9e6119b3/reloader/0.log" Feb 17 14:38:08 crc kubenswrapper[4836]: I0217 14:38:08.447570 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-69b9cbf5df-6fkqt_ccb35f40-d0b8-4a1e-8c45-63dd6987b72c/manager/0.log" Feb 17 14:38:08 crc kubenswrapper[4836]: I0217 14:38:08.641091 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-856546fc87-n5vrx_16c736d5-389e-4d03-9657-1abcd4448953/webhook-server/0.log" Feb 17 14:38:08 crc kubenswrapper[4836]: I0217 14:38:08.889667 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-pb5ff_2690ef6e-0489-43f3-b787-8b6c1295e283/kube-rbac-proxy/0.log" Feb 17 14:38:09 crc kubenswrapper[4836]: I0217 14:38:09.142601 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x257b_e019f338-ff73-4160-a283-a71e9e6119b3/frr/0.log" Feb 17 14:38:09 crc kubenswrapper[4836]: I0217 14:38:09.375632 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-pb5ff_2690ef6e-0489-43f3-b787-8b6c1295e283/speaker/0.log" Feb 17 14:38:13 crc kubenswrapper[4836]: I0217 14:38:13.059214 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-sb6h7"] Feb 17 14:38:13 crc kubenswrapper[4836]: I0217 14:38:13.067864 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-sb6h7"] Feb 17 14:38:14 crc kubenswrapper[4836]: I0217 14:38:14.612009 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81ddbaec-f370-44a3-802b-26980ea65d2f" path="/var/lib/kubelet/pods/81ddbaec-f370-44a3-802b-26980ea65d2f/volumes" Feb 17 14:38:18 crc kubenswrapper[4836]: I0217 14:38:18.568755 4836 scope.go:117] "RemoveContainer" containerID="325cd676e21fc52f03c197c519aae517b900944cff0ba106872a3e674c1b1f20" Feb 17 14:38:18 crc kubenswrapper[4836]: E0217 14:38:18.569720 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bkk9g_openshift-machine-config-operator(895a19c9-a3f0-4a15-aa19-19347121388c)\"" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" podUID="895a19c9-a3f0-4a15-aa19-19347121388c" Feb 17 14:38:22 crc kubenswrapper[4836]: I0217 14:38:22.626598 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651jtdwf_3464477d-9902-4d40-9048-443132123fb3/util/0.log" Feb 17 14:38:22 crc kubenswrapper[4836]: I0217 14:38:22.790709 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651jtdwf_3464477d-9902-4d40-9048-443132123fb3/util/0.log" Feb 17 14:38:22 crc kubenswrapper[4836]: I0217 14:38:22.798103 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651jtdwf_3464477d-9902-4d40-9048-443132123fb3/pull/0.log" Feb 17 14:38:22 crc kubenswrapper[4836]: I0217 14:38:22.852083 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651jtdwf_3464477d-9902-4d40-9048-443132123fb3/pull/0.log" Feb 17 14:38:23 crc kubenswrapper[4836]: I0217 14:38:23.129341 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651jtdwf_3464477d-9902-4d40-9048-443132123fb3/util/0.log" Feb 17 14:38:23 crc kubenswrapper[4836]: I0217 14:38:23.130421 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651jtdwf_3464477d-9902-4d40-9048-443132123fb3/pull/0.log" Feb 17 14:38:23 crc kubenswrapper[4836]: I0217 14:38:23.147516 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651jtdwf_3464477d-9902-4d40-9048-443132123fb3/extract/0.log" Feb 17 14:38:23 crc kubenswrapper[4836]: I0217 14:38:23.321109 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08j84kd_f611c52f-90dc-454e-8c3c-ca9d6a915f58/util/0.log" Feb 17 14:38:23 crc kubenswrapper[4836]: I0217 14:38:23.521923 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08j84kd_f611c52f-90dc-454e-8c3c-ca9d6a915f58/util/0.log" Feb 17 14:38:23 crc kubenswrapper[4836]: I0217 14:38:23.528213 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08j84kd_f611c52f-90dc-454e-8c3c-ca9d6a915f58/pull/0.log" Feb 17 14:38:23 crc kubenswrapper[4836]: I0217 14:38:23.536626 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08j84kd_f611c52f-90dc-454e-8c3c-ca9d6a915f58/pull/0.log" Feb 17 14:38:23 crc kubenswrapper[4836]: I0217 14:38:23.730520 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08j84kd_f611c52f-90dc-454e-8c3c-ca9d6a915f58/util/0.log" Feb 17 14:38:23 crc kubenswrapper[4836]: I0217 14:38:23.759430 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08j84kd_f611c52f-90dc-454e-8c3c-ca9d6a915f58/pull/0.log" Feb 17 14:38:23 crc kubenswrapper[4836]: I0217 14:38:23.770893 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08j84kd_f611c52f-90dc-454e-8c3c-ca9d6a915f58/extract/0.log" Feb 17 14:38:23 crc kubenswrapper[4836]: I0217 14:38:23.934089 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213cwqrl_5939eb42-42be-4ecf-845a-c28b4669c02d/util/0.log" Feb 17 14:38:24 crc kubenswrapper[4836]: I0217 14:38:24.128411 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213cwqrl_5939eb42-42be-4ecf-845a-c28b4669c02d/util/0.log" Feb 17 14:38:24 crc kubenswrapper[4836]: I0217 14:38:24.130487 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213cwqrl_5939eb42-42be-4ecf-845a-c28b4669c02d/pull/0.log" Feb 17 14:38:24 crc kubenswrapper[4836]: I0217 14:38:24.171827 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213cwqrl_5939eb42-42be-4ecf-845a-c28b4669c02d/pull/0.log" Feb 17 14:38:24 crc kubenswrapper[4836]: I0217 14:38:24.322550 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213cwqrl_5939eb42-42be-4ecf-845a-c28b4669c02d/util/0.log" Feb 17 14:38:24 crc kubenswrapper[4836]: I0217 14:38:24.338151 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213cwqrl_5939eb42-42be-4ecf-845a-c28b4669c02d/pull/0.log" Feb 17 14:38:24 crc kubenswrapper[4836]: I0217 14:38:24.341093 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213cwqrl_5939eb42-42be-4ecf-845a-c28b4669c02d/extract/0.log" Feb 17 14:38:24 crc kubenswrapper[4836]: I0217 14:38:24.564951 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xxj4j_eaecd71b-3b00-427a-9654-9d04af5469b9/extract-utilities/0.log" Feb 17 14:38:24 crc kubenswrapper[4836]: I0217 14:38:24.646079 4836 scope.go:117] "RemoveContainer" containerID="35ecf820b0414db1c94b077c083568db5d4a957bb9d735db9d4e378b6ebbc861" Feb 17 14:38:24 crc kubenswrapper[4836]: I0217 14:38:24.708333 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xxj4j_eaecd71b-3b00-427a-9654-9d04af5469b9/extract-utilities/0.log" Feb 17 14:38:24 crc kubenswrapper[4836]: I0217 14:38:24.732602 4836 scope.go:117] "RemoveContainer" containerID="3ae7c112e0518db5ada6508ad8c57217e914b3d3401ff927d4aa18b2e2dd9f79" Feb 17 14:38:24 crc kubenswrapper[4836]: I0217 14:38:24.772488 4836 scope.go:117] "RemoveContainer" containerID="0112cdba6fc4f4acf8102f48cb77deaeb49a0b5c8b49e3c6adcdb559d7e100b6" Feb 17 14:38:24 crc kubenswrapper[4836]: I0217 14:38:24.777641 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xxj4j_eaecd71b-3b00-427a-9654-9d04af5469b9/extract-content/0.log" Feb 17 14:38:24 crc kubenswrapper[4836]: I0217 14:38:24.779480 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xxj4j_eaecd71b-3b00-427a-9654-9d04af5469b9/extract-content/0.log" Feb 17 14:38:24 crc kubenswrapper[4836]: I0217 14:38:24.822065 4836 scope.go:117] "RemoveContainer" containerID="5e36e16a50074efc0038c12585afeefa45bc968423f053fecc01a7a460fc9fd3" Feb 17 14:38:24 crc kubenswrapper[4836]: I0217 14:38:24.895350 4836 scope.go:117] "RemoveContainer" containerID="e3b5cb6d26fdb2e586683ff31b8abe63df8d533a376c42dd280747ab5e165f5e" Feb 17 14:38:24 crc kubenswrapper[4836]: I0217 14:38:24.943228 4836 scope.go:117] "RemoveContainer" containerID="515b55d1439f54ad3649999fcf112b0e86238d037ec2170a1978295a22c02429" Feb 17 14:38:25 crc kubenswrapper[4836]: I0217 14:38:25.030427 4836 scope.go:117] "RemoveContainer" containerID="7f08e0024064e8fd1c473afb57d745eb10366b72696b8824621db71657c54472" Feb 17 14:38:25 crc kubenswrapper[4836]: I0217 14:38:25.084708 4836 scope.go:117] "RemoveContainer" containerID="86d009aabc2aafe94768037f28b03b96d85141a639669b82cdbd2fa653d9696d" Feb 17 14:38:25 crc kubenswrapper[4836]: I0217 14:38:25.117411 4836 scope.go:117] "RemoveContainer" containerID="2953db160f228060c084b5fd479ec149c2b0acd6cacae4957fb68229d08ae1b9" Feb 17 14:38:25 crc kubenswrapper[4836]: I0217 14:38:25.134814 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xxj4j_eaecd71b-3b00-427a-9654-9d04af5469b9/extract-content/0.log" Feb 17 14:38:25 crc kubenswrapper[4836]: I0217 14:38:25.210720 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xxj4j_eaecd71b-3b00-427a-9654-9d04af5469b9/extract-utilities/0.log" Feb 17 14:38:25 crc kubenswrapper[4836]: I0217 14:38:25.403709 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-zjxvt_212802dd-4c4f-444a-b443-bc3bbd1431bc/extract-utilities/0.log" Feb 17 14:38:25 crc kubenswrapper[4836]: I0217 14:38:25.527811 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xxj4j_eaecd71b-3b00-427a-9654-9d04af5469b9/registry-server/0.log" Feb 17 14:38:25 crc kubenswrapper[4836]: I0217 14:38:25.658895 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-zjxvt_212802dd-4c4f-444a-b443-bc3bbd1431bc/extract-content/0.log" Feb 17 14:38:25 crc kubenswrapper[4836]: I0217 14:38:25.661672 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-zjxvt_212802dd-4c4f-444a-b443-bc3bbd1431bc/extract-utilities/0.log" Feb 17 14:38:25 crc kubenswrapper[4836]: I0217 14:38:25.742549 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-zjxvt_212802dd-4c4f-444a-b443-bc3bbd1431bc/extract-content/0.log" Feb 17 14:38:25 crc kubenswrapper[4836]: I0217 14:38:25.985862 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-zjxvt_212802dd-4c4f-444a-b443-bc3bbd1431bc/extract-utilities/0.log" Feb 17 14:38:26 crc kubenswrapper[4836]: I0217 14:38:26.143192 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-zjxvt_212802dd-4c4f-444a-b443-bc3bbd1431bc/extract-content/0.log" Feb 17 14:38:26 crc kubenswrapper[4836]: I0217 14:38:26.385396 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawmbrz_96be2236-f07d-4944-8afa-b15a4ce0c4f0/util/0.log" Feb 17 14:38:26 crc kubenswrapper[4836]: I0217 14:38:26.675207 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-zjxvt_212802dd-4c4f-444a-b443-bc3bbd1431bc/registry-server/0.log" Feb 17 14:38:26 crc kubenswrapper[4836]: I0217 14:38:26.687258 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawmbrz_96be2236-f07d-4944-8afa-b15a4ce0c4f0/pull/0.log" Feb 17 14:38:26 crc kubenswrapper[4836]: I0217 14:38:26.719569 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawmbrz_96be2236-f07d-4944-8afa-b15a4ce0c4f0/util/0.log" Feb 17 14:38:26 crc kubenswrapper[4836]: I0217 14:38:26.721372 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawmbrz_96be2236-f07d-4944-8afa-b15a4ce0c4f0/pull/0.log" Feb 17 14:38:26 crc kubenswrapper[4836]: I0217 14:38:26.962673 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawmbrz_96be2236-f07d-4944-8afa-b15a4ce0c4f0/pull/0.log" Feb 17 14:38:26 crc kubenswrapper[4836]: I0217 14:38:26.990903 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawmbrz_96be2236-f07d-4944-8afa-b15a4ce0c4f0/util/0.log" Feb 17 14:38:27 crc kubenswrapper[4836]: I0217 14:38:27.005112 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-rhsgl_bd68f8c7-fdcc-449d-9f92-2f7afcb4917b/marketplace-operator/0.log" Feb 17 14:38:27 crc kubenswrapper[4836]: I0217 14:38:27.005345 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecawmbrz_96be2236-f07d-4944-8afa-b15a4ce0c4f0/extract/0.log" Feb 17 14:38:27 crc kubenswrapper[4836]: I0217 14:38:27.142968 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8gtc9_8fb3c078-0953-4561-a532-cc25ff32d845/extract-utilities/0.log" Feb 17 14:38:27 crc kubenswrapper[4836]: I0217 14:38:27.360970 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8gtc9_8fb3c078-0953-4561-a532-cc25ff32d845/extract-utilities/0.log" Feb 17 14:38:27 crc kubenswrapper[4836]: I0217 14:38:27.375238 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8gtc9_8fb3c078-0953-4561-a532-cc25ff32d845/extract-content/0.log" Feb 17 14:38:27 crc kubenswrapper[4836]: I0217 14:38:27.404419 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8gtc9_8fb3c078-0953-4561-a532-cc25ff32d845/extract-content/0.log" Feb 17 14:38:27 crc kubenswrapper[4836]: I0217 14:38:27.559217 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8gtc9_8fb3c078-0953-4561-a532-cc25ff32d845/extract-utilities/0.log" Feb 17 14:38:27 crc kubenswrapper[4836]: I0217 14:38:27.631498 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8gtc9_8fb3c078-0953-4561-a532-cc25ff32d845/extract-content/0.log" Feb 17 14:38:27 crc kubenswrapper[4836]: I0217 14:38:27.631597 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-r5vl4_5d52263a-9417-43b6-903c-79e41b1200a0/extract-utilities/0.log" Feb 17 14:38:27 crc kubenswrapper[4836]: I0217 14:38:27.714076 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-8gtc9_8fb3c078-0953-4561-a532-cc25ff32d845/registry-server/0.log" Feb 17 14:38:27 crc kubenswrapper[4836]: I0217 14:38:27.864534 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-r5vl4_5d52263a-9417-43b6-903c-79e41b1200a0/extract-utilities/0.log" Feb 17 14:38:27 crc kubenswrapper[4836]: I0217 14:38:27.886745 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-r5vl4_5d52263a-9417-43b6-903c-79e41b1200a0/extract-content/0.log" Feb 17 14:38:27 crc kubenswrapper[4836]: I0217 14:38:27.942506 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-r5vl4_5d52263a-9417-43b6-903c-79e41b1200a0/extract-content/0.log" Feb 17 14:38:28 crc kubenswrapper[4836]: I0217 14:38:28.090480 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-r5vl4_5d52263a-9417-43b6-903c-79e41b1200a0/extract-content/0.log" Feb 17 14:38:28 crc kubenswrapper[4836]: I0217 14:38:28.118415 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-r5vl4_5d52263a-9417-43b6-903c-79e41b1200a0/extract-utilities/0.log" Feb 17 14:38:28 crc kubenswrapper[4836]: I0217 14:38:28.275548 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-r5vl4_5d52263a-9417-43b6-903c-79e41b1200a0/registry-server/0.log" Feb 17 14:38:29 crc kubenswrapper[4836]: I0217 14:38:29.064375 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-pdhxs"] Feb 17 14:38:29 crc kubenswrapper[4836]: I0217 14:38:29.085928 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-vmgps"] Feb 17 14:38:29 crc kubenswrapper[4836]: I0217 14:38:29.101456 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-pdhxs"] Feb 17 14:38:29 crc kubenswrapper[4836]: I0217 14:38:29.116904 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-vmgps"] Feb 17 14:38:30 crc kubenswrapper[4836]: I0217 14:38:30.038585 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-g9l4s"] Feb 17 14:38:30 crc kubenswrapper[4836]: I0217 14:38:30.053909 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-g9l4s"] Feb 17 14:38:30 crc kubenswrapper[4836]: I0217 14:38:30.568558 4836 scope.go:117] "RemoveContainer" containerID="325cd676e21fc52f03c197c519aae517b900944cff0ba106872a3e674c1b1f20" Feb 17 14:38:30 crc kubenswrapper[4836]: E0217 14:38:30.569332 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bkk9g_openshift-machine-config-operator(895a19c9-a3f0-4a15-aa19-19347121388c)\"" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" podUID="895a19c9-a3f0-4a15-aa19-19347121388c" Feb 17 14:38:30 crc kubenswrapper[4836]: I0217 14:38:30.588189 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10331926-261d-4e44-a8c2-89846903ca12" path="/var/lib/kubelet/pods/10331926-261d-4e44-a8c2-89846903ca12/volumes" Feb 17 14:38:30 crc kubenswrapper[4836]: I0217 14:38:30.589189 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18361bc2-5db1-4611-be18-38593e0b5d5d" path="/var/lib/kubelet/pods/18361bc2-5db1-4611-be18-38593e0b5d5d/volumes" Feb 17 14:38:30 crc kubenswrapper[4836]: I0217 14:38:30.590264 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1fe4b42c-afbf-41e1-8035-5fffb156eadc" path="/var/lib/kubelet/pods/1fe4b42c-afbf-41e1-8035-5fffb156eadc/volumes" Feb 17 14:38:40 crc kubenswrapper[4836]: I0217 14:38:40.039235 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-qqwhc"] Feb 17 14:38:40 crc kubenswrapper[4836]: I0217 14:38:40.065892 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-qqwhc"] Feb 17 14:38:40 crc kubenswrapper[4836]: I0217 14:38:40.579535 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8185c649-f1ad-4230-830d-07d002e5b358" path="/var/lib/kubelet/pods/8185c649-f1ad-4230-830d-07d002e5b358/volumes" Feb 17 14:38:41 crc kubenswrapper[4836]: I0217 14:38:41.490514 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-5b474d8486-lngwm_5a9fdae1-f115-4e94-9b72-026862e02026/prometheus-operator-admission-webhook/0.log" Feb 17 14:38:41 crc kubenswrapper[4836]: I0217 14:38:41.510149 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-xm2rk_755bc851-3fff-45db-bbcf-164a27afcf85/prometheus-operator/0.log" Feb 17 14:38:41 crc kubenswrapper[4836]: I0217 14:38:41.544202 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-5b474d8486-m8jhr_ce0a3fd2-d84a-417c-bd46-c0dba979376e/prometheus-operator-admission-webhook/0.log" Feb 17 14:38:41 crc kubenswrapper[4836]: I0217 14:38:41.704329 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-vqhkf_c4b6d996-7a86-4512-825f-6e6d34148862/perses-operator/0.log" Feb 17 14:38:41 crc kubenswrapper[4836]: I0217 14:38:41.723895 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-f94f2_d6acbcf2-dfc0-4a7b-b6bd-4b62c0b03578/operator/0.log" Feb 17 14:38:42 crc kubenswrapper[4836]: I0217 14:38:42.568824 4836 scope.go:117] "RemoveContainer" containerID="325cd676e21fc52f03c197c519aae517b900944cff0ba106872a3e674c1b1f20" Feb 17 14:38:42 crc kubenswrapper[4836]: E0217 14:38:42.569535 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bkk9g_openshift-machine-config-operator(895a19c9-a3f0-4a15-aa19-19347121388c)\"" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" podUID="895a19c9-a3f0-4a15-aa19-19347121388c" Feb 17 14:38:51 crc kubenswrapper[4836]: I0217 14:38:51.041517 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-db-sync-pvljf"] Feb 17 14:38:51 crc kubenswrapper[4836]: I0217 14:38:51.058370 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-db-sync-pvljf"] Feb 17 14:38:52 crc kubenswrapper[4836]: I0217 14:38:52.581816 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e016162-2025-44ad-989d-ce71d9f8f9bf" path="/var/lib/kubelet/pods/4e016162-2025-44ad-989d-ce71d9f8f9bf/volumes" Feb 17 14:38:55 crc kubenswrapper[4836]: I0217 14:38:55.433818 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-dfd4b8c4b-kclf7_297a6b35-d11d-4c2b-858c-79cb4c3c1b2c/kube-rbac-proxy/0.log" Feb 17 14:38:55 crc kubenswrapper[4836]: I0217 14:38:55.481822 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-dfd4b8c4b-kclf7_297a6b35-d11d-4c2b-858c-79cb4c3c1b2c/manager/0.log" Feb 17 14:38:57 crc kubenswrapper[4836]: I0217 14:38:57.568403 4836 scope.go:117] "RemoveContainer" containerID="325cd676e21fc52f03c197c519aae517b900944cff0ba106872a3e674c1b1f20" Feb 17 14:38:57 crc kubenswrapper[4836]: E0217 14:38:57.569320 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bkk9g_openshift-machine-config-operator(895a19c9-a3f0-4a15-aa19-19347121388c)\"" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" podUID="895a19c9-a3f0-4a15-aa19-19347121388c" Feb 17 14:39:08 crc kubenswrapper[4836]: I0217 14:39:08.088817 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-storageinit-9z4jp"] Feb 17 14:39:08 crc kubenswrapper[4836]: I0217 14:39:08.101202 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-storageinit-9z4jp"] Feb 17 14:39:08 crc kubenswrapper[4836]: I0217 14:39:08.580035 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f38b5f94-bc8b-4e64-abe6-8c39b920cb4b" path="/var/lib/kubelet/pods/f38b5f94-bc8b-4e64-abe6-8c39b920cb4b/volumes" Feb 17 14:39:09 crc kubenswrapper[4836]: I0217 14:39:09.569228 4836 scope.go:117] "RemoveContainer" containerID="325cd676e21fc52f03c197c519aae517b900944cff0ba106872a3e674c1b1f20" Feb 17 14:39:10 crc kubenswrapper[4836]: I0217 14:39:10.747642 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" event={"ID":"895a19c9-a3f0-4a15-aa19-19347121388c","Type":"ContainerStarted","Data":"6f19bb9a4d6443b07f247471c35e97a577b83e39d81d033aff596fac57089969"} Feb 17 14:39:25 crc kubenswrapper[4836]: I0217 14:39:25.385998 4836 scope.go:117] "RemoveContainer" containerID="ff24c89536ae06cf6a0fbffcb68050de3e8ed22356c912b4e7e87afbef99480d" Feb 17 14:39:25 crc kubenswrapper[4836]: I0217 14:39:25.439165 4836 scope.go:117] "RemoveContainer" containerID="852265bc6ffb6ef9657692f454a84caf832b683e76f800e8dccb3317d95a69ea" Feb 17 14:39:25 crc kubenswrapper[4836]: I0217 14:39:25.490683 4836 scope.go:117] "RemoveContainer" containerID="fc7f81c47e20cce7a74c227545b963bd61d6dadbccf7dacfaa97a9b912354775" Feb 17 14:39:25 crc kubenswrapper[4836]: I0217 14:39:25.531115 4836 scope.go:117] "RemoveContainer" containerID="705f230fd2d44c1059294c17cc5410cef58dcabc1573c4e7f4f531d00aad46ec" Feb 17 14:39:25 crc kubenswrapper[4836]: I0217 14:39:25.598606 4836 scope.go:117] "RemoveContainer" containerID="0a4b8ba8b2087b1a38486d6f6172aee2da2f8fb8e22feee2e93bb22306b6558e" Feb 17 14:39:25 crc kubenswrapper[4836]: I0217 14:39:25.661804 4836 scope.go:117] "RemoveContainer" containerID="13ef4f24a42269dbbf22aa927159da757007caa607e5236e1441cff6b685fe12" Feb 17 14:39:45 crc kubenswrapper[4836]: I0217 14:39:45.055901 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-q8wrd"] Feb 17 14:39:45 crc kubenswrapper[4836]: I0217 14:39:45.072463 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-q8wrd"] Feb 17 14:39:46 crc kubenswrapper[4836]: I0217 14:39:46.039963 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-npl52"] Feb 17 14:39:46 crc kubenswrapper[4836]: I0217 14:39:46.052455 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-npl52"] Feb 17 14:39:46 crc kubenswrapper[4836]: I0217 14:39:46.580724 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88b1aa3a-dc15-4ec1-ba76-8246e300422f" path="/var/lib/kubelet/pods/88b1aa3a-dc15-4ec1-ba76-8246e300422f/volumes" Feb 17 14:39:46 crc kubenswrapper[4836]: I0217 14:39:46.581530 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db342a3d-55f5-4b0c-b96f-327014b6fb82" path="/var/lib/kubelet/pods/db342a3d-55f5-4b0c-b96f-327014b6fb82/volumes" Feb 17 14:39:47 crc kubenswrapper[4836]: I0217 14:39:47.036703 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-8fba-account-create-update-gqd5n"] Feb 17 14:39:47 crc kubenswrapper[4836]: I0217 14:39:47.047711 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-28f5-account-create-update-74tvm"] Feb 17 14:39:47 crc kubenswrapper[4836]: I0217 14:39:47.061359 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-5h5m9"] Feb 17 14:39:47 crc kubenswrapper[4836]: I0217 14:39:47.076527 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-a7c4-account-create-update-qj5lb"] Feb 17 14:39:47 crc kubenswrapper[4836]: I0217 14:39:47.085186 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-28f5-account-create-update-74tvm"] Feb 17 14:39:47 crc kubenswrapper[4836]: I0217 14:39:47.098096 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-a7c4-account-create-update-qj5lb"] Feb 17 14:39:47 crc kubenswrapper[4836]: I0217 14:39:47.109001 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-5h5m9"] Feb 17 14:39:47 crc kubenswrapper[4836]: I0217 14:39:47.119437 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-8fba-account-create-update-gqd5n"] Feb 17 14:39:48 crc kubenswrapper[4836]: I0217 14:39:48.580853 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0312359b-98a6-49c7-83f1-fb44c679e8aa" path="/var/lib/kubelet/pods/0312359b-98a6-49c7-83f1-fb44c679e8aa/volumes" Feb 17 14:39:48 crc kubenswrapper[4836]: I0217 14:39:48.582007 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b8171da-ad25-4388-9dab-2afc19993d97" path="/var/lib/kubelet/pods/0b8171da-ad25-4388-9dab-2afc19993d97/volumes" Feb 17 14:39:48 crc kubenswrapper[4836]: I0217 14:39:48.582955 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4dc00367-2940-413d-872a-74d4fa37fc1f" path="/var/lib/kubelet/pods/4dc00367-2940-413d-872a-74d4fa37fc1f/volumes" Feb 17 14:39:48 crc kubenswrapper[4836]: I0217 14:39:48.583730 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7d61f8c-4804-49b6-937e-fbaf20aa3ed2" path="/var/lib/kubelet/pods/c7d61f8c-4804-49b6-937e-fbaf20aa3ed2/volumes" Feb 17 14:40:25 crc kubenswrapper[4836]: I0217 14:40:25.909826 4836 scope.go:117] "RemoveContainer" containerID="0b1fdb782cc59c87b5c334a8e29bc01c7def7137ff5e1a24115754176ed4d2ab" Feb 17 14:40:25 crc kubenswrapper[4836]: I0217 14:40:25.940454 4836 scope.go:117] "RemoveContainer" containerID="e2428efba069899bf573bcb1f933d6f640083a8f0e4830cd36751b8b3332488d" Feb 17 14:40:26 crc kubenswrapper[4836]: I0217 14:40:26.006817 4836 scope.go:117] "RemoveContainer" containerID="66b9158b23020b3eaa0a3cea1af11df9fcdac6316e74751284cbec084e23c3a0" Feb 17 14:40:26 crc kubenswrapper[4836]: I0217 14:40:26.059797 4836 scope.go:117] "RemoveContainer" containerID="7dba2d07908548962f40435efa50aed2a21f68c9f55a50ad39cc396d718c6cf2" Feb 17 14:40:26 crc kubenswrapper[4836]: I0217 14:40:26.114284 4836 scope.go:117] "RemoveContainer" containerID="b40337010298624b5f124e89e37fbded22f8ac5a672bad50ecf9c49dfa1ed535" Feb 17 14:40:26 crc kubenswrapper[4836]: I0217 14:40:26.168145 4836 scope.go:117] "RemoveContainer" containerID="940b27e8f09ea23f3f385f55c83e9233f241038d9dc1c8761036c1c3dbf2e000" Feb 17 14:40:26 crc kubenswrapper[4836]: I0217 14:40:26.210973 4836 scope.go:117] "RemoveContainer" containerID="b099deccdc43aaaf5e1d9673615b93cdbff588beb42726f387dc2c0ef267fb73" Feb 17 14:40:26 crc kubenswrapper[4836]: I0217 14:40:26.241692 4836 scope.go:117] "RemoveContainer" containerID="a870dbadddedc2cd296e8c04a81b16817f6df39787b8061ee58f3dfc1fec3ca8" Feb 17 14:40:35 crc kubenswrapper[4836]: I0217 14:40:35.051365 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-896gw"] Feb 17 14:40:35 crc kubenswrapper[4836]: I0217 14:40:35.067074 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-896gw"] Feb 17 14:40:35 crc kubenswrapper[4836]: I0217 14:40:35.822985 4836 generic.go:334] "Generic (PLEG): container finished" podID="781729f0-fe27-45e7-bd7b-23709696ec4d" containerID="f31755722b53fa2304e51f9abd859d08ebfbf46cba2b7c60b902e42814f8b35d" exitCode=0 Feb 17 14:40:35 crc kubenswrapper[4836]: I0217 14:40:35.823047 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-snsbl/must-gather-4sqf7" event={"ID":"781729f0-fe27-45e7-bd7b-23709696ec4d","Type":"ContainerDied","Data":"f31755722b53fa2304e51f9abd859d08ebfbf46cba2b7c60b902e42814f8b35d"} Feb 17 14:40:35 crc kubenswrapper[4836]: I0217 14:40:35.823950 4836 scope.go:117] "RemoveContainer" containerID="f31755722b53fa2304e51f9abd859d08ebfbf46cba2b7c60b902e42814f8b35d" Feb 17 14:40:36 crc kubenswrapper[4836]: I0217 14:40:36.495618 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-snsbl_must-gather-4sqf7_781729f0-fe27-45e7-bd7b-23709696ec4d/gather/0.log" Feb 17 14:40:36 crc kubenswrapper[4836]: I0217 14:40:36.584380 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5284ac65-3629-4b0f-94ce-114964fe6d15" path="/var/lib/kubelet/pods/5284ac65-3629-4b0f-94ce-114964fe6d15/volumes" Feb 17 14:40:45 crc kubenswrapper[4836]: I0217 14:40:45.243225 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-snsbl/must-gather-4sqf7"] Feb 17 14:40:45 crc kubenswrapper[4836]: I0217 14:40:45.245452 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-snsbl/must-gather-4sqf7" podUID="781729f0-fe27-45e7-bd7b-23709696ec4d" containerName="copy" containerID="cri-o://07b49b7ed15d428bdf610ee0e143b812aa84072521d68f153a7b1e680fa48f67" gracePeriod=2 Feb 17 14:40:45 crc kubenswrapper[4836]: I0217 14:40:45.258108 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-snsbl/must-gather-4sqf7"] Feb 17 14:40:45 crc kubenswrapper[4836]: I0217 14:40:45.808316 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-snsbl_must-gather-4sqf7_781729f0-fe27-45e7-bd7b-23709696ec4d/copy/0.log" Feb 17 14:40:45 crc kubenswrapper[4836]: I0217 14:40:45.809379 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-snsbl/must-gather-4sqf7" Feb 17 14:40:45 crc kubenswrapper[4836]: I0217 14:40:45.942578 4836 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-snsbl_must-gather-4sqf7_781729f0-fe27-45e7-bd7b-23709696ec4d/copy/0.log" Feb 17 14:40:45 crc kubenswrapper[4836]: I0217 14:40:45.943638 4836 generic.go:334] "Generic (PLEG): container finished" podID="781729f0-fe27-45e7-bd7b-23709696ec4d" containerID="07b49b7ed15d428bdf610ee0e143b812aa84072521d68f153a7b1e680fa48f67" exitCode=143 Feb 17 14:40:45 crc kubenswrapper[4836]: I0217 14:40:45.943711 4836 scope.go:117] "RemoveContainer" containerID="07b49b7ed15d428bdf610ee0e143b812aa84072521d68f153a7b1e680fa48f67" Feb 17 14:40:45 crc kubenswrapper[4836]: I0217 14:40:45.943957 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-snsbl/must-gather-4sqf7" Feb 17 14:40:45 crc kubenswrapper[4836]: I0217 14:40:45.991124 4836 scope.go:117] "RemoveContainer" containerID="f31755722b53fa2304e51f9abd859d08ebfbf46cba2b7c60b902e42814f8b35d" Feb 17 14:40:46 crc kubenswrapper[4836]: I0217 14:40:46.004434 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/781729f0-fe27-45e7-bd7b-23709696ec4d-must-gather-output\") pod \"781729f0-fe27-45e7-bd7b-23709696ec4d\" (UID: \"781729f0-fe27-45e7-bd7b-23709696ec4d\") " Feb 17 14:40:46 crc kubenswrapper[4836]: I0217 14:40:46.004769 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t6j6w\" (UniqueName: \"kubernetes.io/projected/781729f0-fe27-45e7-bd7b-23709696ec4d-kube-api-access-t6j6w\") pod \"781729f0-fe27-45e7-bd7b-23709696ec4d\" (UID: \"781729f0-fe27-45e7-bd7b-23709696ec4d\") " Feb 17 14:40:46 crc kubenswrapper[4836]: I0217 14:40:46.015688 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/781729f0-fe27-45e7-bd7b-23709696ec4d-kube-api-access-t6j6w" (OuterVolumeSpecName: "kube-api-access-t6j6w") pod "781729f0-fe27-45e7-bd7b-23709696ec4d" (UID: "781729f0-fe27-45e7-bd7b-23709696ec4d"). InnerVolumeSpecName "kube-api-access-t6j6w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:40:46 crc kubenswrapper[4836]: I0217 14:40:46.108123 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t6j6w\" (UniqueName: \"kubernetes.io/projected/781729f0-fe27-45e7-bd7b-23709696ec4d-kube-api-access-t6j6w\") on node \"crc\" DevicePath \"\"" Feb 17 14:40:46 crc kubenswrapper[4836]: I0217 14:40:46.202278 4836 scope.go:117] "RemoveContainer" containerID="07b49b7ed15d428bdf610ee0e143b812aa84072521d68f153a7b1e680fa48f67" Feb 17 14:40:46 crc kubenswrapper[4836]: E0217 14:40:46.203760 4836 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07b49b7ed15d428bdf610ee0e143b812aa84072521d68f153a7b1e680fa48f67\": container with ID starting with 07b49b7ed15d428bdf610ee0e143b812aa84072521d68f153a7b1e680fa48f67 not found: ID does not exist" containerID="07b49b7ed15d428bdf610ee0e143b812aa84072521d68f153a7b1e680fa48f67" Feb 17 14:40:46 crc kubenswrapper[4836]: I0217 14:40:46.203812 4836 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07b49b7ed15d428bdf610ee0e143b812aa84072521d68f153a7b1e680fa48f67"} err="failed to get container status \"07b49b7ed15d428bdf610ee0e143b812aa84072521d68f153a7b1e680fa48f67\": rpc error: code = NotFound desc = could not find container \"07b49b7ed15d428bdf610ee0e143b812aa84072521d68f153a7b1e680fa48f67\": container with ID starting with 07b49b7ed15d428bdf610ee0e143b812aa84072521d68f153a7b1e680fa48f67 not found: ID does not exist" Feb 17 14:40:46 crc kubenswrapper[4836]: I0217 14:40:46.203845 4836 scope.go:117] "RemoveContainer" containerID="f31755722b53fa2304e51f9abd859d08ebfbf46cba2b7c60b902e42814f8b35d" Feb 17 14:40:46 crc kubenswrapper[4836]: E0217 14:40:46.210048 4836 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f31755722b53fa2304e51f9abd859d08ebfbf46cba2b7c60b902e42814f8b35d\": container with ID starting with f31755722b53fa2304e51f9abd859d08ebfbf46cba2b7c60b902e42814f8b35d not found: ID does not exist" containerID="f31755722b53fa2304e51f9abd859d08ebfbf46cba2b7c60b902e42814f8b35d" Feb 17 14:40:46 crc kubenswrapper[4836]: I0217 14:40:46.211467 4836 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f31755722b53fa2304e51f9abd859d08ebfbf46cba2b7c60b902e42814f8b35d"} err="failed to get container status \"f31755722b53fa2304e51f9abd859d08ebfbf46cba2b7c60b902e42814f8b35d\": rpc error: code = NotFound desc = could not find container \"f31755722b53fa2304e51f9abd859d08ebfbf46cba2b7c60b902e42814f8b35d\": container with ID starting with f31755722b53fa2304e51f9abd859d08ebfbf46cba2b7c60b902e42814f8b35d not found: ID does not exist" Feb 17 14:40:46 crc kubenswrapper[4836]: I0217 14:40:46.335087 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/781729f0-fe27-45e7-bd7b-23709696ec4d-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "781729f0-fe27-45e7-bd7b-23709696ec4d" (UID: "781729f0-fe27-45e7-bd7b-23709696ec4d"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:40:46 crc kubenswrapper[4836]: I0217 14:40:46.415077 4836 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/781729f0-fe27-45e7-bd7b-23709696ec4d-must-gather-output\") on node \"crc\" DevicePath \"\"" Feb 17 14:40:46 crc kubenswrapper[4836]: I0217 14:40:46.589147 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="781729f0-fe27-45e7-bd7b-23709696ec4d" path="/var/lib/kubelet/pods/781729f0-fe27-45e7-bd7b-23709696ec4d/volumes" Feb 17 14:41:15 crc kubenswrapper[4836]: I0217 14:41:15.048348 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-bz94v"] Feb 17 14:41:15 crc kubenswrapper[4836]: I0217 14:41:15.057972 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-bz94v"] Feb 17 14:41:16 crc kubenswrapper[4836]: I0217 14:41:16.036148 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-lqvvn"] Feb 17 14:41:16 crc kubenswrapper[4836]: I0217 14:41:16.044950 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-lqvvn"] Feb 17 14:41:16 crc kubenswrapper[4836]: I0217 14:41:16.581672 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f9d6a93-3d3a-4c5c-85cf-329209cfe911" path="/var/lib/kubelet/pods/3f9d6a93-3d3a-4c5c-85cf-329209cfe911/volumes" Feb 17 14:41:16 crc kubenswrapper[4836]: I0217 14:41:16.582428 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="790a788c-3cfe-49c8-b1ff-a83bcedf17e0" path="/var/lib/kubelet/pods/790a788c-3cfe-49c8-b1ff-a83bcedf17e0/volumes" Feb 17 14:41:26 crc kubenswrapper[4836]: I0217 14:41:26.436819 4836 scope.go:117] "RemoveContainer" containerID="959d5cc1d8ba4d131ae83ee3b420db014e052fb98b3a6fa5c53753ae63d88003" Feb 17 14:41:26 crc kubenswrapper[4836]: I0217 14:41:26.488433 4836 scope.go:117] "RemoveContainer" containerID="9a55578dc34e67ce0a93dbbd5c5e496ed951f38d462ffb4dcccf5ec23897e1c5" Feb 17 14:41:26 crc kubenswrapper[4836]: I0217 14:41:26.564305 4836 scope.go:117] "RemoveContainer" containerID="c224cbe49994301a8cf7d7e85623916f9815d0873ee461d723b64e1a3b753f8d" Feb 17 14:41:29 crc kubenswrapper[4836]: I0217 14:41:29.765324 4836 patch_prober.go:28] interesting pod/machine-config-daemon-bkk9g container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 14:41:29 crc kubenswrapper[4836]: I0217 14:41:29.765814 4836 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" podUID="895a19c9-a3f0-4a15-aa19-19347121388c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 14:41:59 crc kubenswrapper[4836]: I0217 14:41:59.765240 4836 patch_prober.go:28] interesting pod/machine-config-daemon-bkk9g container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 14:41:59 crc kubenswrapper[4836]: I0217 14:41:59.766013 4836 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" podUID="895a19c9-a3f0-4a15-aa19-19347121388c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 14:42:01 crc kubenswrapper[4836]: I0217 14:42:01.075610 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-h4mlr"] Feb 17 14:42:01 crc kubenswrapper[4836]: I0217 14:42:01.086081 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-h4mlr"] Feb 17 14:42:02 crc kubenswrapper[4836]: I0217 14:42:02.587799 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="079f20c9-f742-4c4b-a8c0-a2a09573bf62" path="/var/lib/kubelet/pods/079f20c9-f742-4c4b-a8c0-a2a09573bf62/volumes" Feb 17 14:42:04 crc kubenswrapper[4836]: I0217 14:42:04.352894 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-7rqln"] Feb 17 14:42:04 crc kubenswrapper[4836]: E0217 14:42:04.354306 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="781729f0-fe27-45e7-bd7b-23709696ec4d" containerName="copy" Feb 17 14:42:04 crc kubenswrapper[4836]: I0217 14:42:04.354330 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="781729f0-fe27-45e7-bd7b-23709696ec4d" containerName="copy" Feb 17 14:42:04 crc kubenswrapper[4836]: E0217 14:42:04.354377 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="781729f0-fe27-45e7-bd7b-23709696ec4d" containerName="gather" Feb 17 14:42:04 crc kubenswrapper[4836]: I0217 14:42:04.354385 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="781729f0-fe27-45e7-bd7b-23709696ec4d" containerName="gather" Feb 17 14:42:04 crc kubenswrapper[4836]: E0217 14:42:04.354518 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d11c6f6-7a22-40b4-b7ba-e0cb46e0cecf" containerName="container-00" Feb 17 14:42:04 crc kubenswrapper[4836]: I0217 14:42:04.354535 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d11c6f6-7a22-40b4-b7ba-e0cb46e0cecf" containerName="container-00" Feb 17 14:42:04 crc kubenswrapper[4836]: I0217 14:42:04.354769 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="781729f0-fe27-45e7-bd7b-23709696ec4d" containerName="gather" Feb 17 14:42:04 crc kubenswrapper[4836]: I0217 14:42:04.354788 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d11c6f6-7a22-40b4-b7ba-e0cb46e0cecf" containerName="container-00" Feb 17 14:42:04 crc kubenswrapper[4836]: I0217 14:42:04.354818 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="781729f0-fe27-45e7-bd7b-23709696ec4d" containerName="copy" Feb 17 14:42:04 crc kubenswrapper[4836]: I0217 14:42:04.360100 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7rqln" Feb 17 14:42:04 crc kubenswrapper[4836]: I0217 14:42:04.386747 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7rqln"] Feb 17 14:42:04 crc kubenswrapper[4836]: I0217 14:42:04.470371 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5bcd4960-7859-4e31-829d-e737ae014f31-catalog-content\") pod \"redhat-operators-7rqln\" (UID: \"5bcd4960-7859-4e31-829d-e737ae014f31\") " pod="openshift-marketplace/redhat-operators-7rqln" Feb 17 14:42:04 crc kubenswrapper[4836]: I0217 14:42:04.470734 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5bcd4960-7859-4e31-829d-e737ae014f31-utilities\") pod \"redhat-operators-7rqln\" (UID: \"5bcd4960-7859-4e31-829d-e737ae014f31\") " pod="openshift-marketplace/redhat-operators-7rqln" Feb 17 14:42:04 crc kubenswrapper[4836]: I0217 14:42:04.471001 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxnbz\" (UniqueName: \"kubernetes.io/projected/5bcd4960-7859-4e31-829d-e737ae014f31-kube-api-access-qxnbz\") pod \"redhat-operators-7rqln\" (UID: \"5bcd4960-7859-4e31-829d-e737ae014f31\") " pod="openshift-marketplace/redhat-operators-7rqln" Feb 17 14:42:04 crc kubenswrapper[4836]: I0217 14:42:04.573513 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxnbz\" (UniqueName: \"kubernetes.io/projected/5bcd4960-7859-4e31-829d-e737ae014f31-kube-api-access-qxnbz\") pod \"redhat-operators-7rqln\" (UID: \"5bcd4960-7859-4e31-829d-e737ae014f31\") " pod="openshift-marketplace/redhat-operators-7rqln" Feb 17 14:42:04 crc kubenswrapper[4836]: I0217 14:42:04.573636 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5bcd4960-7859-4e31-829d-e737ae014f31-catalog-content\") pod \"redhat-operators-7rqln\" (UID: \"5bcd4960-7859-4e31-829d-e737ae014f31\") " pod="openshift-marketplace/redhat-operators-7rqln" Feb 17 14:42:04 crc kubenswrapper[4836]: I0217 14:42:04.573729 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5bcd4960-7859-4e31-829d-e737ae014f31-utilities\") pod \"redhat-operators-7rqln\" (UID: \"5bcd4960-7859-4e31-829d-e737ae014f31\") " pod="openshift-marketplace/redhat-operators-7rqln" Feb 17 14:42:04 crc kubenswrapper[4836]: I0217 14:42:04.574420 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5bcd4960-7859-4e31-829d-e737ae014f31-utilities\") pod \"redhat-operators-7rqln\" (UID: \"5bcd4960-7859-4e31-829d-e737ae014f31\") " pod="openshift-marketplace/redhat-operators-7rqln" Feb 17 14:42:04 crc kubenswrapper[4836]: I0217 14:42:04.574450 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5bcd4960-7859-4e31-829d-e737ae014f31-catalog-content\") pod \"redhat-operators-7rqln\" (UID: \"5bcd4960-7859-4e31-829d-e737ae014f31\") " pod="openshift-marketplace/redhat-operators-7rqln" Feb 17 14:42:04 crc kubenswrapper[4836]: I0217 14:42:04.600886 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxnbz\" (UniqueName: \"kubernetes.io/projected/5bcd4960-7859-4e31-829d-e737ae014f31-kube-api-access-qxnbz\") pod \"redhat-operators-7rqln\" (UID: \"5bcd4960-7859-4e31-829d-e737ae014f31\") " pod="openshift-marketplace/redhat-operators-7rqln" Feb 17 14:42:04 crc kubenswrapper[4836]: I0217 14:42:04.701477 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7rqln" Feb 17 14:42:05 crc kubenswrapper[4836]: I0217 14:42:05.278378 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7rqln"] Feb 17 14:42:06 crc kubenswrapper[4836]: I0217 14:42:06.216275 4836 generic.go:334] "Generic (PLEG): container finished" podID="5bcd4960-7859-4e31-829d-e737ae014f31" containerID="065f0b85b5c506e038b1ad3e7531809387c1f19b7e964615c77473205ec24a4f" exitCode=0 Feb 17 14:42:06 crc kubenswrapper[4836]: I0217 14:42:06.216352 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7rqln" event={"ID":"5bcd4960-7859-4e31-829d-e737ae014f31","Type":"ContainerDied","Data":"065f0b85b5c506e038b1ad3e7531809387c1f19b7e964615c77473205ec24a4f"} Feb 17 14:42:06 crc kubenswrapper[4836]: I0217 14:42:06.216398 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7rqln" event={"ID":"5bcd4960-7859-4e31-829d-e737ae014f31","Type":"ContainerStarted","Data":"3b2d2320c7cbb136eafb357b4ff7cfbbe5c583adde16eb7ed6a081a0f7bec0b0"} Feb 17 14:42:06 crc kubenswrapper[4836]: I0217 14:42:06.219565 4836 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 17 14:42:07 crc kubenswrapper[4836]: I0217 14:42:07.248278 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7rqln" event={"ID":"5bcd4960-7859-4e31-829d-e737ae014f31","Type":"ContainerStarted","Data":"ecec317ab07846296a1224db623c16f7c4e0c50365e2c51ebe73d09f24a315ba"} Feb 17 14:42:09 crc kubenswrapper[4836]: I0217 14:42:09.537043 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-s96zg"] Feb 17 14:42:09 crc kubenswrapper[4836]: I0217 14:42:09.541134 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s96zg" Feb 17 14:42:09 crc kubenswrapper[4836]: I0217 14:42:09.553412 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-s96zg"] Feb 17 14:42:09 crc kubenswrapper[4836]: I0217 14:42:09.642852 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-769bf\" (UniqueName: \"kubernetes.io/projected/782abdb8-014c-4d56-a7c7-a5ffb8a8e609-kube-api-access-769bf\") pod \"certified-operators-s96zg\" (UID: \"782abdb8-014c-4d56-a7c7-a5ffb8a8e609\") " pod="openshift-marketplace/certified-operators-s96zg" Feb 17 14:42:09 crc kubenswrapper[4836]: I0217 14:42:09.644129 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/782abdb8-014c-4d56-a7c7-a5ffb8a8e609-utilities\") pod \"certified-operators-s96zg\" (UID: \"782abdb8-014c-4d56-a7c7-a5ffb8a8e609\") " pod="openshift-marketplace/certified-operators-s96zg" Feb 17 14:42:09 crc kubenswrapper[4836]: I0217 14:42:09.644322 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/782abdb8-014c-4d56-a7c7-a5ffb8a8e609-catalog-content\") pod \"certified-operators-s96zg\" (UID: \"782abdb8-014c-4d56-a7c7-a5ffb8a8e609\") " pod="openshift-marketplace/certified-operators-s96zg" Feb 17 14:42:09 crc kubenswrapper[4836]: I0217 14:42:09.746404 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/782abdb8-014c-4d56-a7c7-a5ffb8a8e609-utilities\") pod \"certified-operators-s96zg\" (UID: \"782abdb8-014c-4d56-a7c7-a5ffb8a8e609\") " pod="openshift-marketplace/certified-operators-s96zg" Feb 17 14:42:09 crc kubenswrapper[4836]: I0217 14:42:09.746521 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/782abdb8-014c-4d56-a7c7-a5ffb8a8e609-catalog-content\") pod \"certified-operators-s96zg\" (UID: \"782abdb8-014c-4d56-a7c7-a5ffb8a8e609\") " pod="openshift-marketplace/certified-operators-s96zg" Feb 17 14:42:09 crc kubenswrapper[4836]: I0217 14:42:09.746694 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-769bf\" (UniqueName: \"kubernetes.io/projected/782abdb8-014c-4d56-a7c7-a5ffb8a8e609-kube-api-access-769bf\") pod \"certified-operators-s96zg\" (UID: \"782abdb8-014c-4d56-a7c7-a5ffb8a8e609\") " pod="openshift-marketplace/certified-operators-s96zg" Feb 17 14:42:09 crc kubenswrapper[4836]: I0217 14:42:09.746963 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/782abdb8-014c-4d56-a7c7-a5ffb8a8e609-utilities\") pod \"certified-operators-s96zg\" (UID: \"782abdb8-014c-4d56-a7c7-a5ffb8a8e609\") " pod="openshift-marketplace/certified-operators-s96zg" Feb 17 14:42:09 crc kubenswrapper[4836]: I0217 14:42:09.747085 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/782abdb8-014c-4d56-a7c7-a5ffb8a8e609-catalog-content\") pod \"certified-operators-s96zg\" (UID: \"782abdb8-014c-4d56-a7c7-a5ffb8a8e609\") " pod="openshift-marketplace/certified-operators-s96zg" Feb 17 14:42:09 crc kubenswrapper[4836]: I0217 14:42:09.780437 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-769bf\" (UniqueName: \"kubernetes.io/projected/782abdb8-014c-4d56-a7c7-a5ffb8a8e609-kube-api-access-769bf\") pod \"certified-operators-s96zg\" (UID: \"782abdb8-014c-4d56-a7c7-a5ffb8a8e609\") " pod="openshift-marketplace/certified-operators-s96zg" Feb 17 14:42:09 crc kubenswrapper[4836]: I0217 14:42:09.863189 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s96zg" Feb 17 14:42:10 crc kubenswrapper[4836]: W0217 14:42:10.445205 4836 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod782abdb8_014c_4d56_a7c7_a5ffb8a8e609.slice/crio-52d2f884fbcb1f46af948eaf3e822be773031f979f684862e6d76af9c0054264 WatchSource:0}: Error finding container 52d2f884fbcb1f46af948eaf3e822be773031f979f684862e6d76af9c0054264: Status 404 returned error can't find the container with id 52d2f884fbcb1f46af948eaf3e822be773031f979f684862e6d76af9c0054264 Feb 17 14:42:10 crc kubenswrapper[4836]: I0217 14:42:10.446176 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-s96zg"] Feb 17 14:42:11 crc kubenswrapper[4836]: I0217 14:42:11.303691 4836 generic.go:334] "Generic (PLEG): container finished" podID="5bcd4960-7859-4e31-829d-e737ae014f31" containerID="ecec317ab07846296a1224db623c16f7c4e0c50365e2c51ebe73d09f24a315ba" exitCode=0 Feb 17 14:42:11 crc kubenswrapper[4836]: I0217 14:42:11.303794 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7rqln" event={"ID":"5bcd4960-7859-4e31-829d-e737ae014f31","Type":"ContainerDied","Data":"ecec317ab07846296a1224db623c16f7c4e0c50365e2c51ebe73d09f24a315ba"} Feb 17 14:42:11 crc kubenswrapper[4836]: I0217 14:42:11.308476 4836 generic.go:334] "Generic (PLEG): container finished" podID="782abdb8-014c-4d56-a7c7-a5ffb8a8e609" containerID="383929ae316c24d0ac7b145939152ce6f201b6a654e5356dc808be6253f6bf76" exitCode=0 Feb 17 14:42:11 crc kubenswrapper[4836]: I0217 14:42:11.308546 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s96zg" event={"ID":"782abdb8-014c-4d56-a7c7-a5ffb8a8e609","Type":"ContainerDied","Data":"383929ae316c24d0ac7b145939152ce6f201b6a654e5356dc808be6253f6bf76"} Feb 17 14:42:11 crc kubenswrapper[4836]: I0217 14:42:11.308603 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s96zg" event={"ID":"782abdb8-014c-4d56-a7c7-a5ffb8a8e609","Type":"ContainerStarted","Data":"52d2f884fbcb1f46af948eaf3e822be773031f979f684862e6d76af9c0054264"} Feb 17 14:42:12 crc kubenswrapper[4836]: I0217 14:42:12.321179 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7rqln" event={"ID":"5bcd4960-7859-4e31-829d-e737ae014f31","Type":"ContainerStarted","Data":"7717a65aa7f0f8e6a643858620b6dd5eb2c1e34f51d5b5756f90c00fc814cd9f"} Feb 17 14:42:12 crc kubenswrapper[4836]: I0217 14:42:12.325194 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s96zg" event={"ID":"782abdb8-014c-4d56-a7c7-a5ffb8a8e609","Type":"ContainerStarted","Data":"6bf27dcf1f51f29080696e75835d07c03e3aa2670a5cd4ab04af4eea98086d87"} Feb 17 14:42:12 crc kubenswrapper[4836]: I0217 14:42:12.349583 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-7rqln" podStartSLOduration=2.8499781029999998 podStartE2EDuration="8.349536169s" podCreationTimestamp="2026-02-17 14:42:04 +0000 UTC" firstStartedPulling="2026-02-17 14:42:06.219172923 +0000 UTC m=+2152.562101192" lastFinishedPulling="2026-02-17 14:42:11.718730969 +0000 UTC m=+2158.061659258" observedRunningTime="2026-02-17 14:42:12.341547613 +0000 UTC m=+2158.684475902" watchObservedRunningTime="2026-02-17 14:42:12.349536169 +0000 UTC m=+2158.692464438" Feb 17 14:42:14 crc kubenswrapper[4836]: I0217 14:42:14.701684 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-7rqln" Feb 17 14:42:14 crc kubenswrapper[4836]: I0217 14:42:14.702132 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-7rqln" Feb 17 14:42:15 crc kubenswrapper[4836]: I0217 14:42:15.355252 4836 generic.go:334] "Generic (PLEG): container finished" podID="782abdb8-014c-4d56-a7c7-a5ffb8a8e609" containerID="6bf27dcf1f51f29080696e75835d07c03e3aa2670a5cd4ab04af4eea98086d87" exitCode=0 Feb 17 14:42:15 crc kubenswrapper[4836]: I0217 14:42:15.355292 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s96zg" event={"ID":"782abdb8-014c-4d56-a7c7-a5ffb8a8e609","Type":"ContainerDied","Data":"6bf27dcf1f51f29080696e75835d07c03e3aa2670a5cd4ab04af4eea98086d87"} Feb 17 14:42:15 crc kubenswrapper[4836]: I0217 14:42:15.910792 4836 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7rqln" podUID="5bcd4960-7859-4e31-829d-e737ae014f31" containerName="registry-server" probeResult="failure" output=< Feb 17 14:42:15 crc kubenswrapper[4836]: timeout: failed to connect service ":50051" within 1s Feb 17 14:42:15 crc kubenswrapper[4836]: > Feb 17 14:42:16 crc kubenswrapper[4836]: I0217 14:42:16.368773 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s96zg" event={"ID":"782abdb8-014c-4d56-a7c7-a5ffb8a8e609","Type":"ContainerStarted","Data":"4a1b17ed59affd7deaa2032e53a2dda11bfb17725f3a69ef2d57bc516fd6ce52"} Feb 17 14:42:16 crc kubenswrapper[4836]: I0217 14:42:16.392899 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-s96zg" podStartSLOduration=2.592563839 podStartE2EDuration="7.392868348s" podCreationTimestamp="2026-02-17 14:42:09 +0000 UTC" firstStartedPulling="2026-02-17 14:42:11.311179991 +0000 UTC m=+2157.654108260" lastFinishedPulling="2026-02-17 14:42:16.1114845 +0000 UTC m=+2162.454412769" observedRunningTime="2026-02-17 14:42:16.387548444 +0000 UTC m=+2162.730476723" watchObservedRunningTime="2026-02-17 14:42:16.392868348 +0000 UTC m=+2162.735796617" Feb 17 14:42:19 crc kubenswrapper[4836]: I0217 14:42:19.863912 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-s96zg" Feb 17 14:42:19 crc kubenswrapper[4836]: I0217 14:42:19.864270 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-s96zg" Feb 17 14:42:20 crc kubenswrapper[4836]: I0217 14:42:20.914574 4836 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-s96zg" podUID="782abdb8-014c-4d56-a7c7-a5ffb8a8e609" containerName="registry-server" probeResult="failure" output=< Feb 17 14:42:20 crc kubenswrapper[4836]: timeout: failed to connect service ":50051" within 1s Feb 17 14:42:20 crc kubenswrapper[4836]: > Feb 17 14:42:24 crc kubenswrapper[4836]: I0217 14:42:24.759625 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-7rqln" Feb 17 14:42:24 crc kubenswrapper[4836]: I0217 14:42:24.821186 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-7rqln" Feb 17 14:42:25 crc kubenswrapper[4836]: I0217 14:42:25.002539 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7rqln"] Feb 17 14:42:26 crc kubenswrapper[4836]: I0217 14:42:26.469415 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-7rqln" podUID="5bcd4960-7859-4e31-829d-e737ae014f31" containerName="registry-server" containerID="cri-o://7717a65aa7f0f8e6a643858620b6dd5eb2c1e34f51d5b5756f90c00fc814cd9f" gracePeriod=2 Feb 17 14:42:26 crc kubenswrapper[4836]: I0217 14:42:26.783132 4836 scope.go:117] "RemoveContainer" containerID="6d24e9f78b938b24616765924395f09dc01b17f432bd2a5ca96dd30f763b95e2" Feb 17 14:42:27 crc kubenswrapper[4836]: I0217 14:42:27.154798 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7rqln" Feb 17 14:42:27 crc kubenswrapper[4836]: I0217 14:42:27.212800 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qxnbz\" (UniqueName: \"kubernetes.io/projected/5bcd4960-7859-4e31-829d-e737ae014f31-kube-api-access-qxnbz\") pod \"5bcd4960-7859-4e31-829d-e737ae014f31\" (UID: \"5bcd4960-7859-4e31-829d-e737ae014f31\") " Feb 17 14:42:27 crc kubenswrapper[4836]: I0217 14:42:27.213175 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5bcd4960-7859-4e31-829d-e737ae014f31-catalog-content\") pod \"5bcd4960-7859-4e31-829d-e737ae014f31\" (UID: \"5bcd4960-7859-4e31-829d-e737ae014f31\") " Feb 17 14:42:27 crc kubenswrapper[4836]: I0217 14:42:27.213238 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5bcd4960-7859-4e31-829d-e737ae014f31-utilities\") pod \"5bcd4960-7859-4e31-829d-e737ae014f31\" (UID: \"5bcd4960-7859-4e31-829d-e737ae014f31\") " Feb 17 14:42:27 crc kubenswrapper[4836]: I0217 14:42:27.214420 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5bcd4960-7859-4e31-829d-e737ae014f31-utilities" (OuterVolumeSpecName: "utilities") pod "5bcd4960-7859-4e31-829d-e737ae014f31" (UID: "5bcd4960-7859-4e31-829d-e737ae014f31"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:42:27 crc kubenswrapper[4836]: I0217 14:42:27.221132 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5bcd4960-7859-4e31-829d-e737ae014f31-kube-api-access-qxnbz" (OuterVolumeSpecName: "kube-api-access-qxnbz") pod "5bcd4960-7859-4e31-829d-e737ae014f31" (UID: "5bcd4960-7859-4e31-829d-e737ae014f31"). InnerVolumeSpecName "kube-api-access-qxnbz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:42:27 crc kubenswrapper[4836]: I0217 14:42:27.317265 4836 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5bcd4960-7859-4e31-829d-e737ae014f31-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 14:42:27 crc kubenswrapper[4836]: I0217 14:42:27.317348 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qxnbz\" (UniqueName: \"kubernetes.io/projected/5bcd4960-7859-4e31-829d-e737ae014f31-kube-api-access-qxnbz\") on node \"crc\" DevicePath \"\"" Feb 17 14:42:27 crc kubenswrapper[4836]: I0217 14:42:27.370648 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5bcd4960-7859-4e31-829d-e737ae014f31-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5bcd4960-7859-4e31-829d-e737ae014f31" (UID: "5bcd4960-7859-4e31-829d-e737ae014f31"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:42:27 crc kubenswrapper[4836]: I0217 14:42:27.419947 4836 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5bcd4960-7859-4e31-829d-e737ae014f31-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 14:42:27 crc kubenswrapper[4836]: I0217 14:42:27.482142 4836 generic.go:334] "Generic (PLEG): container finished" podID="5bcd4960-7859-4e31-829d-e737ae014f31" containerID="7717a65aa7f0f8e6a643858620b6dd5eb2c1e34f51d5b5756f90c00fc814cd9f" exitCode=0 Feb 17 14:42:27 crc kubenswrapper[4836]: I0217 14:42:27.482205 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7rqln" event={"ID":"5bcd4960-7859-4e31-829d-e737ae014f31","Type":"ContainerDied","Data":"7717a65aa7f0f8e6a643858620b6dd5eb2c1e34f51d5b5756f90c00fc814cd9f"} Feb 17 14:42:27 crc kubenswrapper[4836]: I0217 14:42:27.482243 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7rqln" event={"ID":"5bcd4960-7859-4e31-829d-e737ae014f31","Type":"ContainerDied","Data":"3b2d2320c7cbb136eafb357b4ff7cfbbe5c583adde16eb7ed6a081a0f7bec0b0"} Feb 17 14:42:27 crc kubenswrapper[4836]: I0217 14:42:27.482265 4836 scope.go:117] "RemoveContainer" containerID="7717a65aa7f0f8e6a643858620b6dd5eb2c1e34f51d5b5756f90c00fc814cd9f" Feb 17 14:42:27 crc kubenswrapper[4836]: I0217 14:42:27.482457 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7rqln" Feb 17 14:42:27 crc kubenswrapper[4836]: I0217 14:42:27.527797 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7rqln"] Feb 17 14:42:27 crc kubenswrapper[4836]: I0217 14:42:27.529239 4836 scope.go:117] "RemoveContainer" containerID="ecec317ab07846296a1224db623c16f7c4e0c50365e2c51ebe73d09f24a315ba" Feb 17 14:42:27 crc kubenswrapper[4836]: I0217 14:42:27.538783 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-7rqln"] Feb 17 14:42:27 crc kubenswrapper[4836]: I0217 14:42:27.568693 4836 scope.go:117] "RemoveContainer" containerID="065f0b85b5c506e038b1ad3e7531809387c1f19b7e964615c77473205ec24a4f" Feb 17 14:42:27 crc kubenswrapper[4836]: I0217 14:42:27.599590 4836 scope.go:117] "RemoveContainer" containerID="7717a65aa7f0f8e6a643858620b6dd5eb2c1e34f51d5b5756f90c00fc814cd9f" Feb 17 14:42:27 crc kubenswrapper[4836]: E0217 14:42:27.599861 4836 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7717a65aa7f0f8e6a643858620b6dd5eb2c1e34f51d5b5756f90c00fc814cd9f\": container with ID starting with 7717a65aa7f0f8e6a643858620b6dd5eb2c1e34f51d5b5756f90c00fc814cd9f not found: ID does not exist" containerID="7717a65aa7f0f8e6a643858620b6dd5eb2c1e34f51d5b5756f90c00fc814cd9f" Feb 17 14:42:27 crc kubenswrapper[4836]: I0217 14:42:27.599900 4836 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7717a65aa7f0f8e6a643858620b6dd5eb2c1e34f51d5b5756f90c00fc814cd9f"} err="failed to get container status \"7717a65aa7f0f8e6a643858620b6dd5eb2c1e34f51d5b5756f90c00fc814cd9f\": rpc error: code = NotFound desc = could not find container \"7717a65aa7f0f8e6a643858620b6dd5eb2c1e34f51d5b5756f90c00fc814cd9f\": container with ID starting with 7717a65aa7f0f8e6a643858620b6dd5eb2c1e34f51d5b5756f90c00fc814cd9f not found: ID does not exist" Feb 17 14:42:27 crc kubenswrapper[4836]: I0217 14:42:27.599926 4836 scope.go:117] "RemoveContainer" containerID="ecec317ab07846296a1224db623c16f7c4e0c50365e2c51ebe73d09f24a315ba" Feb 17 14:42:27 crc kubenswrapper[4836]: E0217 14:42:27.600288 4836 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ecec317ab07846296a1224db623c16f7c4e0c50365e2c51ebe73d09f24a315ba\": container with ID starting with ecec317ab07846296a1224db623c16f7c4e0c50365e2c51ebe73d09f24a315ba not found: ID does not exist" containerID="ecec317ab07846296a1224db623c16f7c4e0c50365e2c51ebe73d09f24a315ba" Feb 17 14:42:27 crc kubenswrapper[4836]: I0217 14:42:27.600325 4836 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ecec317ab07846296a1224db623c16f7c4e0c50365e2c51ebe73d09f24a315ba"} err="failed to get container status \"ecec317ab07846296a1224db623c16f7c4e0c50365e2c51ebe73d09f24a315ba\": rpc error: code = NotFound desc = could not find container \"ecec317ab07846296a1224db623c16f7c4e0c50365e2c51ebe73d09f24a315ba\": container with ID starting with ecec317ab07846296a1224db623c16f7c4e0c50365e2c51ebe73d09f24a315ba not found: ID does not exist" Feb 17 14:42:27 crc kubenswrapper[4836]: I0217 14:42:27.600338 4836 scope.go:117] "RemoveContainer" containerID="065f0b85b5c506e038b1ad3e7531809387c1f19b7e964615c77473205ec24a4f" Feb 17 14:42:27 crc kubenswrapper[4836]: E0217 14:42:27.600805 4836 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"065f0b85b5c506e038b1ad3e7531809387c1f19b7e964615c77473205ec24a4f\": container with ID starting with 065f0b85b5c506e038b1ad3e7531809387c1f19b7e964615c77473205ec24a4f not found: ID does not exist" containerID="065f0b85b5c506e038b1ad3e7531809387c1f19b7e964615c77473205ec24a4f" Feb 17 14:42:27 crc kubenswrapper[4836]: I0217 14:42:27.600827 4836 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"065f0b85b5c506e038b1ad3e7531809387c1f19b7e964615c77473205ec24a4f"} err="failed to get container status \"065f0b85b5c506e038b1ad3e7531809387c1f19b7e964615c77473205ec24a4f\": rpc error: code = NotFound desc = could not find container \"065f0b85b5c506e038b1ad3e7531809387c1f19b7e964615c77473205ec24a4f\": container with ID starting with 065f0b85b5c506e038b1ad3e7531809387c1f19b7e964615c77473205ec24a4f not found: ID does not exist" Feb 17 14:42:28 crc kubenswrapper[4836]: I0217 14:42:28.610733 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5bcd4960-7859-4e31-829d-e737ae014f31" path="/var/lib/kubelet/pods/5bcd4960-7859-4e31-829d-e737ae014f31/volumes" Feb 17 14:42:29 crc kubenswrapper[4836]: I0217 14:42:29.765246 4836 patch_prober.go:28] interesting pod/machine-config-daemon-bkk9g container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 14:42:29 crc kubenswrapper[4836]: I0217 14:42:29.765427 4836 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" podUID="895a19c9-a3f0-4a15-aa19-19347121388c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 14:42:29 crc kubenswrapper[4836]: I0217 14:42:29.765509 4836 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" Feb 17 14:42:29 crc kubenswrapper[4836]: I0217 14:42:29.766593 4836 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6f19bb9a4d6443b07f247471c35e97a577b83e39d81d033aff596fac57089969"} pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 14:42:29 crc kubenswrapper[4836]: I0217 14:42:29.766703 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" podUID="895a19c9-a3f0-4a15-aa19-19347121388c" containerName="machine-config-daemon" containerID="cri-o://6f19bb9a4d6443b07f247471c35e97a577b83e39d81d033aff596fac57089969" gracePeriod=600 Feb 17 14:42:30 crc kubenswrapper[4836]: I0217 14:42:30.041507 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-s96zg" Feb 17 14:42:30 crc kubenswrapper[4836]: I0217 14:42:30.159837 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-s96zg" Feb 17 14:42:30 crc kubenswrapper[4836]: I0217 14:42:30.410896 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-s96zg"] Feb 17 14:42:30 crc kubenswrapper[4836]: I0217 14:42:30.517031 4836 generic.go:334] "Generic (PLEG): container finished" podID="895a19c9-a3f0-4a15-aa19-19347121388c" containerID="6f19bb9a4d6443b07f247471c35e97a577b83e39d81d033aff596fac57089969" exitCode=0 Feb 17 14:42:30 crc kubenswrapper[4836]: I0217 14:42:30.517481 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" event={"ID":"895a19c9-a3f0-4a15-aa19-19347121388c","Type":"ContainerDied","Data":"6f19bb9a4d6443b07f247471c35e97a577b83e39d81d033aff596fac57089969"} Feb 17 14:42:30 crc kubenswrapper[4836]: I0217 14:42:30.517676 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" event={"ID":"895a19c9-a3f0-4a15-aa19-19347121388c","Type":"ContainerStarted","Data":"0cd1f125f5a9b4dbb6838481ae07ba424a29fd62723f85d44334ea5eb698c97d"} Feb 17 14:42:30 crc kubenswrapper[4836]: I0217 14:42:30.517730 4836 scope.go:117] "RemoveContainer" containerID="325cd676e21fc52f03c197c519aae517b900944cff0ba106872a3e674c1b1f20" Feb 17 14:42:31 crc kubenswrapper[4836]: I0217 14:42:31.529706 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-s96zg" podUID="782abdb8-014c-4d56-a7c7-a5ffb8a8e609" containerName="registry-server" containerID="cri-o://4a1b17ed59affd7deaa2032e53a2dda11bfb17725f3a69ef2d57bc516fd6ce52" gracePeriod=2 Feb 17 14:42:31 crc kubenswrapper[4836]: E0217 14:42:31.795079 4836 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod782abdb8_014c_4d56_a7c7_a5ffb8a8e609.slice/crio-conmon-4a1b17ed59affd7deaa2032e53a2dda11bfb17725f3a69ef2d57bc516fd6ce52.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod782abdb8_014c_4d56_a7c7_a5ffb8a8e609.slice/crio-4a1b17ed59affd7deaa2032e53a2dda11bfb17725f3a69ef2d57bc516fd6ce52.scope\": RecentStats: unable to find data in memory cache]" Feb 17 14:42:32 crc kubenswrapper[4836]: I0217 14:42:32.095568 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s96zg" Feb 17 14:42:32 crc kubenswrapper[4836]: I0217 14:42:32.144893 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/782abdb8-014c-4d56-a7c7-a5ffb8a8e609-catalog-content\") pod \"782abdb8-014c-4d56-a7c7-a5ffb8a8e609\" (UID: \"782abdb8-014c-4d56-a7c7-a5ffb8a8e609\") " Feb 17 14:42:32 crc kubenswrapper[4836]: I0217 14:42:32.145632 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/782abdb8-014c-4d56-a7c7-a5ffb8a8e609-utilities\") pod \"782abdb8-014c-4d56-a7c7-a5ffb8a8e609\" (UID: \"782abdb8-014c-4d56-a7c7-a5ffb8a8e609\") " Feb 17 14:42:32 crc kubenswrapper[4836]: I0217 14:42:32.145948 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-769bf\" (UniqueName: \"kubernetes.io/projected/782abdb8-014c-4d56-a7c7-a5ffb8a8e609-kube-api-access-769bf\") pod \"782abdb8-014c-4d56-a7c7-a5ffb8a8e609\" (UID: \"782abdb8-014c-4d56-a7c7-a5ffb8a8e609\") " Feb 17 14:42:32 crc kubenswrapper[4836]: I0217 14:42:32.146604 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/782abdb8-014c-4d56-a7c7-a5ffb8a8e609-utilities" (OuterVolumeSpecName: "utilities") pod "782abdb8-014c-4d56-a7c7-a5ffb8a8e609" (UID: "782abdb8-014c-4d56-a7c7-a5ffb8a8e609"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:42:32 crc kubenswrapper[4836]: I0217 14:42:32.147453 4836 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/782abdb8-014c-4d56-a7c7-a5ffb8a8e609-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 14:42:32 crc kubenswrapper[4836]: I0217 14:42:32.162699 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/782abdb8-014c-4d56-a7c7-a5ffb8a8e609-kube-api-access-769bf" (OuterVolumeSpecName: "kube-api-access-769bf") pod "782abdb8-014c-4d56-a7c7-a5ffb8a8e609" (UID: "782abdb8-014c-4d56-a7c7-a5ffb8a8e609"). InnerVolumeSpecName "kube-api-access-769bf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:42:32 crc kubenswrapper[4836]: I0217 14:42:32.209043 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/782abdb8-014c-4d56-a7c7-a5ffb8a8e609-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "782abdb8-014c-4d56-a7c7-a5ffb8a8e609" (UID: "782abdb8-014c-4d56-a7c7-a5ffb8a8e609"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:42:32 crc kubenswrapper[4836]: I0217 14:42:32.253289 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-769bf\" (UniqueName: \"kubernetes.io/projected/782abdb8-014c-4d56-a7c7-a5ffb8a8e609-kube-api-access-769bf\") on node \"crc\" DevicePath \"\"" Feb 17 14:42:32 crc kubenswrapper[4836]: I0217 14:42:32.253367 4836 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/782abdb8-014c-4d56-a7c7-a5ffb8a8e609-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 14:42:32 crc kubenswrapper[4836]: I0217 14:42:32.541716 4836 generic.go:334] "Generic (PLEG): container finished" podID="782abdb8-014c-4d56-a7c7-a5ffb8a8e609" containerID="4a1b17ed59affd7deaa2032e53a2dda11bfb17725f3a69ef2d57bc516fd6ce52" exitCode=0 Feb 17 14:42:32 crc kubenswrapper[4836]: I0217 14:42:32.541784 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s96zg" event={"ID":"782abdb8-014c-4d56-a7c7-a5ffb8a8e609","Type":"ContainerDied","Data":"4a1b17ed59affd7deaa2032e53a2dda11bfb17725f3a69ef2d57bc516fd6ce52"} Feb 17 14:42:32 crc kubenswrapper[4836]: I0217 14:42:32.541850 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s96zg" event={"ID":"782abdb8-014c-4d56-a7c7-a5ffb8a8e609","Type":"ContainerDied","Data":"52d2f884fbcb1f46af948eaf3e822be773031f979f684862e6d76af9c0054264"} Feb 17 14:42:32 crc kubenswrapper[4836]: I0217 14:42:32.541873 4836 scope.go:117] "RemoveContainer" containerID="4a1b17ed59affd7deaa2032e53a2dda11bfb17725f3a69ef2d57bc516fd6ce52" Feb 17 14:42:32 crc kubenswrapper[4836]: I0217 14:42:32.541926 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s96zg" Feb 17 14:42:32 crc kubenswrapper[4836]: I0217 14:42:32.566520 4836 scope.go:117] "RemoveContainer" containerID="6bf27dcf1f51f29080696e75835d07c03e3aa2670a5cd4ab04af4eea98086d87" Feb 17 14:42:32 crc kubenswrapper[4836]: I0217 14:42:32.591192 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-s96zg"] Feb 17 14:42:32 crc kubenswrapper[4836]: I0217 14:42:32.593426 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-s96zg"] Feb 17 14:42:32 crc kubenswrapper[4836]: I0217 14:42:32.602256 4836 scope.go:117] "RemoveContainer" containerID="383929ae316c24d0ac7b145939152ce6f201b6a654e5356dc808be6253f6bf76" Feb 17 14:42:32 crc kubenswrapper[4836]: I0217 14:42:32.650834 4836 scope.go:117] "RemoveContainer" containerID="4a1b17ed59affd7deaa2032e53a2dda11bfb17725f3a69ef2d57bc516fd6ce52" Feb 17 14:42:32 crc kubenswrapper[4836]: E0217 14:42:32.651502 4836 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a1b17ed59affd7deaa2032e53a2dda11bfb17725f3a69ef2d57bc516fd6ce52\": container with ID starting with 4a1b17ed59affd7deaa2032e53a2dda11bfb17725f3a69ef2d57bc516fd6ce52 not found: ID does not exist" containerID="4a1b17ed59affd7deaa2032e53a2dda11bfb17725f3a69ef2d57bc516fd6ce52" Feb 17 14:42:32 crc kubenswrapper[4836]: I0217 14:42:32.651563 4836 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a1b17ed59affd7deaa2032e53a2dda11bfb17725f3a69ef2d57bc516fd6ce52"} err="failed to get container status \"4a1b17ed59affd7deaa2032e53a2dda11bfb17725f3a69ef2d57bc516fd6ce52\": rpc error: code = NotFound desc = could not find container \"4a1b17ed59affd7deaa2032e53a2dda11bfb17725f3a69ef2d57bc516fd6ce52\": container with ID starting with 4a1b17ed59affd7deaa2032e53a2dda11bfb17725f3a69ef2d57bc516fd6ce52 not found: ID does not exist" Feb 17 14:42:32 crc kubenswrapper[4836]: I0217 14:42:32.651596 4836 scope.go:117] "RemoveContainer" containerID="6bf27dcf1f51f29080696e75835d07c03e3aa2670a5cd4ab04af4eea98086d87" Feb 17 14:42:32 crc kubenswrapper[4836]: E0217 14:42:32.656283 4836 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6bf27dcf1f51f29080696e75835d07c03e3aa2670a5cd4ab04af4eea98086d87\": container with ID starting with 6bf27dcf1f51f29080696e75835d07c03e3aa2670a5cd4ab04af4eea98086d87 not found: ID does not exist" containerID="6bf27dcf1f51f29080696e75835d07c03e3aa2670a5cd4ab04af4eea98086d87" Feb 17 14:42:32 crc kubenswrapper[4836]: I0217 14:42:32.656349 4836 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6bf27dcf1f51f29080696e75835d07c03e3aa2670a5cd4ab04af4eea98086d87"} err="failed to get container status \"6bf27dcf1f51f29080696e75835d07c03e3aa2670a5cd4ab04af4eea98086d87\": rpc error: code = NotFound desc = could not find container \"6bf27dcf1f51f29080696e75835d07c03e3aa2670a5cd4ab04af4eea98086d87\": container with ID starting with 6bf27dcf1f51f29080696e75835d07c03e3aa2670a5cd4ab04af4eea98086d87 not found: ID does not exist" Feb 17 14:42:32 crc kubenswrapper[4836]: I0217 14:42:32.656384 4836 scope.go:117] "RemoveContainer" containerID="383929ae316c24d0ac7b145939152ce6f201b6a654e5356dc808be6253f6bf76" Feb 17 14:42:32 crc kubenswrapper[4836]: E0217 14:42:32.656951 4836 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"383929ae316c24d0ac7b145939152ce6f201b6a654e5356dc808be6253f6bf76\": container with ID starting with 383929ae316c24d0ac7b145939152ce6f201b6a654e5356dc808be6253f6bf76 not found: ID does not exist" containerID="383929ae316c24d0ac7b145939152ce6f201b6a654e5356dc808be6253f6bf76" Feb 17 14:42:32 crc kubenswrapper[4836]: I0217 14:42:32.656982 4836 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"383929ae316c24d0ac7b145939152ce6f201b6a654e5356dc808be6253f6bf76"} err="failed to get container status \"383929ae316c24d0ac7b145939152ce6f201b6a654e5356dc808be6253f6bf76\": rpc error: code = NotFound desc = could not find container \"383929ae316c24d0ac7b145939152ce6f201b6a654e5356dc808be6253f6bf76\": container with ID starting with 383929ae316c24d0ac7b145939152ce6f201b6a654e5356dc808be6253f6bf76 not found: ID does not exist" Feb 17 14:42:34 crc kubenswrapper[4836]: I0217 14:42:34.584043 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="782abdb8-014c-4d56-a7c7-a5ffb8a8e609" path="/var/lib/kubelet/pods/782abdb8-014c-4d56-a7c7-a5ffb8a8e609/volumes" Feb 17 14:42:58 crc kubenswrapper[4836]: I0217 14:42:58.720337 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-l8n5s"] Feb 17 14:42:58 crc kubenswrapper[4836]: E0217 14:42:58.721278 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="782abdb8-014c-4d56-a7c7-a5ffb8a8e609" containerName="extract-content" Feb 17 14:42:58 crc kubenswrapper[4836]: I0217 14:42:58.721319 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="782abdb8-014c-4d56-a7c7-a5ffb8a8e609" containerName="extract-content" Feb 17 14:42:58 crc kubenswrapper[4836]: E0217 14:42:58.721339 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="782abdb8-014c-4d56-a7c7-a5ffb8a8e609" containerName="extract-utilities" Feb 17 14:42:58 crc kubenswrapper[4836]: I0217 14:42:58.721346 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="782abdb8-014c-4d56-a7c7-a5ffb8a8e609" containerName="extract-utilities" Feb 17 14:42:58 crc kubenswrapper[4836]: E0217 14:42:58.721359 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="782abdb8-014c-4d56-a7c7-a5ffb8a8e609" containerName="registry-server" Feb 17 14:42:58 crc kubenswrapper[4836]: I0217 14:42:58.721365 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="782abdb8-014c-4d56-a7c7-a5ffb8a8e609" containerName="registry-server" Feb 17 14:42:58 crc kubenswrapper[4836]: E0217 14:42:58.721377 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bcd4960-7859-4e31-829d-e737ae014f31" containerName="extract-utilities" Feb 17 14:42:58 crc kubenswrapper[4836]: I0217 14:42:58.721383 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bcd4960-7859-4e31-829d-e737ae014f31" containerName="extract-utilities" Feb 17 14:42:58 crc kubenswrapper[4836]: E0217 14:42:58.721393 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bcd4960-7859-4e31-829d-e737ae014f31" containerName="extract-content" Feb 17 14:42:58 crc kubenswrapper[4836]: I0217 14:42:58.721399 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bcd4960-7859-4e31-829d-e737ae014f31" containerName="extract-content" Feb 17 14:42:58 crc kubenswrapper[4836]: E0217 14:42:58.721420 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bcd4960-7859-4e31-829d-e737ae014f31" containerName="registry-server" Feb 17 14:42:58 crc kubenswrapper[4836]: I0217 14:42:58.721425 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bcd4960-7859-4e31-829d-e737ae014f31" containerName="registry-server" Feb 17 14:42:58 crc kubenswrapper[4836]: I0217 14:42:58.721652 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="5bcd4960-7859-4e31-829d-e737ae014f31" containerName="registry-server" Feb 17 14:42:58 crc kubenswrapper[4836]: I0217 14:42:58.721665 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="782abdb8-014c-4d56-a7c7-a5ffb8a8e609" containerName="registry-server" Feb 17 14:42:58 crc kubenswrapper[4836]: I0217 14:42:58.723308 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l8n5s" Feb 17 14:42:58 crc kubenswrapper[4836]: I0217 14:42:58.749512 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-l8n5s"] Feb 17 14:42:58 crc kubenswrapper[4836]: I0217 14:42:58.901161 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2w8s\" (UniqueName: \"kubernetes.io/projected/1362b4e0-d576-4cc3-b60d-22dc164d36e6-kube-api-access-g2w8s\") pod \"redhat-marketplace-l8n5s\" (UID: \"1362b4e0-d576-4cc3-b60d-22dc164d36e6\") " pod="openshift-marketplace/redhat-marketplace-l8n5s" Feb 17 14:42:58 crc kubenswrapper[4836]: I0217 14:42:58.901420 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1362b4e0-d576-4cc3-b60d-22dc164d36e6-utilities\") pod \"redhat-marketplace-l8n5s\" (UID: \"1362b4e0-d576-4cc3-b60d-22dc164d36e6\") " pod="openshift-marketplace/redhat-marketplace-l8n5s" Feb 17 14:42:58 crc kubenswrapper[4836]: I0217 14:42:58.902037 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1362b4e0-d576-4cc3-b60d-22dc164d36e6-catalog-content\") pod \"redhat-marketplace-l8n5s\" (UID: \"1362b4e0-d576-4cc3-b60d-22dc164d36e6\") " pod="openshift-marketplace/redhat-marketplace-l8n5s" Feb 17 14:42:59 crc kubenswrapper[4836]: I0217 14:42:59.009635 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1362b4e0-d576-4cc3-b60d-22dc164d36e6-utilities\") pod \"redhat-marketplace-l8n5s\" (UID: \"1362b4e0-d576-4cc3-b60d-22dc164d36e6\") " pod="openshift-marketplace/redhat-marketplace-l8n5s" Feb 17 14:42:59 crc kubenswrapper[4836]: I0217 14:42:59.009828 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1362b4e0-d576-4cc3-b60d-22dc164d36e6-catalog-content\") pod \"redhat-marketplace-l8n5s\" (UID: \"1362b4e0-d576-4cc3-b60d-22dc164d36e6\") " pod="openshift-marketplace/redhat-marketplace-l8n5s" Feb 17 14:42:59 crc kubenswrapper[4836]: I0217 14:42:59.009878 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2w8s\" (UniqueName: \"kubernetes.io/projected/1362b4e0-d576-4cc3-b60d-22dc164d36e6-kube-api-access-g2w8s\") pod \"redhat-marketplace-l8n5s\" (UID: \"1362b4e0-d576-4cc3-b60d-22dc164d36e6\") " pod="openshift-marketplace/redhat-marketplace-l8n5s" Feb 17 14:42:59 crc kubenswrapper[4836]: I0217 14:42:59.011404 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1362b4e0-d576-4cc3-b60d-22dc164d36e6-utilities\") pod \"redhat-marketplace-l8n5s\" (UID: \"1362b4e0-d576-4cc3-b60d-22dc164d36e6\") " pod="openshift-marketplace/redhat-marketplace-l8n5s" Feb 17 14:42:59 crc kubenswrapper[4836]: I0217 14:42:59.011666 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1362b4e0-d576-4cc3-b60d-22dc164d36e6-catalog-content\") pod \"redhat-marketplace-l8n5s\" (UID: \"1362b4e0-d576-4cc3-b60d-22dc164d36e6\") " pod="openshift-marketplace/redhat-marketplace-l8n5s" Feb 17 14:42:59 crc kubenswrapper[4836]: I0217 14:42:59.057890 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2w8s\" (UniqueName: \"kubernetes.io/projected/1362b4e0-d576-4cc3-b60d-22dc164d36e6-kube-api-access-g2w8s\") pod \"redhat-marketplace-l8n5s\" (UID: \"1362b4e0-d576-4cc3-b60d-22dc164d36e6\") " pod="openshift-marketplace/redhat-marketplace-l8n5s" Feb 17 14:42:59 crc kubenswrapper[4836]: I0217 14:42:59.199759 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l8n5s" Feb 17 14:42:59 crc kubenswrapper[4836]: I0217 14:42:59.696215 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-l8n5s"] Feb 17 14:42:59 crc kubenswrapper[4836]: I0217 14:42:59.918733 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l8n5s" event={"ID":"1362b4e0-d576-4cc3-b60d-22dc164d36e6","Type":"ContainerStarted","Data":"5e39044fdc945b0ccefd2f1703920c43f3a5e055cf64150d9a50d4dcc1955e22"} Feb 17 14:43:00 crc kubenswrapper[4836]: I0217 14:43:00.931506 4836 generic.go:334] "Generic (PLEG): container finished" podID="1362b4e0-d576-4cc3-b60d-22dc164d36e6" containerID="c6cfe2a894550e4dfa97d33e7a8c45bba54a5b3f31c99c78437667f405675475" exitCode=0 Feb 17 14:43:00 crc kubenswrapper[4836]: I0217 14:43:00.931612 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l8n5s" event={"ID":"1362b4e0-d576-4cc3-b60d-22dc164d36e6","Type":"ContainerDied","Data":"c6cfe2a894550e4dfa97d33e7a8c45bba54a5b3f31c99c78437667f405675475"} Feb 17 14:43:02 crc kubenswrapper[4836]: I0217 14:43:02.983523 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l8n5s" event={"ID":"1362b4e0-d576-4cc3-b60d-22dc164d36e6","Type":"ContainerStarted","Data":"446d8b49f1c41afa000152dee4447afdd73a376b18b18225416d88792c05eb9f"} Feb 17 14:43:03 crc kubenswrapper[4836]: I0217 14:43:03.997362 4836 generic.go:334] "Generic (PLEG): container finished" podID="1362b4e0-d576-4cc3-b60d-22dc164d36e6" containerID="446d8b49f1c41afa000152dee4447afdd73a376b18b18225416d88792c05eb9f" exitCode=0 Feb 17 14:43:03 crc kubenswrapper[4836]: I0217 14:43:03.997485 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l8n5s" event={"ID":"1362b4e0-d576-4cc3-b60d-22dc164d36e6","Type":"ContainerDied","Data":"446d8b49f1c41afa000152dee4447afdd73a376b18b18225416d88792c05eb9f"} Feb 17 14:43:05 crc kubenswrapper[4836]: I0217 14:43:05.012086 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l8n5s" event={"ID":"1362b4e0-d576-4cc3-b60d-22dc164d36e6","Type":"ContainerStarted","Data":"c3fbd162f9e80809d3ba31e6ab607c5f8d8bee1d68434fdee175237d6d20554f"} Feb 17 14:43:05 crc kubenswrapper[4836]: I0217 14:43:05.031515 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-l8n5s" podStartSLOduration=3.5235011419999998 podStartE2EDuration="7.031466629s" podCreationTimestamp="2026-02-17 14:42:58 +0000 UTC" firstStartedPulling="2026-02-17 14:43:00.93427636 +0000 UTC m=+2207.277204629" lastFinishedPulling="2026-02-17 14:43:04.442241837 +0000 UTC m=+2210.785170116" observedRunningTime="2026-02-17 14:43:05.030155874 +0000 UTC m=+2211.373084163" watchObservedRunningTime="2026-02-17 14:43:05.031466629 +0000 UTC m=+2211.374394898" Feb 17 14:43:09 crc kubenswrapper[4836]: I0217 14:43:09.200845 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-l8n5s" Feb 17 14:43:09 crc kubenswrapper[4836]: I0217 14:43:09.201564 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-l8n5s" Feb 17 14:43:09 crc kubenswrapper[4836]: I0217 14:43:09.253827 4836 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-l8n5s" Feb 17 14:43:10 crc kubenswrapper[4836]: I0217 14:43:10.108402 4836 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-l8n5s" Feb 17 14:43:10 crc kubenswrapper[4836]: I0217 14:43:10.160770 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-l8n5s"] Feb 17 14:43:12 crc kubenswrapper[4836]: I0217 14:43:12.078473 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-l8n5s" podUID="1362b4e0-d576-4cc3-b60d-22dc164d36e6" containerName="registry-server" containerID="cri-o://c3fbd162f9e80809d3ba31e6ab607c5f8d8bee1d68434fdee175237d6d20554f" gracePeriod=2 Feb 17 14:43:12 crc kubenswrapper[4836]: I0217 14:43:12.762204 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l8n5s" Feb 17 14:43:12 crc kubenswrapper[4836]: I0217 14:43:12.830564 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1362b4e0-d576-4cc3-b60d-22dc164d36e6-utilities\") pod \"1362b4e0-d576-4cc3-b60d-22dc164d36e6\" (UID: \"1362b4e0-d576-4cc3-b60d-22dc164d36e6\") " Feb 17 14:43:12 crc kubenswrapper[4836]: I0217 14:43:12.830761 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1362b4e0-d576-4cc3-b60d-22dc164d36e6-catalog-content\") pod \"1362b4e0-d576-4cc3-b60d-22dc164d36e6\" (UID: \"1362b4e0-d576-4cc3-b60d-22dc164d36e6\") " Feb 17 14:43:12 crc kubenswrapper[4836]: I0217 14:43:12.831153 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g2w8s\" (UniqueName: \"kubernetes.io/projected/1362b4e0-d576-4cc3-b60d-22dc164d36e6-kube-api-access-g2w8s\") pod \"1362b4e0-d576-4cc3-b60d-22dc164d36e6\" (UID: \"1362b4e0-d576-4cc3-b60d-22dc164d36e6\") " Feb 17 14:43:12 crc kubenswrapper[4836]: I0217 14:43:12.832008 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1362b4e0-d576-4cc3-b60d-22dc164d36e6-utilities" (OuterVolumeSpecName: "utilities") pod "1362b4e0-d576-4cc3-b60d-22dc164d36e6" (UID: "1362b4e0-d576-4cc3-b60d-22dc164d36e6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:43:12 crc kubenswrapper[4836]: I0217 14:43:12.832288 4836 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1362b4e0-d576-4cc3-b60d-22dc164d36e6-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 14:43:12 crc kubenswrapper[4836]: I0217 14:43:12.837686 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1362b4e0-d576-4cc3-b60d-22dc164d36e6-kube-api-access-g2w8s" (OuterVolumeSpecName: "kube-api-access-g2w8s") pod "1362b4e0-d576-4cc3-b60d-22dc164d36e6" (UID: "1362b4e0-d576-4cc3-b60d-22dc164d36e6"). InnerVolumeSpecName "kube-api-access-g2w8s". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:43:12 crc kubenswrapper[4836]: I0217 14:43:12.933460 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g2w8s\" (UniqueName: \"kubernetes.io/projected/1362b4e0-d576-4cc3-b60d-22dc164d36e6-kube-api-access-g2w8s\") on node \"crc\" DevicePath \"\"" Feb 17 14:43:12 crc kubenswrapper[4836]: I0217 14:43:12.983731 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1362b4e0-d576-4cc3-b60d-22dc164d36e6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1362b4e0-d576-4cc3-b60d-22dc164d36e6" (UID: "1362b4e0-d576-4cc3-b60d-22dc164d36e6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 14:43:13 crc kubenswrapper[4836]: I0217 14:43:13.034978 4836 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1362b4e0-d576-4cc3-b60d-22dc164d36e6-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 14:43:13 crc kubenswrapper[4836]: I0217 14:43:13.243969 4836 generic.go:334] "Generic (PLEG): container finished" podID="1362b4e0-d576-4cc3-b60d-22dc164d36e6" containerID="c3fbd162f9e80809d3ba31e6ab607c5f8d8bee1d68434fdee175237d6d20554f" exitCode=0 Feb 17 14:43:13 crc kubenswrapper[4836]: I0217 14:43:13.244026 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l8n5s" event={"ID":"1362b4e0-d576-4cc3-b60d-22dc164d36e6","Type":"ContainerDied","Data":"c3fbd162f9e80809d3ba31e6ab607c5f8d8bee1d68434fdee175237d6d20554f"} Feb 17 14:43:13 crc kubenswrapper[4836]: I0217 14:43:13.244061 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l8n5s" event={"ID":"1362b4e0-d576-4cc3-b60d-22dc164d36e6","Type":"ContainerDied","Data":"5e39044fdc945b0ccefd2f1703920c43f3a5e055cf64150d9a50d4dcc1955e22"} Feb 17 14:43:13 crc kubenswrapper[4836]: I0217 14:43:13.244079 4836 scope.go:117] "RemoveContainer" containerID="c3fbd162f9e80809d3ba31e6ab607c5f8d8bee1d68434fdee175237d6d20554f" Feb 17 14:43:13 crc kubenswrapper[4836]: I0217 14:43:13.245009 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l8n5s" Feb 17 14:43:13 crc kubenswrapper[4836]: I0217 14:43:13.263399 4836 scope.go:117] "RemoveContainer" containerID="446d8b49f1c41afa000152dee4447afdd73a376b18b18225416d88792c05eb9f" Feb 17 14:43:13 crc kubenswrapper[4836]: I0217 14:43:13.283678 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-l8n5s"] Feb 17 14:43:13 crc kubenswrapper[4836]: I0217 14:43:13.293157 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-l8n5s"] Feb 17 14:43:13 crc kubenswrapper[4836]: I0217 14:43:13.305358 4836 scope.go:117] "RemoveContainer" containerID="c6cfe2a894550e4dfa97d33e7a8c45bba54a5b3f31c99c78437667f405675475" Feb 17 14:43:13 crc kubenswrapper[4836]: I0217 14:43:13.339125 4836 scope.go:117] "RemoveContainer" containerID="c3fbd162f9e80809d3ba31e6ab607c5f8d8bee1d68434fdee175237d6d20554f" Feb 17 14:43:13 crc kubenswrapper[4836]: E0217 14:43:13.339928 4836 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3fbd162f9e80809d3ba31e6ab607c5f8d8bee1d68434fdee175237d6d20554f\": container with ID starting with c3fbd162f9e80809d3ba31e6ab607c5f8d8bee1d68434fdee175237d6d20554f not found: ID does not exist" containerID="c3fbd162f9e80809d3ba31e6ab607c5f8d8bee1d68434fdee175237d6d20554f" Feb 17 14:43:13 crc kubenswrapper[4836]: I0217 14:43:13.340006 4836 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3fbd162f9e80809d3ba31e6ab607c5f8d8bee1d68434fdee175237d6d20554f"} err="failed to get container status \"c3fbd162f9e80809d3ba31e6ab607c5f8d8bee1d68434fdee175237d6d20554f\": rpc error: code = NotFound desc = could not find container \"c3fbd162f9e80809d3ba31e6ab607c5f8d8bee1d68434fdee175237d6d20554f\": container with ID starting with c3fbd162f9e80809d3ba31e6ab607c5f8d8bee1d68434fdee175237d6d20554f not found: ID does not exist" Feb 17 14:43:13 crc kubenswrapper[4836]: I0217 14:43:13.340067 4836 scope.go:117] "RemoveContainer" containerID="446d8b49f1c41afa000152dee4447afdd73a376b18b18225416d88792c05eb9f" Feb 17 14:43:13 crc kubenswrapper[4836]: E0217 14:43:13.340770 4836 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"446d8b49f1c41afa000152dee4447afdd73a376b18b18225416d88792c05eb9f\": container with ID starting with 446d8b49f1c41afa000152dee4447afdd73a376b18b18225416d88792c05eb9f not found: ID does not exist" containerID="446d8b49f1c41afa000152dee4447afdd73a376b18b18225416d88792c05eb9f" Feb 17 14:43:13 crc kubenswrapper[4836]: I0217 14:43:13.340815 4836 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"446d8b49f1c41afa000152dee4447afdd73a376b18b18225416d88792c05eb9f"} err="failed to get container status \"446d8b49f1c41afa000152dee4447afdd73a376b18b18225416d88792c05eb9f\": rpc error: code = NotFound desc = could not find container \"446d8b49f1c41afa000152dee4447afdd73a376b18b18225416d88792c05eb9f\": container with ID starting with 446d8b49f1c41afa000152dee4447afdd73a376b18b18225416d88792c05eb9f not found: ID does not exist" Feb 17 14:43:13 crc kubenswrapper[4836]: I0217 14:43:13.340843 4836 scope.go:117] "RemoveContainer" containerID="c6cfe2a894550e4dfa97d33e7a8c45bba54a5b3f31c99c78437667f405675475" Feb 17 14:43:13 crc kubenswrapper[4836]: E0217 14:43:13.343720 4836 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6cfe2a894550e4dfa97d33e7a8c45bba54a5b3f31c99c78437667f405675475\": container with ID starting with c6cfe2a894550e4dfa97d33e7a8c45bba54a5b3f31c99c78437667f405675475 not found: ID does not exist" containerID="c6cfe2a894550e4dfa97d33e7a8c45bba54a5b3f31c99c78437667f405675475" Feb 17 14:43:13 crc kubenswrapper[4836]: I0217 14:43:13.343782 4836 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6cfe2a894550e4dfa97d33e7a8c45bba54a5b3f31c99c78437667f405675475"} err="failed to get container status \"c6cfe2a894550e4dfa97d33e7a8c45bba54a5b3f31c99c78437667f405675475\": rpc error: code = NotFound desc = could not find container \"c6cfe2a894550e4dfa97d33e7a8c45bba54a5b3f31c99c78437667f405675475\": container with ID starting with c6cfe2a894550e4dfa97d33e7a8c45bba54a5b3f31c99c78437667f405675475 not found: ID does not exist" Feb 17 14:43:14 crc kubenswrapper[4836]: I0217 14:43:14.585468 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1362b4e0-d576-4cc3-b60d-22dc164d36e6" path="/var/lib/kubelet/pods/1362b4e0-d576-4cc3-b60d-22dc164d36e6/volumes" Feb 17 14:44:59 crc kubenswrapper[4836]: I0217 14:44:59.765314 4836 patch_prober.go:28] interesting pod/machine-config-daemon-bkk9g container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 14:44:59 crc kubenswrapper[4836]: I0217 14:44:59.766091 4836 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" podUID="895a19c9-a3f0-4a15-aa19-19347121388c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 14:45:00 crc kubenswrapper[4836]: I0217 14:45:00.360011 4836 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522325-6z9vm"] Feb 17 14:45:00 crc kubenswrapper[4836]: E0217 14:45:00.360774 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1362b4e0-d576-4cc3-b60d-22dc164d36e6" containerName="extract-utilities" Feb 17 14:45:00 crc kubenswrapper[4836]: I0217 14:45:00.360789 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="1362b4e0-d576-4cc3-b60d-22dc164d36e6" containerName="extract-utilities" Feb 17 14:45:00 crc kubenswrapper[4836]: E0217 14:45:00.360804 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1362b4e0-d576-4cc3-b60d-22dc164d36e6" containerName="extract-content" Feb 17 14:45:00 crc kubenswrapper[4836]: I0217 14:45:00.360810 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="1362b4e0-d576-4cc3-b60d-22dc164d36e6" containerName="extract-content" Feb 17 14:45:00 crc kubenswrapper[4836]: E0217 14:45:00.360821 4836 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1362b4e0-d576-4cc3-b60d-22dc164d36e6" containerName="registry-server" Feb 17 14:45:00 crc kubenswrapper[4836]: I0217 14:45:00.360827 4836 state_mem.go:107] "Deleted CPUSet assignment" podUID="1362b4e0-d576-4cc3-b60d-22dc164d36e6" containerName="registry-server" Feb 17 14:45:00 crc kubenswrapper[4836]: I0217 14:45:00.361036 4836 memory_manager.go:354] "RemoveStaleState removing state" podUID="1362b4e0-d576-4cc3-b60d-22dc164d36e6" containerName="registry-server" Feb 17 14:45:00 crc kubenswrapper[4836]: I0217 14:45:00.362174 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522325-6z9vm" Feb 17 14:45:00 crc kubenswrapper[4836]: I0217 14:45:00.367072 4836 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 17 14:45:00 crc kubenswrapper[4836]: I0217 14:45:00.367582 4836 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 17 14:45:00 crc kubenswrapper[4836]: I0217 14:45:00.381323 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9e1a5bfb-4c7e-47f1-9acb-1c7da14e7765-config-volume\") pod \"collect-profiles-29522325-6z9vm\" (UID: \"9e1a5bfb-4c7e-47f1-9acb-1c7da14e7765\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522325-6z9vm" Feb 17 14:45:00 crc kubenswrapper[4836]: I0217 14:45:00.381449 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grkhq\" (UniqueName: \"kubernetes.io/projected/9e1a5bfb-4c7e-47f1-9acb-1c7da14e7765-kube-api-access-grkhq\") pod \"collect-profiles-29522325-6z9vm\" (UID: \"9e1a5bfb-4c7e-47f1-9acb-1c7da14e7765\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522325-6z9vm" Feb 17 14:45:00 crc kubenswrapper[4836]: I0217 14:45:00.381564 4836 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9e1a5bfb-4c7e-47f1-9acb-1c7da14e7765-secret-volume\") pod \"collect-profiles-29522325-6z9vm\" (UID: \"9e1a5bfb-4c7e-47f1-9acb-1c7da14e7765\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522325-6z9vm" Feb 17 14:45:00 crc kubenswrapper[4836]: I0217 14:45:00.384366 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522325-6z9vm"] Feb 17 14:45:00 crc kubenswrapper[4836]: I0217 14:45:00.484100 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9e1a5bfb-4c7e-47f1-9acb-1c7da14e7765-config-volume\") pod \"collect-profiles-29522325-6z9vm\" (UID: \"9e1a5bfb-4c7e-47f1-9acb-1c7da14e7765\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522325-6z9vm" Feb 17 14:45:00 crc kubenswrapper[4836]: I0217 14:45:00.484245 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grkhq\" (UniqueName: \"kubernetes.io/projected/9e1a5bfb-4c7e-47f1-9acb-1c7da14e7765-kube-api-access-grkhq\") pod \"collect-profiles-29522325-6z9vm\" (UID: \"9e1a5bfb-4c7e-47f1-9acb-1c7da14e7765\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522325-6z9vm" Feb 17 14:45:00 crc kubenswrapper[4836]: I0217 14:45:00.484404 4836 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9e1a5bfb-4c7e-47f1-9acb-1c7da14e7765-secret-volume\") pod \"collect-profiles-29522325-6z9vm\" (UID: \"9e1a5bfb-4c7e-47f1-9acb-1c7da14e7765\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522325-6z9vm" Feb 17 14:45:00 crc kubenswrapper[4836]: I0217 14:45:00.485537 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9e1a5bfb-4c7e-47f1-9acb-1c7da14e7765-config-volume\") pod \"collect-profiles-29522325-6z9vm\" (UID: \"9e1a5bfb-4c7e-47f1-9acb-1c7da14e7765\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522325-6z9vm" Feb 17 14:45:00 crc kubenswrapper[4836]: I0217 14:45:00.496046 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9e1a5bfb-4c7e-47f1-9acb-1c7da14e7765-secret-volume\") pod \"collect-profiles-29522325-6z9vm\" (UID: \"9e1a5bfb-4c7e-47f1-9acb-1c7da14e7765\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522325-6z9vm" Feb 17 14:45:00 crc kubenswrapper[4836]: I0217 14:45:00.511991 4836 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grkhq\" (UniqueName: \"kubernetes.io/projected/9e1a5bfb-4c7e-47f1-9acb-1c7da14e7765-kube-api-access-grkhq\") pod \"collect-profiles-29522325-6z9vm\" (UID: \"9e1a5bfb-4c7e-47f1-9acb-1c7da14e7765\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522325-6z9vm" Feb 17 14:45:00 crc kubenswrapper[4836]: I0217 14:45:00.711931 4836 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522325-6z9vm" Feb 17 14:45:01 crc kubenswrapper[4836]: I0217 14:45:01.434931 4836 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522325-6z9vm"] Feb 17 14:45:01 crc kubenswrapper[4836]: I0217 14:45:01.802089 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522325-6z9vm" event={"ID":"9e1a5bfb-4c7e-47f1-9acb-1c7da14e7765","Type":"ContainerStarted","Data":"7ca139c0f79ad8f1696fe7f9e1c84b53c156ee1274043c6c3d37f1283049d2db"} Feb 17 14:45:01 crc kubenswrapper[4836]: I0217 14:45:01.802152 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522325-6z9vm" event={"ID":"9e1a5bfb-4c7e-47f1-9acb-1c7da14e7765","Type":"ContainerStarted","Data":"c3be4cefaa64aaba8ee9b719c8fa1372623b7c9d9319f4164769a8ca22750bd7"} Feb 17 14:45:01 crc kubenswrapper[4836]: I0217 14:45:01.835100 4836 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29522325-6z9vm" podStartSLOduration=1.835061963 podStartE2EDuration="1.835061963s" podCreationTimestamp="2026-02-17 14:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 14:45:01.821637452 +0000 UTC m=+2328.164565731" watchObservedRunningTime="2026-02-17 14:45:01.835061963 +0000 UTC m=+2328.177990242" Feb 17 14:45:02 crc kubenswrapper[4836]: I0217 14:45:02.813732 4836 generic.go:334] "Generic (PLEG): container finished" podID="9e1a5bfb-4c7e-47f1-9acb-1c7da14e7765" containerID="7ca139c0f79ad8f1696fe7f9e1c84b53c156ee1274043c6c3d37f1283049d2db" exitCode=0 Feb 17 14:45:02 crc kubenswrapper[4836]: I0217 14:45:02.813794 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522325-6z9vm" event={"ID":"9e1a5bfb-4c7e-47f1-9acb-1c7da14e7765","Type":"ContainerDied","Data":"7ca139c0f79ad8f1696fe7f9e1c84b53c156ee1274043c6c3d37f1283049d2db"} Feb 17 14:45:04 crc kubenswrapper[4836]: I0217 14:45:04.341666 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522325-6z9vm" Feb 17 14:45:04 crc kubenswrapper[4836]: I0217 14:45:04.483494 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9e1a5bfb-4c7e-47f1-9acb-1c7da14e7765-config-volume\") pod \"9e1a5bfb-4c7e-47f1-9acb-1c7da14e7765\" (UID: \"9e1a5bfb-4c7e-47f1-9acb-1c7da14e7765\") " Feb 17 14:45:04 crc kubenswrapper[4836]: I0217 14:45:04.483616 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9e1a5bfb-4c7e-47f1-9acb-1c7da14e7765-secret-volume\") pod \"9e1a5bfb-4c7e-47f1-9acb-1c7da14e7765\" (UID: \"9e1a5bfb-4c7e-47f1-9acb-1c7da14e7765\") " Feb 17 14:45:04 crc kubenswrapper[4836]: I0217 14:45:04.483669 4836 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-grkhq\" (UniqueName: \"kubernetes.io/projected/9e1a5bfb-4c7e-47f1-9acb-1c7da14e7765-kube-api-access-grkhq\") pod \"9e1a5bfb-4c7e-47f1-9acb-1c7da14e7765\" (UID: \"9e1a5bfb-4c7e-47f1-9acb-1c7da14e7765\") " Feb 17 14:45:04 crc kubenswrapper[4836]: I0217 14:45:04.485883 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e1a5bfb-4c7e-47f1-9acb-1c7da14e7765-config-volume" (OuterVolumeSpecName: "config-volume") pod "9e1a5bfb-4c7e-47f1-9acb-1c7da14e7765" (UID: "9e1a5bfb-4c7e-47f1-9acb-1c7da14e7765"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 14:45:04 crc kubenswrapper[4836]: I0217 14:45:04.500327 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e1a5bfb-4c7e-47f1-9acb-1c7da14e7765-kube-api-access-grkhq" (OuterVolumeSpecName: "kube-api-access-grkhq") pod "9e1a5bfb-4c7e-47f1-9acb-1c7da14e7765" (UID: "9e1a5bfb-4c7e-47f1-9acb-1c7da14e7765"). InnerVolumeSpecName "kube-api-access-grkhq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 14:45:04 crc kubenswrapper[4836]: I0217 14:45:04.503523 4836 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e1a5bfb-4c7e-47f1-9acb-1c7da14e7765-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "9e1a5bfb-4c7e-47f1-9acb-1c7da14e7765" (UID: "9e1a5bfb-4c7e-47f1-9acb-1c7da14e7765"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 14:45:04 crc kubenswrapper[4836]: I0217 14:45:04.542146 4836 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522280-c8fps"] Feb 17 14:45:04 crc kubenswrapper[4836]: I0217 14:45:04.561486 4836 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522280-c8fps"] Feb 17 14:45:04 crc kubenswrapper[4836]: I0217 14:45:04.586752 4836 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9e1a5bfb-4c7e-47f1-9acb-1c7da14e7765-config-volume\") on node \"crc\" DevicePath \"\"" Feb 17 14:45:04 crc kubenswrapper[4836]: I0217 14:45:04.586809 4836 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9e1a5bfb-4c7e-47f1-9acb-1c7da14e7765-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 17 14:45:04 crc kubenswrapper[4836]: I0217 14:45:04.586821 4836 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-grkhq\" (UniqueName: \"kubernetes.io/projected/9e1a5bfb-4c7e-47f1-9acb-1c7da14e7765-kube-api-access-grkhq\") on node \"crc\" DevicePath \"\"" Feb 17 14:45:04 crc kubenswrapper[4836]: I0217 14:45:04.592538 4836 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91eb437c-beea-4f2d-b3f7-505b87fe6dee" path="/var/lib/kubelet/pods/91eb437c-beea-4f2d-b3f7-505b87fe6dee/volumes" Feb 17 14:45:04 crc kubenswrapper[4836]: I0217 14:45:04.838289 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522325-6z9vm" event={"ID":"9e1a5bfb-4c7e-47f1-9acb-1c7da14e7765","Type":"ContainerDied","Data":"c3be4cefaa64aaba8ee9b719c8fa1372623b7c9d9319f4164769a8ca22750bd7"} Feb 17 14:45:04 crc kubenswrapper[4836]: I0217 14:45:04.838427 4836 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522325-6z9vm" Feb 17 14:45:04 crc kubenswrapper[4836]: I0217 14:45:04.838354 4836 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c3be4cefaa64aaba8ee9b719c8fa1372623b7c9d9319f4164769a8ca22750bd7" Feb 17 14:45:26 crc kubenswrapper[4836]: I0217 14:45:26.990954 4836 scope.go:117] "RemoveContainer" containerID="4b8580f44aade0425b4de34e0f49d07bd6192e526f9c10aa11b53556a3546660" Feb 17 14:45:29 crc kubenswrapper[4836]: I0217 14:45:29.765749 4836 patch_prober.go:28] interesting pod/machine-config-daemon-bkk9g container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 14:45:29 crc kubenswrapper[4836]: I0217 14:45:29.766532 4836 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" podUID="895a19c9-a3f0-4a15-aa19-19347121388c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 14:45:59 crc kubenswrapper[4836]: I0217 14:45:59.765530 4836 patch_prober.go:28] interesting pod/machine-config-daemon-bkk9g container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 14:45:59 crc kubenswrapper[4836]: I0217 14:45:59.766183 4836 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" podUID="895a19c9-a3f0-4a15-aa19-19347121388c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 14:45:59 crc kubenswrapper[4836]: I0217 14:45:59.766257 4836 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" Feb 17 14:45:59 crc kubenswrapper[4836]: I0217 14:45:59.767426 4836 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0cd1f125f5a9b4dbb6838481ae07ba424a29fd62723f85d44334ea5eb698c97d"} pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 14:45:59 crc kubenswrapper[4836]: I0217 14:45:59.767505 4836 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" podUID="895a19c9-a3f0-4a15-aa19-19347121388c" containerName="machine-config-daemon" containerID="cri-o://0cd1f125f5a9b4dbb6838481ae07ba424a29fd62723f85d44334ea5eb698c97d" gracePeriod=600 Feb 17 14:45:59 crc kubenswrapper[4836]: E0217 14:45:59.900133 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bkk9g_openshift-machine-config-operator(895a19c9-a3f0-4a15-aa19-19347121388c)\"" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" podUID="895a19c9-a3f0-4a15-aa19-19347121388c" Feb 17 14:46:00 crc kubenswrapper[4836]: I0217 14:46:00.474808 4836 generic.go:334] "Generic (PLEG): container finished" podID="895a19c9-a3f0-4a15-aa19-19347121388c" containerID="0cd1f125f5a9b4dbb6838481ae07ba424a29fd62723f85d44334ea5eb698c97d" exitCode=0 Feb 17 14:46:00 crc kubenswrapper[4836]: I0217 14:46:00.474870 4836 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" event={"ID":"895a19c9-a3f0-4a15-aa19-19347121388c","Type":"ContainerDied","Data":"0cd1f125f5a9b4dbb6838481ae07ba424a29fd62723f85d44334ea5eb698c97d"} Feb 17 14:46:00 crc kubenswrapper[4836]: I0217 14:46:00.474913 4836 scope.go:117] "RemoveContainer" containerID="6f19bb9a4d6443b07f247471c35e97a577b83e39d81d033aff596fac57089969" Feb 17 14:46:00 crc kubenswrapper[4836]: I0217 14:46:00.475793 4836 scope.go:117] "RemoveContainer" containerID="0cd1f125f5a9b4dbb6838481ae07ba424a29fd62723f85d44334ea5eb698c97d" Feb 17 14:46:00 crc kubenswrapper[4836]: E0217 14:46:00.476264 4836 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bkk9g_openshift-machine-config-operator(895a19c9-a3f0-4a15-aa19-19347121388c)\"" pod="openshift-machine-config-operator/machine-config-daemon-bkk9g" podUID="895a19c9-a3f0-4a15-aa19-19347121388c" var/home/core/zuul-output/logs/crc-cloud-workdir-crc-all-logs.tar.gz0000644000175000000000000000005515145077467024465 0ustar coreroot  Om77'(var/home/core/zuul-output/logs/crc-cloud/0000755000175000000000000000000015145077470017374 5ustar corerootvar/home/core/zuul-output/artifacts/0000755000175000017500000000000015145072337016514 5ustar corecorevar/home/core/zuul-output/docs/0000755000175000017500000000000015145072337015464 5ustar corecore